Supporting our scientists with Google Earth-based UIs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Janine
2010-10-01
Google Earth and Google Maps are incredibly useful for researchers looking for easily-digestible displays of data. This presentation will provide a step-by-step tutorial on how to begin using Google Earth to create tools that further the mission of the DOE national lab complex.
2013-08-09
CAPE CANAVERAL, Fla. – As seen on Google Maps, space shuttle Endeavour goes through transition and retirement processing in high bay 4 of the Vehicle Assembly Building at NASA's Kennedy Space Center. The spacecraft completed 25 missions beginning with its first flight, STS-49, in May 1992, and ending with STS-134 in May 2011. It helped construct the International Space Station in orbit and travelled more than 122 million miles in orbit during its career. The reaction control system pods in the shuttle's nose and aft section were removed for processing before Endeavour was put on public display at the California Science Center in Los Angeles. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang
a Map Mash-Up Application: Investigation the Temporal Effects of Climate Change on Salt Lake Basin
NASA Astrophysics Data System (ADS)
Kirtiloglu, O. S.; Orhan, O.; Ekercin, S.
2016-06-01
The main purpose of this paper is to investigate climate change effects that have been occurred at the beginning of the twenty-first century at the Konya Closed Basin (KCB) located in the semi-arid central Anatolian region of Turkey and particularly in Salt Lake region where many major wetlands located in and situated in KCB and to share the analysis results online in a Web Geographical Information System (GIS) environment. 71 Landsat 5-TM, 7-ETM+ and 8-OLI images and meteorological data obtained from 10 meteorological stations have been used at the scope of this work. 56 of Landsat images have been used for extraction of Salt Lake surface area through multi-temporal Landsat imagery collected from 2000 to 2014 in Salt lake basin. 15 of Landsat images have been used to make thematic maps of Normalised Difference Vegetation Index (NDVI) in KCB, and 10 meteorological stations data has been used to generate the Standardized Precipitation Index (SPI), which was used in drought studies. For the purpose of visualizing and sharing the results, a Web GIS-like environment has been established by using Google Maps and its useful data storage and manipulating product Fusion Tables which are all Google's free of charge Web service elements. The infrastructure of web application includes HTML5, CSS3, JavaScript, Google Maps API V3 and Google Fusion Tables API technologies. These technologies make it possible to make effective "Map Mash-Ups" involving an embedded Google Map in a Web page, storing the spatial or tabular data in Fusion Tables and add this data as a map layer on embedded map. The analysing process and map mash-up application have been discussed in detail as the main sections of this paper.
Visualize Your Data with Google Fusion Tables
NASA Astrophysics Data System (ADS)
Brisbin, K. E.
2011-12-01
Google Fusion Tables is a modern data management platform that makes it easy to host, manage, collaborate on, visualize, and publish tabular data online. Fusion Tables allows users to upload their own data to the Google cloud, which they can then use to create compelling and interactive visualizations with the data. Users can view data on a Google Map, plot data in a line chart, or display data along a timeline. Users can share these visualizations with others to explore and discover interesting trends about various types of data, including scientific data such as invasive species or global trends in disease. Fusion Tables has been used by many organizations to visualize a variety of scientific data. One example is the California Redistricting Map created by the LA Times: http://goo.gl/gwZt5 The Pacific Institute and Circle of Blue have used Fusion Tables to map the quality of water around the world: http://goo.gl/T4SX8 The World Resources Institute mapped the threat level of coral reefs using Fusion Tables: http://goo.gl/cdqe8 What attendees will learn in this session: This session will cover all the steps necessary to use Fusion Tables to create a variety of interactive visualizations. Attendees will begin by learning about the various options for uploading data into Fusion Tables, including Shapefile, KML file, and CSV file import. Attendees will then learn how to use Fusion Tables to manage their data by merging it with other data and controlling the permissions of the data. Finally, the session will cover how to create a customized visualization from the data, and share that visualization with others using both Fusion Tables and the Google Maps API.
ERIC Educational Resources Information Center
Wang, Kening; Mulvenon, Sean W.; Stegman, Charles; Anderson, Travis
2008-01-01
Google Maps API (Application Programming Interface), released in late June 2005 by Google, is an amazing technology that allows users to embed Google Maps in their own Web pages with JavaScript. Google Maps API has accelerated the development of new Google Maps based applications. This article reports a Web-based interactive mapping system…
Boulos, Maged N Kamel
2005-01-01
This eye-opener article aims at introducing the health GIS community to the emerging online consumer geoinformatics services from Google and Microsoft (MSN), and their potential utility in creating custom online interactive health maps. Using the programmable interfaces provided by Google and MSN, we created three interactive demonstrator maps of England's Strategic Health Authorities. These can be browsed online at – Google Maps API (Application Programming Interface) version, – Google Earth KML (Keyhole Markup Language) version, and – MSN Virtual Earth Map Control version. Google and MSN's worldwide distribution of "free" geospatial tools, imagery, and maps is to be commended as a significant step towards the ultimate "wikification" of maps and GIS. A discussion is provided of these emerging online mapping trends, their expected future implications and development directions, and associated individual privacy, national security and copyrights issues. Although ESRI have announced their planned response to Google (and MSN), it remains to be seen how their envisaged plans will materialize and compare to the offerings from Google and MSN, and also how Google and MSN mapping tools will further evolve in the near future. PMID:16176577
NASA Astrophysics Data System (ADS)
Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.
2006-12-01
Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its early stages, high school and college teachers, as well as researchers have expressed interest in using and extending these tools for visualizing and interacting with data on Earth and other planetary bodies.
From Google Maps to Google Models (Invited)
NASA Astrophysics Data System (ADS)
Moore, R. V.
2010-12-01
Why hasn’t integrated modelling taken off? To its advocates, it is self-evidently the best and arguably the only tool available for understanding and predicting the likely response of the environment to events and policies. Legislation requires managers to ensure that their plans are sustainable. How, other than by modelling the interacting processes involved, can the option with the greatest benefits be identified? Integrated modelling (IM) is seen to have huge potential. In science, IM is used to extend and encapsulate our understanding of the whole earth system. Such models are beginning to be incorporated in operational decision support systems and used to seek sustainable solutions to society’s problems, but only on a limited scale. Commercial take up is negligible yet the opportunities would appear limitless. The need is there; the potential is there, so what is inhibiting IM’s take up? What must be done to reap the rewards of the R & D to date? To answer the question, it useful to look back at the developments which have seen paper maps evolve into Google Maps and the systems that now surround it; facilities available not just to experts and governments but to anyone with a an iphone and an internet connection. The initial objective was to automate the process of drawing lines on paper, though it was quickly realised that digitising maps was the key to unlocking the information they held. However, it took thousands of PhD and MSc projects before a computer could generate a map comparable to that produced by a cartographer and many more before it was possible to extract reliable useful information from maps. It also required advances in IT and a change of mindset from one focused on paper map production to one focused on information delivery. To move from digital maps to Google Maps required the availability of data on a world scale, the resources to bring them together, the development of remote sensing, satellite navigation and communications technology and the creation of a commercial climate and conditions that allowed businesses anywhere to exploit the new information. This talk will draw lessons from the experience and imagine how Google Maps could become Google Models. The first lesson is time scale, it took far longer for digital mapping to move out of the development phase than most expected. Its first real customers were the public utilities. They are large organisations, risk averse and take time to change their ways of working; integrated modellers should not be surprised by the slow take up. Few of the early commercial entrants made any significant profits. It was only when the data reached critical mass and became accessible, when the systems became easy to use, affordable and accessible via the web, when convincing demonstrations became available and the necessary standards emerged that Google Maps could emerge. IM has yet to reach this point. It has far bigger technical, scientific and institutional challenges to overcome. The resources required will be large. It is possible though that they could be marshalled by creating an open source community of practice. However, that community will need a facilitating core group and standards to succeed. Having seen what Google Maps made possible, the innovative ideas it released, it is not difficult to imagine where a community of practice might take IM.
ERIC Educational Resources Information Center
Jacobsen, Mikael
2008-01-01
Librarians use online mapping services such as Google Maps, MapQuest, Yahoo Maps, and others to check traffic conditions, find local businesses, and provide directions. However, few libraries are using one of Google Maps most outstanding applications, My Maps, for the creation of enhanced and interactive multimedia maps. My Maps is a simple and…
Alabama Public Scoping Meeting | NOAA Gulf Spill Restoration
: Mobile, AL Start Time: 6:30 p.m. Central Time Description: As part of the public scoping process, the co open at 6:30 p.m. and the meeting will begin at 7:30 p.m. Location: The Battle House Renaissance Mobile Hotel & Spa 26 North Royal Street Mobile, AL 36602 (google map of location) Gulf Spill Restoration
Using Google Earth as an innovative tool for community mapping.
Lefer, Theodore B; Anderson, Matthew R; Fornari, Alice; Lambert, Anastasia; Fletcher, Jason; Baquero, Maria
2008-01-01
Maps are used to track diseases and illustrate the social context of health problems. However, commercial mapping software requires special training. This article illustrates how nonspecialists used Google Earth, a free program, to create community maps. The Bronx, New York, is characterized by high levels of obesity and diabetes. Residents and medical students measured the variety and quality of food and exercise sources around a residency training clinic and a student-run free clinic, using Google Earth to create maps with minimal assistance. Locations were identified using street addresses or simply by pointing to them on a map. Maps can be shared via e-mail, viewed online with Google Earth or Google Maps, and the data can be incorporated into other mapping software.
In Pursuit of Agile Acquisition: Are We There Yet?
2013-03-01
digital mapping capabilities like Google , 71Microsoft,72 and Wikimapia,73 are readily obtainable in the commercial marketplace. This knowledge...Fox. Defense Acquisition Reform, 14. 69 Ibid., 8. 70 XBRADTC, “Army Acquisition Woes,” Bring the Heat Bring the Stupid , entry posted May 1, 2011...https://xbradtc.wordpress.com/2011/05/01/Army-acquisition-woes/ (accessed on December 5, 2012). 71 Google Maps, http://maps.google.com/maps (accessed
Online Public Access Catalog: The Google Maps of the Library World
ERIC Educational Resources Information Center
Bailey, Kieren
2011-01-01
What do Google Maps and a library's Online Public Access Catalog (OPAC) have in common? Google Maps provides users with all the information they need for a trip in one place; users can get directions and find out what attractions, hotels, and restaurants are close by. Librarians must find the ultimate OPAC that will provide, in one place, all the…
SpaceTime Environmental Image Information for Scene Understanding
2016-04-01
public Internet resources such as Google,65 MapQuest,66 Bing,67 and Yahoo Maps.68 Approved for public release; distribution unlimited. 9 Table 3...azimuth angle 3 Terrain and location: USACE AGC — Satellite/aerial imagery and terrain analysis 4 Terrain and location: Google, MapQuest, Bing, Yahoo ...Maps. [accessed 2015 Dec]. https://www.bing.com/maps/. 68. YAHOO ! Maps. [accessed 2015 Dec]. https://maps.yahoo.com/b/. 69. 557th Weather Wing. US
Google Maps offers a new way to evaluate claudication.
Khambati, Husain; Boles, Kim; Jetty, Prasad
2017-05-01
Accurate determination of walking capacity is important for the clinical diagnosis and management plan for patients with peripheral arterial disease. The current "gold standard" of measurement is walking distance on a treadmill. However, treadmill testing is not always reflective of the patient's natural walking conditions, and it may not be fully accessible in every vascular clinic. The objective of this study was to determine whether Google Maps, the readily available GPS-based mapping tool, offers an accurate and accessible method of evaluating walking distances in vascular claudication patients. Patients presenting to the outpatient vascular surgery clinic between November 2013 and April 2014 at the Ottawa Hospital with vasculogenic calf, buttock, and thigh claudication symptoms were identified and prospectively enrolled in our study. Onset of claudication symptoms and maximal walking distance (MWD) were evaluated using four tools: history; Walking Impairment Questionnaire (WIQ), a validated claudication survey; Google Maps distance calculator (patients were asked to report their daily walking routes on the Google Maps-based tool runningmap.com, and walking distances were calculated accordingly); and treadmill testing for onset of symptoms and MWD, recorded in a double-blinded fashion. Fifteen patients were recruited for the study. Determination of walking distances using Google Maps proved to be more accurate than by both clinical history and WIQ, correlating highly with the gold standard of treadmill testing for both claudication onset (r = .805; P < .001) and MWD (r = .928; P < .0001). In addition, distances were generally under-reported on history and WIQ. The Google Maps tool was also efficient, with reporting times averaging below 4 minutes. For vascular claudicants with no other walking limitations, Google Maps is a promising new tool that combines the objective strengths of the treadmill test and incorporates real-world walking environments. It offers an accurate, efficient, inexpensive, and readily accessible way to assess walking distances in patients with peripheral vascular disease. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Learning to Map the Earth and Planets using a Google Earth - based Multi-student Game
NASA Astrophysics Data System (ADS)
De Paor, D. G.; Wild, S. C.; Dordevic, M.
2011-12-01
We report on progress in developing an interactive geological and geophysical mapping game employing the Google Earth, Google Moon, and Goole Mars virtual globes. Working in groups of four, students represent themselves on the Google Earth surface by selecting an avatar. One of the group drives to each field stop in a model vehicle using game-like controls. When they arrive at a field stop and get out of their field vehicle, students can control their own avatars' movements independently and can communicate with one another by text message. They are geo-fenced and receive automatic messages if they wander off target. Individual movements are logged and stored in a MySQL database for later analysis. Students collaborate on mapping decisions and submit a report to their instructor through a Javascript interface to the Google Earth API. Unlike real mapping, students are not restricted by geographic access and can engage in comparative mapping on different planets. Using newly developed techniques, they can also explore and map the sub-surface down to the core-mantle boundary. Virtual specimens created with a 3D scanner, Gigapan images of outcrops, and COLLADA models of mantle structures such as subducted lithospheric slabs all contribute to an engaging learning experience.
Google Earth and Geo Applications: A Toolset for Viewing Earth's Geospatial Information
NASA Astrophysics Data System (ADS)
Tuxen-Bettman, K.
2016-12-01
Earth scientists measure and derive fundamental data that can be of broad general interest to the public and policy makers. Yet, one of the challenges that has always faced the Earth science community is how to present their data and findings in an easy-to-use and compelling manner. Google's Geo Tools offer an efficient and dynamic way for scientists, educators, journalists and others to both access data and view or tell stories in a dynamic three-dimensional geospatial context. Google Earth in particular provides a dense canvas of satellite imagery on which can be viewed rich vector and raster datasets using the medium of Keyhole Markup Language (KML). Through KML, Google Earth can combine the analytical capabilities of Earth Engine, collaborative mapping of My Maps, and storytelling of Tour Builder and more to make Google's Geo Applications a coherent suite of tools for exploring our planet.https://earth.google.com/https://earthengine.google.com/https://mymaps.google.com/https://tourbuilder.withgoogle.com/https://www.google.com/streetview/
2013-08-09
CAPE CANAVERAL, Fla. – Google used an assortment of vehicles to precisely map NASA's Kennedy Space Center in Florida to be featured on the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Google used a car, tricycle and pushcart to maneuver around the center and through some of its facilities. Photo credit: Google/Wendy Wang
ERIC Educational Resources Information Center
Gross, Liz
2012-01-01
As a new year begins, higher education professionals who manage social media are getting to know the latest social network, Google+, and how they can best use Google+ Pages to advance their institutions. When Google+ first came on the scene in late June 2011, several institutions signed up and began using the service. Given the popularity of other…
Recent Advances in Geospatial Visualization with the New Google Earth
NASA Astrophysics Data System (ADS)
Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.
2017-12-01
Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.
Price, Richard; Marsh, Abbie J; Fisher, Marisa H
2018-03-01
Facilitating the use of public transportation enhances opportunities for independent living and competitive, community-based employment for individuals with intellectual and developmental disabilities (IDD). Four young adults with IDD were taught through total-task chaining to use the Google Maps application, a self-prompting, visual navigation system, to take the bus to locations around a college campus and the community. Three of four participants learned to use Google Maps to independently navigate public transportation. Google Maps may be helpful in supporting independent travel, highlighting the importance of future research in teaching navigation skills. Learning to independently use public transportation increases access to autonomous activities, such as opportunities to work and to attend postsecondary education programs on large college campuses.Individuals with IDD can be taught through chaining procedures to use the Google Maps application to navigate public transportation.Mobile map applications are an effective and functional modern tool that can be used to teach community navigation.
What Google Maps can do for biomedical data dissemination: examples and a design study.
Jianu, Radu; Laidlaw, David H
2013-05-04
Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations.
What google maps can do for biomedical data dissemination: examples and a design study
2013-01-01
Background Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. Results We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. Conclusions We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations. PMID:23642009
ERIC Educational Resources Information Center
Hsu, Hsiao-Ping; Tsai, Bor-Wen; Chen, Che-Ming
2018-01-01
Teaching high-school geomorphological concepts and topographic map reading entails many challenges. This research reports the applicability and effectiveness of Google Earth in teaching topographic map skills and geomorphological concepts, by a single teacher, in a one-computer classroom. Compared to learning via a conventional instructional…
PhyloGeoViz: a web-based program that visualizes genetic data on maps.
Tsai, Yi-Hsin E
2011-05-01
The first step of many population genetic studies is the simple visualization of allele frequencies on a landscape. This basic data exploration can be challenging without proprietary software, and the manual plotting of data is cumbersome and unfeasible at large sample sizes. I present an open source, web-based program that plots any kind of frequency or count data as pie charts in Google Maps (Google Inc., Mountain View, CA). Pie polygons are then exportable to Google Earth (Google Inc.), a free Geographic Information Systems platform. Import of genetic data into Google Earth allows phylogeographers access to a wealth of spatial information layers integral to forming hypotheses and understanding patterns in the data. © 2010 Blackwell Publishing Ltd.
Feature Positioning on Google Street View Panoramas
NASA Astrophysics Data System (ADS)
Tsai, V. J. D.; Chang, C.-T.
2012-07-01
Location-based services (LBS) on web-based maps and images have come into real-time since Google launched its Street View imaging services in 2007. This research employs Google Maps API and Web Service, GAE for JAVA, AJAX, Proj4js, CSS and HTML in developing an internet platform for accessing the orientation parameters of Google Street View (GSV) panoramas in order to determine the three dimensional position of interest features that appear on two overlapping panoramas by geometric intersection. A pair of GSV panoramas was examined using known points located on the Library Building of National Chung Hsing University (NCHU) with the root-mean-squared errors of ±0.522m, ±1.230m, and ±5.779m for intersection and ±0.142m, ±1.558m, and ±5.733m for resection in X, Y, and h (elevation), respectively. Potential error sources in GSV positioning were analyzed and illustrated that the errors in Google provided GSV positional parameters dominate the errors in geometric intersection. The developed system is suitable for data collection in establishing LBS applications integrated with Google Maps and Google Earth in traffic sign and infrastructure inventory by adding automatic extraction and matching techniques for points of interest (POI) from GSV panoramas.
ERIC Educational Resources Information Center
Lin, Yu-Tzu; Chang, Chia-Hu; Hou, Huei-Tse; Wu, Ke-Chou
2016-01-01
This study investigated the effectiveness of using Google Docs in collaborative concept mapping (CCM) by comparing it with a paper-and-pencil approach. A quasi-experimental study was conducted in a physics course. The control group drew concept maps using the paper-and-pencil method and face-to-face discussion, whereas the experimental group…
Usability analysis of indoor map application in a shopping centre
NASA Astrophysics Data System (ADS)
Dewi, R. S.; Hadi, R. K.
2018-04-01
Although indoor navigation is still new in Indonesia, its future development is very promising. Similar to the outdoor one, the indoor navigation technology provides several important functions to support route and landmark findings. Furthermore, there is also a need that indoor navigation can support the public safety especially during disaster evacuation process in a building. It is a common that the indoor navigation technologies are built as applications where users can access this technology using their smartphones, tablets, or personal computers. Therefore, a usability analysis is important to ensure the indoor navigation applications can be operated by users with highest functionality. Among several indoor map applications which were available in the market, this study chose to analyse indoor Google Maps due to its availability and popularity in Indonesia. The experiments to test indoor Google Maps was conducted in one of the biggest shopping centre building in Surabaya, Indonesia. The usability was measured by employing System Usability Scale (SUS) questionnaire. The result showed that the SUS score of indoor Google Maps was below the average score of other cellular applications to indicate the users still had high difficulty in operating and learning the features of indoor Google Maps.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-10
... explained in the legislative history of the Omnibus Trade and Competitiveness Act of 1988, the Department... Google Maps: https://maps.google.com . The rates were in effect prior to the POR, so we adjusted them to...
Positional Accuracy Assessment of Googleearth in Riyadh
NASA Astrophysics Data System (ADS)
Farah, Ashraf; Algarni, Dafer
2014-06-01
Google Earth is a virtual globe, map and geographical information program that is controlled by Google corporation. It maps the Earth by the superimposition of images obtained from satellite imagery, aerial photography and GIS 3D globe. With millions of users all around the globe, GoogleEarth® has become the ultimate source of spatial data and information for private and public decision-support systems besides many types and forms of social interactions. Many users mostly in developing countries are also using it for surveying applications, the matter that raises questions about the positional accuracy of the Google Earth program. This research presents a small-scale assessment study of the positional accuracy of GoogleEarth® Imagery in Riyadh; capital of Kingdom of Saudi Arabia (KSA). The results show that the RMSE of the GoogleEarth imagery is 2.18 m and 1.51 m for the horizontal and height coordinates respectively.
Google earth mapping of damage from the Nigata-Ken-Chuetsu M6.6 earthquake of 16 July 2007
Kayen, Robert E.; Steele, WM. Clint; Collins, Brian; Walker, Kevin
2008-01-01
We describe the use of Google Earth during and after a large damaging earthquake thatstruck the central Japan coast on 16 July 2007 to collect and organize damage information and guide the reconnaissance activities. This software enabled greater real-time collaboration among scientists and engineers. After the field investigation, the Google Earth map is used as a final reporting product that was directly linked to the more traditional research report document. Finally, we analyze the use of the software within the context of a post-disaster reconnaissance investigation, and link it to student use of GoogleEarth in field situations
2013-08-09
CAPE CANAVERAL, Fla. – As seen on Google Maps, the massive F-1 engines of the Saturn V's first stage on display inside the Apollo/Saturn V Center at the Kennedy Space Center Visitor Complex. Each engine stands 19 feet tall with a diameter of more than 12 feet. The five engines on the first stage produced 7.5 million pounds of thrust at liftoff. The Saturn V was used to launch NASA's Apollo missions to the moon which saw 12 astronauts land and work on the lunar surface. Google precisely mapped Kennedy Space Center and some of its historical facilities for the company's map page. Photo credit: Google/Wendy Wang
Using Mobile App Development Tools to Build a GIS Application
NASA Astrophysics Data System (ADS)
Mital, A.; Catchen, M.; Mital, K.
2014-12-01
Our group designed and built working web, android, and IOS applications using different mapping libraries as bases on which to overlay fire data from NASA. The group originally planned to make app versions for Google Maps, Leaflet, and OpenLayers. However, because the Leaflet library did not properly load on Android, the group focused efforts on the other two mapping libraries. For Google Maps, the group first designed a UI for the web app and made a working version of the app. After updating the source of fire data to one which also provided historical fire data, the design had to be modified to include the extra data. After completing a working version of the web app, the group used webview in android, a built in resource which allowed porting the web app to android without rewriting the code for android. Upon completing this, the group found Apple IOS devices had a similar capability, and so decided to add an IOS app to the project using a function similar to webview. Alongside this effort, the group began implementing an OpenLayers fire map using a simpler UI. This web app was completed fairly quickly relative to Google Maps; however, it did not include functionality such as satellite imagery or searchable locations. The group finished the project with a working android version of the Google Maps based app supporting API levels 14-19 and an OpenLayers based app supporting API levels 8-19, as well as a Google Maps based IOS app supporting both old and new screen formats. This project was implemented by high school and college students under an SGT Inc. STEM internship program
Visualizing Cross-sectional Data in a Real-World Context
NASA Astrophysics Data System (ADS)
Van Noten, K.; Lecocq, T.
2016-12-01
If you could fly around your research results in three dimensions, wouldn't you like to do it? Visualizing research results properly during scientific presentations already does half the job of informing the public on the geographic framework of your research. Many scientists use the Google Earth™ mapping service (V7.1.2.2041) because it's a great interactive mapping tool for assigning geographic coordinates to individual data points, localizing a research area, and draping maps of results over Earth's surface for 3D visualization. However, visualizations of research results in vertical cross-sections are often not shown simultaneously with the maps in Google Earth. A few tutorials and programs to display cross-sectional data in Google Earth do exist, and the workflow is rather simple. By importing a cross-sectional figure into in the open software SketchUp Make [Trimble Navigation Limited, 2016], any spatial model can be exported to a vertical figure in Google Earth. In this presentation a clear workflow/tutorial is presented how to image cross-sections manually in Google Earth. No software skills, nor any programming codes are required. It is very easy to use, offers great possibilities for teaching and allows fast figure manipulation in Google Earth. The full workflow can be found in "Van Noten, K. 2016. Visualizing Cross-Sectional Data in a Real-World Context. EOS, Transactions AGU, 97, 16-19".The video tutorial can be found here: https://www.youtube.com/watch?v=Tr8LwFJ4RYU&Figure: Cross-sectional Research Examples Illustrated in Google Earth
MaRGEE: Move and Rotate Google Earth Elements
NASA Astrophysics Data System (ADS)
Dordevic, Mladen M.; Whitmeyer, Steven J.
2015-12-01
Google Earth is recognized as a highly effective visualization tool for geospatial information. However, there remain serious limitations that have hindered its acceptance as a tool for research and education in the geosciences. One significant limitation is the inability to translate or rotate geometrical elements on the Google Earth virtual globe. Here we present a new JavaScript web application to "Move and Rotate Google Earth Elements" (MaRGEE). MaRGEE includes tools to simplify, translate, and rotate elements, add intermediate steps to a transposition, and batch process multiple transpositions. The transposition algorithm uses spherical geometry calculations, such as the haversine formula, to accurately reposition groups of points, paths, and polygons on the Google Earth globe without distortion. Due to the imminent deprecation of the Google Earth API and browser plugin, MaRGEE uses a Google Maps interface to facilitate and illustrate the transpositions. However, the inherent spatial distortions that result from the Google Maps Web Mercator projection are not apparent once the transposed elements are saved as a KML file and opened in Google Earth. Potential applications of the MaRGEE toolkit include tectonic reconstructions, the movements of glaciers or thrust sheets, and time-based animations of other large- and small-scale geologic processes.
The Adversarial Route Analysis Tool: A Web Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casson, William H. Jr.
2012-08-02
The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.
NASA Astrophysics Data System (ADS)
Minnett, R. C.; Koppers, A. A.; Staudigel, D.; Staudigel, H.
2008-12-01
EarthRef.org is comprehensive and convenient resource for Earth Science reference data and models. It encompasses four main portals: the Geochemical Earth Reference Model (GERM), the Magnetics Information Consortium (MagIC), the Seamount Biogeosciences Network (SBN), and the Enduring Resources for Earth Science Education (ERESE). Their underlying databases are publically available and the scientific community has contributed widely and is urged to continue to do so. However, the net result is a vast and largely heterogeneous warehouse of geospatial data ranging from carefully prepared maps of seamounts to geochemical data/metadata, daily reports from seagoing expeditions, large volumes of raw and processed multibeam data, images of paleomagnetic sampling sites, etc. This presents a considerable obstacle for integrating other rich media content, such as videos, images, data files, cruise tracks, and interoperable database results, without overwhelming the web user. The four EarthRef.org portals clearly lend themselves to a more intuitive user interface and has, therefore, been an invaluable test bed for the design and implementation of FlashMap, a versatile KML-driven geospatial browser written for reliability and speed in Adobe Flash. FlashMap allows layers of content to be loaded and displayed over a streaming high-resolution map which can be zoomed and panned similarly to Google Maps and Google Earth. Many organizations, from National Geographic to the USGS, have begun using Google Earth software to display geospatial content. However, Google Earth, as a desktop application, does not integrate cleanly with existing websites requiring the user to navigate away from the browser and focus on a separate application and Google Maps, written in Java Script, does not scale up reliably to large datasets. FlashMap remedies these problems as a web-based application that allows for seamless integration of the real-time display power of Google Earth and the flexibility of the web without losing scalability and control of the base maps. Our Flash-based application is fully compatible with KML (Keyhole Markup Language) 2.2, the most recent iteration of KML, allowing users with existing Google Earth KML files to effortlessly display their geospatial content embedded in a web page. As a test case for FlashMap, the annual Iron-Oxidizing Microbial Observatory (FeMO) dive cruise to the Loihi Seamount, in conjunction with data available from ongoing and published FeMO laboratory studies, showcases the flexibility of this single web-based application. With a KML 2.2 compatible web-service providing the content, any database can display results in FlashMap. The user can then hide and show multiple layers of content, potentially from several data sources, and rapidly digest a vast quantity of information to narrow the search results. This flexibility gives experienced users the ability to drill down to exactly the record they are looking for (SERC at Carleton College's educational application of FlashMap at http://serc.carleton.edu/sp/erese/activities/22223.html) and allows users familiar with Google Earth the ability to load and view geospatial data content within a browser from any computer with an internet connection.
KML-based teaching lessons developed by Google in partnership with the University of Alaska.
NASA Astrophysics Data System (ADS)
Kolb, E. J.; Bailey, J.; Bishop, A.; Cain, J.; Goddard, M.; Hurowitz, K.; Kennedy, K.; Ornduff, T.; Sfraga, M.; Wernecke, J.
2008-12-01
The focus of Google's Geo Education outreach efforts (http://www.google.com/educators/geo.html) is on helping primary, secondary, and post-secondary educators incorporate Google Earth and Sky, Google Maps, and SketchUp into their classroom lessons. In this poster and demonstration, we will show our KML-based science lessons that were developed in partnership with the University of Alaska and used in classroom teachings by our team to Alaskan high-school students.
Preservation in the Age of Google: Digitization, Digital Preservation, and Dilemmas
ERIC Educational Resources Information Center
Conway, Paul
2010-01-01
The cultural heritage preservation community now functions largely within the environment of digital technologies. This article begins by juxtaposing definitions of the terms "digitization for preservation" and "digital preservation" within a sociotechnical environment for which Google serves as a relevant metaphor. It then reviews two reports…
Rousselet, Jérôme; Imbert, Charles-Edouard; Dekri, Anissa; Garcia, Jacques; Goussard, Francis; Vincent, Bruno; Denux, Olivier; Robinet, Christelle; Dorkeld, Franck; Roques, Alain; Rossi, Jean-Pierre
2013-01-01
Mapping species spatial distribution using spatial inference and prediction requires a lot of data. Occurrence data are generally not easily available from the literature and are very time-consuming to collect in the field. For that reason, we designed a survey to explore to which extent large-scale databases such as Google maps and Google Street View could be used to derive valid occurrence data. We worked with the Pine Processionary Moth (PPM) Thaumetopoea pityocampa because the larvae of that moth build silk nests that are easily visible. The presence of the species at one location can therefore be inferred from visual records derived from the panoramic views available from Google Street View. We designed a standardized procedure allowing evaluating the presence of the PPM on a sampling grid covering the landscape under study. The outputs were compared to field data. We investigated two landscapes using grids of different extent and mesh size. Data derived from Google Street View were highly similar to field data in the large-scale analysis based on a square grid with a mesh of 16 km (96% of matching records). Using a 2 km mesh size led to a strong divergence between field and Google-derived data (46% of matching records). We conclude that Google database might provide useful occurrence data for mapping the distribution of species which presence can be visually evaluated such as the PPM. However, the accuracy of the output strongly depends on the spatial scales considered and on the sampling grid used. Other factors such as the coverage of Google Street View network with regards to sampling grid size and the spatial distribution of host trees with regards to road network may also be determinant.
Dekri, Anissa; Garcia, Jacques; Goussard, Francis; Vincent, Bruno; Denux, Olivier; Robinet, Christelle; Dorkeld, Franck; Roques, Alain; Rossi, Jean-Pierre
2013-01-01
Mapping species spatial distribution using spatial inference and prediction requires a lot of data. Occurrence data are generally not easily available from the literature and are very time-consuming to collect in the field. For that reason, we designed a survey to explore to which extent large-scale databases such as Google maps and Google street view could be used to derive valid occurrence data. We worked with the Pine Processionary Moth (PPM) Thaumetopoea pityocampa because the larvae of that moth build silk nests that are easily visible. The presence of the species at one location can therefore be inferred from visual records derived from the panoramic views available from Google street view. We designed a standardized procedure allowing evaluating the presence of the PPM on a sampling grid covering the landscape under study. The outputs were compared to field data. We investigated two landscapes using grids of different extent and mesh size. Data derived from Google street view were highly similar to field data in the large-scale analysis based on a square grid with a mesh of 16 km (96% of matching records). Using a 2 km mesh size led to a strong divergence between field and Google-derived data (46% of matching records). We conclude that Google database might provide useful occurrence data for mapping the distribution of species which presence can be visually evaluated such as the PPM. However, the accuracy of the output strongly depends on the spatial scales considered and on the sampling grid used. Other factors such as the coverage of Google street view network with regards to sampling grid size and the spatial distribution of host trees with regards to road network may also be determinant. PMID:24130675
2013-08-09
CAPE CANAVERAL, Fla. – As seen on Google Maps, a Space Shuttle Main Engine, or SSME, stands inside the Engine Shop at Orbiter Processing Facility 3 at NASA's Kennedy Space Center. Each orbiter used three of the engines during launch and ascent into orbit. The engines burn super-cold liquid hydrogen and liquid oxygen and each one produces 155,000 pounds of thrust. The engines, known in the industry as RS-25s, could be reused on multiple shuttle missions. They will be used again later this decade for NASA's Space Launch System rocket. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang
Kaewpitoon, Soraya J; Rujirakul, Ratana; Joosiri, Apinya; Jantakate, Sirinun; Sangkudloa, Amnat; Kaewthani, Sarochinee; Chimplee, Kanokporn; Khemplila, Kritsakorn; Kaewpitoon, Natthawut
2016-01-01
Cholangiocarcinoma (CCA) is a serious problem in Thailand, particularly in the northeastern and northern regions. Database of population at risk are need required for monitoring, surveillance, home health care, and home visit. Therefore, this study aimed to develop a geographic information system (GIS) database and Google map of the population at risk of CCA in Mueang Yang district, Nakhon Ratchasima province, northeastern Thailand during June to October 2015. Populations at risk were screened using the Korat CCA verbal screening test (KCVST). Software included Microsoft Excel, ArcGIS, and Google Maps. The secondary data included the point of villages, sub-district boundaries, district boundaries, point of hospital in Mueang Yang district, used for created the spatial databese. The populations at risk for CCA and opisthorchiasis were used to create an arttribute database. Data were tranfered to WGS84 UTM ZONE 48. After the conversion, all of the data were imported into Google Earth using online web pages www.earthpoint.us. Some 222 from a 4,800 population at risk for CCA constituted a high risk group. Geo-visual display available at following www.google.com/maps/d/u/0/ edit?mid=zPxtcHv_iDLo.kvPpxl5mAs90 and hl=th. Geo-visual display 5 layers including: layer 1, village location and number of the population at risk for CCA; layer 2, sub-district health promotion hospital in Mueang Yang district and number of opisthorchiasis; layer 3, sub-district district and the number of population at risk for CCA; layer 4, district hospital and the number of population at risk for CCA and number of opisthorchiasis; and layer 5, district and the number of population at risk for CCA and number of opisthorchiasis. This GIS database and Google map production process is suitable for further monitoring, surveillance, and home health care for CCA sufferers.
Google's Evolution Leads to Library Revolution
ERIC Educational Resources Information Center
Jaworski, Susan; Sullivan, Roberta
2011-01-01
Do library catalogs compete with Google or is it the other way around? We know which came first but which will finish in the end? Only trained library professionals were considered qualified to develop reliable catalog records. However, with the increased sophistication of search engines, we are beginning to realize that a collaborative effort may…
Google Voice: Connecting Your Telephone to the 21st Century
ERIC Educational Resources Information Center
Johnson, Benjamin E.
2010-01-01
The foundation of the mighty Google Empire rests upon an algorithm that connects people to information--things such as websites, maps, and restaurant reviews. Lately it seems that people are less interested in connecting with information than they are with connecting to one another, which begs the question, "Is Facebook the new Google?" Given this…
Cultural Adventures for the Google[TM] Generation
ERIC Educational Resources Information Center
Dann, Tammy
2010-01-01
Google Earth is a computer program that allows users to view the Earth through satellite imagery and maps, to see cities from above and through street views, and to search for addresses and browse locations. Many famous buildings and structures from around the world have detailed 3D views accessible on Google Earth. It is possible to explore the…
Visualizing Geographic Data in Google Earth for Education and Outreach
NASA Astrophysics Data System (ADS)
Martin, D. J.; Treves, R.
2008-12-01
Google Earth is an excellent tool to help students and the public visualize scientific data as with low technical skill scientific content can be shown in three dimensions against a background of remotely sensed imagery. It therefore has a variety of uses in university education and as a tool for public outreach. However, in both situations it is of limited value if it is only used to attract attention with flashy three dimensional animations. In this poster we shall illustrate several applications that represent what we believe is good educational practice. The first example shows how the combination of a floor map and a projection of Google Earth on a screen can be used to produce active learning. Students are asked to imagine where they would build a house on Big Island Hawaii in order to avoid volcanic hazards. In the second example Google Earth is used to illustrate evidence over a range of scales in a description of Lake Agassiz flood events which would be more difficult to comprehend in a traditional paper based format. In the final example a simple text manipulation application "TMapper" is used to change the color palette of a thematic map generated by the students in Google Earth to teach them about the use of color in map design.
[Who Hits the Mark? A Comparative Study of the Free Geocoding Services of Google and OpenStreetMap].
Lemke, D; Mattauch, V; Heidinger, O; Hense, H W
2015-09-01
Geocoding, the process of converting textual information (addresses) into geographic coordinates is increasingly used in public health/epidemiological research and practice. To date, little attention has been paid to geocoding quality and its impact on different types of spatially-related health studies. The primary aim of this study was to compare 2 freely available geocoding services (Google and OpenStreetMap) with regard to matching rate (percentage of address records capable of being geocoded) and positional accuracy (distance between geocodes and the ground truth locations). Residential addresses were geocoded by the NRW state office for information and technology and were considered as reference data (gold standard). The gold standard included the coordinates, the quality of the addresses (4 categories), and a binary urbanity indicator based on the CORINE land cover data. 2 500 addresses were randomly sampled after stratification for address quality and urbanity indicator (approximately 20 000 addresses). These address samples were geocoded using the geocoding services from Google and OSM. In general, both geocoding services showed a decrease in the matching rate with decreasing address quality and urbanity. Google showed consistently a higher completeness than OSM (>93 vs. >82%). Also, the cartographic confounding between urban and rural regions was less distinct with Google's geocoding API. Regarding the positional accuracy of the geo-coordinates, Google also showed the smallest deviations from the reference coordinates, with a median of <9 vs. <175.8 m. The cumulative density function derived from the positional accuracy showed for Google that nearly 95% and for OSM 50% of the addresses were geocoded within <50 m of their reference coordinates. The geocoding API from Google is superior to OSM regarding completeness and positional accuracy of the geocoded addresses. On the other hand, Google has several restrictions, such as the limitation of the requests to 2 500 addresses per 24 h and the presentation of the results exclusively on Google Maps, which may complicate the use for scientific purposes. © Georg Thieme Verlag KG Stuttgart · New York.
Keemei: cloud-based validation of tabular bioinformatics file formats in Google Sheets.
Rideout, Jai Ram; Chase, John H; Bolyen, Evan; Ackermann, Gail; González, Antonio; Knight, Rob; Caporaso, J Gregory
2016-06-13
Bioinformatics software often requires human-generated tabular text files as input and has specific requirements for how those data are formatted. Users frequently manage these data in spreadsheet programs, which is convenient for researchers who are compiling the requisite information because the spreadsheet programs can easily be used on different platforms including laptops and tablets, and because they provide a familiar interface. It is increasingly common for many different researchers to be involved in compiling these data, including study coordinators, clinicians, lab technicians and bioinformaticians. As a result, many research groups are shifting toward using cloud-based spreadsheet programs, such as Google Sheets, which support the concurrent editing of a single spreadsheet by different users working on different platforms. Most of the researchers who enter data are not familiar with the formatting requirements of the bioinformatics programs that will be used, so validating and correcting file formats is often a bottleneck prior to beginning bioinformatics analysis. We present Keemei, a Google Sheets Add-on, for validating tabular files used in bioinformatics analyses. Keemei is available free of charge from Google's Chrome Web Store. Keemei can be installed and run on any web browser supported by Google Sheets. Keemei currently supports the validation of two widely used tabular bioinformatics formats, the Quantitative Insights into Microbial Ecology (QIIME) sample metadata mapping file format and the Spatially Referenced Genetic Data (SRGD) format, but is designed to easily support the addition of others. Keemei will save researchers time and frustration by providing a convenient interface for tabular bioinformatics file format validation. By allowing everyone involved with data entry for a project to easily validate their data, it will reduce the validation and formatting bottlenecks that are commonly encountered when human-generated data files are first used with a bioinformatics system. Simplifying the validation of essential tabular data files, such as sample metadata, will reduce common errors and thereby improve the quality and reliability of research outcomes.
2013-08-09
CAPE CANAVERAL, Fla. – As seen on Google Maps, the Rotating Service Structure at Launch Complex 39A at NASA's Kennedy Space Center housed space shuttle payloads temporarily so they could be loaded inside the 60-foot-long cargo bay of a shuttle before launch. The RSS, as the structure was known, was hinged to the Fixed Service Structure on one side and rolled on a rail on the other. As its name suggests, the enclosed facility would rotate into place around the shuttle as it stood at the launch pad. Once in place, the RSS protected the shuttle and its cargo. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang
Active Fire Mapping Program Current Large Incidents (Home) New Large Incidents Fire Detection Maps MODIS Satellite Imagery VIIRS Satellite Imagery Fire Detection GIS Data Fire Data in Google Earth ...
ERIC Educational Resources Information Center
Fluke, Christopher J.
2009-01-01
I report on a pilot study on the use of Google Maps to provide virtual field trips as a component of a wholly online graduate course on the history of astronomy. The Astronomical Tourist Web site (http://astronomy.swin.edu.au/sao/tourist), themed around the role that specific locations on Earth have contributed to the development of astronomical…
Spatio-temporal Change Patterns of Tropical Forests from 2000 to 2014 Using MOD09A1 Dataset
NASA Astrophysics Data System (ADS)
Qin, Y.; Xiao, X.; Dong, J.
2016-12-01
Large-scale deforestation and forest degradation in the tropical region have resulted in extensive carbon emissions and biodiversity loss. However, restricted by the availability of good-quality observations, large uncertainty exists in mapping the spatial distribution of forests and their spatio-temporal changes. In this study, we proposed a pixel- and phenology-based algorithm to identify and map annual tropical forests from 2000 to 2014, using the 8-day, 500-m MOD09A1 (v005) product, under the support of Google cloud computing (Google Earth Engine). A temporal filter was applied to reduce the random noises and to identify the spatio-temporal changes of forests. We then built up a confusion matrix and assessed the accuracy of the annual forest maps based on the ground reference interpreted from high spatial resolution images in Google Earth. The resultant forest maps showed the consistent forest/non-forest, forest loss, and forest gain in the pan-tropical zone during 2000 - 2014. The proposed algorithm showed the potential for tropical forest mapping and the resultant forest maps are important for the estimation of carbon emission and biodiversity loss.
Masthi, N R Ramesh; Madhusudan, M; Puthussery, Yannick P
2015-11-01
The global positioning system (GPS) technology along with Google Earth is used to measure (spatial map) the accurate distribution of morbidity, mortality and planning of interventions in the community. We used this technology to find out its role in the investigation of a cholera outbreak, and also to identify the cause of the outbreak. This study was conducted in a village near Bengaluru, Karnataka in June 2013 during a cholera outbreak. House-to-house survey was done to identify acute watery diarrhoea cases. A hand held GPS receiver was used to record north and east coordinates of the households of cases and these values were subsequently plotted on Google Earth map. Water samples were collected from suspected sources for microbiological analysis. A total of 27 cases of acute watery diarrhoea were reported. Fifty per cent of cases were in the age group of 14-44 yr and one death was reported. GPS technology and Google Earth described the accurate location of household of cases and spot map generated showed clustering of cases around the suspected water sources. The attack rate was 6.92 per cent and case fatality rate was 3.7 per cent. Water samples collected from suspected sources showed the presence of Vibrio cholera O1 Ogawa. GPS technology and Google Earth were easy to use, helpful to accurately pinpoint the location of household of cases, construction of spot map and follow up of cases. Outbreak was found to be due to contamination of drinking water sources.
2013-08-09
CAPE CANAVERAL, Fla. – As seen on Google Maps, the view from the top of the Fixed Service Structure at Launch Complex 39A at NASA's Kennedy Space Center. The FSS, as the structure is known, is 285 feet high and overlooks the Rotating Service Structure that was rolled into place when a space shuttle was at the pad. The path taken by NASA's massive crawler-transporters that carried the shuttle stack 3 miles from Vehicle Assembly Building are also visible leading up to the launch pad. In the distance are seen the launch pads and support structures at Cape Canaveral Air Force Station for the Atlas V, Delta IV and Falcon 9 rockets. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang
Assessing species habitat using Google Street View: a case study of cliff-nesting vultures.
Olea, Pedro P; Mateo-Tomás, Patricia
2013-01-01
The assessment of a species' habitat is a crucial issue in ecology and conservation. While the collection of habitat data has been boosted by the availability of remote sensing technologies, certain habitat types have yet to be collected through costly, on-ground surveys, limiting study over large areas. Cliffs are ecosystems that provide habitat for a rich biodiversity, especially raptors. Because of their principally vertical structure, however, cliffs are not easy to study by remote sensing technologies, posing a challenge for many researches and managers working with cliff-related biodiversity. We explore the feasibility of Google Street View, a freely available on-line tool, to remotely identify and assess the nesting habitat of two cliff-nesting vultures (the griffon vulture and the globally endangered Egyptian vulture) in northwestern Spain. Two main usefulness of Google Street View to ecologists and conservation biologists were evaluated: i) remotely identifying a species' potential habitat and ii) extracting fine-scale habitat information. Google Street View imagery covered 49% (1,907 km) of the roads of our study area (7,000 km²). The potential visibility covered by on-ground surveys was significantly greater (mean: 97.4%) than that of Google Street View (48.1%). However, incorporating Google Street View to the vulture's habitat survey would save, on average, 36% in time and 49.5% in funds with respect to the on-ground survey only. The ability of Google Street View to identify cliffs (overall accuracy = 100%) outperformed the classification maps derived from digital elevation models (DEMs) (62-95%). Nonetheless, high-performance DEM maps may be useful to compensate Google Street View coverage limitations. Through Google Street View we could examine 66% of the vultures' nesting-cliffs existing in the study area (n = 148): 64% from griffon vultures and 65% from Egyptian vultures. It also allowed us the extraction of fine-scale features of cliffs. This World Wide Web-based methodology may be a useful, complementary tool to remotely map and assess the potential habitat of cliff-dependent biodiversity over large geographic areas, saving survey-related costs.
Assessing Species Habitat Using Google Street View: A Case Study of Cliff-Nesting Vultures
Olea, Pedro P.; Mateo-Tomás, Patricia
2013-01-01
The assessment of a species’ habitat is a crucial issue in ecology and conservation. While the collection of habitat data has been boosted by the availability of remote sensing technologies, certain habitat types have yet to be collected through costly, on-ground surveys, limiting study over large areas. Cliffs are ecosystems that provide habitat for a rich biodiversity, especially raptors. Because of their principally vertical structure, however, cliffs are not easy to study by remote sensing technologies, posing a challenge for many researches and managers working with cliff-related biodiversity. We explore the feasibility of Google Street View, a freely available on-line tool, to remotely identify and assess the nesting habitat of two cliff-nesting vultures (the griffon vulture and the globally endangered Egyptian vulture) in northwestern Spain. Two main usefulness of Google Street View to ecologists and conservation biologists were evaluated: i) remotely identifying a species’ potential habitat and ii) extracting fine-scale habitat information. Google Street View imagery covered 49% (1,907 km) of the roads of our study area (7,000 km2). The potential visibility covered by on-ground surveys was significantly greater (mean: 97.4%) than that of Google Street View (48.1%). However, incorporating Google Street View to the vulture’s habitat survey would save, on average, 36% in time and 49.5% in funds with respect to the on-ground survey only. The ability of Google Street View to identify cliffs (overall accuracy = 100%) outperformed the classification maps derived from digital elevation models (DEMs) (62–95%). Nonetheless, high-performance DEM maps may be useful to compensate Google Street View coverage limitations. Through Google Street View we could examine 66% of the vultures’ nesting-cliffs existing in the study area (n = 148): 64% from griffon vultures and 65% from Egyptian vultures. It also allowed us the extraction of fine-scale features of cliffs. This World Wide Web-based methodology may be a useful, complementary tool to remotely map and assess the potential habitat of cliff-dependent biodiversity over large geographic areas, saving survey-related costs. PMID:23355880
A method for vreating a three dimensional model from published geologic maps and cross sections
Walsh, Gregory J.
2009-01-01
This brief report presents a relatively inexpensive and rapid method for creating a 3D model of geology from published quadrangle-scale maps and cross sections using Google Earth and Google SketchUp software. An example from the Green Mountains of Vermont, USA, is used to illustrate the step by step methods used to create such a model. A second example is provided from the Jebel Saghro region of the Anti-Atlas Mountains of Morocco. The report was published to help enhance the public?s ability to use and visualize geologic map data.
The impact of geo-tagging on the photo industry and creating revenue streams
NASA Astrophysics Data System (ADS)
Richter, Rolf; Böge, Henning; Weckmann, Christoph; Schloen, Malte
2010-02-01
Internet geo and mapping services like Google Maps, Google Earth and Microsoft Bing Maps have reinvented the use of geographical information and have reached an enormous popularity. Besides that, location technologies like GPS have become affordable and are now being integrated in many camera phones. GPS is also available for standalone cameras as add on products or integrated in cameras. These developments are the enabler for new products for the photo industry or they enhance existing products. New commercial opportunities have been identified in the areas of photo hardware, internet/software and photo finishing.
Google Mercury: The Launch of a New Planet
NASA Astrophysics Data System (ADS)
Hirshon, B.; Chapman, C. R.; Edmonds, J.; Goldstein, J.; Hallau, K. G.; Solomon, S. C.; Vanhala, H.; Weir, H. M.; Messenger Education; Public Outreach Epo Team
2010-12-01
The NASA MESSENGER mission’s Education and Public Outreach (EPO) Team, in cooperation with Google, Inc., has launched Google Mercury, an immersive new environment on the Google Earth platform. Google Mercury features hundreds of surface features, most of them newly revealed by the three flybys of the innermost planet by the MESSENGER spacecraft. As with Google Earth, Google Mercury is available on line at no cost. This presentation will demonstrate how our team worked with Google staff, features we incorporated, how games can be developed within the Google Earth platform, and how others can add tours, games, and other educational features. Finally, we will detail new enhancements to be added once MESSENGER enters into orbit about Mercury in March 2011 and begins sending back compelling images and other global data sets on a daily basis. The MESSENGER EPO Team comprises individuals from the American Association for the Advancement of Science (AAAS); Carnegie Academy for Science Education (CASE); Center for Educational Resources (CERES) at Montana State University (MSU) - Bozeman; National Center for Earth and Space Science Education (NCESSE); Johns Hopkins University Applied Physics Laboratory (JHU/APL); National Air and Space Museum (NASM); Science Systems and Applications, Inc. (SSAI); and Southwest Research Institute (SwRI). Screen shot of Google Mercury as a work in progress
Web GIS in practice V: 3-D interactive and real-time mapping in Second Life
Boulos, Maged N Kamel; Burden, David
2007-01-01
This paper describes technologies from Daden Limited for geographically mapping and accessing live news stories/feeds, as well as other real-time, real-world data feeds (e.g., Google Earth KML feeds and GeoRSS feeds) in the 3-D virtual world of Second Life, by plotting and updating the corresponding Earth location points on a globe or some other suitable form (in-world), and further linking those points to relevant information and resources. This approach enables users to visualise, interact with, and even walk or fly through, the plotted data in 3-D. Users can also do the reverse: put pins on a map in the virtual world, and then view the data points on the Web in Google Maps or Google Earth. The technologies presented thus serve as a bridge between mirror worlds like Google Earth and virtual worlds like Second Life. We explore the geo-data display potential of virtual worlds and their likely convergence with mirror worlds in the context of the future 3-D Internet or Metaverse, and reflect on the potential of such technologies and their future possibilities, e.g. their use to develop emergency/public health virtual situation rooms to effectively manage emergencies and disasters in real time. The paper also covers some of the issues associated with these technologies, namely user interface accessibility and individual privacy. PMID:18042275
Caltrans - California Department of Transportation
Caltrans QuickMap QuickMap Mobile QuickMap Android App Check Current Highway Conditions: Enter Highway the App Store. Google Play Apple Store Quickmap Mobile Version Quickmap Full Version CA Safety
Fazeli Dehkordy, Soudabeh; Carlos, Ruth C; Hall, Kelli S; Dalton, Vanessa K
2014-09-01
Millions of people use online search engines everyday to find health-related information and voluntarily share their personal health status and behaviors in various Web sites. Thus, data from tracking of online information seeker's behavior offer potential opportunities for use in public health surveillance and research. Google Trends is a feature of Google which allows Internet users to graph the frequency of searches for a single term or phrase over time or by geographic region. We used Google Trends to describe patterns of information-seeking behavior in the subject of dense breasts and to examine their correlation with the passage or introduction of dense breast notification legislation. To capture the temporal variations of information seeking about dense breasts, the Web search query "dense breast" was entered in the Google Trends tool. We then mapped the dates of legislative actions regarding dense breasts that received widespread coverage in the lay media to information-seeking trends about dense breasts over time. Newsworthy events and legislative actions appear to correlate well with peaks in search volume of "dense breast". Geographic regions with the highest search volumes have passed, denied, or are currently considering the dense breast legislation. Our study demonstrated that any legislative action and respective news coverage correlate with increase in information seeking for "dense breast" on Google, suggesting that Google Trends has the potential to serve as a data source for policy-relevant research. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Google's Geo Education Outreach: Results and Discussion of Outreach Trip to Alaskan High Schools.
NASA Astrophysics Data System (ADS)
Kolb, E. J.; Bailey, J.; Bishop, A.; Cain, J.; Goddard, M.; Hurowitz, K.; Kennedy, K.; Ornduff, T.; Sfraga, M.; Wernecke, J.
2008-12-01
The focus of Google's Geo Education outreach efforts (http://www.google.com/educators/geo.html) is on helping primary, secondary, and post-secondary educators incorporate Google Earth and Sky, Google Maps, and SketchUp into their classroom lessons. In partnership with the University of Alaska, our Geo Education team members visited several remote Alaskan high schools during a one-week period in September. At each school, we led several 40-minute hands-on learning sessions in which Google products were used by the students to investigate local geologic and environmental processes. For the teachers, we provided several resources including follow-on lesson plans, example KML-based lessons, useful URL's, and website resources that multiple users can contribute to. This talk will highlight results of the trip and discuss how educators can access and use Google's Geo Education resources.
Geospatial Data Science Applications and Visualizations | Geospatial Data
. Since before the time of Google Maps, NREL has used the internet to allow stakeholders to view and world, these maps drive understanding. See our collection of key maps for examples. Featured Analysis
2013-08-09
CAPE CANAVERAL, Fla. – As seen on Google Maps, Firing Room 3 inside the Launch Control Center at NASA's Kennedy Space Center was one of the four control rooms used by NASA and contractor launch teams to oversee a space shuttle countdown. This firing room is furnished in the classic style with the same metal computer cabinets and some of the same monitors in place when the first shuttle mission launched April 12, 1981. Specialized operators worked at consoles tailored to keep track of the status of shuttle systems while the spacecraft was processed in the Orbiter Processing Facility, being stacked inside the Vehicle Assembly Building and standing at the launch pad before liftoff. The firing rooms, including 3, were also used during NASA's Apollo Program. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang
An interactive GIS based tool on Chinese history and its topography
NASA Astrophysics Data System (ADS)
Konda, Ashish Reddy
The aim of the thesis is to demonstrate how China was attacked by the foreign powers, the rise and fall of the empires, the border conflicts with India, Russia, Vietnam and territorial disputes in South China Sea. This thesis is focused on creating a GIS tool showcasing the modern Chinese history, which includes the major wars fought during that period. This tool is developed using the features of Google Maps that shows the location of the wars. The topography of China is also represented on the interactive Google Map by creating layers for rivers, mountain ranges and deserts. The provinces with highest population are also represented on the Google Map with circles. The application also shows the historical events in chronological order using a timeline feature. This has been implemented using JQuery, JavaScript, HTML5 and CSS. Chinese culture and biographies of important leaders are also included in this thesis, which is embedded with pictures and videos.
Mandel, Jacob E; Morel-Ovalle, Louis; Boas, Franz E; Ziv, Etay; Yarmohammadi, Hooman; Deipolyi, Amy; Mohabir, Heeralall R; Erinjeri, Joseph P
2018-02-20
The purpose of this study is to determine whether a custom Google Maps application can optimize site selection when scheduling outpatient interventional radiology (IR) procedures within a multi-site hospital system. The Google Maps for Business Application Programming Interface (API) was used to develop an internal web application that uses real-time traffic data to determine estimated travel time (ETT; minutes) and estimated travel distance (ETD; miles) from a patient's home to each a nearby IR facility in our hospital system. Hypothetical patient home addresses based on the 33 cities comprising our institution's catchment area were used to determine the optimal IR site for hypothetical patients traveling from each city based on real-time traffic conditions. For 10/33 (30%) cities, there was discordance between the optimal IR site based on ETT and the optimal IR site based on ETD at non-rush hour time or rush hour time. By choosing to travel to an IR site based on ETT rather than ETD, patients from discordant cities were predicted to save an average of 7.29 min during non-rush hour (p = 0.03), and 28.80 min during rush hour (p < 0.001). Using a custom Google Maps application to schedule outpatients for IR procedures can effectively reduce patient travel time when more than one location providing IR procedures is available within the same hospital system.
Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui
2012-01-01
Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications. PMID:22998945
Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui.
Newton, Richard; Deonarine, Andrew; Wernisch, Lorenz
2012-09-24
The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications.
How to Display Hazards and other Scientific Data Using Google Maps
NASA Astrophysics Data System (ADS)
Venezky, D. Y.; Fee, J. M.
2007-12-01
The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) is launching a map-based interface to display hazards information using the Google® Map API (Application Program Interface). Map-based interfaces provide a synoptic view of data, making patterns easier to detect and allowing users to quickly ascertain where hazards are in relation to major population and infrastructure centers. Several map-based interfaces are now simple to run on a web server, providing ideal platforms for sharing information with colleagues, emergency managers, and the public. There are three main steps to making data accessible on a map-based interface; formatting the input data, plotting the data on the map, and customizing the user interface. The presentation, "Creating Geospatial RSS and ATOM feeds for Map-based Interfaces" (Fee and Venezky, this session), reviews key features for map input data. Join us for this presentation on how to plot data in a geographic context and then format the display with images, custom markers, and links to external data. Examples will show how the VHP Volcano Status Map was created and how to plot a field trip with driving directions.
A Land-Use-Planning Simulation Using Google Earth
ERIC Educational Resources Information Center
Bodzin, Alec M.; Cirucci, Lori
2009-01-01
Google Earth (GE) is proving to be a valuable tool in the science classroom for understanding the environment and making responsible environmental decisions (Bodzin 2008). GE provides learners with a dynamic mapping experience using a simple interface with a limited range of functions. This interface makes geospatial analysis accessible and…
NASA Astrophysics Data System (ADS)
Farda, N. M.
2017-12-01
Coastal wetlands provide ecosystem services essential to people and the environment. Changes in coastal wetlands, especially on land use, are important to monitor by utilizing multi-temporal imagery. The Google Earth Engine (GEE) provides many machine learning algorithms (10 algorithms) that are very useful for extracting land use from imagery. The research objective is to explore machine learning in Google Earth Engine and its accuracy for multi-temporal land use mapping of coastal wetland area. Landsat 3 MSS (1978), Landsat 5 TM (1991), Landsat 7 ETM+ (2001), and Landsat 8 OLI (2014) images located in Segara Anakan lagoon are selected to represent multi temporal images. The input for machine learning are visible and near infrared bands, PCA band, invers PCA bands, bare soil index, vegetation index, wetness index, elevation from ASTER GDEM, and GLCM (Harralick) texture, and also polygon samples in 140 locations. There are 10 machine learning algorithms applied to extract coastal wetlands land use from Landsat imagery. The algorithms are Fast Naive Bayes, CART (Classification and Regression Tree), Random Forests, GMO Max Entropy, Perceptron (Multi Class Perceptron), Winnow, Voting SVM, Margin SVM, Pegasos (Primal Estimated sub-GrAdient SOlver for Svm), IKPamir (Intersection Kernel Passive Aggressive Method for Information Retrieval, SVM). Machine learning in Google Earth Engine are very helpful in multi-temporal land use mapping, the highest accuracy for land use mapping of coastal wetland is CART with 96.98 % Overall Accuracy using K-Fold Cross Validation (K = 10). GEE is particularly useful for multi-temporal land use mapping with ready used image and classification algorithms, and also very challenging for other applications.
There's An App For That: Planning Ahead for the Solar Eclipse in August 2017
NASA Astrophysics Data System (ADS)
Chizek Frouard, Malynda R.; Lesniak, Michael V.; Bell, Steve
2017-01-01
With the total solar eclipse of 2017 August 21 over the continental United States approaching, the U.S. Naval Observatory (USNO) on-line Solar Eclipse Computer can now be accessed via an Android application, available on Google Play.Over the course of the eclipse, as viewed from a specific site, several events may be visible: the beginning and ending of the eclipse (first and fourth contacts), the beginning and ending of totality (second and third contacts), the moment of maximum eclipse, sunrise, or sunset. For each of these events, the USNO Solar Eclipse 2017 Android application reports the time, Sun's altitude and azimuth, and the event's position and vertex angles. The app also lists the duration of the total phase, the duration of the eclipse, the magnitude of the eclipse, and the percent of the Sun obscured for a particular eclipse site.All of the data available in the app comes from the flexible USNO Solar Eclipse Computer Application Programming Interface (API), which produces JavaScript Object Notation (JSON) that can be incorporated into third-party Web sites or custom applications. Additional information is available in the on-line documentation (http://aa.usno.navy.mil/data/docs/api.php).For those who prefer using a traditional data input form, the local circumstances can still be requested at http://aa.usno.navy.mil/data/docs/SolarEclipses.php.In addition the 2017 August 21 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2017.php) consolidates all of the USNO resources for this event, including a Google Map view of the eclipse track designed by Her Majesty's Nautical Almanac Office (HMNAO).Looking further ahead, a 2024 April 8 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2024.php) is also available.
Global Coastal and Marine Spatial Planning (CMSP) from Space Based AIS Ship Tracking
NASA Astrophysics Data System (ADS)
Schwehr, K. D.; Foulkes, J. A.; Lorenzini, D.; Kanawati, M.
2011-12-01
All nations need to be developing long term integrated strategies for how to use and preserve our natural resources. As a part of these strategies, we must evalutate how communities of users react to changes in rules and regulations of ocean use. Global characterization of the vessel traffic on our Earth's oceans is essential to understanding the existing uses to develop international Coast and Marine Spatial Planning (CMSP). Ship traffic within 100-200km is beginning to be effectively covered in low latitudes by ground based receivers collecting position reports from the maritime Automatic Identification System (AIS). Unfortunately, remote islands, high latitudes, and open ocean Marine Protected Areas (MPA) are not covered by these ground systems. Deploying enough autonomous airborne (UAV) and surface (USV) vessels and buoys to provide adequate coverage is a difficult task. While the individual device costs are plummeting, a large fleet of AIS receivers is expensive to maintain. The global AIS coverage from SpaceQuest's low Earth orbit satellite receivers combined with the visualization and data storage infrastructure of Google (e.g. Maps, Earth, and Fusion Tables) provide a platform that enables researchers and resource managers to begin answer the question of how ocean resources are being utilized. Near real-time vessel traffic data will allow managers of marine resources to understand how changes to education, enforcement, rules, and regulations alter usage and compliance patterns. We will demonstrate the potential for this system using a sample SpaceQuest data set processed with libais which stores the results in a Fusion Table. From there, the data is imported to PyKML and visualized in Google Earth with a custom gx:Track visualization utilizing KML's extended data functionality to facilitate ship track interrogation. Analysts can then annotate and discuss vessel tracks in Fusion Tables.
3D Viewer Platform of Cloud Clustering Management System: Google Map 3D
NASA Astrophysics Data System (ADS)
Choi, Sung-Ja; Lee, Gang-Soo
The new management system of framework for cloud envrionemnt is needed by the platfrom of convergence according to computing environments of changes. A ISV and small business model is hard to adapt management system of platform which is offered from super business. This article suggest the clustering management system of cloud computing envirionments for ISV and a man of enterprise in small business model. It applies the 3D viewer adapt from map3D & earth of google. It is called 3DV_CCMS as expand the CCMS[1].
Rattanasing, Wannaporn; Kaewpitoon, Soraya J; Loyd, Ryan A; Rujirakul, Ratana; Yodkaw, Eakachai; Kaewpitoon, Natthawut
2015-01-01
Cholangiocarcinoma (CCA) is a serious public health problem in the Northeast of Thailand. CCA is considered to be an incurable and rapidly lethal disease. Knowledge of the distribution of CCA patients is necessary for management strategies. This study aimed to utilize the Geographic Information System and Google EarthTM for distribution mapping of cholangiocarcinoma in Satuek District, Buriram, Thailand, during a 5-year period (2008-2012). In this retrospective study data were collected and reviewed from the OPD cards, definitive cases of CCA were patients who were treated in Satuek hospital and were diagnosed with CCA or ICD-10 code C22.1. CCA cases were used to analyze and calculate with ArcGIS 9.2, all of data were imported into Google Earth using the online web page www.earthpoint.us. Data were displayed at village points. A total of 53 cases were diagnosed and identified as CCA. The incidence was 53.57 per 100,000 population (65.5 for males and 30.8 for females) and the majority of CCA cases were in stages IV and IIA. The average age was 67 years old. The highest attack rate was observed in Thung Wang sub-district (161.4 per 100,000 population). The map display at village points for CCA patients based on Google Earth gave a clear visual deistribution. CCA is still a major problem in Satuek district, Buriram province of Thailand. The Google Earth production process is very simple and easy to learn. It is suitable for the use in further development of CCA management strategies.
Dehkordy, Soudabeh Fazeli; Carlos, Ruth C.; Hall, Kelli S.; Dalton, Vanessa K.
2015-01-01
Rationale and Objectives Millions of people use online search engines every day to find health-related information and voluntarily share their personal health status and behaviors in various Web sites. Thus, data from tracking of online information seeker’s behavior offer potential opportunities for use in public health surveillance and research. Google Trends is a feature of Google which allows internet users to graph the frequency of searches for a single term or phrase over time or by geographic region. We used Google Trends to describe patterns of information seeking behavior in the subject of dense breasts and to examine their correlation with the passage or introduction of dense breast notification legislation. Materials and Methods In order to capture the temporal variations of information seeking about dense breasts, the web search query “dense breast” was entered in the Google Trends tool. We then mapped the dates of legislative actions regarding dense breasts that received widespread coverage in the lay media to information seeking trends about dense breasts over time. Results Newsworthy events and legislative actions appear to correlate well with peaks in search volume of “dense breast”. Geographic regions with the highest search volumes have either passed, denied, or are currently considering the dense breast legislation. Conclusions Our study demonstrated that any legislative action and respective news coverage correlate with increase in information seeking for “dense breast” on Google, suggesting that Google Trends has the potential to serve as a data source for policy-relevant research. PMID:24998689
The World in Spatial Terms: Mapmaking and Map Reading
ERIC Educational Resources Information Center
Ekiss, Gale Olp; Trapido-Lurie, Barbara; Phillips, Judy; Hinde, Elizabeth
2007-01-01
Maps and mapping activities are essential in the primary grades. Maps are truly ubiquitous today, as evidenced by the popularity of websites such as Google Earth and Mapquest, and by devices such as Global Positioning System (GPS) units in cars, planes, and boats. Maps can give visual settings to travel stories and historical narratives and can…
Wampler, Peter J; Rediske, Richard R; Molla, Azizur R
2013-01-18
A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This method provides an important technique that can be applied to other developing countries where a randomized study design is needed but infrastructure is lacking to implement more traditional participant selection methods.
ERIC Educational Resources Information Center
Lucking, Robert A.; Christmann, Edwin P.; Whiting, Mervyn J.
2008-01-01
"Mashup" is a new technology term used to describe a web application that combines data or technology from several different sources. You can apply this concept in your classroom by having students create their own mashup maps. Google Maps provides you with the simple tools, map databases, and online help you'll need to quickly master this…
2013-08-09
CAPE CANAVERAL, Fla. – As seen on Google Maps, Firing Room 4 inside the Launch Control Center at NASA's Kennedy Space Center was one of the four control rooms used by NASA and contractor launch teams to oversee a space shuttle countdown. This firing room was the most advanced of the control rooms used for shuttle missions and was the primary firing room for the shuttle's final series of launches before retirement. It is furnished in a more contemporary style with wood cabinets and other features, although it retains many of the computer systems the shuttle counted on to operate safely. Specialized operators worked at consoles tailored to keep track of the status of shuttle systems while the spacecraft was processed in the Orbiter Processing Facility, being stacked inside the Vehicle Assembly Building and standing at the launch pad before liftoff. The firing rooms, including 3, were also used during NASA's Apollo Program. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang
NASA Astrophysics Data System (ADS)
Askay, S.
2009-12-01
Published on Memorial Day 2009, Map the Fallen is a Google Earth visualization of the 5500+ US and international soldiers that have died in Iraq and Afghanistan since 2001. In addition to providing photos, stories and links for each solider, the time-animated map visually connects hometowns to places of death. This novel way of representing casualty data brings the geographic reach and magnitude of the issue into focus together with the very personal nature of individual stories. Innovative visualizations techniques were used that illustrate the spatio-temporal nature of this information and to show the global reach and interconnectivity of this issue. Several of advanced KML techniques employed to create this engaging and performance-conscious map will be discussed during this session. These include: 1) the use of HTML iframes and javascript to minimize the KML size, and extensive cross-linking throughout content; 2) the creation of a time-animated, on-screen casualty counter; 3) the use of parabolic arcs to connect each hometown to place of death; 4) the use of concentric spirals to represent chronological data; and 5) numerous performance optimizations to ensure the 23K placemarks, 2500 screen overlays and nearly 250k line vertices performed well in Google Earth. This session will include a demonstration of the map, conceptual discussions of the techniques used, and some in-depth technical explanation of the KML code.
NASA Astrophysics Data System (ADS)
Cao, Y. B.; Hua, Y. X.; Zhao, J. X.; Guo, S. M.
2013-11-01
With China's rapid economic development and comprehensive national strength growing, Border work has become a long-term and important task in China's diplomatic work. How to implement rapid plotting, real-time sharing and mapping surrounding affairs has taken great significance for government policy makers and diplomatic staff. However, at present the already exists Boundary information system are mainly have problems of Geospatial data update is heavily workload, plotting tools are in a state of serious lack of, Geographic events are difficult to share, this phenomenon has seriously hampered the smooth development of the border task. The development and progress of Geographic information system technology especially the development of Web GIS offers the possibility to solve the above problems, this paper adopts four layers of B/S architecture, with the support of Google maps service, uses the free API which is offered by Google maps and its features of openness, ease of use, sharing characteristics, highresolution images to design and implement the surrounding transaction plotting and management system based on the web development technology of ASP.NET, C#, Ajax. The system can provide decision support for government policy makers as well as diplomatic staff's real-time plotting and sharing of surrounding information. The practice has proved that the system has good usability and strong real-time.
Creating a Geo-Referenced Bibliography with Google Earth and Geocommons: The Coos Bay Bibliography
ERIC Educational Resources Information Center
Schmitt, Jenni; Butler, Barb
2012-01-01
We compiled a geo-referenced bibliography of research including theses, peer-reviewed articles, agency literature, and books having sample collection sites in and around Coos Bay, Oregon. Using Google Earth and GeoCommons we created a map that allows users such as visiting researchers, faculty, students, and local agencies to identify previous…
NASA Astrophysics Data System (ADS)
Díaz, Elkin; Arguello, Henry
2016-05-01
Urban ecosystem studies require monitoring, controlling and planning to analyze building density, urban density, urban planning, atmospheric modeling and land use. In urban planning, there are many methods for building height estimation using optical remote sensing images. These methods however, highly depend on sun illumination and cloud-free weather. In contrast, high resolution synthetic aperture radar provides images independent from daytime and weather conditions, although, these images rely on special hardware and expensive acquisition. Most of the biggest cities around the world have been photographed by Google street view under different conditions. Thus, thousands of images from the principal streets of a city can be accessed online. The availability of this and similar rich city imagery such as StreetSide from Microsoft, represents huge opportunities in computer vision because these images can be used as input in many applications such as 3D modeling, segmentation, recognition and stereo correspondence. This paper proposes a novel algorithm to estimate building heights using public Google Street-View imagery. The objective of this work is to obtain thousands of geo-referenced images from Google Street-View using a representational state transfer system, and estimate their average height using single view metrology. Furthermore, the resulting measurements and image metadata are used to derive a layer of heights in a Google map available online. The experimental results show that the proposed algorithm can estimate an accurate average building height map of thousands of images using Google Street-View Imagery of any city.
ERIC Educational Resources Information Center
Giorgis, Scott
2015-01-01
Three-dimensional thinking skills are extremely useful for geoscientists, and at the undergraduate level, these skills are often emphasized in structural geology courses. Google Earth is a powerful tool for visualizing the three-dimensional nature of data collected on the surface of Earth. The results of a 5 y pre- and posttest study of the…
Drawing the Line with Google Earth: The Place of Digital Mapping outside of Geography
ERIC Educational Resources Information Center
Mercier, O. Ripeka; Rata, Arama
2017-01-01
The "Te Kawa a Maui Atlas" project explores how mapping activities support undergraduate student engagement and learning in Maori studies. This article describes two specific assignments, which used online mapping allowing students to engage with the work of their peers. By analysing student evaluations of these activities, we identify…
How to Use This Website | USDA Plant Hardiness Zone Map
, regional or national plant hardiness zone maps in three different resolutions using the following steps. To default printing menu option or button. Viewing the Map - Open Full Map Button c. Save Full Map Button can copy the e-mail address and paste it into a different e-mail client (e.g., Google Gmail, Yahoo
Traffic Sign Inventory from Google Street View Images
NASA Astrophysics Data System (ADS)
Tsai, Victor J. D.; Chen, Jyun-Han; Huang, Hsun-Sheng
2016-06-01
Traffic sign detection and recognition (TSDR) has drawn considerable attention on developing intelligent transportation systems (ITS) and autonomous vehicle driving systems (AVDS) since 1980's. Unlikely to the general TSDR systems that deal with real-time images captured by the in-vehicle cameras, this research aims on developing techniques for detecting, extracting, and positioning of traffic signs from Google Street View (GSV) images along user-selected routes for low-cost, volumetric and quick establishment of the traffic sign infrastructural database that may be associated with Google Maps. The framework and techniques employed in the proposed system are described.
Using Google Streetview Panoramic Imagery for Geoscience Education
NASA Astrophysics Data System (ADS)
De Paor, D. G.; Dordevic, M. M.
2014-12-01
Google Streetview is a feature of Google Maps and Google Earth that allows viewers to switch from map or satellite view to 360° panoramic imagery recorded close to the ground. Most panoramas are recorded by Google engineers using special cameras mounted on the roofs of cars. Bicycles, snowmobiles, and boats have also been used and sometimes the camera has been mounted on a backpack for off-road use by hikers and skiers or attached to scuba-diving gear for "Underwater Streetview (sic)." Streetview panoramas are linked together so that the viewer can change viewpoint by clicking forward and reverse buttons. They therefore create a 4-D touring effect. As part of the GEODE project ("Google Earth for Onsite and Distance Education"), we are experimenting with the use of Streetview imagery for geoscience education. Our web-based test application allows instructors to select locations for students to study. Students are presented with a set of questions or tasks that they must address by studying the panoramic imagery. Questions include identification of rock types, structures such as faults, and general geological setting. The student view is locked into Streetview mode until they submit their answers, whereupon the map and satellite views become available, allowing students to zoom out and verify their location on Earth. Student learning is scaffolded by automatic computerized feedback. There are lots of existing Streetview panoramas with rich geological content. Additionally, instructors and members of the general public can create panoramas, including 360° Photo Spheres, by stitching images taken with their mobiles devices and submitting them to Google for evaluation and hosting. A multi-thousand-dollar, multi-directional camera and mount can be purchased from DIY-streetview.com. This allows power users to generate their own high-resolution panoramas. A cheaper, 360° video camera is soon to be released according to geonaute.com. Thus there are opportunities for geoscience educators both to use existing Streetview imagery and to generate new imagery for specific locations of geological interest. The GEODE team includes the authors and: H. Almquist, C. Bentley, S. Burgin, C. Cervato, G. Cooper, P. Karabinos, T. Pavlis, J. Piatek, B. Richards, J. Ryan, R. Schott, K. St. John, B. Tewksbury, and S. Whitmeyer.
Cartographic analyses of geographic information available on Google Earth Images
NASA Astrophysics Data System (ADS)
Oliveira, J. C.; Ramos, J. R.; Epiphanio, J. C.
2011-12-01
The propose was to evaluate planimetric accuracy of satellite images available on database of Google Earth. These images are referents to the vicinities of the Federal Univertisity of Viçosa, Minas Gerais - Brazil. The methodology developed evaluated the geographical information of three groups of images which were in accordance to the level of detail presented in the screen images (zoom). These groups of images were labeled to Zoom 1000 (a single image for the entire study area), Zoom 100 (formed by a mosaic of 73 images) and Zoom 100 with geometric correction (this mosaic is like before, however, it was applied a geometric correction through control points). In each group of image was measured the Cartographic Accuracy based on statistical analyses and brazilian's law parameters about planimetric mapping. For this evaluation were identified 22 points in each group of image, where the coordinates of each point were compared to the coordinates of the field obtained by GPS (Global Positioning System). The Table 1 show results related to accuracy (based on a threshold equal to 0.5 mm * mapping scale) and tendency (abscissa and ordinate) between the coordinates of the image and the coordinates of field. Table 1 The geometric correction applied to the Group Zoom 100 reduced the trends identified earlier, and the statistical tests pointed a usefulness of the data for a mapping at a scale of 1/5000 with error minor than 0.5 mm * scale. The analyses proved the quality of cartographic data provided by Google, as well as the possibility of reduce the divergences of positioning present on the data. It can be concluded that it is possible to obtain geographic information database available on Google Earth, however, the level of detail (zoom) used at the time of viewing and capturing information on the screen influences the quality cartographic of the mapping. Although cartographic and thematic potential present in the database, it is important to note that both the software as data distributed by Google Earth has policies for use and distribution.
Table 1 - PLANIMETRIC ANALYSIS
Sean A. Parks; Lisa M. Holsinger; Morgan A. Voss; Rachel A. Loehman; Nathaniel P. Robinson
2018-01-01
Landsat-based fire severity datasets are an invaluable resource for monitoring and research purposes. These gridded fire severity datasets are generally produced with pre- and post-fire imagery to estimate the degree of fire-induced ecological change. Here, we introduce methods to produce three Landsat-based fire severity metrics using the Google Earth Engine (GEE)...
NASA Astrophysics Data System (ADS)
Griffith, P. C.; Wilcox, L. E.; Morrell, A.
2009-12-01
The central objective of the North American Carbon Program (NACP), a core element of the US Global Change Research Program, is to quantify the sources and sinks of carbon dioxide, carbon monoxide, and methane in North America and adjacent ocean regions. The NACP consists of a wide range of investigators at universities and federal research centers. Although many of these investigators have worked together in the past, many have had few prior interactions and may not know of similar work within knowledge domains, much less across the diversity of environments and scientific approaches in the Program. Coordinating interactions and sharing data are major challenges in conducting NACP. The Google Earth and Google Map Collections on the NACP website (www.nacarbon.org) provide a geographical view of the research products contributed by each core and affiliated NACP project. Other relevant data sources (e.g. AERONET, LVIS) can also be browsed in spatial context with NACP contributions. Each contribution links to project-oriented metadata, or “project profiles”, that provide a greater understanding of the scientific and social context of each dataset and are an important means of communicating within the NACP and to the larger carbon cycle science community. Project profiles store information such as a project's title, leaders, participants, an abstract, keywords, funding agencies, associated intensive campaigns, expected data products, data needs, publications, and URLs to associated data centers, datasets, and metadata. Data products are research contributions that include biometric inventories, flux tower estimates, remote sensing land cover products, tools, services, and model inputs / outputs. Project leaders have been asked to identify these contributions to the site level whenever possible, either through simple latitude/longitude pair, or by uploading a KML, KMZ, or shape file. Project leaders may select custom icons to graphically categorize their contributions; for example, a ship for oceanographic samples, a tower for tower measurements. After post-processing, research contributions are added to the NACP Google Earth and Google Map Collection to facilitate discovery and use in synthesis activities of the Program.
ERIC Educational Resources Information Center
Williams, Lesley
2006-01-01
In a survey of a representative sample of over 3300 online information consumers and their information-seeking behavior, survey findings indicate that 84 percent of information searches begin with a search engine. Library web sites were selected by just one percent of respondents as the source used to begin an information search and 72 percent had…
NASA Astrophysics Data System (ADS)
Gorelick, Noel
2013-04-01
The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.
NASA Astrophysics Data System (ADS)
Gorelick, N.
2012-12-01
The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.
NASA Astrophysics Data System (ADS)
Triantafyllou, Antoine; Bastin, Christophe; Watlet, Arnaud
2016-04-01
GIS software suites are today's essential tools to gather and visualise geological data, to apply spatial and temporal analysis and in fine, to create and share interactive maps for further geosciences' investigations. For these purposes, we developed GeolOkit: an open-source, freeware and lightweight software, written in Python, a high-level, cross-platform programming language. GeolOkit software is accessible through a graphical user interface, designed to run in parallel with Google Earth. It is a super user-friendly toolbox that allows 'geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to plot these one into Google Earth environment using KML code. This workflow requires no need of any third party software, except Google Earth itself. GeolOkit comes with large number of geosciences' labels, symbols, colours and placemarks and may process : (i) multi-points data, (ii) contours via several interpolations methods, (iii) discrete planar and linear structural data in 2D or 3D supporting large range of structures input format, (iv) clustered stereonets and rose diagram, (v) drawn cross-sections as vertical sections, (vi) georeferenced maps and vectors, (vii) field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS. We are looking for you to discover all the functionalities of GeolOkit software. As this project is under development, we are definitely looking to discussions regarding your proper needs, your ideas and contributions to GeolOkit project.
Google Sky: A Digital View of the Night Sky
NASA Astrophysics Data System (ADS)
Connolly, A. Scranton, R.; Ornduff, T.
2008-11-01
From its inception Astronomy has been a visual science, from careful observations of the sky using the naked eye, to the use of telescopes and photographs to map the distribution of stars and galaxies, to the current era of digital cameras that can image the sky over many decades of the electromagnetic spectrum. Sky in Google Earth (http://earth.google.com) and Google Sky (http://www.google.com/sky) continue this tradition, providing an intuitive visual interface to some of the largest astronomical imaging surveys of the sky. Streaming multi-color imagery, catalogs, time domain data, as well as annotating interesting astronomical sources and events with placemarks, podcasts and videos, Sky provides a panchromatic view of the universe accessible to anyone with a computer. Beyond a simple exploration of the sky Google Sky enables users to create and share content with others around the world. With an open interface available on Linux, Mac OS X and Windows, and translations of the content into over 20 different languages we present Sky as the embodiment of a virtual telescope for discovery and sharing the excitement of astronomy and science as a whole.
Distributed Kernelized Locality-Sensitive Hashing for Faster Image Based Navigation
2015-03-26
Facebook, Google, and Yahoo !. Current methods for image retrieval become problematic when implemented on image datasets that can easily reach billions of...correlations. Tech industry leaders like Facebook, Google, and Yahoo ! sort and index even larger volumes of “big data” daily. When attempting to process...open source implementation of Google’s MapReduce programming paradigm [13] which has been used for many different things. Using Apache Hadoop, Yahoo
LLMapReduce: Multi-Lingual Map-Reduce for Supercomputing Environments
2015-11-20
1990s. Popularized by Google [36] and Apache Hadoop [37], map-reduce has become a staple technology of the ever- growing big data community...Lexington, MA, U.S.A Abstract— The map-reduce parallel programming model has become extremely popular in the big data community. Many big data ...to big data users running on a supercomputer. LLMapReduce dramatically simplifies map-reduce programming by providing simple parallel programming
Crop classification and mapping based on Sentinel missions data in cloud environment
NASA Astrophysics Data System (ADS)
Lavreniuk, M. S.; Kussul, N.; Shelestov, A.; Vasiliev, V.
2017-12-01
Availability of high resolution satellite imagery (Sentinel-1/2/3, Landsat) over large territories opens new opportunities in agricultural monitoring. In particular, it becomes feasible to solve crop classification and crop mapping task at country and regional scale using time series of heterogenous satellite imagery. But in this case, we face with the problem of Big Data. Dealing with time series of high resolution (10 m) multispectral imagery we need to download huge volumes of data and then process them. The solution is to move "processing chain" closer to data itself to drastically shorten time for data transfer. One more advantage of such approach is the possibility to parallelize data processing workflow and efficiently implement machine learning algorithms. This could be done with cloud platform where Sentinel imagery are stored. In this study, we investigate usability and efficiency of two different cloud platforms Amazon and Google for crop classification and crop mapping problems. Two pilot areas were investigated - Ukraine and England. Google provides user friendly environment Google Earth Engine for Earth observation applications with a lot of data processing and machine learning tools already deployed. At the same time with Amazon one gets much more flexibility in implementation of his own workflow. Detailed analysis of pros and cons will be done in the presentation.
SECURE INTERNET OF THINGS-BASED CLOUD FRAMEWORK TO CONTROL ZIKA VIRUS OUTBREAK.
Sareen, Sanjay; Sood, Sandeep K; Gupta, Sunil Kumar
2017-01-01
Zika virus (ZikaV) is currently one of the most important emerging viruses in the world which has caused outbreaks and epidemics and has also been associated with severe clinical manifestations and congenital malformations. Traditional approaches to combat the ZikaV outbreak are not effective for detection and control. The aim of this study is to propose a cloud-based system to prevent and control the spread of Zika virus disease using integration of mobile phones and Internet of Things (IoT). A Naive Bayesian Network (NBN) is used to diagnose the possibly infected users, and Google Maps Web service is used to provide the geographic positioning system (GPS)-based risk assessment to prevent the outbreak. It is used to represent each ZikaV infected user, mosquito-dense sites, and breeding sites on the Google map that helps the government healthcare authorities to control such risk-prone areas effectively and efficiently. The performance and accuracy of the proposed system are evaluated using dataset for 2 million users. Our system provides high accuracy for initial diagnosis of different users according to their symptoms and appropriate GPS-based risk assessment. The cloud-based proposed system contributed to the accurate NBN-based classification of infected users and accurate identification of risk-prone areas using Google Maps.
Cross-disciplinary Undergraduate Research: A Case Study in Digital Mapping, western Ireland
NASA Astrophysics Data System (ADS)
Whitmeyer, S. J.; de Paor, D. G.; Nicoletti, J.; Rivera, M.; Santangelo, B.; Daniels, J.
2008-12-01
As digital mapping technology becomes ever more advanced, field geologists spend a greater proportion of time learning digital methods relative to analyzing rocks and structures. To explore potential solutions to the time commitment implicit in learning digital field methods, we paired James Madison University (JMU) geology majors (experienced in traditional field techniques) with Worcester Polytechnic Institute (WPI) engineering students (experienced in computer applications) during a four week summer mapping project in Connemara, western Ireland. The project consisted of approximately equal parts digital field mapping (directed by the geology students), and lab-based map assembly, evaluation and formatting for virtual 3D terrains (directed by the engineering students). Students collected geologic data in the field using ruggedized handheld computers (Trimble GeoExplorer® series) with ArcPAD® software. Lab work initially focused on building geologic maps in ArcGIS® from the digital field data and then progressed to developing Google Earth-based visualizations of field data and maps. Challenges included exporting GIS data, such as locations and attributes, to KML tags for viewing in Google Earth, which we accomplished using a Linux bash script written by one of our engineers - a task outside the comfort zone of the average geology major. We also attempted to expand the scope of Google Earth by using DEMs of present-day geologically-induced landforms as representative models for paleo-geographic reconstructions of the western Ireland field area. As our integrated approach to digital field work progressed, we found that our digital field mapping produced data at a faster rate than could be effectively managed during our allotted time for lab work. This likely reflected the more developed methodology for digital field data collection, as compared with our lab-based attempts to develop new methods for 3D visualization of geologic maps. However, this experiment in cross-disciplinary undergraduate research was a big success, with an enthusiastic interchange of expertise between undergraduate geology and engineering students that produced new, cutting-edge methods for visualizing geologic data and maps.
What's New in the Ocean in Google Earth and Maps
NASA Astrophysics Data System (ADS)
Austin, J.; Sandwell, D. T.
2014-12-01
Jenifer Austin, Jamie Adams, Kurt Schwehr, Brian Sullivan, David Sandwell2, Walter Smith3, Vicki Ferrini4, and Barry Eakins5, 1 Google Inc., 1600 Amphitheatre Parkway, Mountain View, California, USA 2 University of California-San Diego, Scripps Institute of Oceanography, La Jolla, California ,USA3 NOAA Laboratory for Satellite Altimetry, College Park, Maryland, USA4 Lamont Doherty, Columbia University5 NOAAMore than two-thirds of Earth is covered by oceans. On the almost 6 year anniversary of launching an explorable ocean seafloor in Google Earth and Maps, we updated our global underwater terrain dataset in partnership with Lamont-Doherty at Columbia, the Scripps Institution of Oceanography, and NOAA. With this update to our ocean map, we'll reveal an additional 2% of the ocean in high resolution representing 2 years of work by Columbia, pulling in data from numerous institutions including the Campeche Escarpment in the Gulf of Mexico in partnership with Charlie Paul at MBARI and the Schmidt Ocean Institute. The Scripps Institution of Oceanography at UCSD has curated 30 years of data from more than 8,000 ship cruises and 135 different institutions to reveal 15 percent of the seafloor at 1 km resolution. In addition, explore new data from an automated pipeline built to make updates to our Ocean Map more scalable in partnership with NOAA's National Geophysical Data Center (link to http://www.ngdc.noaa.gov/mgg/bathymetry/) and the University of Colorado CIRES program (link to http://cires.colorado.edu/index.html).
NASA Astrophysics Data System (ADS)
Danladi, Iliya Bauchi; Kore, Basiru Mohammed; Gül, Murat
2017-10-01
Coastal areas are important regions in the world as they host huge population, diverse ecosystems and natural resources. However, owing to their settings, elevations and proximities to the sea, climate change (global warming) and human activities are threatening issues. Herein, we report the coastline changes and possible future threats related to sea level rise owing to global warming and human activities in the coastal region of Nigeria. Google earth images, Digital Elevation Model (DEM) and geological maps were used. Using google earth images, coastal changes for the past 43 years, 3 years prior to and after the construction of breakwaters along Goshen Beach Estate (Lekki) were examined. Additionally, coastline changes along Lekki Phase I from 2013 to 2016 were evaluated. The DEM map was used to delineate 0-2 m, 2-5 m and 5-10 m asl which correspond to undifferentiated sands and gravels to clays on the geological map. The results of the google earth images revealed remarkable erosion along both Lekki and Lekki Phase I, with the destruction of a lagoon in Lekki Phase I. Based on the result of the DEM map and geology, elevations of 0-2 m, 2-5 m and 5-10 m asl were interpreted as highly risky, moderately risky and risky respectively. Considering factors threatening coastal regions, the erosion and destruction of the lagoon along the Nigerian coast may be ascribed to sea level rise as a result of global warming and intense human activities respectively.
Implementing a geographical information system to assess endemic fluoride areas in Lamphun, Thailand
Theerawasttanasiri, Nonthaphat; Taneepanichskul, Surasak; Pingchai, Wichain; Nimchareon, Yuwaree; Sriwichai, Sangworn
2018-01-01
Introduction Many studies have shown that fluoride can cross the placenta and that exposure to high fluoride during pregnancy may result in premature birth and/or a low birth weight. Lamphun is one of six provinces in Thailand where natural water fluoride (WF) concentrations >10.0 mg/L were found, and it was also found that >50% of households used water with high fluoride levels. Nevertheless, geographical information system (GIS) and maps of endemic fluoride areas are lacking. We aimed to measure the fluoride level of village water supplies to assess endemic fluoride areas and present GIS with maps in Google Maps. Methods A cross-sectional survey was conducted from July 2016 to January 2017. Purpose sampling was used to identify villages of districts with WF >10.0 mg/L in the Mueang Lamphun, Pasang, and Ban Thi districts. Water samples were collected with the geolocation measured by Smart System Info. Fluoride was analyzed with an ion-selective electrode instrument using a total ionic strength adjustment buffer. WF >0.70 mg/L was used to identify unsafe drinking water and areas with high endemic fluoride levels. Descriptive statistics were used to describe the findings, and MS Excel was used to create the GIS database. Maps were created in Google Earth and presented in Google Maps. Results We found that WF concentrations ranged between 0.10–13.60 mg/L. Forty-four percent (n=439) of samples were at unsafe levels (>0.70 mg/L), and. 54% (n=303) of villages and 46% (n=79,807) of households used the unsafe drinking water. Fifty percent (n=26) of subdistricts were classified as being endemic fluoride areas. Five subdistricts were endemic fluoride areas, and in those, there were two subdistricts in which every household used unsafe drinking water. Conclusion These findings show the distribution of endemic fluoride areas and unsafe drinking water in Lamphun. This is useful for health policy authorities, local governments, and villagers and enables collaboration to resolve these issues. The GIS data are available at https://drive.google.com/open?id=1mi4Pvomf5xHZ1MQjK44pdp2xXFw&usp=sharing. PMID:29398924
Theerawasttanasiri, Nonthaphat; Taneepanichskul, Surasak; Pingchai, Wichain; Nimchareon, Yuwaree; Sriwichai, Sangworn
2018-01-01
Many studies have shown that fluoride can cross the placenta and that exposure to high fluoride during pregnancy may result in premature birth and/or a low birth weight. Lamphun is one of six provinces in Thailand where natural water fluoride (WF) concentrations >10.0 mg/L were found, and it was also found that >50% of households used water with high fluoride levels. Nevertheless, geographical information system (GIS) and maps of endemic fluoride areas are lacking. We aimed to measure the fluoride level of village water supplies to assess endemic fluoride areas and present GIS with maps in Google Maps. A cross-sectional survey was conducted from July 2016 to January 2017. Purpose sampling was used to identify villages of districts with WF >10.0 mg/L in the Mueang Lamphun, Pasang, and Ban Thi districts. Water samples were collected with the geolocation measured by Smart System Info. Fluoride was analyzed with an ion-selective electrode instrument using a total ionic strength adjustment buffer. WF >0.70 mg/L was used to identify unsafe drinking water and areas with high endemic fluoride levels. Descriptive statistics were used to describe the findings, and MS Excel was used to create the GIS database. Maps were created in Google Earth and presented in Google Maps. We found that WF concentrations ranged between 0.10-13.60 mg/L. Forty-four percent (n=439) of samples were at unsafe levels (>0.70 mg/L), and. 54% (n=303) of villages and 46% (n=79,807) of households used the unsafe drinking water. Fifty percent (n=26) of subdistricts were classified as being endemic fluoride areas. Five subdistricts were endemic fluoride areas, and in those, there were two subdistricts in which every household used unsafe drinking water. These findings show the distribution of endemic fluoride areas and unsafe drinking water in Lamphun. This is useful for health policy authorities, local governments, and villagers and enables collaboration to resolve these issues. The GIS data are available at https://drive.google.com/open?id=1mi4Pvomf5xHZ1MQjK44pdp2xXFw&usp=sharing.
Moonshot Laboratories' Lava Relief Google Mapping Project
NASA Astrophysics Data System (ADS)
Brennan, B.; Tomita, M.
2016-12-01
The Moonshot Laboratories were conceived at the University Laboratory School (ULS) on Oahu, Hawaii as way to develop creative problem solvers able to resourcefully apply 21st century technologies to respond to the problems and needs of their communities. One example of this was involved students from ULS using modern mapping and imaging technologies to assist peers who had been displaced from their own school in Pahoe on the Big Island of Hawaii. During 2015, lava flows from the eruption of Kilauea Volcano were slowly encroaching into the district of Puna in 2015. The lava flow was cutting the main town of Pahoa in half, leaving no safe routes of passage into or out of the town. One elementary school in the path of the flow was closed entirely and a new one was erected north of the flow for students living on that side. Pahoa High School students and teachers living to the north were been forced to leave their school and transfer to Kea'au High School. These students were separated from friends, family and the community they grew up in and were being thrust into a foreign environment that until then had been their local rival. Using Google Mapping technologies, Moonshot Laboratories students created a dynamic map to introduce the incoming Pahoa students to their new school in Kea'au. Elements included a stylized My Maps basemap, YouTube video descriptions of the building, videos recorded by Google Glass showing first person experiences, and immersive images of classrooms were created using 360 cameras. During the first day of orientation at Kea'au for the 200 Pahoa students, each of them were given a tablet to view the map as they toured and got to know their new campus. The methods and technologies, and more importantly innovative thinking, used to create this map have enormous potential for how to educate all students about the world around us, and the issues facing it. http://www.moonshotincubator.com/
Visualizing Mars data and imagery with Google Earth
NASA Astrophysics Data System (ADS)
Beyer, R. A.; Broxton, M.; Gorelick, N.; Hancher, M.; Lundy, M.; Kolb, E.; Moratto, Z.; Nefian, A.; Scharff, T.; Weiss-Malik, M.
2009-12-01
There is a vast store of planetary geospatial data that has been collected by NASA but is difficult to access and visualize. Virtual globes have revolutionized the way we visualize and understand the Earth, but other planetary bodies including Mars and the Moon can be visualized in similar ways. Extraterrestrial virtual globes are poised to revolutionize planetary science, bring an exciting new dimension to science education, and allow ordinary users to explore imagery being sent back to Earth by planetary science satellites. The original Google Mars Web site allowed users to view base maps of Mars via the Web, but it did not have the full features of the 3D Google Earth client. We have previously demonstrated the use of Google Earth to display Mars imagery, but now with the launch of Mars in Google Earth, there is a base set of Mars data available for anyone to work from and add to. There are a variety of global maps to choose from and display. The Terrain layer has the MOLA gridded data topography, and where available, HRSC terrain models are mosaicked into the topography. In some locations there is also meter-scale terrain derived from HiRISE stereo imagery. There is rich information in the form of the IAU nomenclature database, data for the rovers and landers on the surface, and a Spacecraft Imagery layer which contains the image outlines for all HiRISE, CTX, CRISM, HRSC, and MOC image data released to the PDS and links back to their science data. There are also features like the Traveler's Guide to Mars, Historic Maps, Guided Tours, as well as the 'Live from Mars' feature, which shows the orbital tracks of both the Mars Odyssey and Mars Reconnaissance Orbiter for a few days in the recent past. It shows where they have acquired imagery, and also some preview image data. These capabilities have obvious public outreach and education benefits, but the potential benefits of allowing planetary scientists to rapidly explore these large and varied data collections—in geological context and within a single user interface—are also becoming evident. Because anyone can produce additional KML content for use in Google Earth, scientists can customize the environment to their needs as well as publish their own processed data and results for others to use. Many scientists and organizations have begun to do this already, resulting in a useful and growing collection of planetary-science-oriented Google Earth layers.
A campus-based course in field geology
NASA Astrophysics Data System (ADS)
Richard, G. A.; Hanson, G. N.
2009-12-01
GEO 305: Field Geology offers students practical experience in the field and in the computer laboratory conducting geological field studies on the Stony Brook University campus. Computer laboratory exercises feature mapping techniques and field studies of glacial and environmental geology, and include geophysical and hydrological analysis, interpretation, and mapping. Participants learn to use direct measurement and mathematical techniques to compute the location and geometry of features and gain practical experience in representing raster imagery and vector geographic data as features on maps. Data collecting techniques in the field include the use of hand-held GPS devices, compasses, ground-penetrating radar, tape measures, pacing, and leveling devices. Assignments that utilize these skills and techniques include mapping campus geology with GPS, using Google Earth to explore our geologic context, data file management and ArcGIS, tape and compass mapping of woodland trails, pace and compass mapping of woodland trails, measuring elevation differences on a hillside, measuring geologic sections and cores, drilling through glacial deposits, using ground penetrating radar on glaciotectonic topography, mapping the local water table, and the identification and mapping of boulders. Two three-hour sessions are offered per week, apportioned as needed between lecture; discussion; guided hands-on instruction in geospatial and other software such as ArcGIS, Google Earth, spreadsheets, and custom modules such as an arc intersection calculator; outdoor data collection and mapping; and writing of illustrated reports.
Aanensen, David M; Huntley, Derek M; Feil, Edward J; al-Own, Fada'a; Spratt, Brian G
2009-09-16
Epidemiologists and ecologists often collect data in the field and, on returning to their laboratory, enter their data into a database for further analysis. The recent introduction of mobile phones that utilise the open source Android operating system, and which include (among other features) both GPS and Google Maps, provide new opportunities for developing mobile phone applications, which in conjunction with web applications, allow two-way communication between field workers and their project databases. Here we describe a generic framework, consisting of mobile phone software, EpiCollect, and a web application located within www.spatialepidemiology.net. Data collected by multiple field workers can be submitted by phone, together with GPS data, to a common web database and can be displayed and analysed, along with previously collected data, using Google Maps (or Google Earth). Similarly, data from the web database can be requested and displayed on the mobile phone, again using Google Maps. Data filtering options allow the display of data submitted by the individual field workers or, for example, those data within certain values of a measured variable or a time period. Data collection frameworks utilising mobile phones with data submission to and from central databases are widely applicable and can give a field worker similar display and analysis tools on their mobile phone that they would have if viewing the data in their laboratory via the web. We demonstrate their utility for epidemiological data collection and display, and briefly discuss their application in ecological and community data collection. Furthermore, such frameworks offer great potential for recruiting 'citizen scientists' to contribute data easily to central databases through their mobile phone.
NASA Astrophysics Data System (ADS)
Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii
2017-02-01
Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.
Using Google Earth for Submarine Operations at Pavilion Lake
NASA Astrophysics Data System (ADS)
Deans, M. C.; Lees, D. S.; Fong, T.; Lim, D. S.
2009-12-01
During the July 2009 Pavilion Lake field test, we supported submarine "flight" operations using Google Earth. The Intelligent Robotics Group at NASA Ames has experience with ground data systems for NASA missions, earth analog field tests, disaster response, and the Gigapan camera system. Leveraging this expertise and existing software, we put together a set of tools to support sub tracking and mapping, called the "Surface Data System." This system supports flight planning, real time flight operations, and post-flight analysis. For planning, we make overlays of the regional bedrock geology, sonar bathymetry, and sonar backscatter maps that show geology, depth, and structure of the bottom. Placemarks show the mooring locations for start and end points. Flight plans are shown as polylines with icons for waypoints. Flight tracks and imagery from previous field seasons are embedded in the map for planning follow-on activities. These data provide context for flight planning. During flights, sub position is updated every 5 seconds from the nav computer on the chase boat. We periodically update tracking KML files and refresh them with network links. A sub icon shows current location of the sub. A compass rose shows bearings to indicate heading to the next waypoint. A "Science Stenographer" listens on the voice loop and transcribes significant observations in real time. Observations called up to the surface immediately appear on the map as icons with date, time, position, and what was said. After each flight, the science back room immediately has the flight track and georeferenced notes from the pilots. We add additional information in post-processing. The submarines record video continuously, with "event" timestamps marked by the pilot. We cross-correlate the event timestamps with position logs to geolocate events and put a preview image and compressed video clip into the map. Animated flight tracks are also generated, showing timestamped position and providing timelapse playback of the flight. Neogeography tools are increasing in popularity and offer an excellent platform for geoinformatics. The scientists on the team are already familiar with Google Earth, eliminating up-front training on new tools. The flight maps and archived data are available immediately and in a usable format. Google Earth provides lots of measurement tools, annotation tools, and other built-in functions that we can use to create and analyze the map. All of this information is saved to a shared filesystem so that everyone on the team has access to all of the same map data. After the field season, the map data will be used by the team to analyse and correlate information from across the lake and across different flights to support their research, and to plan next year's activities.
Hurricane Ike Deposits on the Bolivar Peninsula, Galveston Bay, Texas
NASA Astrophysics Data System (ADS)
Evans, C. A.; Wilkinson, M. J.; Eppler, D.
2011-12-01
In September 2008, Hurricane Ike made landfall on Galveston Bay, close to the NASA Johnson Space Center (JSC). The storm flooded much of the area with a storm surge ranging from 11-20 feet. The Bolivar peninsula, the southeastern coast of Galveston Bay, experienced the brunt of the surge. Several agencies collected excellent imagery baselines before the storm and complementary data a few days afterward that helped define the impacts of the storm. In April of 2011, a team of scientists and astronauts from JSC conducted field mapping exercises along the Bolivar Peninsula, the section of the Galveston Bay coast most impacted by the storm. Astronauts routinely observe and document coastal changes from orbit aboard the International Space Station. As part of their basic Earth Science training, scientists at the Johnson Space Center take astronauts out for field mapping exercises so that they can better recognize and understand features and processes that they will later observe from the International Space Station. Using pre-storm baseline images of the Bolivar Peninsula near Rollover Pass and Gilchrist (NOAA/Google Earth Imagery and USGS aerial imagery and lidar data), the astronauts mapped current coastline positions at defined locations, and related their findings to specific coastal characteristics, including channel, jetties, and other developments. In addition to mapping, we dug trenches along both the Gulf of Mexico coast as well as the Galveston Bay coast of the Bolivar peninsula to determine the depth of the scouring from the storm on the Gulf side, and the amount of deposition of the storm surge deposits on the Bay side of the peninsula. The storm signature was easy to identify by sharp sediment transitions and, in the case of storm deposits, a layer of storm debris (roof shingles, PVC pipes, etc) and black, organic rich layers containing buried sea grasses in areas that were marshes before the storm. The amount of deposition was generally about 20-25 cm; the local areas experiencing obvious deposition are readily obvious in post-Ike imagery of the region. We used a March 2010 aerial photograph from the NOAA-Google Earth collection because construction and vegetation recovery was minimal. Based on the before and after aerial imagery and the trenching data collected over two days, we can begin to characterize the material transported and deposited by Hurricane Ike along one stretch of the Bolivar peninsula. We summarize the results from our mapping and trenching data. The basic data collected 2.5 years after the storm are ephemeral as the storm deposits become reworked and overprinted by coastal processes, vegetation regrowth and reconstruction.
Hurricane Ike Deposits on the Bolivar Peninsula, Galveston Bay, Texas
NASA Technical Reports Server (NTRS)
Evans, Cynthia A.; Wilkinson, M. J.; Eppler, Dean
2011-01-01
In September 2008, Hurricane Ike made landfall on Galveston Bay, close to the NASA Johnson Space Center (JSC). The storm flooded much of the area with a storm surge ranging from 11 -20 feet. The Bolivar peninsula, the southeastern coast of Galveston Bay, experienced the brunt of the surge. Several agencies collected excellent imagery baselines before the storm and complementary data a few days afterward that helped define the impacts of the storm. In April of 2011, a team of scientists and astronauts from JSC conducted field mapping exercises along the Bolivar Peninsula, the section of the Galveston Bay coast most impacted by the storm. Astronauts routinely observe and document coastal changes from orbit aboard the International Space Station. As part of their basic Earth Science training, scientists at the Johnson Space Center take astronauts out for field mapping exercises so that they can better recognize and understand features and processes that they will later observe from the International Space Station. Using pre -storm baseline images of the Bolivar Peninsula near Rollover Pass and Gilchrist (NOAA/Google Earth Imagery and USGS aerial imagery and lidar data), the astronauts mapped current coastline positions at defined locations, and related their findings to specific coastal characteristics, including channel, jetties, and other developments. In addition to mapping, we dug trenches along both the Gulf of Mexico coast as well as the Galveston Bay coast of the Bolivar peninsula to determine the depth of the scouring from the storm on the Gulf side, and the amount of deposition of the storm surge deposits on the Bay side of the peninsula. The storm signature was easy to identify by sharp sediment transitions and, in the case of storm deposits, a layer of storm debris (roof shingles, PVC pipes, etc) and black, organic rich layers containing buried sea grasses in areas that were marshes before the storm. The amount of deposition was generally about 20 -25 cm; the local areas experiencing obvious deposition are readily obvious in post -Ike imagery of the region. We used a March 2010 aerial photograph from the NOAA -Google Earth collection because construction and vegetation recovery was minimal. Based on the before and after aerial imagery and the trenching data collected over two days, we can begin to characterize the material transported and deposited by Hurricane Ike along one stretch of the Bolivar peninsula. We summarize the results from our mapping and trenching data. The basic data collected 2.5 years after the storm are ephemeral as the storm deposits become reworked and overprinted by coastal processes, vegetation regrowth and reconstruction.
Association between Stock Market Gains and Losses and Google Searches
Arditi, Eli; Yechiam, Eldad; Zahavi, Gal
2015-01-01
Experimental studies in the area of Psychology and Behavioral Economics have suggested that people change their search pattern in response to positive and negative events. Using Internet search data provided by Google, we investigated the relationship between stock-specific events and related Google searches. We studied daily data from 13 stocks from the Dow-Jones and NASDAQ100 indices, over a period of 4 trading years. Focusing on periods in which stocks were extensively searched (Intensive Search Periods), we found a correlation between the magnitude of stock returns at the beginning of the period and the volume, peak, and duration of search generated during the period. This relation between magnitudes of stock returns and subsequent searches was considerably magnified in periods following negative stock returns. Yet, we did not find that intensive search periods following losses were associated with more Google searches than periods following gains. Thus, rather than increasing search, losses improved the fit between people’s search behavior and the extent of real-world events triggering the search. The findings demonstrate the robustness of the attentional effect of losses. PMID:26513371
Streets? Where We're Going, We Don't Need Streets
NASA Astrophysics Data System (ADS)
Bailey, J.
2017-12-01
In 2007 Google Street View started as a project to provide 360-degree imagery along streets, but in the decade since has evolved into a platform through which to explore everywhere from the slope of everest, to the middle of the Amazon rainforest to under the ocean. As camera technology has evolved it has also become a tool for ground truthing maps, and provided scientific observations, storytelling and education. The Google Street View "special collects" team has undertaken increasingly more challenging projects across 80+ countries and every continent. All of which culminated in possibly the most ambitious collection yet, the capture of Street View on board the International Space Station. Learn about the preparation and obstacles behind this and other special collects. Explore these datasets through both Google Earth and Google Expeditions VR, an educational tool to take students on virtual field trips using 360 degree imagery.
Scales, David; Zelenev, Alexei; Brownstein, John S.
2013-01-01
Background This is the first study quantitatively evaluating the effect that media-related limitations have on data from an automated epidemic intelligence system. Methods We modeled time series of HealthMap's two main data feeds, Google News and Moreover, to test for evidence of two potential limitations: first, human resources constraints, and second, high-profile outbreaks “crowding out” coverage of other infectious diseases. Results Google News events declined by 58.3%, 65.9%, and 14.7% on Saturday, Sunday and Monday, respectively, relative to other weekdays. Events were reduced by 27.4% during Christmas/New Years weeks and 33.6% lower during American Thanksgiving week than during an average week for Google News. Moreover data yielded similar results with the addition of Memorial Day (US) being associated with a 36.2% reduction in events. Other holiday effects were not statistically significant. We found evidence for a crowd out phenomenon for influenza/H1N1, where a 50% increase in influenza events corresponded with a 4% decline in other disease events for Google News only. Other prominent diseases in this database – avian influenza (H5N1), cholera, or foodborne illness – were not associated with a crowd out phenomenon. Conclusions These results provide quantitative evidence for the limited impact of editorial biases on HealthMap's web-crawling epidemic intelligence. PMID:24206612
Using GeoRSS feeds to distribute house renting and selling information based on Google map
NASA Astrophysics Data System (ADS)
Nong, Yu; Wang, Kun; Miao, Lei; Chen, Fei
2007-06-01
Geographically Encoded Objects RSS (GeoRSS) is a way to encode location in RSS feeds. RSS is a widely supported format for syndication of news and weblogs, and is extendable to publish any sort of itemized data. When Weblogs explode since RSS became new portals, Geo-tagged feed is necessary to show the location that story tells. Geographically Encoded Objects adopts the core of RSS framework, making itself the map annotations specified in the RSS XML format. The case studied illuminates that GeoRSS could be maximally concise in representation and conception, so it's simple to manipulate generation and then mashup GeoRSS feeds with Google Map through API to show the real estate information with other attribute in the information window. After subscribe to feeds of concerned subjects, users could easily check for new bulletin showing on map through syndication. The primary design goal of GeoRSS is to make spatial data creation as easy as regular Web content development. However, it does more for successfully bridging the gap between traditional GIS professionals and amateurs, Web map hackers, and numerous services that enable location-based content for its simplicity and effectiveness.
Visualizing Moon Data and Imagery with Google Earth
NASA Astrophysics Data System (ADS)
Weiss-Malik, M.; Scharff, T.; Nefian, A.; Moratto, Z.; Kolb, E.; Lundy, M.; Hancher, M.; Gorelick, N.; Broxton, M.; Beyer, R. A.
2009-12-01
There is a vast store of planetary geospatial data that has been collected by NASA but is difficult to access and visualize. Virtual globes have revolutionized the way we visualize and understand the Earth, but other planetary bodies including Mars and the Moon can be visualized in similar ways. Extraterrestrial virtual globes are poised to revolutionize planetary science, bring an exciting new dimension to science education, and allow ordinary users to explore imagery being sent back to Earth by planetary science satellites. The original Google Moon Web site was a limited series of maps and Apollo content. The new Moon in Google Earth feature provides a similar virtual planet experience for the Moon as we have for the Earth and Mars. We incorporated existing Clementine and Lunar Orbiter imagery for the basemaps and a combination of Kaguya LALT topography and some terrain created from Apollo Metric and Panoramic images. We also have information about the Apollo landings and other robotic landers on the surface, as well as historic maps and charts, and guided tours. Some of the first-released LROC imagery of the Apollo landing sites has been put in place, and we look forward to incorporating more data as it is released from LRO, Chandraayan-1, and Kaguya. These capabilities have obvious public outreach and education benefits, but the potential benefits of allowing planetary scientists to rapidly explore these large and varied data collections — in geological context and within a single user interface — are also becoming evident. Because anyone can produce additional KML content for use in Google Earth, scientists can customize the environment to their needs as well as publish their own processed data and results for others to use. Many scientists and organizations have begun to do this already, resulting in a useful and growing collection of planetary-science-oriented Google Earth layers. Screen shot of Moon in Google Earth, a freely downloadable application for visualizing Moon imagery and data.
Real-time bus location monitoring using Arduino
NASA Astrophysics Data System (ADS)
Ibrahim, Mohammad Y. M.; Audah, Lukman
2017-09-01
The Internet of Things (IoT) is the network of objects, such as a vehicles, mobile devices, and buildings that have electronic components, software, and network connectivity that enable them to collect data, run commands, and be controlled through the Internet. Controlling physical items from the Internet will increase efficiency and save time. The growing number of devices used by people increases the practicality of having IoT devices on the market. The IoT is also an opportunity to develop products that can save money and time and increase work efficiency. Initially, they need more efficiency for real-time bus location systems, especially in university campuses. This system can easily find the accurate locations of and distances between each bus stop and the estimated time to reach a new location. This system has been separated into two parts, which are the hardware and the software. The hardware parts are the Arduino Uno and the Global Positioning System (GPS), while Google Earth and GpsGate are the software parts. The GPS continuously takes input data from the satellite and stores the latitude and longitude values in the Arduino Uno. If we want to track the vehicle, we need to send the longitude and latitude as a message to the Google Earth software to convert these into maps for navigation. Once the Arduino Uno is activated, it takes the last received latitude and longitude positions' values from GpsGate and sends a message to Google Earth. Once the message has been sent to Google Earth, the current location will be shown, and navigation will be activated automatically. Then it will be broadcast using ManyCam, Google+ Hangouts, and YouTube, as well as Facebook, and appear to users. The additional features use Google Forms for determining problems faced by students, who can also take immediate action against the responsible department. Then after several successful simulations, the results will be shown in real time on a map.
Chang, Aileen Y; Parrales, Maria E; Jimenez, Javier; Sobieszczyk, Magdalena E; Hammer, Scott M; Copenhaver, David J; Kulkarni, Rajan P
2009-01-01
Background Dengue fever is a mosquito-borne illness that places significant burden on tropical developing countries with unplanned urbanization. A surveillance system using Google Earth and GIS mapping technologies was developed in Nicaragua as a management tool. Methods and Results Satellite imagery of the town of Bluefields, Nicaragua captured from Google Earth was used to create a base-map in ArcGIS 9. Indices of larval infestation, locations of tire dumps, cemeteries, large areas of standing water, etc. that may act as larval development sites, and locations of the homes of dengue cases collected during routine epidemiologic surveying were overlaid onto this map. Visual imagery of the location of dengue cases, larval infestation, and locations of potential larval development sites were used by dengue control specialists to prioritize specific neighborhoods for targeted control interventions. Conclusion This dengue surveillance program allows public health workers in resource-limited settings to accurately identify areas with high indices of mosquito infestation and interpret the spatial relationship of these areas with potential larval development sites such as garbage piles and large pools of standing water. As a result, it is possible to prioritize control strategies and to target interventions to highest risk areas in order to eliminate the likely origin of the mosquito vector. This program is well-suited for resource-limited settings since it utilizes readily available technologies that do not rely on Internet access for daily use and can easily be implemented in many developing countries for very little cost. PMID:19627614
Kamadjeu, Raoul
2009-01-01
Background The use of GIS in public health is growing, a consequence of a rapidly evolving technology and increasing accessibility to a wider audience. Google Earth™ (GE) is becoming an important mapping infrastructure for public health. However, generating traditional public health maps for GE is still beyond the reach of most public health professionals. In this paper, we explain, through the example of polio eradication activities in the Democratic Republic of Congo, how we used GE Earth as a planning tool and we share the methods used to generate public health maps. Results The use of GE improved field operations and resulted in better dispatch of vaccination teams and allocation of resources. It also allowed the creation of maps of high quality for advocacy, training and to help understand the spatiotemporal relationship between all the entities involved in the polio outbreak and response. Conclusion GE has the potential of making mapping available to a new set of public health users in developing countries. High quality and free satellite imagery, rich features including Keyhole Markup Language or image overlay provide a flexible but yet powerful platform that set it apart from traditional GIS tools and this power is still to be fully harnessed by public health professionals. PMID:19161606
Using Google Maps to Access USGS Volcano Hazards Information
NASA Astrophysics Data System (ADS)
Venezky, D. Y.; Snedigar, S.; Guffanti, M.; Bailey, J. E.; Wall, B. G.
2006-12-01
The U.S. Geological Survey (USGS) Volcano Hazard Program (VHP) is revising the information architecture of our website to provide data within a geospatial context for emergency managers, educators, landowners in volcanic areas, researchers, and the general public. Using a map-based interface for displaying hazard information provides a synoptic view of volcanic activity along with the ability to quickly ascertain where hazards are in relation to major population and infrastructure centers. At the same time, the map interface provides a gateway for educators and the public to find information about volcanoes in their geographic context. A plethora of data visualization solutions are available that are flexible, customizable, and can be run on individual websites. We are currently using a Google map interface because it can be accessed immediately from a website (a downloadable viewer is not required), and it provides simple features for moving around and zooming within the large map area that encompasses U.S. volcanism. A text interface will also be available. The new VHP website will serve as a portal to information for each volcano the USGS monitors with icons for alert levels and aviation color codes. When a volcano is clicked, a window will provide additional information including links to maps, images, and real-time data, thereby connecting information from individual observatories, the Smithsonian Institution, and our partner universities. In addition to the VHP home page, many observatories and partners have detailed graphical interfaces to data and images that include the activity pages for the Alaska Volcano Observatory, the Smithsonian Google Earth files, and Yellowstone Volcano Observatory pictures and data. Users with varied requests such as raw data, scientific papers, images, or brief overviews expect to be able to quickly access information for their specialized needs. Over the next few years we will be gathering, cleansing, reorganizing, and posting data in multiple formats to meet these needs.
A Google Glass navigation system for ultrasound and fluorescence dual-mode image-guided surgery
NASA Astrophysics Data System (ADS)
Zhang, Zeshu; Pei, Jing; Wang, Dong; Hu, Chuanzhen; Ye, Jian; Gan, Qi; Liu, Peng; Yue, Jian; Wang, Benzhong; Shao, Pengfei; Povoski, Stephen P.; Martin, Edward W.; Yilmaz, Alper; Tweedle, Michael F.; Xu, Ronald X.
2016-03-01
Surgical resection remains the primary curative intervention for cancer treatment. However, the occurrence of a residual tumor after resection is very common, leading to the recurrence of the disease and the need for re-resection. We develop a surgical Google Glass navigation system that combines near infrared fluorescent imaging and ultrasonography for intraoperative detection of sites of tumor and assessment of surgical resection boundaries, well as for guiding sentinel lymph node (SLN) mapping and biopsy. The system consists of a monochromatic CCD camera, a computer, a Google Glass wearable headset, an ultrasonic machine and an array of LED light sources. All the above components, except the Google Glass, are connected to a host computer by a USB or HDMI port. Wireless connection is established between the glass and the host computer for image acquisition and data transport tasks. A control program is written in C++ to call OpenCV functions for image calibration, processing and display. The technical feasibility of the system is tested in both tumor simulating phantoms and in a human subject. When the system is used for simulated phantom resection tasks, the tumor boundaries, invisible to the naked eye, can be clearly visualized with the surgical Google Glass navigation system. This system has also been used in an IRB approved protocol in a single patient during SLN mapping and biopsy in the First Affiliated Hospital of Anhui Medical University, demonstrating the ability to successfully localize and resect all apparent SLNs. In summary, our tumor simulating phantom and human subject studies have demonstrated the technical feasibility of successfully using the proposed goggle navigation system during cancer surgery.
Towards a geospatial wikipedia
NASA Astrophysics Data System (ADS)
Fritz, S.; McCallum, I.; Schill, C.; Perger, C.; Kraxner, F.; Obersteiner, M.
2009-04-01
Based on the Google Earth (http://earth.google.com) platform we have developed a geospatial Wikipedia (geo-wiki.org). The tool allows everybody in the world to contribute to spatial validation and is made available to the internet community interested in that task. We illustrate how this tool can be used for different applications. In our first application we combine uncertainty hotspot information from three global land cover datasets (GLC, MODIS, GlobCover). With an ever increasing amount of high resolution images available on Google Earth, it is becoming increasingly possible to distinguish land cover features with a high degree of accuracy. We first direct the land cover validation community to certain hotspots of land cover uncertainty and then ask them to fill in a small popup menu on type of land cover, possibly a picture at that location with the different cardinal points as well as date and what type of validation was chosen (google earth imagery/panoramio or if the person has ground truth data). We have implemented the tool via a land cover validation community at FACEBOOK which is based on a snowball system which allows the tracking of individuals and the possibility to ignore users which misuse the system. In a second application we illustrate how the tool could possibly be used for mapping malaria occurrence and small water bodies as well as overall malaria risk. For this application we have implemented a polygon as well as attribute function using Google maps as along with virtual earth using openlayers. The third application deals with illegal logging and how an alert system for illegal logging detection within a certain land tenure system could be implemented. Here we show how the tool can be used to document illegal logging via a YouTube video.
Werts, Joshua D; Mikhailova, Elena A; Post, Christopher J; Sharp, Julia L
2012-04-01
Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.
NASA Astrophysics Data System (ADS)
Werts, Joshua D.; Mikhailova, Elena A.; Post, Christopher J.; Sharp, Julia L.
2012-04-01
Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.
Analysis of world terror networks from the reduced Google matrix of Wikipedia
NASA Astrophysics Data System (ADS)
El Zant, Samer; Frahm, Klaus M.; Jaffrès-Runser, Katia; Shepelyansky, Dima L.
2018-01-01
We apply the reduced Google matrix method to analyze interactions between 95 terrorist groups and determine their relationships and influence on 64 world countries. This is done on the basis of the Google matrix of the English Wikipedia (2017) composed of 5 416 537 articles which accumulate a great part of global human knowledge. The reduced Google matrix takes into account the direct and hidden links between a selection of 159 nodes (articles) appearing due to all paths of a random surfer moving over the whole network. As a result we obtain the network structure of terrorist groups and their relations with selected countries including hidden indirect links. Using the sensitivity of PageRank to a weight variation of specific links we determine the geopolitical sensitivity and influence of specific terrorist groups on world countries. The world maps of the sensitivity of various countries to influence of specific terrorist groups are obtained. We argue that this approach can find useful application for more extensive and detailed data bases analysis.
Local Air Quality Conditions and Forecasts
... Monitor Location Archived Maps by Region Canada Air Quality Air Quality on Google Earth Links A-Z About AirNow AirNow International Air Quality Action Days / Alerts AirCompare Air Quality Index (AQI) ...
Ahmetovic, Dragan; Manduchi, Roberto; Coughlan, James M.; Mascetti, Sergio
2016-01-01
In this paper we propose a computer vision-based technique that mines existing spatial image databases for discovery of zebra crosswalks in urban settings. Knowing the location of crosswalks is critical for a blind person planning a trip that includes street crossing. By augmenting existing spatial databases (such as Google Maps or OpenStreetMap) with this information, a blind traveler may make more informed routing decisions, resulting in greater safety during independent travel. Our algorithm first searches for zebra crosswalks in satellite images; all candidates thus found are validated against spatially registered Google Street View images. This cascaded approach enables fast and reliable discovery and localization of zebra crosswalks in large image datasets. While fully automatic, our algorithm could also be complemented by a final crowdsourcing validation stage for increased accuracy. PMID:26824080
Application based on ArcObject inquiry and Google maps demonstration to real estate database
NASA Astrophysics Data System (ADS)
Hwang, JinTsong
2007-06-01
Real estate industry in Taiwan has been flourishing in recent years. To acquire various and abundant information of real estate for sale is the same goal for the consumers and the brokerages. Therefore, before looking at the property, it is important to get all pertinent information possible. Not only this beneficial for the real estate agent as they can provide the sellers with the most information, thereby solidifying the interest of the buyer, but may also save time and the cost of manpower were something out of place. Most of the brokerage sites are aware of utilizes Internet as form of media for publicity however; the contents are limited to specific property itself and the functions of query are mostly just provided searching by condition. This paper proposes a query interface on website which gives function of zone query by spatial analysis for non-GIS users, developing a user-friendly interface with ArcObject in VB6, and query by condition. The inquiry results can show on the web page which is embedded functions of Google Maps and the UrMap API on it. In addition, the demonstration of inquiry results will give the multimedia present way which includes hyperlink to Google Earth with surrounding of the property, the Virtual Reality scene of house, panorama of interior of building and so on. Therefore, the website provides extra spatial solution for query and demonstration abundant information of real estate in two-dimensional and three-dimensional types of view.
Chien, Tsair-Wei; Chang, Yu; Wang, Hsien-Yi
2018-02-01
Many researchers used National Health Insurance database to publish medical papers which are often retrospective, population-based, and cohort studies. However, the author's research domain and academic characteristics are still unclear.By searching the PubMed database (Pubmed.com), we used the keyword of [Taiwan] and [National Health Insurance Research Database], then downloaded 2913 articles published from 1995 to 2017. Social network analysis (SNA), Gini coefficient, and Google Maps were applied to gather these data for visualizing: the most productive author; the pattern of coauthor collaboration teams; and the author's research domain denoted by abstract keywords and Pubmed MESH (medical subject heading) terms.Utilizing the 2913 papers from Taiwan's National Health Insurance database, we chose the top 10 research teams shown on Google Maps and analyzed one author (Dr. Kao) who published 149 papers in the database in 2015. In the past 15 years, we found Dr. Kao had 2987 connections with other coauthors from 13 research teams. The cooccurrence abstract keywords with the highest frequency are cohort study and National Health Insurance Research Database. The most coexistent MESH terms are tomography, X-ray computed, and positron-emission tomography. The strength of the author research distinct domain is very low (Gini < 0.40).SNA incorporated with Google Maps and Gini coefficient provides insight into the relationships between entities. The results obtained in this study can be applied for a comprehensive understanding of other productive authors in the field of academics.
ERIC Educational Resources Information Center
McMahon, Don; Cihak, David F.; Wright, Rachel
2015-01-01
The purpose of this study was to examine the effects of location-based augmented reality navigation compared to Google Maps and paper maps as navigation aids for students with disabilities. The participants in this single subject study were three college students with intellectual disability and one college student with autism spectrum disorder.…
In the current study, three Google Street View cars were equipped with the Aclima Environmental Intelligence ™ Platform. The air pollutants of interest, including O3, NO, NO2, CO2, black carbon, and particle number in several size ranges, were measured using a suite of fast...
The Internet and the Google Age: Introduction
ERIC Educational Resources Information Center
James, Jonathan D.
2014-01-01
In this introductory chapter, the author begins by looking at the Internet from an historical and communication perspective in an effort to understand its significance in the contemporary world. Then he gives an overview of the most searched topics on the Internet and identifies prospects that have opened up and perils that lurk in the information…
Opinion: High-Quality Mathematics Resources as Public Goods
ERIC Educational Resources Information Center
Russo, James
2017-01-01
James Russo begins a discussion of the difficulty and time-consuming activity of Googling to find lesson plans and resources to keep his lessons more interesting and engaging, since such resources seem particularly scarce for math teachers. Russo writes that joining professional associations has given him ready access to higher quality resources…
2013-01-01
Background Molecular biology knowledge can be formalized and systematically represented in a computer-readable form as a comprehensive map of molecular interactions. There exist an increasing number of maps of molecular interactions containing detailed and step-wise description of various cell mechanisms. It is difficult to explore these large maps, to organize discussion of their content and to maintain them. Several efforts were recently made to combine these capabilities together in one environment, and NaviCell is one of them. Results NaviCell is a web-based environment for exploiting large maps of molecular interactions, created in CellDesigner, allowing their easy exploration, curation and maintenance. It is characterized by a combination of three essential features: (1) efficient map browsing based on Google Maps; (2) semantic zooming for viewing different levels of details or of abstraction of the map and (3) integrated web-based blog for collecting community feedback. NaviCell can be easily used by experts in the field of molecular biology for studying molecular entities of interest in the context of signaling pathways and crosstalk between pathways within a global signaling network. NaviCell allows both exploration of detailed molecular mechanisms represented on the map and a more abstract view of the map up to a top-level modular representation. NaviCell greatly facilitates curation, maintenance and updating the comprehensive maps of molecular interactions in an interactive and user-friendly fashion due to an imbedded blogging system. Conclusions NaviCell provides user-friendly exploration of large-scale maps of molecular interactions, thanks to Google Maps and WordPress interfaces, with which many users are already familiar. Semantic zooming which is used for navigating geographical maps is adopted for molecular maps in NaviCell, making any level of visualization readable. In addition, NaviCell provides a framework for community-based curation of maps. PMID:24099179
Kuperstein, Inna; Cohen, David P A; Pook, Stuart; Viara, Eric; Calzone, Laurence; Barillot, Emmanuel; Zinovyev, Andrei
2013-10-07
Molecular biology knowledge can be formalized and systematically represented in a computer-readable form as a comprehensive map of molecular interactions. There exist an increasing number of maps of molecular interactions containing detailed and step-wise description of various cell mechanisms. It is difficult to explore these large maps, to organize discussion of their content and to maintain them. Several efforts were recently made to combine these capabilities together in one environment, and NaviCell is one of them. NaviCell is a web-based environment for exploiting large maps of molecular interactions, created in CellDesigner, allowing their easy exploration, curation and maintenance. It is characterized by a combination of three essential features: (1) efficient map browsing based on Google Maps; (2) semantic zooming for viewing different levels of details or of abstraction of the map and (3) integrated web-based blog for collecting community feedback. NaviCell can be easily used by experts in the field of molecular biology for studying molecular entities of interest in the context of signaling pathways and crosstalk between pathways within a global signaling network. NaviCell allows both exploration of detailed molecular mechanisms represented on the map and a more abstract view of the map up to a top-level modular representation. NaviCell greatly facilitates curation, maintenance and updating the comprehensive maps of molecular interactions in an interactive and user-friendly fashion due to an imbedded blogging system. NaviCell provides user-friendly exploration of large-scale maps of molecular interactions, thanks to Google Maps and WordPress interfaces, with which many users are already familiar. Semantic zooming which is used for navigating geographical maps is adopted for molecular maps in NaviCell, making any level of visualization readable. In addition, NaviCell provides a framework for community-based curation of maps.
Leveraging Google Geo Tools for Interactive STEM Education: Insights from the GEODE Project
NASA Astrophysics Data System (ADS)
Dordevic, M.; Whitmeyer, S. J.; De Paor, D. G.; Karabinos, P.; Burgin, S.; Coba, F.; Bentley, C.; St John, K. K.
2016-12-01
Web-based imagery and geospatial tools have transformed our ability to immerse students in global virtual environments. Google's suite of geospatial tools, such as Google Earth (± Engine), Google Maps, and Street View, allow developers and instructors to create interactive and immersive environments, where students can investigate and resolve common misconceptions in STEM concepts and natural processes. The GEODE (.net) project is developing digital resources to enhance STEM education. These include virtual field experiences (VFEs), such as an interactive visualization of the breakup of the Pangaea supercontinent, a "Grand Tour of the Terrestrial Planets," and GigaPan-based VFEs of sites like the Canadian Rockies. Web-based challenges, such as EarthQuiz (.net) and the "Fold Analysis Challenge," incorporate scaffolded investigations of geoscience concepts. EarthQuiz features web-hosted imagery, such as Street View, Photo Spheres, GigaPans, and Satellite View, as the basis for guided inquiry. In the Fold Analysis Challenge, upper-level undergraduates use Google Earth to evaluate a doubly-plunging fold at Sheep Mountain, WY. GEODE.net also features: "Reasons for the Seasons"—a Google Earth-based visualization that addresses misconceptions that abound amongst students, teachers, and the public, many of whom believe that seasonality is caused by large variations in Earth's distance from the Sun; "Plate Euler Pole Finder," which helps students understand rotational motion of tectonic plates on the globe; and "Exploring Marine Sediments Using Google Earth," an exercise that uses empirical data to explore the surficial distribution of marine sediments in the modern ocean. The GEODE research team includes the authors and: Heather Almquist, Cinzia Cervato, Gene Cooper, Helen Crompton, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Barb Tewksbury, and their students and collaborating colleagues. We are supported by NSF DUE 1323419 and a Google Geo Curriculum Award.
Mapping for the masses: using free remote sensing data for disaster management
NASA Astrophysics Data System (ADS)
Teeuw, R.; McWilliam, N.; Morris, N.; Saunders, C.
2009-04-01
We examine the uses of free satellite imagery and Digital Elevation Models (DEMs) for disaster management, targeting three data sources: the United Nations Charter on Space and Disasters, Google Earth and internet-based satellite data archives, such as the Global Land Cover Facility (GLCF). The research has assessed SRTM and ASTER DEM data, Landsat TM/ETM+ and ASTER imagery, as well as utilising datasets and basic GIS operations available via Google Earth. As an aid to Disaster Risk Reduction, four sets of maps can be produced from satellite data: (i) Multiple Geohazards: areas prone to slope instability, coastal inundation and fluvial flooding; (ii) Vulnerability: population density, habitation types, land cover types and infrastructure; (iii) Disaster Risk: produced by combining severity scores from (i) and (ii); (iv) Reconstruction: zones of rock/sediment with construction uses; areas of woodland (for fuel/construction) water sources; transport routes; zones suitable for re-settlement. This set of Disaster Risk Reduction maps are ideal for regional (1:50,000 to 1:250,000 scale) planning for in low-income countries: more detailed assessments require relatively expensive high resolution satellite imagery or aerial photography, although Google Earth has a good track record for posting high-res imagery of disaster zones (e.g. the 2008 Burma storm surge). The Disaster Risk maps highlight areas of maximum risk to a region's emergency planners and decision makers, enabling various types of public education and other disaster mitigation measures. The Reconstruction map also helps to save lives, by facilitating disaster recovery. Many problems have been identified. Access to the UN Charter imagery is fine after a disaster, but very difficult if assessing pre-disaster indicators: the data supplied also tends to be pre-processed, when some relief agencies would prefer to have raw data. The limited and expensive internet access in many developing countries limits access to archives of free satellite data, such as the GLCF. Finally, data integration, spatial/temporal analysis and map production are all hindered by the high price of most GIS software, making the development of suitable open-source software a priority.
Google Maps for Crowdsourced Emergency Routing
NASA Astrophysics Data System (ADS)
Nedkov, S.; Zlatanova, S.
2012-08-01
Gathering infrastructure data in emergency situations is challenging. The affected by a disaster areas are often large and the needed observations numerous. Spaceborne remote sensing techniques cover large areas but they are of limited use as their field of view may be blocked by clouds, smoke, buildings, highways, etc. Remote sensing products furthermore require specialists to collect and analyze the data. This contrasts the nature of the damage detection problem: almost everyone is capable of observing whether a street is usable or not. The crowd is fit for solving these challenges as its members are numerous, they are willing to help and are often in the vicinity of the disaster thereby forming a highly dispersed sensor network. This paper proposes and implements a small WebGIS application for performing shortest path calculations based on crowdsourced information about the infrastructure health. The application is built on top of Google Maps and uses its routing service to calculate the shortest distance between two locations. Impassable areas are indicated on a map by people performing in-situ observations on a mobile device, and by users on a desktop machine who consult a multitude of information sources.
Do, Nhan V; Barnhill, Rick; Heermann-Do, Kimberly A; Salzman, Keith L; Gimbel, Ronald W
2011-01-01
To design, build, implement, and evaluate a personal health record (PHR), tethered to the Military Health System, that leverages Microsoft® HealthVault and Google® Health infrastructure based on user preference. A pilot project was conducted in 2008-2009 at Madigan Army Medical Center in Tacoma, Washington. Our PHR was architected to a flexible platform that incorporated standards-based models of Continuity of Document and Continuity of Care Record to map Department of Defense-sourced health data, via a secure Veterans Administration data broker, to Microsoft® HealthVault and Google® Health based on user preference. The project design and implementation were guided by provider and patient advisory panels with formal user evaluation. The pilot project included 250 beneficiary users. Approximately 73.2% of users were < 65 years of age, and 38.4% were female. Of the users, 169 (67.6%) selected Microsoft® HealthVault, and 81 (32.4%) selected Google® Health as their PHR of preference. Sample evaluation of users reflected 100% (n = 60) satisfied with convenience of record access and 91.7% (n = 55) satisfied with overall functionality of PHR. Key lessons learned related to data-transfer decisions (push vs pull), purposeful delays in reporting sensitive information, understanding and mapping PHR use and clinical workflow, and decisions on information patients may choose to share with their provider. Currently PHRs are being viewed as empowering tools for patient activation. Design and implementation issues (eg, technical, organizational, information security) are substantial and must be thoughtfully approached. Adopting standards into design can enhance the national goal of portability and interoperability.
Use of Openly Available Satellite Images for Remote Sensing Education
NASA Astrophysics Data System (ADS)
Wang, C.-K.
2011-09-01
With the advent of Google Earth, Google Maps, and Microsoft Bing Maps, high resolution satellite imagery are becoming more easily accessible than ever. It have been the case that the college students may already have wealth experiences with the high resolution satellite imagery by using these software and web services prior to any formal remote sensing education. It is obvious that the remote sensing education should be adjusted to the fact that the audience are already the customers of remote sensing products (through the use of the above mentioned services). This paper reports the use of openly available satellite imagery in an introductory-level remote sensing course in the Department of Geomatics of National Cheng Kung University as a term project. From the experience learned from the fall of 2009 and 2010, it shows that this term project has effectively aroused the students' enthusiastic toward Remote Sensing.
In campus location finder using mobile application services
NASA Astrophysics Data System (ADS)
Fai, Low Weng; Audah, Lukman
2017-09-01
Navigation services become very common in this era, the application include Google Map, Waze and etc. Although navigation application contains the main routing service in open area but not all of the buildings are recorded in the database. In this project, an application is made for the indoor and outdoor navigation in Universiti Tun Hussein Onn Malaysia (UTHM). It is used to help outsider and new incoming students by navigating them from their current location to destination using mobile application name "U Finder". Thunkable website has been used to build the application for outdoor and indoor navigation. Outdoor navigation is linked to the Google Map and indoor navigation is using the QR code for positioning and routing picture for navigation. The outdoor navigation can route user to the main faculties in UTHM and indoor navigation is only done for the G1 building in UTHM.
USGS Coastal and Marine Geology Survey Data in Google Earth
NASA Astrophysics Data System (ADS)
Reiss, C.; Steele, C.; Ma, A.; Chin, J.
2006-12-01
The U.S. Geological Survey (USGS) Coastal and Marine Geology (CMG) program has a rich data catalog of geologic field activities and metadata called InfoBank, which has been a standard tool for researchers within and outside of the agency. Along with traditional web maps, the data are now accessible in Google Earth, which greatly expands the possible user audience. The Google Earth interface provides geographic orientation and panning/zooming capabilities to locate data relative to topography, bathymetry, and coastal areas. Viewing navigation with Google Earth's background imagery allows queries such as, why areas were not surveyed (answer presence of islands, shorelines, cliffs, etc.). Detailed box core subsample photos from selected sampling activities, published geotechnical data, and sample descriptions are now viewable on Google Earth, (for example, M-1-95-MB, P-2-95-MB, and P-1-97- MB box core samples). One example of the use of Google Earth is CMG's surveys of San Francisco's Ocean Beach since 2004. The surveys are conducted with an all-terrain vehicle (ATV) and shallow-water personal watercraft (PWC) equipped with Global Positioning System (GPS), and elevation and echo sounder data collectors. 3D topographic models with centimeter accuracy have been produced from these surveys to monitor beach and nearshore processes, including sand transport, sedimentation patterns, and seasonal trends. Using Google Earth, multiple track line data (examples: OB-1-05-CA and OB-2-05-CA) can be overlaid on beach imagery. The images also help explain the shape of track lines as objects are encountered.
Googling trends in conservation biology.
Proulx, Raphaël; Massicotte, Philippe; Pépino, Marc
2014-02-01
Web-crawling approaches, that is, automated programs data mining the internet to obtain information about a particular process, have recently been proposed for monitoring early signs of ecosystem degradation or for establishing crop calendars. However, lack of a clear conceptual and methodological framework has prevented the development of such approaches within the field of conservation biology. Our objective was to illustrate how Google Trends, a freely accessible web-crawling engine, can be used to track changes in timing of biological processes, spatial distribution of invasive species, and level of public awareness about key conservation issues. Google Trends returns the number of internet searches that were made for a keyword in a given region of the world over a defined period. Using data retrieved online for 13 countries, we exemplify how Google Trends can be used to study the timing of biological processes, such as the seasonal recurrence of pollen release or mosquito outbreaks across a latitudinal gradient. We mapped the spatial extent of results from Google Trends for 5 invasive species in the United States and found geographic patterns in invasions that are consistent with their coarse-grained distribution at state levels. From 2004 through 2012, Google Trends showed that the level of public interest and awareness about conservation issues related to ecosystem services, biodiversity, and climate change increased, decreased, and followed both trends, respectively. Finally, to further the development of research approaches at the interface of conservation biology, collective knowledge, and environmental management, we developed an algorithm that allows the rapid retrieval of Google Trends data. © 2013 Society for Conservation Biology.
Participating in the Geospatial Web: Collaborative Mapping, Social Networks and Participatory GIS
NASA Astrophysics Data System (ADS)
Rouse, L. Jesse; Bergeron, Susan J.; Harris, Trevor M.
In 2005, Google, Microsoft and Yahoo! released free Web mapping applications that opened up digital mapping to mainstream Internet users. Importantly, these companies also released free APIs for their platforms, allowing users to geo-locate and map their own data. These initiatives have spurred the growth of the Geospatial Web and represent spatially aware online communities and new ways of enabling communities to share information from the bottom up. This chapter explores how the emerging Geospatial Web can meet some of the fundamental needs of Participatory GIS projects to incorporate local knowledge into GIS, as well as promote public access and collaborative mapping.
Harnessing Satellite Imageries in Feature Extraction Using Google Earth Pro
NASA Astrophysics Data System (ADS)
Fernandez, Sim Joseph; Milano, Alan
2016-07-01
Climate change has been a long-time concern worldwide. Impending flooding, for one, is among its unwanted consequences. The Phil-LiDAR 1 project of the Department of Science and Technology (DOST), Republic of the Philippines, has developed an early warning system in regards to flood hazards. The project utilizes the use of remote sensing technologies in determining the lives in probable dire danger by mapping and attributing building features using LiDAR dataset and satellite imageries. A free mapping software named Google Earth Pro (GEP) is used to load these satellite imageries as base maps. Geotagging of building features has been done so far with the use of handheld Global Positioning System (GPS). Alternatively, mapping and attribution of building features using GEP saves a substantial amount of resources such as manpower, time and budget. Accuracy-wise, geotagging by GEP is dependent on either the satellite imageries or orthophotograph images of half-meter resolution obtained during LiDAR acquisition and not on the GPS of three-meter accuracy. The attributed building features are overlain to the flood hazard map of Phil-LiDAR 1 in order to determine the exposed population. The building features as obtained from satellite imageries may not only be used in flood exposure assessment but may also be used in assessing other hazards and a number of other uses. Several other features may also be extracted from the satellite imageries.
NASA Astrophysics Data System (ADS)
Carraro, Francesco
"Mars @ ASDC" is a project born with the goal of using the new web technologies to assist researches involved in the study of Mars. This project employs Mars map and javascript APIs provided by Google to visualize data acquired by space missions on the planet. So far, visualization of tracks acquired by MARSIS and regions observed by VIRTIS-Rosetta has been implemented. The main reason for the creation of this kind of tool is the difficulty in handling hundreds or thousands of acquisitions, like the ones from MARSIS, and the consequent difficulty in finding observations related to a particular region. This led to the development of a tool which allows to search for acquisitions either by defining the region of interest through a set of geometrical parameters or by manually selecting the region on the map through a few mouse clicks The system allows the visualization of tracks (acquired by MARSIS) or regions (acquired by VIRTIS-Rosetta) which intersect the user defined region. MARSIS tracks can be visualized both in Mercator and polar projections while the regions observed by VIRTIS can presently be visualized only in Mercator projection. The Mercator projection is the standard map provided by Google. The polar projections are provided by NASA and have been developed to be used in combination with APIs provided by Google The whole project has been developed following the "open source" philosophy: the client-side code which handles the functioning of the web page is written in javascript; the server-side code which executes the searches for tracks or regions is written in PHP and the DB which undergoes the system is MySQL.
Naive (commonsense) geography and geobrowser usability after ten years of Google Earth
NASA Astrophysics Data System (ADS)
Hamerlinck, J. D.
2016-04-01
In 1995, the concept of ‘naive geography’ was formally introduced as an area of cognitive geographic information science representing ‘the body of knowledge that people have about the surrounding geographic world’ and reflecting ‘the way people think and reason about geographic space and time, both consciously and subconsciously’. The need to incorporate such commonsense knowledge and reasoning into design of geospatial technologies was identified but faced challenges in formalizing these relationships and processes in software implementation. Ten years later, the Google Earth geobrowser was released, marking the beginning of a new era of open access to, and application of, geographic data and information in society. Fast-forward to today, and the opportunity presents itself to take stock of twenty years of naive geography and a decade of the ubiquitous virtual globe. This paper introduces an ongoing research effort to explore the integration of naive (or commonsense) geography concepts in the Google Earth geobrowser virtual globe and their possible impact on Google Earth's usability, utility, and usefulness. A multi-phase methodology is described, combining usability reviews and usability testing with use-case scenarios involving the U.S.-Canadian Yellowstone to Yukon Initiative. Initial progress on a usability review combining cognitive walkthroughs and heuristics evaluation is presented.
Comparison of Genetic Algorithm and Hill Climbing for Shortest Path Optimization Mapping
NASA Astrophysics Data System (ADS)
Fronita, Mona; Gernowo, Rahmat; Gunawan, Vincencius
2018-02-01
Traveling Salesman Problem (TSP) is an optimization to find the shortest path to reach several destinations in one trip without passing through the same city and back again to the early departure city, the process is applied to the delivery systems. This comparison is done using two methods, namely optimization genetic algorithm and hill climbing. Hill Climbing works by directly selecting a new path that is exchanged with the neighbour's to get the track distance smaller than the previous track, without testing. Genetic algorithms depend on the input parameters, they are the number of population, the probability of crossover, mutation probability and the number of generations. To simplify the process of determining the shortest path supported by the development of software that uses the google map API. Tests carried out as much as 20 times with the number of city 8, 16, 24 and 32 to see which method is optimal in terms of distance and time computation. Based on experiments conducted with a number of cities 3, 4, 5 and 6 producing the same value and optimal distance for the genetic algorithm and hill climbing, the value of this distance begins to differ with the number of city 7. The overall results shows that these tests, hill climbing are more optimal to number of small cities and the number of cities over 30 optimized using genetic algorithms.
Fleischman, Ross J.; Lundquist, Mark; Jui, Jonathan; Newgard, Craig D.; Warden, Craig
2014-01-01
Objective To derive and validate a model that accurately predicts ambulance arrival time that could be implemented as a Google Maps web application. Methods This was a retrospective study of all scene transports in Multnomah County, Oregon, from January 1 through December 31, 2008. Scene and destination hospital addresses were converted to coordinates. ArcGIS Network Analyst was used to estimate transport times based on street network speed limits. We then created a linear regression model to improve the accuracy of these street network estimates using weather, patient characteristics, use of lights and sirens, daylight, and rush-hour intervals. The model was derived from a 50% sample and validated on the remainder. Significance of the covariates was determined by p < 0.05 for a t-test of the model coefficients. Accuracy was quantified by the proportion of estimates that were within 5 minutes of the actual transport times recorded by computer-aided dispatch. We then built a Google Maps-based web application to demonstrate application in real-world EMS operations. Results There were 48,308 included transports. Street network estimates of transport time were accurate within 5 minutes of actual transport time less than 16% of the time. Actual transport times were longer during daylight and rush-hour intervals and shorter with use of lights and sirens. Age under 18 years, gender, wet weather, and trauma system entry were not significant predictors of transport time. Our model predicted arrival time within 5 minutes 73% of the time. For lights and sirens transports, accuracy was within 5 minutes 77% of the time. Accuracy was identical in the validation dataset. Lights and sirens saved an average of 3.1 minutes for transports under 8.8 minutes, and 5.3 minutes for longer transports. Conclusions An estimate of transport time based only on a street network significantly underestimated transport times. A simple model incorporating few variables can predict ambulance time of arrival to the emergency department with good accuracy. This model could be linked to global positioning system data and an automated Google Maps web application to optimize emergency department resource use. Use of lights and sirens had a significant effect on transport times. PMID:23865736
NASA Astrophysics Data System (ADS)
Zhang, X.; Wu, B.; Zhang, M.; Zeng, H.
2017-12-01
Rice is one of the main staple foods in East Asia and Southeast Asia, which has occupied more than half of the world's population with 11% of cultivated land. Study on rice can provide direct or indirect information on food security and water source management. Remote sensing has proven to be the most effective method to monitoring the cropland in large scale by using temporary and spectral information. There are two main kinds of satellite have been used to mapping rice including microwave and optical. Rice, as the main crop of paddy fields, the main feature different from other crops is flooding phenomenon at planning stage (Figure 1). Microwave satellites can penetrate through clouds and efficiency on monitoring flooding phenomenon. Meanwhile, the vegetation index based on optical satellite can well distinguish rice from other vegetation. Google Earth Engine is a cloud-based platform that makes it easy to access high-performance computing resources for processing very large geospatial datasets. Google has collected large number of remote sensing satellite data around the world, which providing researchers with the possibility of doing application by using multi-source remote sensing data in a large area. In this work, we map rice planting area in south China through integration of Landsat-8 OLI, Sentienl-2, and Sentinel-1 Synthetic Aperture Radar (SAR) images. The flowchart is shown in figure 2. First, a threshold method the VH polarized backscatter from SAR sensor and vegetation index including normalized difference vegetation index (NDVI) and enhanced vegetation index (EVI) from optical sensor were used the classify the rice extent map. The forest and water surface extent map provided by earth engine were used to mask forest and water. To overcome the problem of the "salt and pepper effect" by Pixel-based classification when the spatial resolution increased, we segment the optical image and use the pixel- based classification results to merge the object-oriented segmentation data, and finally get the rice extent map. At last, by using the time series analysis, the peak count was obtained for each rice area to ensure the crop intensity. In this work, the rice ground point from a GVG crowdsourcing smartphone and rice area statistical results from National Bureau of Statistics were used to validate and evaluate our result.
Integrating Radar Image Data with Google Maps
NASA Technical Reports Server (NTRS)
Chapman, Bruce D.; Gibas, Sarah
2010-01-01
A public Web site has been developed as a method for displaying the multitude of radar imagery collected by NASA s Airborne Synthetic Aperture Radar (AIRSAR) instrument during its 16-year mission. Utilizing NASA s internal AIRSAR site, the new Web site features more sophisticated visualization tools that enable the general public to have access to these images. The site was originally maintained at NASA on six computers: one that held the Oracle database, two that took care of the software for the interactive map, and three that were for the Web site itself. Several tasks were involved in moving this complicated setup to just one computer. First, the AIRSAR database was migrated from Oracle to MySQL. Then the back-end of the AIRSAR Web site was updated in order to access the MySQL database. To do this, a few of the scripts needed to be modified; specifically three Perl scripts that query that database. The database connections were then updated from Oracle to MySQL, numerous syntax errors were corrected, and a query was implemented that replaced one of the stored Oracle procedures. Lastly, the interactive map was designed, implemented, and tested so that users could easily browse and access the radar imagery through the Google Maps interface.
NASA Astrophysics Data System (ADS)
Manaud, Nicolas; Carter, John; Boix, Oriol
2016-10-01
The "Where On Mars?" project is essentially the evolution of an existing outreach product developed in collaboration between ESA and CartoDB; an interactive map visualisation of the ESA's ExoMars Rover candidate landing sites (whereonmars.co). Planetary imagery data and maps are increasingly produced by the scientific community, and shared typically as images, in scientific publications, presentations or public outreach websites. However, this media lacks of interactivity and contextual information available for further exploration, making it difficult for any audience to relate one location-based information to another. We believe that interactive web maps are a powerful way of telling stories, engaging with and educating people who, over the last decade, have become familiar with tools such as Google Maps. A few planetary web maps exist but they are either too complex for non-experts, or are closed-systems that do not allows anyone to publish and share content. The long-term vision for the project is to provide researchers, communicators, educators and a worldwide public with an open planetary mapping and social platform enabling them to create, share, communicate and consume research-based content. We aim for this platform to become the reference website everyone will go to learn about Mars and other planets in our Solar System; just like people head to Google Maps to find their bearings or any location-based information. The driver is clearly to create for people an emotional connection with Mars. The short-term objectives for the project are (1) to produce and curate an open repository of basemaps, geospatial data sets, map visualisations, and story maps; (2) to develop a beautifully crafted and engaging interactive map of Mars. Based on user-generated content, the underlying framework should (3) make it easy to create and share additional interactive maps telling specific stories.
Definitions of Quality in Higher Education: A Synthesis of the Literature
ERIC Educational Resources Information Center
Schindler, Laura; Puls-Elvidge, Sarah; Welzant, Heather; Crawford, Linda
2015-01-01
The aim of this paper is to provide a synthesis of the literature on defining quality in the context of higher education. During a search for relevant literature, the authors intentionally cast a wide net, beginning with a broad search in Google Scholar and followed by a narrower search in educational databases, including Academic Search Complete,…
Measuring Neighborhood Walkable Environments: A Comparison of Three Approaches
Chiang, Yen-Cheng; Sullivan, William; Larsen, Linda
2017-01-01
Multiple studies have revealed the impact of walkable environments on physical activity. Scholars attach considerable importance to leisure and health-related walking. Recent studies have used Google Street View as an instrument to assess city streets and walkable environments; however, no study has compared the validity of Google Street View assessments of walkable environment attributes to assessments made by local residents and compiled from field visits. In this study, we involved nearby residents and compared the extent to which Google Street View assessments of the walkable environment correlated with assessments from local residents and with field visits. We determined the assessment approaches (local resident or field visit assessments) that exhibited the highest agreement with Google Street View. One city with relatively high-quality walkable environments and one city with relatively low-quality walkable environments were examined, and three neighborhoods from each city were surveyed. Participants in each neighborhood used one of three approaches to assess the walkability of the environment: 15 local residents assessed the environment using a map, 15 participants made a field visit to assess the environment, and 15 participants used Google Street View to assess the environment, yielding a total of 90 valid samples for the two cities. Findings revealed that the three approaches to assessing neighborhood walkability were highly correlated for traffic safety, aesthetics, sidewalk quality, and physical barriers. Compared with assessments from participants making field visits, assessments by local residents were more highly correlated with Google Street View assessments. Google Street View provides a more convenient, low-cost, efficient, and safe approach to assess neighborhood walkability. The results of this study may facilitate future large-scale walkable environment surveys, effectively reduce expenses, and improve survey efficiency. PMID:28587186
Vanwolleghem, Griet; Van Dyck, Delfien; Ducheyne, Fabian; De Bourdeaudhuij, Ilse; Cardon, Greet
2014-06-10
Google Street View provides a valuable and efficient alternative to observe the physical environment compared to on-site fieldwork. However, studies on the use, reliability and validity of Google Street View in a cycling-to-school context are lacking. We aimed to study the intra-, inter-rater reliability and criterion validity of EGA-Cycling (Environmental Google Street View Based Audit - Cycling to school), a newly developed audit using Google Street View to assess the physical environment along cycling routes to school. Parents (n = 52) of 11-to-12-year old Flemish children, who mostly cycled to school, completed a questionnaire and identified their child's cycling route to school on a street map. Fifty cycling routes of 11-to-12-year olds were identified and physical environmental characteristics along the identified routes were rated with EGA-Cycling (5 subscales; 37 items), based on Google Street View. To assess reliability, two researchers performed the audit. Criterion validity of the audit was examined by comparing the ratings based on Google Street View with ratings through on-site assessments. Intra-rater reliability was high (kappa range 0.47-1.00). Large variations in the inter-rater reliability (kappa range -0.03-1.00) and criterion validity scores (kappa range -0.06-1.00) were reported, with acceptable inter-rater reliability values for 43% of all items and acceptable criterion validity for 54% of all items. EGA-Cycling can be used to assess physical environmental characteristics along cycling routes to school. However, to assess the micro-environment specifically related to cycling, on-site assessments have to be added.
Geolokit: An interactive tool for visualising and exploring geoscientific data in Google Earth
NASA Astrophysics Data System (ADS)
Triantafyllou, Antoine; Watlet, Arnaud; Bastin, Christophe
2017-10-01
Virtual globes have been developed to showcase different types of data combining a digital elevation model and basemaps of high resolution satellite imagery. Hence, they became a standard to share spatial data and information, although they suffer from a lack of toolboxes dedicated to the formatting of large geoscientific dataset. From this perspective, we developed Geolokit: a free and lightweight software that allows geoscientists - and every scientist working with spatial data - to import their data (e.g., sample collections, structural geology, cross-sections, field pictures, georeferenced maps), to handle and to transcribe them to Keyhole Markup Language (KML) files. KML files are then automatically opened in the Google Earth virtual globe and the spatial data accessed and shared. Geolokit comes with a large number of dedicated tools that can process and display: (i) multi-points data, (ii) scattered data interpolations, (iii) structural geology features in 2D and 3D, (iv) rose diagrams, stereonets and dip-plunge polar histograms, (v) cross-sections and oriented rasters, (vi) georeferenced field pictures, (vii) georeferenced maps and projected gridding. Therefore, together with Geolokit, Google Earth becomes not only a powerful georeferenced data viewer but also a stand-alone work platform. The toolbox (available online at http://www.geolokit.org) is written in Python, a high-level, cross-platform programming language and is accessible through a graphical user interface, designed to run in parallel with Google Earth, through a workflow that requires no additional third party software. Geolokit features are demonstrated in this paper using typical datasets gathered from two case studies illustrating its applicability at multiple scales of investigation: a petro-structural investigation of the Ile d'Yeu orthogneissic unit (Western France) and data collection of the Mariana oceanic subduction zone (Western Pacific).
Barnhill, Rick; Heermann-Do, Kimberly A; Salzman, Keith L; Gimbel, Ronald W
2011-01-01
Objective To design, build, implement, and evaluate a personal health record (PHR), tethered to the Military Health System, that leverages Microsoft® HealthVault and Google® Health infrastructure based on user preference. Materials and methods A pilot project was conducted in 2008–2009 at Madigan Army Medical Center in Tacoma, Washington. Our PHR was architected to a flexible platform that incorporated standards-based models of Continuity of Document and Continuity of Care Record to map Department of Defense-sourced health data, via a secure Veterans Administration data broker, to Microsoft® HealthVault and Google® Health based on user preference. The project design and implementation were guided by provider and patient advisory panels with formal user evaluation. Results The pilot project included 250 beneficiary users. Approximately 73.2% of users were <65 years of age, and 38.4% were female. Of the users, 169 (67.6%) selected Microsoft® HealthVault, and 81 (32.4%) selected Google® Health as their PHR of preference. Sample evaluation of users reflected 100% (n=60) satisfied with convenience of record access and 91.7% (n=55) satisfied with overall functionality of PHR. Discussion Key lessons learned related to data-transfer decisions (push vs pull), purposeful delays in reporting sensitive information, understanding and mapping PHR use and clinical workflow, and decisions on information patients may choose to share with their provider. Conclusion Currently PHRs are being viewed as empowering tools for patient activation. Design and implementation issues (eg, technical, organizational, information security) are substantial and must be thoughtfully approached. Adopting standards into design can enhance the national goal of portability and interoperability. PMID:21292705
Usability evaluation of cloud-based mapping tools for the display of very large datasets
NASA Astrophysics Data System (ADS)
Stotz, Nicole Marie
The elasticity and on-demand nature of cloud services have made it easier to create web maps. Users only need access to a web browser and the Internet to utilize cloud based web maps, eliminating the need for specialized software. To encourage a wide variety of users, a map must be well designed; usability is a very important concept in designing a web map. Fusion Tables, a new product from Google, is one example of newer cloud-based distributed GIS services. It allows for easy spatial data manipulation and visualization, within the Google Maps framework. ESRI has also introduced a cloud based version of their software, called ArcGIS Online, built on Amazon's EC2 cloud. Utilizing a user-centered design framework, two prototype maps were created with data from the San Diego East County Economic Development Council. One map was built on Fusion Tables, and another on ESRI's ArcGIS Online. A usability analysis was conducted and used to compare both map prototypes in term so of design and functionality. Load tests were also ran, and performance metrics gathered on both map prototypes. The usability analysis was taken by 25 geography students, and consisted of time based tasks and questions on map design and functionality. Survey participants completed the time based tasks for the Fusion Tables map prototype quicker than those of the ArcGIS Online map prototype. While response was generally positive towards the design and functionality of both prototypes, overall the Fusion Tables map prototype was preferred. For the load tests, the data set was broken into 22 groups for a total of 44 tests. While the Fusion Tables map prototype performed more efficiently than the ArcGIS Online prototype, differences are almost unnoticeable. A SWOT analysis was conducted for each prototype. The results from this research point to the Fusion Tables map prototype. A redesign of this prototype would incorporate design suggestions from the usability survey, while some functionality would need to be dropped. This is a free product and would therefore be the best option if cost is an issue, but this map may not be supported in the future.
DistMap: a toolkit for distributed short read mapping on a Hadoop cluster.
Pandey, Ram Vinay; Schlötterer, Christian
2013-01-01
With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/
DistMap: A Toolkit for Distributed Short Read Mapping on a Hadoop Cluster
Pandey, Ram Vinay; Schlötterer, Christian
2013-01-01
With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/ PMID:24009693
Learning GIS and exploring geolocated data with the all-in-one Geolokit toolbox for Google Earth
NASA Astrophysics Data System (ADS)
Watlet, A.; Triantafyllou, A.; Bastin, C.
2016-12-01
GIS software are today's essential tools to gather and visualize geological data, to apply spatial and temporal analysis and finally, to create and share interactive maps for further investigations in geosciences. Such skills are especially essential to learn for students who go through fieldtrips, samples collections or field experiments. However, time is generally missing to teach in detail all the aspects of visualizing geolocated geoscientific data. For these purposes, we developed Geolokit: a lightweight freeware dedicated to geodata visualization and written in Python, a high-level, cross-platform programming language. Geolokit software is accessible through a graphical user interface, designed to run in parallel with Google Earth, benefitting from the numerous interactive capabilities. It is designed as a very user-friendly toolbox that allows `geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to visualize these into the Google Earth environment using KML code; with no require of third party software, except Google Earth itself. Geolokit comes with a large number of geosciences labels, symbols, colours and placemarks and is applicable to display several types of geolocated data, including: Multi-points datasets Automatically computed contours of multi-points datasets via several interpolation methods Discrete planar and linear structural geology data in 2D or 3D supporting large range of structures input format Clustered stereonets and rose diagrams 2D cross-sections as vertical sections Georeferenced maps and grids with user defined coordinates Field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS In the end, Geolokit is helpful for quickly visualizing and exploring data without losing too much time in the numerous capabilities of GIS software suites. We are looking for students and teachers to discover all the functionalities of Geolokit. As this project is under development and planned to be open source, we are definitely looking to discussions regarding particular needs or ideas, and to contributions in the Geolokit project.
Large Scale Crop Mapping in Ukraine Using Google Earth Engine
NASA Astrophysics Data System (ADS)
Shelestov, A.; Lavreniuk, M. S.; Kussul, N.
2016-12-01
There are no globally available high resolution satellite-derived crop specific maps at present. Only coarse-resolution imagery (> 250 m spatial resolution) has been utilized to derive global cropland extent. In 2016 we are going to carry out a country level demonstration of Sentinel-2 use for crop classification in Ukraine within the ESA Sen2-Agri project. But optical imagery can be contaminated by cloud cover that makes it difficult to acquire imagery in an optimal time range to discriminate certain crops. Due to the Copernicus program since 2015, a lot of Sentinel-1 SAR data at high spatial resolution is available for free for Ukraine. It allows us to use the time series of SAR data for crop classification. Our experiment for one administrative region in 2015 showed much higher crop classification accuracy with SAR data than with optical only time series [1, 2]. Therefore, in 2016 within the Google Earth Engine Research Award we use SAR data together with optical ones for large area crop mapping (entire territory of Ukraine) using cloud computing capabilities available at Google Earth Engine (GEE). This study compares different classification methods for crop mapping for the whole territory of Ukraine using data and algorithms from GEE. Classification performance assessed using overall classification accuracy, Kappa coefficients, and user's and producer's accuracies. Also, crop areas from derived classification maps compared to the official statistics [3]. S. Skakun et al., "Efficiency assessment of multitemporal C-band Radarsat-2 intensity and Landsat-8 surface reflectance satellite imagery for crop classification in Ukraine," IEEE Journal of Selected Topics in Applied Earth Observ. and Rem. Sens., 2015, DOI: 10.1109/JSTARS.2015.2454297. N. Kussul, S. Skakun, A. Shelestov, O. Kussul, "The use of satellite SAR imagery to crop classification in Ukraine within JECAM project," IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp.1497-1500, 13-18 July 2014, Quebec City, Canada. F.J. Gallego, N. Kussul, S. Skakun, O. Kravchenko, A. Shelestov, O. Kussul, "Efficiency assessment of using satellite data for crop area estimation in Ukraine," International Journal of Applied Earth Observation and Geoinformation vol. 29, pp. 22-30, 2014.
Results of Prospecting of Impact Craters in Morocco
NASA Astrophysics Data System (ADS)
Chaabout, S.; Chennaoui Aoudjehane, H.; Reimold, W. U.; Baratoux, D.
2014-09-01
This work is based to use satellite images of Google Earth and Yahoo-Maps scenes; we examined the surface of our country to be able to locate the structures that have a circular morphology such as impact craters, which potentially could be.
NASA Astrophysics Data System (ADS)
Davias, M. E.; Gilbride, J. L.
2011-12-01
Aerial photographs of Carolina bays taken in the 1930's sparked the initial research into their geomorphology. Satellite Imagery available today through the Google Earth Virtual Globe facility expands the regions available for interrogation, but reveal only part of their unique planforms. Digital Elevation Maps (DEMs), using Light Detection And Ranging (LiDAR) remote sensing data, accentuate the visual presentation of these aligned ovoid shallow basins by emphasizing their robust circumpheral rims. To support a geospatial survey of Carolina bay landforms in the continental USA, 400,000 km2 of hsv-shaded DEMs were created as KML-JPEG tile sets. A majority of these DEMs were generated with LiDAR-derived data. We demonstrate the tile generation process and their integration into Google Earth, where the DEMs augment available photographic imagery for the visualization of bay planforms. While the generic Carolina bay planform is considered oval, we document subtle regional variations. Using a small set of empirically derived planform shapes, we created corresponding Google Earth overlay templates. We demonstrate the analysis of an individual Carolina bay by placing an appropriate overlay onto the virtually globe, then orientating, sizing and rotating it by edit handles such that it satisfactorily represents the bay's rim. The resulting overlay data element is extracted from Google Earth's object directory and programmatically processed to generate metrics such as geographic location, elevation, major and minor axis and inferred orientation. Utilizing a virtual globe facility for data capture may result in higher quality data compared to methods that reference flat maps, where geospatial shape and orientation of the bays could be skewed and distorted in the orthographic projection process. Using the methodology described, we have measured over 25k distinct Carolina bays. We discuss the Google Fusion geospatial data repository facility, through which these data have been assembled and made web-accessible to other researchers. Preliminary findings from the survey are discussed, such as how bay surface area, eccentricity and orientation vary across ~800 1/4° × 1/4° grid elements. Future work includes measuring 25k additional bays, as well as interrogation of the orientation data to identify any possible systematic geospatial relationships.
Mapping of Sample Collection Data: GIS Tools for the Natural Product Researcher
Oberlies, Nicholas H.; Rineer, James I.; Alali, Feras Q.; Tawaha, Khaled; Falkinham, Joseph O.; Wheaton, William D.
2009-01-01
Scientists engaged in the research of natural products often either conduct field collections themselves or collaborate with partners who do, such as botanists, mycologists, or SCUBA divers. The information gleaned from such collecting trips (e.g. longitude/latitude coordinates, geography, elevation, and a multitude of other field observations) have provided valuable data to the scientific community (e.g., biodiversity), even if it is tangential to the direct aims of the natural products research, which are often focused on drug discovery and/or chemical ecology. Geographic Information Systems (GIS) have been used to display, manage, and analyze geographic data, including collection sites for natural products. However, to the uninitiated, these tools are often beyond the financial and/or computational means of the natural product scientist. With new, free, and easy-to-use geospatial visualization tools, such as Google Earth, mapping and geographic imaging of sampling data are now within the reach of natural products scientists. The goals of the present study were to develop simple tools that are tailored for the natural products setting, thereby presenting a means to map such information, particularly via open source software like Google Earth. PMID:20161345
iPads at Field Camp: A First Test of the Challenges and Opportunities
NASA Astrophysics Data System (ADS)
Hurst, S. D.; Stewart, M. A.
2011-12-01
An iPad 2 was given to approximately half of the University of Illinois students attending the Wasatch-Uinta Field Camp (WUFC) in summer 2011. The iPads were provisioned with orientation measuring, mapping and location software. The software would automatically transfer an orientation measurement to the current location on the Google Maps application, and was able to output a full list of orientation data. Students also had normal access to more traditional mapping tools such as Brunton compasses and GPS units and were required to map with these tools along with other students of WUFC not provided iPads. Compared to traditional tools, iPads have drawbacks such as increased weight, break-ability, need for power source and wireless connectivity; in sum, they need a substantial infrastructure that reduces range, availability, and probably most importantly, convenience. Some of these drawbacks inhibited adoption by our students, the primary reasons being the added weight and the inability to map directly to a GIS application with detailed topographic maps equivalent to the physical topographic map sheets used at WUFC. In their favor, the iPads combine a host of tools into one, including software that can measure orientation in a fashion more intuitively than a Brunton. They also allow storage, editing and analysis of data, notes (spoken and/or written) and potentially unlimited access to a variety of maps. Via a post-field camp survey of the University of Illinois students at WUFC, we have identified some of the important issues that need to be addressed before portable tablets like the iPad become the tool of choice for general field work. Some problems are intrinsic to almost any advanced technology, some are artifacts of the current generations of hardware and software available for these devices. Technical drawbacks aside, the adoption of iPads was further inhibited primarily by inexperience with their use as a mapping tool and secondarily by their redundancy with traditional tools. We are addressing some aspects of software limitations and future technology improvements by the industry will naturally reduce other limitations. We will continue testing iPads during field trips and courses for the foreseeable future. As we begin to deal with these limitations and students become more accustomed to their use in the field, we expect our students to more fully embrace iPads as a convenient field and mapping tool.
Satellite Radar Detects Damage from Sept. 19, 2017 Raboso, Mexico, Quake
2017-09-20
The Advanced Rapid Imaging and Analysis (ARIA) team at NASA's Jet Propulsion Laboratory in Pasadena, California, and Caltech, also in Pasadena, created this Damage Proxy Map (DPM) depicting areas of Central Mexico, including Mexico City, that are likely damaged (shown by red and yellow pixels) from the magnitude 7.1 Raboso earthquake of Sept. 19, 2017 (local time). The map is derived from synthetic aperture radar (SAR) images from the Copernicus Sentinel-1A and Sentinel-1B satellites, operated by the European Space Agency (ESA). The images were taken before (Sept. 8, 2017) and after (Sept. 20, 2017) the earthquake. The map covers an area of 109 by 106 miles (175 by 170 kilometers). Each pixel measures about 33 yards (30 meters) across. The color variation from yellow to red indicates increasingly more significant ground and building surface change. Preliminary validation was done by comparing the DPM to a crowd-sourced Google Map (https://www.google.com/maps/d/u/0/viewer?mid=1_-V97lbdgLFHpx-CtqhLWlJAnYY&ll=19.41452166501326%2C-99.16498240436704&z=16). This damage proxy map should be used as guidance to identify damaged areas, and may be less reliable over vegetated areas. Sentinel-1 data were accessed through the Copernicus Open Access Hub. The image contains modified Copernicus Sentinel data (2017), processed by ESA and analyzed by the NASA-JPL/Caltech ARIA team. This research was carried out at JPL under contract with NASA. https://photojournal.jpl.nasa.gov/catalog/PIA21963
Measurable realistic image-based 3D mapping
NASA Astrophysics Data System (ADS)
Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.
2011-12-01
Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable realistic image-based (MRI) system can produce. The major contribution here is the implementation of measurable images on 3D maps to obtain various measurements from real scenes.
NASA Astrophysics Data System (ADS)
Igarashi, Masayasu; Murao, Osamu
In this paper, the authors develop a multiple regression model which estimates urban earthquake vulnerability (building collapse risk and conflagration risk) for different eras, and clarify the historical changes of urban risk in Marunouchi and Ginza Districts in Tokyo, Japan using old maps and contemporary geographic information data. Also, we compare the change of urban vulnerability of the districts with the significant historical events in Tokyo. Finally, the results are loaded onto Google Earth with timescale extension to consider the possibility of urban recovery digital archives in the era of the recent geoinformatic technologies.
Immunochromatographic diagnostic test analysis using Google Glass.
Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan
2014-03-25
We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health.
Immunochromatographic Diagnostic Test Analysis Using Google Glass
2014-01-01
We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health. PMID:24571349
JournalMap: Geo-semantic searching for relevant knowledge
USDA-ARS?s Scientific Manuscript database
Ecologists struggling to understand rapidly changing environments and evolving ecosystem threats need quick access to relevant research and documentation of natural systems. The advent of semantic and aggregation searching (e.g., Google Scholar, Web of Science) has made it easier to find useful lite...
GIS tool to locate major Sikh temples in USA
NASA Astrophysics Data System (ADS)
Sharma, Saumya
This tool is a GIS based interactive and graphical user interface tool, which locates the major Sikh temples of USA on a map. This tool is using Java programming language along with MOJO (Map Object Java Object) provided by ESRI that is the organization that provides the GIS software. It also includes some of the integration with Google's API's like Google Translator API. This application will tell users about the origin of Sikhism in India and USA, the major Sikh temples in each state of USA, location, name and detail information through their website. The primary purpose of this application is to make people aware about this religion and culture. This tool will also measure the distance between two temple points in a map and display the result in miles and kilometers. Also, there is an added support to convert each temple's website language from English to Punjabi or any other language using a language convertor tool so that people from different nationalities can understand their culture. By clicking on each point on a map, a new window will pop up showing the picture of the temple and a hyperlink that will redirect to the website of that particular temple .It will also contain links to their dance, music, history, and also a help menu to guide the users to use the software efficiently.
New Delhi Metallo-beta-lactamase around the world: an eReview using Google Maps.
Berrazeg, M; Diene, Sm; Medjahed, L; Parola, P; Drissi, M; Raoult, D; Rolain, Jm
2014-05-22
Gram-negative carbapenem-resistant bacteria, in particular those producing New Delhi Metallo-betalactamase-1 (NDM-1), are a major global health problem. To inform the scientific and medical community in real time about worldwide dissemination of isolates of NDM-1-producing bacteria, we used the PubMed database to review all available publications from the first description in 2009 up to 31 December 2012, and created a regularly updated worldwide dissemination map using a web-based mapping application. We retrieved 33 reviews, and 136 case reports describing 950 isolates of NDM-1-producing bacteria. Klebsiella pneumoniae (n= 359) and Escherichia coli (n=268) were the most commonly reported bacteria producing NDM-1 enzyme. Several case reports of infections due to imported NDM-1 producing bacteria have been reported in a number of countries, including the United Kingdom, Italy, and Oman. In most cases (132/153, 86.3%), patients had connections with the Indian subcontinent or Balkan countries. Those infected were originally from these areas, had either spent time and/or been hospitalised there, or were potentially linked to other patients who had been hospitalised in these regions. By using Google Maps, we were able to trace spread of NDM-1-producing bacteria. We strongly encourage epidemiologists to use these types of interactive tools for surveillance purposes and use the information to prevent the spread and outbreaks of such bacteria.
Researchermap: a tool for visualizing author locations using Google maps.
Rastegar-Mojarad, Majid; Bales, Michael E; Yu, Hong
2013-01-01
We hereby present ResearcherMap, a tool to visualize locations of authors of scholarly papers. In response to a query, the system returns a map of author locations. To develop the system we first populated a database of author locations, geocoding institution locations for all available institutional affiliation data in our database. The database includes all authors of Medline papers from 1990 to 2012. We conducted a formative heuristic usability evaluation of the system and measured the system's accuracy and performance. The accuracy of finding the accurate address is 97.5% in our system.
Peuchaud, Sheila
2014-06-01
This paper represents a case study of how social media activists have harnessed the power of Facebook, Twitter and mobile phone networks to address sexual harassment in Egypt. HarassMap plots reports of sexual harassment on a Google Map and informs victims of support services. Tahrir Bodyguard and Operation Anti-Sexual Harassment (OpAntiSH) protect female protestors who have been vulnerable to sexual aggression at the hands of unruly mobs and by agents of the state. Activists have access to an Android app called 'I'm Getting Arrested' or 'Byt2ebed 3alia' in Egyptian Arabic. The app sends the time and GPS coordinates of an arrest to family, fellow activists, legal counsel and social media outlets. The hope is the initiatives described in this paper could inspire public health ministries and activist NGOs to incorporate crowdsourcing social media applications in the spirit of health in all policies (HiAP). To that end, this paper will begin by defining social media activism from the perspective of the communications discipline. This paper will then demonstrate the significance of sexual harassment as a public health issue, and describe several social media efforts to document incidents and protect victims. The paper will conclude with discussion regarding how these innovations could be integrated into the HiAP approach. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Signalling maps in cancer research: construction and data analysis
Kondratova, Maria; Sompairac, Nicolas; Barillot, Emmanuel; Zinovyev, Andrei
2018-01-01
Abstract Generation and usage of high-quality molecular signalling network maps can be augmented by standardizing notations, establishing curation workflows and application of computational biology methods to exploit the knowledge contained in the maps. In this manuscript, we summarize the major aims and challenges of assembling information in the form of comprehensive maps of molecular interactions. Mainly, we share our experience gained while creating the Atlas of Cancer Signalling Network. In the step-by-step procedure, we describe the map construction process and suggest solutions for map complexity management by introducing a hierarchical modular map structure. In addition, we describe the NaviCell platform, a computational technology using Google Maps API to explore comprehensive molecular maps similar to geographical maps and explain the advantages of semantic zooming principles for map navigation. We also provide the outline to prepare signalling network maps for navigation using the NaviCell platform. Finally, several examples of cancer high-throughput data analysis and visualization in the context of comprehensive signalling maps are presented. PMID:29688383
Brunger, Fern; Welch, Vivian; Asghari, Shabnam; Kaposy, Chris
2018-01-01
Background This paper focuses on the collision of three factors: a growing emphasis on sharing research through open access publication, an increasing awareness of big data and its potential uses, and an engaged public interested in the privacy and confidentiality of their personal health information. One conceptual space where this collision is brought into sharp relief is with the open availability of patient medical photographs from peer-reviewed journal articles in the search results of online image databases such as Google Images. Objective The aim of this study was to assess the availability of patient medical photographs from published journal articles in Google Images search results and the factors impacting this availability. Methods We conducted a cross-sectional study using data from an evidence map of research with transgender, gender non-binary, and other gender diverse (trans) participants. For the original evidence map, a comprehensive search of 15 academic databases was developed in collaboration with a health sciences librarian. Initial search results produced 25,230 references after duplicates were removed. Eligibility criteria were established to include empirical research of any design that included trans participants or their personal information and that was published in English in peer-reviewed journals. We identified all articles published between 2008 and 2015 with medical photographs of trans participants. For each reference, images were individually numbered in order to track the total number of medical photographs. We used odds ratios (OR) to assess the association between availability of the clinical photograph on Google Images and the following factors: whether the article was openly available online (open access, Researchgate.net, or Academia.edu), whether the article included genital images, if the photographs were published in color, and whether the photographs were located on the journal article landing page. Results We identified 94 articles with medical photographs of trans participants, including a total of 605 photographs. Of the 94 publications, 35 (37%) included at least one medical photograph that was found on Google Images. The ability to locate the article freely online contributes to the availability of at least one image from the article on Google Images (OR 2.99, 95% CI 1.20-7.45). Conclusions This is the first study to document the existence of medical photographs from peer-reviewed journals appearing in Google Images search results. Almost all of the images we searched for included sensitive photographs of patient genitals, chests, or breasts. Given that it is unlikely that patients consented to sharing their personal health information in these ways, this constitutes a risk to patient privacy. Based on the impact of current practices, revisions to informed consent policies and guidelines are required. PMID:29483069
Transport Statistics - Transport - UNECE
Statistics and Data Online Infocards Database SDG Papers E-Road Census Traffic Census Map Traffic Census 2015 available. Two new datasets have been added to the transport statistics database: bus and coach statistics Database Evaluations Follow UNECE Facebook Rss Twitter You tube Contact us Instagram Flickr Google+ Â
An Automated Approach to Extracting River Bank Locations from Aerial Imagery Using Image Texture
2013-01-01
Atchafalaya River, LA. Map Data: Google, United States Department of Agriculture Farm Ser- vice Agency, Europa Technologies AUTOMATED RIVER BANK...traverse morphologically smooth landscapes including rivers in sand or ice . Within these limitations, we hold that this technique rep- resents a valuable
Children Creating Multimodal Stories about a Familiar Environment
ERIC Educational Resources Information Center
Kervin, Lisa; Mantei, Jessica
2017-01-01
Storytelling is a practice that enables children to apply their literacy skills. This article shares a collaborative literacy strategy devised to enable children to create multimodal stories about their familiar school environment. The strategy uses resources, including the children's own drawings, images from Google Maps, and the Puppet Pals…
Being There is Only the Beginning: Toward More Effective Web 2.0 Use in Academic Libraries
2010-01-02
Google is Our Friend,” and “ Plagiarism 101.” Also unlike the hard-to-find blogs, many academic libraries, including both Hollins University and Urbana...Effective Web 2.0 Use in Academic Libraries by Hanna C. Bachrach Pratt Institute...5a. CONTRACT NUMBER 2.0 Use in Academic Libraries 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Bachrach
Takada, Kenta
2012-01-01
Seasonal changes in the popularity of fireflies [usually Genji-fireflies (Luciola cruciata Motschulsky) in Japan] and Japanese rhinoceros beetles [Allomyrina dichotoma (Linne)] were investigated to examine whether contemporary Japanese are interested in visible emergence of these insects as seasonal events. The popularity of fireflies and Japanese rhinoceros beetles was assessed by the Google search volume of their Japanese names, “Hotaru” and “Kabuto-mushi” in Japanese Katakana script using Google Trends. The search volume index for fireflies and Japanese rhinoceros beetles was distributed across seasons with a clear peak in only particular times of each year from 2004 to 2011. In addition, the seasonal peak of popularity for fireflies occurred at the beginning of June, whereas that for Japanese rhinoceros beetles occurred from the middle of July to the beginning of August. Thus seasonal peak of each species coincided with the peak period of the emergence of each adult stage. These findings indicated that the Japanese are interested in these insects primarily during the time when the two species are most visibly abundant. Although untested, this could suggest that fireflies and Japanese rhinoceros beetles are perceived by the general public as indicators or symbols of summer in Japan. PMID:26466535
Mapping the Urban Side of the Earth- the new GUF+ Layer
NASA Astrophysics Data System (ADS)
Gorelick, N.; Marconcini, M.; Üreyen, S.; Zeidler, J.; Svaton, V.; Esch, T.
2017-12-01
From the beginning of the years 2000, it is estimated that more than half of the global population is living in cities and the dynamic trend of urbanization is growing at an unprecedented speed. In such framework, how does expanding population affect the surrounding landscape? Are urban areas making good use of limited space or is rapid urbanization threatening the planet's sustainability? What is the impact of urbanization on vulnerability to natural disasters? To try answering these and other challenging questions, a key information is to reliably know the location and characteristics (e.g. shape, extent, greenness) of human settlements worldwide. In this context, yet from the last decade different global maps outlining urban areas have started being produced. Here, DLR's Global Urban Footprint (GUF) layer, generated on the basis of very high resolution radar imagery, represents one of the most accurate and largely employed datasets. However, in order to overcome still existing limitations of the GUF layer, often originating from specifics of the underlying radar imagery, DLR developed a novel methodology that for the first time exploits mass multitemporal collections of optical and radar satellite imagery. The new approach has been employed for generating the GUF+ 2015 layer, a global map of settlement areas derived at 10m spatial resolution based overall on a joint analysis of hundreds of thousands of Landsat and Sentinel-1 scenes (processed with the support of Google Earth Engine) collected in the years 2014-2015. The GUF+2015 outperforms all other existing global human settlements maps and allows - among others - to considerably improve the detection of very small settlements in rural regions and better outline scattered peri-urban areas. Nevertheless, this is not an arrival but rather a starting point for generating a suite of additional products (GUF+ suite) supposed to support a 360° analysis of global urbanization - e.g. with data on the imperviousness/greenness and the spatiotemporal development of the built-up area over the last decades.
NASA Astrophysics Data System (ADS)
Ryan, J. G.; McIlrath, J. A.
2008-12-01
Web-accessible geospatial information system (GIS) technologies have advanced in concert with an expansion of data resources that can be accessed and used by researchers, educators and students. These resources facilitate the development of data-rich instructional resources and activities that can be used to transition seamlessly into undergraduate research projects. MARGINS Data in the Classroom (http://serc.carleton.edu/ margins/index.html) seeks to engage MARGINS researchers and educators in using the images, datasets, and visualizations produced by NSF-MARGINS Program-funded research and related efforts to create Web-deliverable instructional materials for use in undergraduate-level geoscience courses (MARGINS Mini-Lessons). MARGINS science data is managed by the Marine Geosciences Data System (MGDS), and these and all other MGDS-hosted data can be accessed, manipulated and visualized using GeoMapApp (www.geomapapp.org; Carbotte et al, 2004), a freely available geographic information system focused on the marine environment. Both "packaged" MGDS datasets (i.e., global earthquake foci, volcanoes, bathymetry) and "raw" data (seismic surveys, magnetics, gravity) are accessible via GeoMapApp, with WFS linkages to other resources (geodesy from UNAVCO; seismic profiles from IRIS; geochemical and drillsite data from EarthChem, IODP, and others), permitting the comprehensive characterization of many regions of the ocean basins. Geospatially controlled datasets can be imported into GeoMapApp visualizations, and these visualizations can be exported into Google Earth as .kmz image files. Many of the MARGINS Mini-Lessons thus far produced use (or have studentss use the varied capabilities of GeoMapApp (i.e., constructing topographic profiles, overlaying varied geophysical and bathymetric datasets, characterizing geochemical data). These materials are available for use and testing from the project webpage (http://serc.carleton.edu/margins/). Classroom testing and assessment of the Mini- Lessons begins this Fall.
Using open-source programs to create a web-based portal for hydrologic information
NASA Astrophysics Data System (ADS)
Kim, H.
2013-12-01
Some hydrologic data sets, such as basin climatology, precipitation, and terrestrial water storage, are not easily obtainable and distributable due to their size and complexity. We present a Hydrologic Information Portal (HIP) that has been implemented at the University of California for Hydrologic Modeling (UCCHM) and that has been organized around the large river basins of North America. This portal can be easily accessed through a modern web browser that enables easy access and visualization of such hydrologic data sets. Some of the main features of our HIP include a set of data visualization features so that users can search, retrieve, analyze, integrate, organize, and map data within large river basins. Recent information technologies such as Google Maps, Tornado (Python asynchronous web server), NumPy/SciPy (Scientific Library for Python) and d3.js (Visualization library for JavaScript) were incorporated into the HIP to create ease in navigating large data sets. With such open source libraries, HIP can give public users a way to combine and explore various data sets by generating multiple chart types (Line, Bar, Pie, Scatter plot) directly from the Google Maps viewport. Every rendered object such as a basin shape on the viewport is clickable, and this is the first step to access the visualization of data sets.
Kuperstein, I; Bonnet, E; Nguyen, H-A; Cohen, D; Viara, E; Grieco, L; Fourquet, S; Calzone, L; Russo, C; Kondratova, M; Dutreix, M; Barillot, E; Zinovyev, A
2015-01-01
Cancerogenesis is driven by mutations leading to aberrant functioning of a complex network of molecular interactions and simultaneously affecting multiple cellular functions. Therefore, the successful application of bioinformatics and systems biology methods for analysis of high-throughput data in cancer research heavily depends on availability of global and detailed reconstructions of signalling networks amenable for computational analysis. We present here the Atlas of Cancer Signalling Network (ACSN), an interactive and comprehensive map of molecular mechanisms implicated in cancer. The resource includes tools for map navigation, visualization and analysis of molecular data in the context of signalling network maps. Constructing and updating ACSN involves careful manual curation of molecular biology literature and participation of experts in the corresponding fields. The cancer-oriented content of ACSN is completely original and covers major mechanisms involved in cancer progression, including DNA repair, cell survival, apoptosis, cell cycle, EMT and cell motility. Cell signalling mechanisms are depicted in detail, together creating a seamless ‘geographic-like' map of molecular interactions frequently deregulated in cancer. The map is browsable using NaviCell web interface using the Google Maps engine and semantic zooming principle. The associated web-blog provides a forum for commenting and curating the ACSN content. ACSN allows uploading heterogeneous omics data from users on top of the maps for visualization and performing functional analyses. We suggest several scenarios for ACSN application in cancer research, particularly for visualizing high-throughput data, starting from small interfering RNA-based screening results or mutation frequencies to innovative ways of exploring transcriptomes and phosphoproteomes. Integration and analysis of these data in the context of ACSN may help interpret their biological significance and formulate mechanistic hypotheses. ACSN may also support patient stratification, prediction of treatment response and resistance to cancer drugs, as well as design of novel treatment strategies. PMID:26192618
Understanding Urban Watersheds through Digital Interactive Maps, San Francisco Bay Area, California
NASA Astrophysics Data System (ADS)
Sowers, J. M.; Ticci, M. G.; Mulvey, P.
2014-12-01
Dense urbanization has resulted in the "disappearance" of many local creeks in urbanized areas surrounding the San Francisco Bay. Long reaches of creeks now flow in underground pipes. Municipalities and water agencies trying to reduce non-point-source pollution are faced with a public that cannot see and therefore does not understand the interconnected nature of the drainage system or its ultimate discharge to the bay. Since 1993, we have collaborated with the Oakland Museum, the San Francisco Estuary Institute, public agencies, and municipalities to create creek and watershed maps to address the need for public understanding of watershed concepts. Fifteen paper maps are now published (www.museumca.org/creeks), which have become a standard reference for educators and anyone working on local creek-related issues. We now present digital interactive creek and watershed maps in Google Earth. Four maps are completed covering urbanized areas of Santa Clara and Alameda Counties. The maps provide a 3D visualization of the watersheds, with cartography draped over the landscape in transparent colors. Each mapped area includes both Present and Past (circa 1800s) layers which can be clicked on or off by the user. The Present layers include the modern drainage network, watershed boundaries, and reservoirs. The Past layers include the 1800s-era creek systems, tidal marshes, lagoons, and other habitats. All data are developed in ArcGIS software and converted to Google Earth format. To ensure the maps are interesting and engaging, clickable icons pop-up provide information on places to visit, restoration projects, history, plants, and animals. Maps of Santa Clara Valley are available at http://www.valleywater.org/WOW.aspx. Maps of western Alameda County will soon be available at http://acfloodcontrol.org/. Digital interactive maps provide several advantages over paper maps. They are seamless within each map area, and the user can zoom in or out, and tilt, and fly over to explore any area of interest. They can be easily customized, for example, adding placemarks or notes. Enrichment information can be added, using clickable icons, without cluttering the map. Best, the maps are fun to use. Digital interactive maps will be another effective tool for enhancing public understanding of urban creeks & watersheds.
The implementation of a modernized Dynamic Digital Map on Gale Crater, Mars
NASA Astrophysics Data System (ADS)
McBeck, J.; Condit, C. D.
2012-12-01
Currently, geology instructors present information to students via PowerPoint, Word, Excel and other programs that are not designed to parse or present geologic data. More tech-savvy, and perhaps better-funded, instructors use Google Earth or ArcGIS to display geologic maps and other visual information. However, Google Earth lacks the ability to present large portions of text, and ArcGIS restricts such functionality to labels and annotations. The original Dynamic Digital Map, which we have renamed Dynamic Digital Map Classic (DDMC), allows instructors to represent both visual and large portions of textual information to students. This summer we generalized the underlying architecture of DDMC, redesigned the user interface, modernized the analytical functionality, renamed the older version and labeled this new creature Dynamic Digital Map Extended (DDME). With the new DDME instructors can showcase maps, images, articles and movies, and create digital field trips. They can set the scale, coordinate system and caption of maps and images, add symbol links to maps and images that can transport the user to any specified destination—either internally (to data contained within the DDME) or externally (to a website address). Instructors and students can also calculate non-linear distances and irregular areas of maps and images, and create digital field trips with any number of stops—complete with notes and driving directions. DDMEs are perhaps best described as a sort of computerized, self-authored, interactive textbook. To display the vast capabilities of DDME, we created a DDME of Gale Crater (DDME-GC), which is the landing site of the most sophisticated NASA Mars Rover—Curiosity. DDME-GC hosts six thematic maps: a detailed geologic map provided by Brad Thompson of the Boston University Center for Remote Sensing (Thompson, et al., 2010), and five maps maintained in ASU's JMARS system, including global mosaics from Mars Global Surveyor's Mars Orbiter Laser Altimeter (MOLA), Mars Odyssey's Thermal Emission Imaging System (THEMIS), and the Mars Digital Image Model. DDME-GC offers a diverse suite of images, with over 40 images captured in the High Resolution Imaging Science Experiment (HiRISE), as well as several global mosaics created from Viking Orbiter, Hubble Telescope, THEMIS, MOLA and HiRISE data. DDME-GC also provides more than 25 articles that span subjects from the possible origins of the mound located in Gale Crater to the goals of NASA's Mars Exploration Program. The movies hosted by DDME-GC describe the difficulties of selecting a landing site for Curiosity, landing Curiosity on Mars and several other dynamic topics. The most significant advantage of the modernized DDME is its easily augmented functionality. In the future, DDME will be able to communicate with databases, import Keyhole Markup Language (KML) files from Google Earth, and be available on iOS and Android operating system. (Imagine: a field trip without the burden of notebooks, pens or pencils, paper or clipboards, with this information maintained on a mobile device.) The most recent DDME is a mere skeleton of its full capabilities—a robust architecture upon which myriad functionality can be supplemented.
NASA Astrophysics Data System (ADS)
Piman, T.; Schellekens, J.; Haag, A.; Donchyts, G.; Apirumanekul, C.; Hlaing, K. T.
2017-12-01
River morphology changes is one of the key issues in Ayeyarwady River in Myanmar which cause impacts on navigation, riverine habitats, agriculture lands, communities and livelihoods near the bank of the river. This study is aimed to track the changes in river morphology in the middle reach of Ayeyarwady River over last 30 years from 1984-2014 to improve understanding of riverbank dynamic, erosion and deposition procress. Earth observations including LandSat-7, LandSat-8, Digital Elevation Model from SRTM Plus and, ASTER-2 GoogleMap and Open Street Map were obtained for the study. GIS and remote sensing tools were used to analyze changes in river morphology while surface water mapping tool was applied to determine how the dynamic behaviour of the surface river and effect of river morphology changes. The tool consists of two components: (1) a Google Earth Engine (GEE) javascript or python application that performs image analysis and (2) a user-friendly site/app using Google's appspot.com that exposes the application to the users. The results of this study shown that the fluvial morphology in the middle reach of Ayeyarwady River is continuously changing under the influence of high water flows in particularly from extreme flood events and land use change from mining and deforestation. It was observed that some meandering sections of the riverbank were straightened, which results in the movement of sediment downstream and created new sections of meandering riverbank. Several large islands have formed due to the stabilization by vegetation and is enforced by sedimentation while many small bars were formed and migrated dynamically due to changes in water levels and flow velocity in the wet and dry seasons. The main channel was changed to secondary channel in some sections of the river. This results a constant shift of the navigation route. We also found that some villages were facing riverbank erosion which can force villagers to relocate. The study results demonstrated that the products from earth observations and the surface water mapping tool could detect dynamic changes of river morphology in the Ayeyarwady River. This information is useful to support navigation and riverbank protection planning and formulating mitigation measures for local communities that are affecting by riverbank erosion.
NASA Astrophysics Data System (ADS)
De Paor, D. G.; Bailey, J. E.; Whitmeyer, S. J.
2012-12-01
Our TUES research centers on the role of digital data, visualizations, animations, and simulations in undergraduate geoscience education. Digital hardware (smartphones, tablets, GPSs, GigaPan robotic camera mounts, etc.) are revolutionizing field data collection. Software products (GIS, 3-D scanning and modeling programs, virtual globes, etc.) have truly transformed the way geoscientists teach, learn, and do research. Whilst Google-Earth-style visualizations are famously user-friend for the person browsing, they can be notoriously unfriendly for the content creator. Therefore, we developed tools to help educators create and share visualizations as easily as if posting on Facebook. Anyone whoIf you wish to display geological cross sections on Google Earth, go to digitalplanet.org, upload image files, position them on a line of section, and share with the world through our KMZ hosting service. Other tools facilitate screen overlay and 3-D map symbol generation. We advocate use of such technology to enable undergraduate students to 'publish' their first mapping efforts even while they are working in the field. A second outcome of our TUES projects merges Second-Life-style interaction with Google Earth. We created games in which students act as first responders for natural hazard mitigation, prospectors for natural resource explorations, and structural geologist for map-making. Students are represented by avatars and collaborate by exchange of text messages - the natural mode of communication for the current generation. Teachers view logs showing student movements as well as transcripts of text messages and can scaffold student learning and geofence students to prevent wandering. Early results of in-class testing show positive learning outcomes. The third aspect of our program emphasizes dissemination. Experience shows that great effort is required to overcome activation energy and ensure adoption of new technology into the curriculum. We organized a GSA Penrose Conference, a GSA Pardee Keynote Symposium, and AGU Townhall Meeting, and numerous workshops at annual and regional meetings, and set up a web site dedicated to dissemination of program products. Future plans include development of augmented reality teaching resources, hosting of community mapping services, and creation of a truly 4-D virtual globe.;
Improving Land Cover Mapping: a Mobile Application Based on ESA Sentinel 2 Imagery
NASA Astrophysics Data System (ADS)
Melis, M. T.; Dessì, F.; Loddo, P.; La Mantia, C.; Da Pelo, S.; Deflorio, A. M.; Ghiglieri, G.; Hailu, B. T.; Kalegele, K.; Mwasi, B. N.
2018-04-01
The increasing availability of satellite data is a real value for the enhancement of environmental knowledge and land management. Possibilities to integrate different source of geo-data are growing and methodologies to create thematic database are becoming very sophisticated. Moreover, the access to internet services and, in particular, to web mapping services is well developed and spread either between expert users than the citizens. Web map services, like Google Maps or Open Street Maps, give the access to updated optical imagery or topographic maps but information on land cover/use - are not still provided. Therefore, there are many failings in the general utilization -non-specialized users- and access to those maps. This issue is particularly felt where the digital (web) maps could form the basis for land use management as they are more economic and accessible than the paper maps. These conditions are well known in many African countries where, while the internet access is becoming open to all, the local map agencies and their products are not widespread.
A global map of rainfed cropland areas (GMRCA) at the end of last millennium using remote sensing
Biradar, C.M.; Thenkabail, P.S.; Noojipady, P.; Li, Y.; Dheeravath, V.; Turral, H.; Velpuri, M.; Gumma, M.K.; Gangalakunta, O.R.P.; Cai, X.L.; Xiao, X.; Schull, M.A.; Alankara, R.D.; Gunasinghe, S.; Mohideen, S.
2009-01-01
The overarching goal of this study was to produce a global map of rainfed cropland areas (GMRCA) and calculate country-by-country rainfed area statistics using remote sensing data. A suite of spatial datasets, methods and protocols for mapping GMRCA were described. These consist of: (a) data fusion and composition of multi-resolution time-series mega-file data-cube (MFDC), (b) image segmentation based on precipitation, temperature, and elevation zones, (c) spectral correlation similarity (SCS), (d) protocols for class identification and labeling through uses of SCS R2-values, bi-spectral plots, space-time spiral curves (ST-SCs), rich source of field-plot data, and zoom-in-views of Google Earth (GE), and (e) techniques for resolving mixed classes by decision tree algorithms, and spatial modeling. The outcome was a 9-class GMRCA from which country-by-country rainfed area statistics were computed for the end of the last millennium. The global rainfed cropland area estimate from the GMRCA 9-class map was 1.13 billion hectares (Bha). The total global cropland areas (rainfed plus irrigated) was 1.53 Bha which was close to national statistics compiled by FAOSTAT (1.51 Bha). The accuracies and errors of GMRCA were assessed using field-plot and Google Earth data points. The accuracy varied between 92 and 98% with kappa value of about 0.76, errors of omission of 2-8%, and the errors of commission of 19-36%. ?? 2008 Elsevier B.V.
Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)
NASA Astrophysics Data System (ADS)
Hancher, M.
2013-12-01
Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.
UNAVCO Software and Services for Visualization and Exploration of Geoscience Data
NASA Astrophysics Data System (ADS)
Meertens, C.; Wier, S.
2007-12-01
UNAVCO has been involved in visualization of geoscience data to support education and research for several years. An early and ongoing service is the Jules Verne Voyager, a web browser applet built on the GMT that displays any area on Earth, with many data set choices, including maps, satellite images, topography, geoid heights, sea-floor ages, strain rates, political boundaries, rivers and lakes, earthquake and volcano locations, focal mechanisms, stress axes, and observed and modeled plate motion and deformation velocity vectors from geodetic measurements around the world. As part of the GEON project, UNAVCO has developed the GEON IDV, a research-level, 4D (earth location, depth and/or altitude, and time), Java application for interactive display and analysis of geoscience data. The GEON IDV is designed to meet the challenge of investigating complex, multi-variate, time-varying, three-dimensional geoscience data anywhere on earth. The GEON IDV supports simultaneous displays of data sets from differing sources, with complete control over colors, time animation, map projection, map area, point of view, and vertical scale. The GEON IDV displays gridded and point data, images, GIS shape files, and several other types of data. The GEON IDV has symbols and displays for GPS velocity vectors, seismic tomography, earthquake focal mechanisms, earthquake locations with magnitude or depth, seismic ray paths in 3D, seismic anisotropy, convection model visualization, earth strain axes and strain field imagery, and high-resolution 3D topographic relief maps. Multiple data sources and display types may appear in one view. As an example of GEON IDV utility, it can display hypocenters under a volcano, a surface geology map of the volcano draped over 3D topographic relief, town locations and political boundaries, and real-time 3D weather radar clouds of volcanic ash in the atmosphere, with time animation. The GEON IDV can drive a GeoWall or other 3D stereo system. IDV output includes imagery, movies, and KML files for Google Earth use of IDV static images, where Google Earth can handle the display. The IDV can be scripted to create display images on user request or automatically on data arrival, offering the use of the IDV as a back end to support a data web site. We plan to extend the power of the IDV by accepting new data types and data services, such as GeoSciML. An active program of online and video training in GEON IDV use is planned. UNAVCO will support users who need assistance converting their data to the standard formats used by the GEON IDV. The UNAVCO Facility provides web-accessible support for Google Earth and Google Maps display of any of more than 9500 GPS stations and survey points, including metadata for each installation. UNAVCO provides corresponding Open Geospatial Consortium (OGC) web services with the same data. UNAVCO's goal is to facilitate data access, interoperability, and efficient searches, exploration, and use of data by promoting web services, standards for GEON IDV data formats and metadata, and software able to simultaneously read and display multiple data sources, formats, and map locations or projections. Retention and propagation of semantics and metadata with observational and experimental values is essential for interoperability and understanding diverse data sources.
Near real-time qualitative monitoring of lake water chlorophyll globally using GoogleEarth Engine
NASA Astrophysics Data System (ADS)
Zlinszky, András; Supan, Peter; Koma, Zsófia
2017-04-01
Monitoring ocean chlorophyll and suspended sediment has been made possible using optical satellite imaging, and has contributed immensely to our understanding of the Earth and its climate. However, lake water quality monitoring has limitations due to the optical complexity of shallow, sediment- and organic matter-laden waters. Meanwhile, timely and detailed information on basic lake water quality parameters would be essential for sustainable management of inland waters. Satellite-based remote sensing can deliver area-covering, high resolution maps of basic lake water quality parameters, but scientific application of these datasets for lake monitoring has been hindered by limitations to calibration and accuracy evaluation, and therefore access to such data has been the privilege of scientific users. Nevertheless, since for many inland waters satellite imaging is the only source of monitoring data, we believe it is urgent to make map products of chlorophyll and suspended sediment concentrations available to a wide range of users. Even if absolute accuracy can not be validated, patterns, processes and qualitative information delivered by such datasets in near-real time can act as an early warning system, raise awareness to water quality processes and serve education, in addition to complementing local monitoring activities. By making these datasets openly available on the internet through an easy to use framework, dialogue between stakeholders, management and governance authorities can be facilitated. We use GoogleEarthEngine to access and process archive and current satellite data. GoogleEarth Engine is a development and visualization framework that provides access to satellite datasets and processing capacity for analysis at the Petabyte scale. Based on earlier investigations, we chose the fluorescence line height index to represent water chlorophyll concentration. This index relies on the chlorophyll fluorescence peak at 680 nm, and has been tested for open ocean but also inland lake situations for MODIS and MERIS satellite sensor data. In addition to being relatively robust and less sensitive to atmospheric influence, this algorithm is also very simple, being based on the height of the 680 nm peak above the linear interpolation of the two neighbouring bands. However, not all satellite datasets suitable for FLH are catalogued for GoogleEarth Engine. In the current testing phase, Landsat 7, Landsat 8 (30 m resolution), and Sentinel 2 (20 m) are being tested. Landsat 7 has suitable band configuration, but has a strip error due to a sensor problem. Landsat 8 and Sentinel 2 lack a single spectral optimal for FLH. Sentinel 3 would be an optimal data source and has shown good performace during small-scale initial tests, but is not distributed globally for GoogleEarth Engine. In addition to FLH data from these satellites, our system delivers cloud and ice masking, qualitative suspended sediment data (based on the band closest to 600 nm) and true colour images, all within an easy-to-use Google Maps background. This allows on-demand understanding and interpretation of water quality patterns and processes in near real time. While the system is still under development, we believe it could significantly contribute to lake water quality management and monitoring worldwide.
NASA Astrophysics Data System (ADS)
Welton, B.; Chouinard, K.; Sultan, M.; Becker, D.; Milewski, A.; Becker, R.
2010-12-01
Rising populations in the arid and semi arid parts of the World are increasing the demand for fresh water supplies worldwide. Many data sets needed for assessment of hydrologic applications across vast regions of the world are expensive, unpublished, difficult to obtain, or at varying scales which complicates their use. Fortunately, this situation is changing with the development of global remote sensing datasets and web-based platforms such as GIS Server. GIS provides a cost effective vehicle for comparing, analyzing, and querying a variety of spatial datasets as geographically referenced layers. We have recently constructed a web-based GIS, that incorporates all relevant geological, geochemical, geophysical, and remote sensing data sets that were readily used to identify reservoir types and potential well locations on local and regional scales in various tectonic settings including: (1) extensional environment (Red Sea rift), (2) transcurrent fault system (Najd Fault in the Arabian-Nubian Shield), and (3) compressional environments (Himalayas). The web-based GIS could also be used to detect spatial and temporal trends in precipitation, recharge, and runoff in large watersheds on local, regional, and continental scales. These applications were enabled through the construction of a web-based ArcGIS Server with Google Map’s interface and the development of customized geoprocessing tools. ArcGIS Server provides out-of-the-box setups that are generic in nature. This platform includes all of the standard web based GIS tools (e.g. pan, zoom, identify, search, data querying, and measurement). In addition to the standard suite of tools provided by ArcGIS Server an additional set of advanced data manipulation and display tools was also developed to allow for a more complete and customizable view of the area of interest. The most notable addition to the standard GIS Server tools is the custom on-demand geoprocessing tools (e.g., graph, statistical functions, custom raster creation, profile, TRMM). The generation of a wide range of derivative maps (e.g., buffer zone, contour map, graphs, temporal rainfall distribution maps) from various map layers (e.g., geologic maps, geophysics, satellite images) allows for more user flexibility. The use of these tools along with Google Map’s API which enables the website user to utilize high quality GeoEye 2 images provide by Google in conjunction with our data, creates a more complete image of the area being observed and allows for custom derivative maps to be created in the field and viewed immediately on the web, processes that were restricted to offline databases.
The Dimensions of the Solar System
ERIC Educational Resources Information Center
Schneider, Stephen E.; Davis, Kathleen S.
2007-01-01
A few new wrinkles have been added to the popular activity of building a scale model of the solar system. Students can learn about maps and scaling using easily accessible online resources that include satellite images. This is accomplished by taking advantage of some of the special features of Google Earth. This activity gives students a much…
Secure and Resilient Cloud Computing for the Department of Defense
2015-11-16
platform as a service (PaaS), and software as a service ( SaaS )—that target system administrators, developers, and end-users respectively (see Table 2...interfaces (API) and services Medium Amazon Elastic MapReduce, MathWorks Cloud, Red Hat OpenShift SaaS Full-fledged applications Low Google gMail
Geospatial Services in Special Libraries: A Needs Assessment Perspective
ERIC Educational Resources Information Center
Barnes, Ilana
2013-01-01
Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…
Map Scale, Proportion, and Google[TM] Earth
ERIC Educational Resources Information Center
Roberge, Martin C.; Cooper, Linda L.
2010-01-01
Aerial imagery has a great capacity to engage and maintain student interest while providing a contextual setting to strengthen their ability to reason proportionally. Free, on-demand, high-resolution, large-scale aerial photography provides both a bird's eye view of the world and a new perspective on one's own community. This article presents an…
Midekisa, Alemayehu; Holl, Felix; Savory, David J; Andrade-Pacheco, Ricardo; Gething, Peter W; Bennett, Adam; Sturrock, Hugh J W
2017-01-01
Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth's land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources.
Detecting Potential Water Quality Issues by Mapping Trophic Status Using Google Earth Engine
NASA Astrophysics Data System (ADS)
Nguy-Robertson, A. L.; Harvey, K.; Huening, V.; Robinson, H.
2017-12-01
The identification, timing, and spatial distribution of recurrent algal blooms and aquatic vegetation can help water managers and policy makers make better water resource decisions. In many parts of the world there is little monitoring or reporting of water quality due to the required costs and effort to collect and process water samples. We propose to use Google Earth Engine to quickly identify the recurrence of trophic states in global inland water systems. Utilizing Landsat and Sentinel multispectral imagery, inland water quality parameters (i.e. chlorophyll a concentration) can be estimated and waters can be classified by trophic state; oligotrophic, mesotrophic, eutrophic, and hypereutrophic. The recurrence of eutrophic and hypereutrophic observations can highlight potentially problematic locations where algal blooms or aquatic vegetation occur routinely. Eutrophic and hypereutrophic waters commonly include many harmful algal blooms and waters prone to fish die-offs from hypoxia. While these maps may be limited by the accuracy of the algorithms utilized to estimate chlorophyll a; relative comparisons at a local scale can help water managers to focus limited resources.
Holl, Felix; Savory, David J.; Andrade-Pacheco, Ricardo; Gething, Peter W.; Bennett, Adam; Sturrock, Hugh J. W.
2017-01-01
Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth’s land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources. PMID:28953943
A landslide susceptibility map of Africa
NASA Astrophysics Data System (ADS)
Broeckx, Jente; Vanmaercke, Matthias; Duchateau, Rica; Poesen, Jean
2017-04-01
Studies on landslide risks and fatalities indicate that landslides are a global threat to humans, infrastructure and the environment, certainly in Africa. Nonetheless our understanding of the spatial patterns of landslides and rockfalls on this continent is very limited. Also in global landslide susceptibility maps, Africa is mostly underrepresented in the inventories used to construct these maps. As a result, predicted landslide susceptibilities remain subject to very large uncertainties. This research aims to produce a first continent-wide landslide susceptibility map for Africa, calibrated with a well-distributed landslide dataset. As a first step, we compiled all available landslide inventories for Africa. This data was supplemented by additional landslide mapping with Google Earth in underrepresented regions. This way, we compiled 60 landslide inventories from the literature (ca. 11000 landslides) and an additional 6500 landslides through mapping in Google Earth (including 1500 rockfalls). Various environmental variables such as slope, lithology, soil characteristics, land use, precipitation and seismic activity, were investigated for their significance in explaining the observed spatial patterns of landslides. To account for potential mapping biases in our dataset, we used Monte Carlo simulations that selected different subsets of mapped landslides, tested the significance of the considered environmental variables and evaluated the performance of the fitted multiple logistic regression model against another subset of mapped landslides. Based on these analyses, we constructed two landslide susceptibility maps for Africa: one for all landslide types and one excluding rockfalls. In both maps, topography, lithology and seismic activity were the most significant variables. The latter factor may be surprising, given the overall limited degree of seismicity in Africa. However, its significance indicates that frequent seismic events may serve as in important preparatory factor for landslides. This finding concurs with several other recent studies. Rainfall explains a significant, but limited part of the observed landslide pattern and becomes insignificant when also rockfalls are considered. This may be explained by the fact that a significant fraction of the mapped rockfalls occurred in the Sahara desert. Overall, both maps perform well in predicting intra-continental patterns of mass movements in Africa and explain about 80% of the observed variance in landslide occurrence. As a result, these maps may be a valuable tool for planning and risk reduction strategies.
NASA Astrophysics Data System (ADS)
McCarthy, K.
2017-12-01
NASA's Operation IceBridge (OIB), the largest airborne survey of Earth's polar ice uses remote sensing methods to collect data on changing sea and land ice. PolarTREC teacher Kelly McCarthy joined the team during the 2016 Spring Arctic Campaign. This presentation explores ways in which k-12 students were engaged in the work being done by OIB through classroom learning experiences, digital communications, and independent research. Initially, digital communication including chats via NASA's Mission Tools Suite for Education (MTSE) platform was leveraged to engage students in the daily work of OIB. Two lessons were piloted with student groups during the 2016-2017 academic year both for students who actively engaged in communications with the team during the expedition and those who had no prior connections to the field. All of the data collected on OIB missions is stored for public use in a digital portal on the National Snow and Ice Data Center (NSIDC) website. In one lesson, 10th-12th grade students were guided through a tutorial to learn how to access data and begin to develop a story about Greenland's Jakobshavn Glacier using pre-selected data sets, Google's MyMaps app, and independent research methods. In the second lesson, 8th grade students were introduced to remote sensing, first through a discussion on vocabulary using productive talk moves and then via a demonstration using Vernier motion detectors and a graph matching simulation. Students worked in groups to develop procedures to map a hidden surface region (boxed assortment of miscellaneous objects) using a Vernier motion sensor to simulate sonar. Students translated data points collected from the motion sensor into a vertical profile of the simulated surface region. Both lessons allowed students a way to engage in two of the most important components of OIB. The ability to work with real data collected by the OIB team provided a unique context through which students gained skill and overcame challenges in Excel, Google Apps, construction of graphs, and data analysis. The remote sensing simulation allowed students to practice and gain hands-on knowledge of the components of OIB discussed in the digital communications that may have felt unclear to students who have had limited or no exposure to remote sensing technologies or the science behind them.
Ameisen, David; Deroulers, Christophe; Perrier, Valérie; Bouhidel, Fatiha; Battistella, Maxime; Legrès, Luc; Janin, Anne; Bertheau, Philippe; Yunès, Jean-Baptiste
2014-01-01
Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow.
Snake River Plain Geothermal Play Fairway Analysis - Phase 1 KMZ files
John Shervais
2015-10-10
This dataset contain raw data files in kmz files (Google Earth georeference format). These files include volcanic vent locations and age, the distribution of fine-grained lacustrine sediments (which act as both a seal and an insulating layer for hydrothermal fluids), and post-Miocene faults compiled from the Idaho Geological Survey, the USGS Quaternary Fault database, and unpublished mapping. It also contains the Composite Common Risk Segment Map created during Phase 1 studies, as well as a file with locations of select deep wells used to interrogate the subsurface.
Combating Conflict Related Sexual Violence: More Than a Stability Concern
2014-06-13
violence can cause serious bodily harm or mental harm to members of the group (International Criminal Court 2002, 3; Ellis 2007). Under crimes against...population was subjugated to Japanese 46 rule and were provided with horrific visual, physical, and emotional reminders of the futility of any...maps.google.com/ maps /ms?msid=214870171076954118166.0004b9bcb533b0ee2c1f8&msa=0&ie= UTF8&ll=12.848235,58.136902&spn=43.135476,135.258178&t=m&output=em bed
NASA Astrophysics Data System (ADS)
Li, P.; Turk, J.; Vu, Q.; Knosp, B.; Hristova-Veleva, S. M.; Lambrigtsen, B.; Poulsen, W. L.; Licata, S.
2009-12-01
NASA is planning a new field experiment, the Genesis and Rapid Intensification Processes (GRIP), in the summer of 2010 to better understand how tropical storms form and develop into major hurricanes. The DC-8 aircraft and the Global Hawk Unmanned Airborne System (UAS) will be deployed loaded with instruments for measurements including lightning, temperature, 3D wind, precipitation, liquid and ice water contents, aerosol and cloud profiles. During the field campaign, both the spaceborne and the airborne observations will be collected in real-time and integrated with the hurricane forecast models. This observation-model integration will help the campaign achieve its science goals by allowing team members to effectively plan the mission with current forecasts. To support the GRIP experiment, JPL developed a website for interactive visualization of all related remote-sensing observations in the GRIP’s geographical domain using the new Google Earth API. All the observations are collected in near real-time (NRT) with 2 to 5 hour latency. The observations include a 1KM blended Sea Surface Temperature (SST) map from GHRSST L2P products; 6-hour composite images of GOES IR; stability indices, temperature and vapor profiles from AIRS and AMSU-B; microwave brightness temperature and rain index maps from AMSR-E, SSMI and TRMM-TMI; ocean surface wind vectors, vorticity and divergence of the wind from QuikSCAT; the 3D precipitation structure from TRMM-PR and vertical profiles of cloud and precipitation from CloudSAT. All the NRT observations are collected from the data centers and science facilities at NASA and NOAA, subsetted, re-projected, and composited into hourly or daily data products depending on the frequency of the observation. The data products are then displayed on the 3D Google Earth plug-in at the JPL Tropical Cyclone Information System (TCIS) website. The data products offered by the TCIS in the Google Earth display include image overlays, wind vectors, clickable placemarks with vertical profiles for temperature and water vapors and curtain plots along the satellite tracks. Multiple products can be overlaid with individual adjustable opacity control. The time sequence visualization is supported by calendar and Google Earth time animation. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.
Accuracy comparison in mapping water bodies using Landsat images and Google Earth Images
NASA Astrophysics Data System (ADS)
Zhou, Z.; Zhou, X.
2016-12-01
A lot of research has been done for the extraction of water bodies with multiple satellite images. The Water Indexes with the use of multi-spectral images are the mostly used methods for the water bodies' extraction. In order to extract area of water bodies from satellite images, accuracy may depend on the spatial resolution of images and relative size of the water bodies. To quantify the impact of spatial resolution and size (major and minor lengths) of the water bodies on the accuracy of water area extraction, we use Georgetown Lake, Montana and coalbed methane (CBM) water retention ponds in the Montana Powder River Basin as test sites to evaluate the impact of spatial resolution and the size of water bodies on water area extraction. Data sources used include Landsat images and Google Earth images covering both large water bodies and small ponds. Firstly we used water indices to extract water coverage from Landsat images for both large lake and small ponds. Secondly we used a newly developed visible-index method to extract water coverage from Google Earth images covering both large lake and small ponds. Thirdly, we used the image fusion method in which the Google Earth Images are fused with multi-spectral Landsat images to obtain multi-spectral images of the same high spatial resolution as the Google earth images. The actual area of the lake and ponds are measured using GPS surveys. Results will be compared and the optimal method will be selected for water body extraction.
Marshall, Zack; Brunger, Fern; Welch, Vivian; Asghari, Shabnam; Kaposy, Chris
2018-02-26
This paper focuses on the collision of three factors: a growing emphasis on sharing research through open access publication, an increasing awareness of big data and its potential uses, and an engaged public interested in the privacy and confidentiality of their personal health information. One conceptual space where this collision is brought into sharp relief is with the open availability of patient medical photographs from peer-reviewed journal articles in the search results of online image databases such as Google Images. The aim of this study was to assess the availability of patient medical photographs from published journal articles in Google Images search results and the factors impacting this availability. We conducted a cross-sectional study using data from an evidence map of research with transgender, gender non-binary, and other gender diverse (trans) participants. For the original evidence map, a comprehensive search of 15 academic databases was developed in collaboration with a health sciences librarian. Initial search results produced 25,230 references after duplicates were removed. Eligibility criteria were established to include empirical research of any design that included trans participants or their personal information and that was published in English in peer-reviewed journals. We identified all articles published between 2008 and 2015 with medical photographs of trans participants. For each reference, images were individually numbered in order to track the total number of medical photographs. We used odds ratios (OR) to assess the association between availability of the clinical photograph on Google Images and the following factors: whether the article was openly available online (open access, Researchgate.net, or Academia.edu), whether the article included genital images, if the photographs were published in color, and whether the photographs were located on the journal article landing page. We identified 94 articles with medical photographs of trans participants, including a total of 605 photographs. Of the 94 publications, 35 (37%) included at least one medical photograph that was found on Google Images. The ability to locate the article freely online contributes to the availability of at least one image from the article on Google Images (OR 2.99, 95% CI 1.20-7.45). This is the first study to document the existence of medical photographs from peer-reviewed journals appearing in Google Images search results. Almost all of the images we searched for included sensitive photographs of patient genitals, chests, or breasts. Given that it is unlikely that patients consented to sharing their personal health information in these ways, this constitutes a risk to patient privacy. Based on the impact of current practices, revisions to informed consent policies and guidelines are required. ©Zack Marshall, Fern Brunger, Vivian Welch, Shabnam Asghari, Chris Kaposy. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 26.02.2018.
Physics collaboration and communication through emerging media: *odcasts, blogs and wikis
NASA Astrophysics Data System (ADS)
Clark, Charles W.; Williams, Jamie
2006-05-01
The entertainment and news industries are being transformed by the emergence of innovative, internet-based media tools. Audio and video downloads are beginning to compete with traditional entertainment distribution channels, and the blogosphere has become an alternative press with demonstrated news-making power of its own. The scientific community, and physics in particular, is just beginning to experiment with these tools. We believe that they have great potential for enhancing the quality and effectiveness of collaboration and communication, and that the coming generation of physicists will expect them to be used creatively. We will report on our experience in producing seminar podcasts (google ``QIBEC'' or search ``quantum'' on Apple iTunes), and on operating a distributed research institute using a group-based blog.
Sedimentation and erosion in Lake Diefenbaker, Canada: solutions for shoreline retreat monitoring.
Sadeghian, Amir; de Boer, Dirk; Lindenschmidt, Karl-Erich
2017-09-15
This study looks into sedimentation and erosion rates in Lake Diefenbaker, a prairie reservoir, in Saskatchewan, Canada, which has been in operation since 1968. First, we looked at the historical data in all different formats over the last 70 years, which includes data from more than 20 years before the formation of the lake. The field observations indicate high rates of shoreline erosion, especially in the upstream portion as a potential region for shoreline retreat. Because of the great importance of this waterbody to the province, monitoring sedimentation and erosion rates is necessary for maintaining the quality of water especially after severe floods which are more common due to climate change effects. Second, we used Google Maps Elevation API, a new tool from Google that provides elevation data for cross sections drawn between two points, by drawing 24 cross sections in the upstream area extending 250 m from each bank. This feature from Google can be used as an easy and fast monitoring tool, is free of charge, and provides excellent control capabilities for monitoring changes in cross-sectional profiles.
2016-02-27
Sam Choi and Naiara Pinto observe Google Earth overlaid with in almost real time what the synthetic aperture radar is mapping from the C-20A aircraft. Researchers were in the sky and on the ground to take measurements of plant mass, distribution of trees, shrubs and ground cover and the diversity of plants and how much carbon is absorbed by them.
Ingress in Geography: Portals to Academic Success?
ERIC Educational Resources Information Center
Davis, Michael
2017-01-01
Niantic Labs has developed an augmented virtual reality mobile app game called Ingress in which agents must seek out and control locations for their designated factions. The app uses the Google Maps interface along with GPS to enhance a geocaching-like experience with elements of other classical games such as capture-the-flag. This study aims to…
Applying Modern Stage Theory to Mauritania: A Prescription to Encourage Entrepreneurship
2014-12-01
entrepreneurship, stage theory, development, Africa , factor-driven, trade freedom, business freedom 15. NUMBER OF PAGES 77 16. PRICE CODE 17...SOUTH ASIA, SUB-SAHARAN AFRICA ) from the NAVAL POSTGRADUATE SCHOOL December 2014 Author: Jennifer M. Warren Approved by: Robert E...Notes, Coins) .......................................................................... 4 Figure 2. Satellite map of West Africa (from Google Earth
Re-Purposing Google Maps Visualisation for Teaching Logistics Systems
ERIC Educational Resources Information Center
Cheong, France; Cheong, Christopher; Jie, Ferry
2012-01-01
Routing is the process of selecting appropriate paths and ordering waypoints in a network. It plays an important part in logistics and supply chain management as choosing the optimal route can minimise distribution costs. Routing optimisation, however, is a difficult problem to solve and computer software is often used to determine the best route.…
Developing a scientific procedure for community based hazard mapping and risk mitigation
NASA Astrophysics Data System (ADS)
Verrier, M.
2011-12-01
As an international exchange student from the Geological Sciences Department at San Diego State University (SDSU), I joined the KKN-PPM program at Universitas Gadjah Mada (UGM), Yogyakarta, Indonesia, in July 2011 for 12 days (July 4th to July 16th) of its two month duration (July 4th to August 25th). The KKN-PPM group I was attached was designated 154 and was focused in Plosorejo Village, Karanganyar, Kerjo, Central Java, Indonesia. The mission of KKN-PPM 154 was to survey Plosorejo village for existing landslides, to generate a simple hazard susceptibility map that can be understood by local villagers, and then to begin dissemination of that map into the community. To generate our susceptibility map we first conducted a geological survey of the existing landslides in the field study area, with a focus on determining landslide triggers and gauging areas for susceptibility for future landslides. The methods for gauging susceptibility included lithological observation, the presence of linear cracking, visible loss of structural integrity in structures such as villager homes, as well as collaboration with local residents and with the local rescue and response team. There were three color distinctions used in representing susceptibility which were green, where there is no immediate danger of landslide damage; orange, where transportation routes are at risk of being disrupted by landslides; and red, where imminent landslide potential puts a home in direct danger. The landslide inventory and susceptibility data was compiled into digital mediums such as CorelDraw, ArcGIS and Google Earth. Once a technical map was generated, we presented it to the village leadership for confirmation and modification based on their experience. Finally, we began to use the technical susceptibility map to draft evacuation routes and meeting points in the event of landslides, as well as simple susceptibility maps that can be understood and utilized by local villagers. Landslide mitigation projects that are being conducted alongside the community hazard map include marking evacuation routes with painted bamboo signs, creating a meaningful landslide awareness mural, and installing simple early warning systems that detect land movement and alert residents that evacuation routes should be used. KKN-PPM is scheduled to continue until August 25th, 2011. In the future, research will be done into using the model for community based hazard mapping outlined here in the Geological Sciences Department at SDSU to increase georisk awareness and improve mitigation of landslides in local areas of need such as Tijuana, Mexico.
Regional early flood warning system: design and implementation
NASA Astrophysics Data System (ADS)
Chang, L. C.; Yang, S. N.; Kuo, C. L.; Wang, Y. F.
2017-12-01
This study proposes a prototype of the regional early flood inundation warning system in Tainan City, Taiwan. The AI technology is used to forecast multi-step-ahead regional flood inundation maps during storm events. The computing time is only few seconds that leads to real-time regional flood inundation forecasting. A database is built to organize data and information for building real-time forecasting models, maintaining the relations of forecasted points, and displaying forecasted results, while real-time data acquisition is another key task where the model requires immediately accessing rain gauge information to provide forecast services. All programs related database are constructed in Microsoft SQL Server by using Visual C# to extracting real-time hydrological data, managing data, storing the forecasted data and providing the information to the visual map-based display. The regional early flood inundation warning system use the up-to-date Web technologies driven by the database and real-time data acquisition to display the on-line forecasting flood inundation depths in the study area. The friendly interface includes on-line sequentially showing inundation area by Google Map, maximum inundation depth and its location, and providing KMZ file download of the results which can be watched on Google Earth. The developed system can provide all the relevant information and on-line forecast results that helps city authorities to make decisions during typhoon events and make actions to mitigate the losses.
Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping
NASA Astrophysics Data System (ADS)
Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.
2017-12-01
Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.
How Would You Move Mount Fuji - And Why Would You Want To?
NASA Astrophysics Data System (ADS)
de Paor, D. G.
2008-12-01
According to author William Poundstone, "How Would You Move Mt Fuji?" typifies the kind of question that corporations such as Microsoft are wont to ask job applicants in order to test their lateral thinking skills. One answer (albeit not one that would necessarily secure a job at Microsoft) is: "With Google Earth and a Macintosh or PC." The answer to the more profound follow-up question "Why Would You Want To?" is hinted at by one of the great quotations of earth science, namely Charles Lyell's proposition that "The Present Is Key to the Past." Google Earth is a phenomenally powerful tool for visualizing today's earth, ocean, and atmosphere. With the aid of Google SketchUp, that visualization can be extended to reconstruct the past using relocated samples of present-day landscapes and environments as models of paleo-DEM and paleogeography. Volcanoes are particularly useful models because their self similar growth can be simulated by changing KML altitude tags within a timespan, but numerous other landforms and geologic structures serve as useful keys to the past. Examples range in scale from glaciers and fault scarps to island arcs and mountain ranges. The ability to generate a paleo-terrain model in Google Earth brings us one step closer to a truly four- dimensional, interactive geological map of the world throughout time.
Seismicity map tools for earthquake studies
NASA Astrophysics Data System (ADS)
Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos
2014-05-01
We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.
The New USGS Volcano Hazards Program Web Site
NASA Astrophysics Data System (ADS)
Venezky, D. Y.; Graham, S. E.; Parker, T. J.; Snedigar, S. F.
2008-12-01
The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) has launched a revised web site that uses a map-based interface to display hazards information for U.S. volcanoes. The web site is focused on better communication of hazards and background volcano information to our varied user groups by reorganizing content based on user needs and improving data display. The Home Page provides a synoptic view of the activity level of all volcanoes for which updates are written using a custom Google® Map. Updates are accessible by clicking on one of the map icons or clicking on the volcano of interest in the adjacent color-coded list of updates. The new navigation provides rapid access to volcanic activity information, background volcano information, images and publications, volcanic hazards, information about VHP, and the USGS volcano observatories. The Volcanic Activity section was tailored for emergency managers but provides information for all our user groups. It includes a Google® Map of the volcanoes we monitor, an Elevated Activity Page, a general status page, information about our Volcano Alert Levels and Aviation Color Codes, monitoring information, and links to monitoring data from VHP's volcano observatories: Alaska Volcano Observatory (AVO), Cascades Volcano Observatory (CVO), Long Valley Observatory (LVO), Hawaiian Volcano Observatory (HVO), and Yellowstone Volcano Observatory (YVO). The YVO web site was the first to move to the new navigation system and we are working on integrating the Long Valley Observatory web site next. We are excited to continue to implement new geospatial technologies to better display our hazards and supporting volcano information.
NASA Astrophysics Data System (ADS)
Drosos, Vasileios C.; Liampas, Sarantis-Aggelos G.; Doukas, Aristotelis-Kosmas G.
2014-08-01
In our time, the Geographic Information Systems (GIS) have become important tools, not only in the geosciences and environmental sciences, as well as virtually for all researches that require monitoring, planning or land management. The purpose of this paper was to develop a planning tool and decision making tool using AutoCAD Map software, ArcGIS and Google Earth with emphasis on the investigation of the suitability of forest roads' mapping and the range of its implementation in Greece in prefecture level. Integrating spatial information into a database makes data available throughout the organization; improving quality, productivity, and data management. Also working in such an environment, you can: Access and edit information, integrate and analyze data and communicate effectively. To select desirable information such as forest road network in a very early stage in the planning of silviculture operations, for example before the planning of the harvest is carried out. The software programs that were used were AutoCAD Map for the export in shape files for the GPS data, and ArcGIS in shape files (ArcGlobe), while Google Earth with KML files (Keyhole Markup Language) in order to better visualize and evaluate existing conditions, design in a real-world context and exchange information with government agencies, utilities, and contractors in both CAD and GIS data formats. The automation of the updating procedure and transfer of any files between agencies-departments is one of the main tasks of the integrated GIS-tool among the others should be addressed.
Google Earth-Based Grand Tours of the World's Ocean Basins and Marine Sediments
NASA Astrophysics Data System (ADS)
St John, K. K.; De Paor, D. G.; Suranovic, B.; Robinson, C.; Firth, J. V.; Rand, C.
2016-12-01
The GEODE project has produced a collection of Google Earth-based marine geology teaching resources that offer grand tours of the world's ocean basins and marine sediments. We use a map of oceanic crustal ages from Müller et al (2008; doi:10.1029/2007GC001743), and a set of emergent COLLADA models of IODP drill core data as a basis for a Google Earth tour introducing students to the world's ocean basins. Most students are familiar with basic seafloor spreading patterns but teaching experience suggests that few students have an appreciation of the number of abandoned ocean basins on Earth. Students also lack a valid visualization of the west Pacific where the oldest crust forms an isolated triangular patch and the ocean floor becomes younger towards the subduction zones. Our tour links geographic locations to mechanical models of rifting, seafloor spreading, subduction, and transform faulting. Google Earth's built-in earthquake and volcano data are related to ocean floor patterns. Marine sediments are explored in a Google Earth tour that draws on exemplary IODP core samples of a range of sediment types (e.g., turbidites, diatom ooze). Information and links are used to connect location to sediment type. This tour compliments a physical core kit of core catcher sections that can be employed for classroom instruction (geode.net/marine-core-kit/). At a larger scale, we use data from IMLGS to explore the distribution of the marine sediments types in the modern global ocean. More than 2,500 sites are plotted with access to original data. Students are guided to compare modern "type sections" of primary marine sediment lithologies, as well as examine site transects to address questions of bathymetric setting, ocean circulation, chemistry (e.g., CCD), and bioproductivity as influences on modern seafloor sedimentation. KMZ files, student exercises, and tips for instructors are available at geode.net/exploring-marine-sediments-using-google-earth.
ERIC Educational Resources Information Center
Lott, Kimberly; Read, Sylvia
2015-01-01
All writing begins with ideas, but young students often need visual cues to help them organize their thoughts before beginning to write. For this reason, many elementary teachers use graphic organizers or thinking maps to help students visualize patterns and organize their ideas within the different genres of writing. Graphic organizers such as…
NASA Astrophysics Data System (ADS)
Fritz, S.; Nordling, J.; See, L. M.; McCallum, I.; Perger, C.; Becker-Reshef, I.; Mucher, S.; Bydekerke, L.; Havlik, P.; Kraxner, F.; Obersteiner, M.
2014-12-01
The International Institute for Applied Systems Analysis (IIASA) has developed a global cropland extent map, which supports the monitoring and assessment activities of GEOGLAM (Group on Earth Observations Global Agricultural Monitoring Initiative). Through the European-funded SIGMA (Stimulating Innovation for Global Monitoring of Agriculture and its Impact on the Environment in support of GEOGLAM) project, IIASA is continuing to support GEOGLAM by providing cropland projections in the future and modelling environmental impacts on agriculture under various scenarios. In addition, IIASA is focusing on two specific elements within SIGMA: the development of a global field size and irrigation map; and mobile app development for in-situ data collection and validation of remotely-sensed products. Cropland field size is a very useful indicator for agricultural monitoring yet the information we have at a global scale is currently very limited. IIASA has already created a global map of field size at a 1 km resolution using crowdsourced data from Geo-Wiki as a first approximation. Using automatic classification of Landsat imagery and algorithms contained within Google Earth Engine, initial experimentation has shown that circular fields and landscape structures can easily be extracted. Not only will this contribute to improving the global map of field size, it can also be used to create a global map that contains a large proportion of the world's irrigated areas, which will be another useful contribution to GEOGLAM. The field size map will also be used to stratify and develop a global crop map in SIGMA. Mobile app development in support of in-situ data collection is another area where IIASA is currently working. An Android app has been built using the Open Data Toolkit (ODK) and extended further with spatial mapping capabilities called GeoODK. The app allows users to collect data on different crop types and delineate fields on the ground, which can be used to validate the field size map. The app can also cache map data so that high resolution satellite imagery and reference data from the users can be viewed in the field without the need for an internet connection. This app will be used for calibration and validation of the data products in SIGMA, e.g. data collection at JECAM (Joint Experiment of Crop Assessment and Monitoring) sites.
NASA Astrophysics Data System (ADS)
Raup, B. H.; Khalsa, S. S.; Armstrong, R.
2007-12-01
The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), MapInfo, GML (Geography Markup Language) and GMT (Generic Mapping Tools). This "clip-and-ship" function allows users to download only the data they are interested in. Our flexible web interfaces to the database, which includes various support layers (e.g. a layer to help collaborators identify satellite imagery over their region of expertise) will facilitate enhanced analysis to be undertaken on glacier systems, their distribution, and their impacts on other Earth systems.
NASA Astrophysics Data System (ADS)
Chen, Bangqian; Xiao, Xiangming; Li, Xiangping; Pan, Lianghao; Doughty, Russell; Ma, Jun; Dong, Jinwei; Qin, Yuanwei; Zhao, Bin; Wu, Zhixiang; Sun, Rui; Lan, Guoyu; Xie, Guishui; Clinton, Nicholas; Giri, Chandra
2017-09-01
Due to rapid losses of mangrove forests caused by anthropogenic disturbances and climate change, accurate and contemporary maps of mangrove forests are needed to understand how mangrove ecosystems are changing and establish plans for sustainable management. In this study, a new classification algorithm was developed using the biophysical characteristics of mangrove forests in China. More specifically, these forests were mapped by identifying: (1) greenness, canopy coverage, and tidal inundation from time series Landsat data, and (2) elevation, slope, and intersection-with-sea criterion. The annual mean Normalized Difference Vegetation Index (NDVI) was found to be a key variable in determining the classification thresholds of greenness, canopy coverage, and tidal inundation of mangrove forests, which are greatly affected by tide dynamics. In addition, the integration of Sentinel-1A VH band and modified Normalized Difference Water Index (mNDWI) shows great potential in identifying yearlong tidal and fresh water bodies, which is related to mangrove forests. This algorithm was developed using 6 typical Regions of Interest (ROIs) as algorithm training and was run on the Google Earth Engine (GEE) cloud computing platform to process 1941 Landsat images (25 Path/Row) and 586 Sentinel-1A images circa 2015. The resultant mangrove forest map of China at 30 m spatial resolution has an overall/users/producer's accuracy greater than 95% when validated with ground reference data. In 2015, China's mangrove forests had a total area of 20,303 ha, about 92% of which was in the Guangxi Zhuang Autonomous Region, Guangdong, and Hainan Provinces. This study has demonstrated the potential of using the GEE platform, time series Landsat and Sentine-1A SAR images to identify and map mangrove forests along the coastal zones. The resultant mangrove forest maps are likely to be useful for the sustainable management and ecological assessments of mangrove forests in China.
Dong, Jinwei; Xiao, Xiangming; Menarguez, Michael A.; Zhang, Geli; Qin, Yuanwei; Thau, David; Biradar, Chandrashekhar; Moore, Berrien
2016-01-01
Area and spatial distribution information of paddy rice are important for understanding of food security, water use, greenhouse gas emission, and disease transmission. Due to climatic warming and increasing food demand, paddy rice has been expanding rapidly in high latitude areas in the last decade, particularly in northeastern (NE) Asia. Current knowledge about paddy rice fields in these cold regions is limited. The phenology- and pixel-based paddy rice mapping (PPPM) algorithm, which identifies the flooding signals in the rice transplanting phase, has been effectively applied in tropical areas, but has not been tested at large scale of cold regions yet. Despite the effects from more snow/ice, paddy rice mapping in high latitude areas is assumed to be more encouraging due to less clouds, lower cropping intensity, and more observations from Landsat sidelaps. Moreover, the enhanced temporal and geographic coverage from Landsat 8 provides an opportunity to acquire phenology information and map paddy rice. This study evaluated the potential of Landsat 8 images on annual paddy rice mapping in NE Asia which was dominated by single cropping system, including Japan, North Korea, South Korea, and NE China. The cloud computing approach was used to process all the available Landsat 8 imagery in 2014 (143 path/rows, ~3290 scenes) with the Google Earth Engine (GEE) platform. The results indicated that the Landsat 8, GEE, and improved PPPM algorithm can effectively support the yearly mapping of paddy rice in NE Asia. The resultant paddy rice map has a high accuracy with the producer (user) accuracy of 73% (92%), based on the validation using very high resolution images and intensive field photos. Geographic characteristics of paddy rice distribution were analyzed from aspects of country, elevation, latitude, and climate. The resultant 30-m paddy rice map is expected to provide unprecedented details about the area, spatial distribution, and landscape pattern of paddy rice fields in NE Asia, which will contribute to food security assessment, water resource management, estimation of greenhouse gas emissions, and disease control. PMID:28025586
Dong, Jinwei; Xiao, Xiangming; Menarguez, Michael A; Zhang, Geli; Qin, Yuanwei; Thau, David; Biradar, Chandrashekhar; Moore, Berrien
2016-11-01
Area and spatial distribution information of paddy rice are important for understanding of food security, water use, greenhouse gas emission, and disease transmission. Due to climatic warming and increasing food demand, paddy rice has been expanding rapidly in high latitude areas in the last decade, particularly in northeastern (NE) Asia. Current knowledge about paddy rice fields in these cold regions is limited. The phenology- and pixel-based paddy rice mapping (PPPM) algorithm, which identifies the flooding signals in the rice transplanting phase, has been effectively applied in tropical areas, but has not been tested at large scale of cold regions yet. Despite the effects from more snow/ice, paddy rice mapping in high latitude areas is assumed to be more encouraging due to less clouds, lower cropping intensity, and more observations from Landsat sidelaps. Moreover, the enhanced temporal and geographic coverage from Landsat 8 provides an opportunity to acquire phenology information and map paddy rice. This study evaluated the potential of Landsat 8 images on annual paddy rice mapping in NE Asia which was dominated by single cropping system, including Japan, North Korea, South Korea, and NE China. The cloud computing approach was used to process all the available Landsat 8 imagery in 2014 (143 path/rows, ~3290 scenes) with the Google Earth Engine (GEE) platform. The results indicated that the Landsat 8, GEE, and improved PPPM algorithm can effectively support the yearly mapping of paddy rice in NE Asia. The resultant paddy rice map has a high accuracy with the producer (user) accuracy of 73% (92%), based on the validation using very high resolution images and intensive field photos. Geographic characteristics of paddy rice distribution were analyzed from aspects of country, elevation, latitude, and climate. The resultant 30-m paddy rice map is expected to provide unprecedented details about the area, spatial distribution, and landscape pattern of paddy rice fields in NE Asia, which will contribute to food security assessment, water resource management, estimation of greenhouse gas emissions, and disease control.
NASA Astrophysics Data System (ADS)
Listyorini, Tri; Muzid, Syafiul
2017-06-01
The promotion team of Muria Kudus University (UMK) has done annual promotion visit to several senior high schools in Indonesia. The visits were done to numbers of schools in Kudus, Jepara, Demak, Rembang and Purwodadi. To simplify the visit, each visit round is limited to 15 (fifteen) schools. However, the team frequently faces some obstacles during the visit, particularly in determining the route that they should take toward the targeted school. It is due to the long distance or the difficult route to reach the targeted school that leads to elongated travel duration and inefficient fuel cost. To solve these problems, the development of a certain application using heuristic genetic algorithm method based on the dynamic of population size or Population Resizing on Fitness lmprovement Genetic Algorithm (PRoFIGA), was done. This android-based application was developed to make the visit easier and to determine a shorter route for the team, hence, the visiting period will be effective and efficient. The result of this research was an android-based application to determine the shortest route by combining heuristic method and Google Maps Application Programming lnterface (API) that display the route options for the team.
Web application and database modeling of traffic impact analysis using Google Maps
NASA Astrophysics Data System (ADS)
Yulianto, Budi; Setiono
2017-06-01
Traffic impact analysis (TIA) is a traffic study that aims at identifying the impact of traffic generated by development or change in land use. In addition to identifying the traffic impact, TIA is also equipped with mitigation measurement to minimize the arising traffic impact. TIA has been increasingly important since it was defined in the act as one of the requirements in the proposal of Building Permit. The act encourages a number of TIA studies in various cities in Indonesia, including Surakarta. For that reason, it is necessary to study the development of TIA by adopting the concept Transportation Impact Control (TIC) in the implementation of the TIA standard document and multimodal modeling. It includes TIA's standardization for technical guidelines, database and inspection by providing TIA checklists, monitoring and evaluation. The research was undertaken by collecting the historical data of junctions, modeling of the data in the form of relational database, building a user interface for CRUD (Create, Read, Update and Delete) the TIA data in the form of web programming with Google Maps libraries. The result research is a system that provides information that helps the improvement and repairment of TIA documents that exist today which is more transparent, reliable and credible.
KML Super Overlay to WMS Translator
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2007-01-01
This translator is a server-based application that automatically generates KML super overlay configuration files required by Google Earth for map data access via the Open Geospatial Consortium WMS (Web Map Service) standard. The translator uses a set of URL parameters that mirror the WMS parameters as much as possible, and it also can generate a super overlay subdivision of any given area that is only loaded when needed, enabling very large areas of coverage at very high resolutions. It can make almost any dataset available as a WMS service visible and usable in any KML application, without the need to reformat the data.
The Lunar Mapping and Modeling Project
NASA Technical Reports Server (NTRS)
Nall, M.; French, R.; Noble, S.; Muery, K.
2010-01-01
The Lunar Mapping and Modeling Project (LMMP) is managing a suite of lunar mapping and modeling tools and data products that support lunar exploration activities, including the planning, de-sign, development, test, and operations associated with crewed and/or robotic operations on the lunar surface. Although the project was initiated primarily to serve the needs of the Constellation program, it is equally suited for supporting landing site selection and planning for a variety of robotic missions, including NASA science and/or human precursor missions and commercial missions such as those planned by the Google Lunar X-Prize participants. In addition, LMMP should prove to be a convenient and useful tool for scientific analysis and for education and public out-reach (E/PO) activities.
Using Webgis and Cloud Tools to Promote Cultural Heritage Dissemination: the Historic up Project
NASA Astrophysics Data System (ADS)
Tommasi, A.; Cefalo, R.; Zardini, F.; Nicolaucig, M.
2017-05-01
On the occasion of the First World War centennial, GeoSNav Lab (Geodesy and Satellite Navigation Laboratory), Department of Engineering and Architecture, University of Trieste, Italy, in coooperation with Radici&Futuro Association, Trieste, Italy, carried out an educational Project named "Historic Up" involving a group of students from "F. Petrarca" High School of Trieste, Italy. The main goal of the project is to make available to students of Middle and High Schools a set of historical and cultural contents in a simple and immediate way, through the production of a virtual and interactive tour following the event that caused the burst of the First World War: the assassination of Franz Ferdinand and his wife Sofia in Sarajevo occurred on June 28, 1914. A set of Google Apps was used, including Google Earth, Maps, Tour Builder, Street View, Gmail, Drive, and Docs. The Authors instructed the students about software and team-working and supported them along the research. After being checked, all the historical and geographic data have been uploaded on a Google Tour Builder to create a sequence of historical checkpoints. Each checkpoint has texts, pictures and videos that connect the tour-users to 1914. Moreover, GeoSNaV Lab researchers produced a KML (Keyhole Markup Language) file, formed by several polylines and points, representing the itinerary of the funeral procession that has been superimposed on ad-hoc georeferenced historical maps. This tour, freely available online, starts with the arrival of the royals, on June 28th 1914, and follows the couple along the events, from the assassination to the burial in Arstetten (Austria), including their passages through Trieste (Italy), Ljubljana (Slovenia), Graz and Wien (Austria).
Google it: obtaining information about local STD/HIV testing services online.
Habel, Melissa A; Hood, Julia; Desai, Sheila; Kachur, Rachel; Buhi, Eric R; Liddon, Nicole
2011-04-01
Although the Internet is one of the most commonly accessed resources for health information, finding information on local sexual health services, such as sexually transmitted disease (STD) testing, can be challenging. Recognizing that most quests for online health information begin with search engines, the purpose of this exploratory study was to examine the extent to which online information about local STD/HIV testing services can be found using Google. Queries on STD and HIV testing services were executed in Google for 6 geographically unique locations across the United States. The first 3 websites that resulted from each query were coded for the following characteristics: (1) relevancy to the search topic, (2) domain and purpose, (3) rank in Google results, and (4) content. Websites hosted at .com (57.3%), .org (25.7%), and .gov (10.5%) domains were retrieved most frequently. Roughly half of all websites (n = 376) provided information relevant to the query, and about three-quarters (77.0%) of all queries yielded at least 1 relevant website within the first 3 results. Searches for larger cities were more likely to yield relevant results compared with smaller cities (odds ratio [OR] = 10.0, 95% confidence interval [CI] = 5.6, 17.9). On comparison with .com domains, .gov (OR = 2.9, 95% CI = 1.4, 5.6) and .org domains (OR = 2.9, 95% CI = 1.7, 4.8) were more likely to provide information of the location to get tested. Ease of online access to information about sexual health services varies by search topic and locale. Sexual health service providers must optimize their website placement so as to reach a greater proportion of the sexually active population who use web search engines.
TumorMap: Exploring the Molecular Similarities of Cancer Samples in an Interactive Portal.
Newton, Yulia; Novak, Adam M; Swatloski, Teresa; McColl, Duncan C; Chopra, Sahil; Graim, Kiley; Weinstein, Alana S; Baertsch, Robert; Salama, Sofie R; Ellrott, Kyle; Chopra, Manu; Goldstein, Theodore C; Haussler, David; Morozova, Olena; Stuart, Joshua M
2017-11-01
Vast amounts of molecular data are being collected on tumor samples, which provide unique opportunities for discovering trends within and between cancer subtypes. Such cross-cancer analyses require computational methods that enable intuitive and interactive browsing of thousands of samples based on their molecular similarity. We created a portal called TumorMap to assist in exploration and statistical interrogation of high-dimensional complex "omics" data in an interactive and easily interpretable way. In the TumorMap, samples are arranged on a hexagonal grid based on their similarity to one another in the original genomic space and are rendered with Google's Map technology. While the important feature of this public portal is the ability for the users to build maps from their own data, we pre-built genomic maps from several previously published projects. We demonstrate the utility of this portal by presenting results obtained from The Cancer Genome Atlas project data. Cancer Res; 77(21); e111-4. ©2017 AACR . ©2017 American Association for Cancer Research.
Map based multimedia tool on Pacific theatre in World War II
NASA Astrophysics Data System (ADS)
Pakala Venkata, Devi Prasada Reddy
Maps have been used for depicting data of all kinds in the educational community for many years. A standout amongst the rapidly changing methods of teaching is through the development of interactive and dynamic maps. The emphasis of the thesis is to develop an intuitive map based multimedia tool, which provides a timeline of battles and events in the Pacific theatre of World War II. The tool contains summaries of major battles and commanders and has multimedia content embedded in it. The primary advantage of this Map tool is that one can quickly know about all the battles and campaigns of the Pacific Theatre by accessing Timeline of Battles in each region or Individual Battles in each region or Summary of each Battle in an interactive way. This tool can be accessed via any standard web browser and motivate the user to know more about the battles involved in the Pacific Theatre. It was made responsive using Google maps API, JavaScript, HTML5 and CSS.
NASA Astrophysics Data System (ADS)
Moore, R. T.; Hansen, M. C.
2011-12-01
Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as well as transparency in data and methods. Methods developed for global processing of MODIS data to map land cover are being adopted for use with Landsat data. Specifically, the MODIS Vegetation Continuous Field product methodology has been applied for mapping forest extent and change at national scales using Landsat time-series data sets. Scaling this method to continental and global scales is enabled by Google Earth Engine computing capabilities. By combining the supervised learning VCF approach with the Landsat archive and cloud computing, unprecedented monitoring of land cover dynamics is enabled.
Urban topography for flood modeling by fusion of OpenStreetMap, SRTM and local knowledge
NASA Astrophysics Data System (ADS)
Winsemius, Hessel; Donchyts, Gennadii; Eilander, Dirk; Chen, Jorik; Leskens, Anne; Coughlan, Erin; Mawanda, Shaban; Ward, Philip; Diaz Loaiza, Andres; Luo, Tianyi; Iceland, Charles
2016-04-01
Topography data is essential for understanding and modeling of urban flood hazard. Within urban areas, much of the topography is defined by highly localized man-made features such as roads, channels, ditches, culverts and buildings. This results in the requirement that urban flood models require high resolution topography, and water conveying connections within the topography are considered. In recent years, more and more topography information is collected through LIDAR surveys however there are still many cities in the world where high resolution topography data is not available. Furthermore, information on connectivity is required for flood modelling, even when LIDAR data are used. In this contribution, we demonstrate how high resolution terrain data can be synthesized using a fusion between features in OpenStreetMap (OSM) data (including roads, culverts, channels and buildings) and existing low resolution and noisy SRTM elevation data using the Google Earth Engine platform. Our method uses typical existing OSM properties to estimate heights and topology associated with the features, and uses these to correct noise and burn features on top of the existing low resolution SRTM elevation data. The method has been setup in the Google Earth Engine platform so that local stakeholders and mapping teams can on-the-fly propose, include and visualize the effect of additional features and properties of features, which are deemed important for topography and water conveyance. These features can be included in a workshop environment. We pilot our tool over Dar Es Salaam.
Machine Learning for Flood Prediction in Google Earth Engine
NASA Astrophysics Data System (ADS)
Kuhn, C.; Tellman, B.; Max, S. A.; Schwarz, B.
2015-12-01
With the increasing availability of high-resolution satellite imagery, dynamic flood mapping in near real time is becoming a reachable goal for decision-makers. This talk describes a newly developed framework for predicting biophysical flood vulnerability using public data, cloud computing and machine learning. Our objective is to define an approach to flood inundation modeling using statistical learning methods deployed in a cloud-based computing platform. Traditionally, static flood extent maps grounded in physically based hydrologic models can require hours of human expertise to construct at significant financial cost. In addition, desktop modeling software and limited local server storage can impose restraints on the size and resolution of input datasets. Data-driven, cloud-based processing holds promise for predictive watershed modeling at a wide range of spatio-temporal scales. However, these benefits come with constraints. In particular, parallel computing limits a modeler's ability to simulate the flow of water across a landscape, rendering traditional routing algorithms unusable in this platform. Our project pushes these limits by testing the performance of two machine learning algorithms, Support Vector Machine (SVM) and Random Forests, at predicting flood extent. Constructed in Google Earth Engine, the model mines a suite of publicly available satellite imagery layers to use as algorithm inputs. Results are cross-validated using MODIS-based flood maps created using the Dartmouth Flood Observatory detection algorithm. Model uncertainty highlights the difficulty of deploying unbalanced training data sets based on rare extreme events.
Aggregating Concept Map Data to Investigate the Knowledge of Beginning CS Students
ERIC Educational Resources Information Center
Mühling, Andreas
2016-01-01
Concept maps have a long history in educational settings as a tool for teaching, learning, and assessing. As an assessment tool, they are predominantly used to extract the structural configuration of learners' knowledge. This article presents an investigation of the knowledge structures of a large group of beginning CS students. The investigation…
Google Earth Engine derived areal extents to infer elevation variation of lakes and reservoirs
NASA Astrophysics Data System (ADS)
Nguy-Robertson, Anthony; May, Jack; Dartevelle, Sebastien; Griffin, Sean; Miller, Justin; Tetrault, Robert; Birkett, Charon; Lucero, Eileen; Russo, Tess; Zentner, Matthew
2017-04-01
Monitoring water supplies is important for identifying potential national security issues before they begin. As a means to estimate lake and reservoir storage for sites without reliable water stage data, this study defines correlations between water body levels from hypsometry curves based on in situ gauge station and altimeter data (i.e. TOPEX/Poseidon, Jason series) and sensor areal extents observed in historic multispectral (i.e. MODIS and Landsat TM/ETM+/OLI) imagery. Water levels measured using in situ observations and altimeters, when in situ data were unavailable, were used to estimate the relationship between water elevation and surface area for 18 sites globally. Altimeters were generally more accurate (RMSE: 0.40 - 0.49 m) for estimating in situ lake elevations from Iraq and Afghanistan than the modeled elevation data using multispectral sensor areal extents: Landsat (RMSE: 0.25 - 1.5 m) and MODIS (RMSE 0.53 - 3.0 m). Correlations between altimeter data and Landsat imagery processed with Google Earth Engine confirmed similar relationships exists for a broader range of lakes without reported in situ data across the globe (RMSE: 0.24 - 1.6 m). Thus, while altimetry is still preferred to an areal extent model, lake surface area derived with Google Earth Engine can be used as a reasonable proxy for lake storage, expanding the number of observable lakes beyond the current constellation of altimeters and in situ gauges.
iAnn: an event sharing platform for the life sciences.
Jimenez, Rafael C; Albar, Juan P; Bhak, Jong; Blatter, Marie-Claude; Blicher, Thomas; Brazas, Michelle D; Brooksbank, Cath; Budd, Aidan; De Las Rivas, Javier; Dreyer, Jacqueline; van Driel, Marc A; Dunn, Michael J; Fernandes, Pedro L; van Gelder, Celia W G; Hermjakob, Henning; Ioannidis, Vassilios; Judge, David P; Kahlem, Pascal; Korpelainen, Eija; Kraus, Hans-Joachim; Loveland, Jane; Mayer, Christine; McDowall, Jennifer; Moran, Federico; Mulder, Nicola; Nyronen, Tommi; Rother, Kristian; Salazar, Gustavo A; Schneider, Reinhard; Via, Allegra; Villaveces, Jose M; Yu, Ping; Schneider, Maria V; Attwood, Teresa K; Corpas, Manuel
2013-08-01
We present iAnn, an open source community-driven platform for dissemination of life science events, such as courses, conferences and workshops. iAnn allows automatic visualisation and integration of customised event reports. A central repository lies at the core of the platform: curators add submitted events, and these are subsequently accessed via web services. Thus, once an iAnn widget is incorporated into a website, it permanently shows timely relevant information as if it were native to the remote site. At the same time, announcements submitted to the repository are automatically disseminated to all portals that query the system. To facilitate the visualization of announcements, iAnn provides powerful filtering options and views, integrated in Google Maps and Google Calendar. All iAnn widgets are freely available. http://iann.pro/iannviewer manuel.corpas@tgac.ac.uk.
Tactical Level Commander and Staff Toolkit
2010-01-01
Sites Geodata.gov (for maps) http://gos2.geodata.gov Google Earth for .mil (United States Army Corps of Engineers (USACE) site) https...the eyes, ears, head, hands, back, and feet. When appropriate, personnel should wear protective lenses, goggles, or face shields . Leaders should...Typical hurricanes are about 300 miles wide, although they can vary considerably. Size is not necessarily an indication of hurricane intensity. The
2014-01-01
Background Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Methods Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. Results We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Conclusions Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow. PMID:25565494
Utility of Mobile phones to support In-situ data collection for Land Cover Mapping
NASA Astrophysics Data System (ADS)
Oduor, P.; Omondi, S.; Wahome, A.; Mugo, R. M.; Flores, A.
2017-12-01
With the compelling need to create better monitoring tools for our landscapes to enhance better decision making processes, it becomes imperative to do so in much more sophisticated yet simple ways. Making it possible to leverage untapped potential of our "lay men" at the same time enabling us to respond to the complexity of the information we have to get out. SERVIR Eastern and Southern Africa has developed a mobile app that can be utilized with very little prior knowledge or no knowledge at all to collect spatial information on land cover. This set of in-situ data can be collected by masses because the tools is very simple to use, and have this information fed in classification algorithms than can then be used to map out our ever changing landscape. The LULC Mapper is a subset of JiMap system and is able to pull the google earth imagery and open street maps to enable user familiarize with their location. It uses phone GPS, phone network information to map location coordinates and at the same time gives the user sample picture of what to categorize their landscape. The system is able to work offline and when user gets access to internet they can push the information into an amazon database as bulk data. The location details including geotagged photos allows the data to be used in development of a lot of spatial information including land cover data. The app is currently available in Google Play Store and will soon be uploaded on Appstore for utilization by a wider community. We foresee a lot of potential in this tool in terms of making data collection cheaper and affordable. Taking advantage of the advances made in phone technology. We envisage to do a data collection campaign where we can have the tool used for crowdsourcing.
Interactive Mapping of the Planets: An Online Activity Using the Google Earth Platform
NASA Astrophysics Data System (ADS)
Osinski, G. R.; Gilbert, A.; Harrison, T. N.; Mader, M. M.; Shankar, B.; Tornabene, L. L.
2013-12-01
With funding from the Natural Sciences and Engineering Research Council of Canada's PromoScience program and support from the Department of Earth Sciences at The University of Western Ontario, the Centre for Planetary Science and Exploration (CPSX) has developed a new web-based initiative called Interactive Mapping of the Planets (IMAPS). Additional components include in person school visits to deliver inquiry-based workshops, week-long summer camps, and pre-prepared impact rock lending kits, all framed around the IMAPS activity. IMAPS will is now in beta testing mode and will be demonstrated in this session. The general objective of the online activity is for participants to plan and design a rover mission to Mars based on a given mission goal - e.g., to find evidence for past water flow. The activity begins with participants receiving image-analysis training to learn about the different landforms on Mars and which ones are potentially caused by water flow. They then need to pass a short test to show they can consistently identify Martian landforms. From there, the participants choose a landing site and plan a traverse - utilizing the free Google Earth plug-in - and taking into account factors such as hazards and their sites of interest. A mission control blog will provide updates on the status of their mission and a 'choose your rover' option provides the opportunity to unlock more advanced rovers by collaborating with other scientists and rating their missions. Indeed, evaluation of missions will be done using a crowd-sourcing method. In addition to being fully accessible online, CPSX will also target primary- and secondary-school grades in which astronomy and space science is taught. Teachers in K-12 classrooms will be able to sign-up for the activity ahead of time in order to receive a workshop package, which will guide them on how to use the IMAPS online activity with their class. Teachers will be able to set up groups for their classroom so that they can evaluate their students based on pre-determined criteria. The IMAPS activities are developed in partnerships with the Department of Earth Sciences at Western University, Sports Western, the Thames Valley District School Board, and Dimentians Web Marketing and Design. We are continually looking for new collaborators to help design or test our inquiry- and web-based activities, provide feedback on our programs, or volunteer with us. Please contact cpsxoutreach@uwo.ca if you are interested.
High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.
Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue
2010-11-13
Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.
Mapping or Tracing? Rethinking Curriculum Mapping in Higher Education
ERIC Educational Resources Information Center
Wang, Chia-Ling
2015-01-01
Curriculum mapping has been emphasized in recent curriculum innovations in higher education in the drive for global competitiveness. This paper begins by providing an outline of current discourses of curriculum mapping in higher education. Curriculum mapping is frequently associated with outcome-based learning and work readiness, and guiding the…
NASA Astrophysics Data System (ADS)
Westgard, Kerri S. W.
Success in today's globalized, multi-dimensional, and connected world requires individuals to have a variety of skill sets -- i.e. oracy, numeracy, literacy, as well as the ability to think spatially. Student's spatial literacy, based on various national and international assessment results, indicates that even though there have been gains in U.S. scores over the past decade, overall performance, including those specific to spatial skills, are still below proficiency. Existing studies focused on the potential of virtual learning environment technology to reach students in a variety of academic areas, but a need still exists to study specifically the phenomenon of using Google Earth as a potentially more useful pedagogical tool to develop spatial literacy than the currently employed methods. The purpose of this study was to determine the extent to which graphicacy achievement scores of students who were immersed in a Google Earth environment were different from students who were provided with only two-dimensional instruction for developing spatial skills. Situated learning theory and the work of Piaget and Inhelder's Child's Conception of Space provided the theoretical grounding from which this study evolved. The National Research Council's call to develop spatial literacy, as seen in Learning to Think Spatially , provided the impetus to begin research. The target population (N = 84) for this study consisted of eighth grade geography students at an upper Midwest Jr. High School during the 2009-2010 academic year. Students were assigned to the control or experimental group based on when they had geography class. Control group students ( n = 44) used two-dimensional PowerPoint images to complete activities, while experimental group students (n = 40) were immersed in the three-dimensional Google Earth world for activity completion. Research data was then compiled and statistically analyzed to answer five research questions developed for this study. One-way ANOVAs were run on data collected and no statistically significant difference was found between the control and experimental group. However, two of the five research questions yielded practically significant data that indicates students who used Google Earth outperformed their counterparts who used PowerPoint on pattern prediction and spatial relationship understanding.
Extensible Probabilistic Repository Technology (XPRT)
2004-10-01
projects, such as, Centaurus , Evidence Data Base (EDB), etc., others were fabricated, such as INS and FED, while others contain data from the open...Google Web Report Unlimited SOAP API News BBC News Unlimited WEB RSS 1.0 Centaurus Person Demographics 204,402 people from 240 countries...objects of the domain ontology map to the various simulated data-sources. For example, the PersonDemographics are stored in the Centaurus database, while
Reaching Forward in the War against the Islamic State
2016-12-07
every week with U.S.- and Coalition-advised ISOF troops taking the lead in combat operations using cellular communications systems that link them...tions—Offline Maps, Google Earth , and Viber, to name a few—which allowed them to bring tablets and phones on their operations to help communicate ...provided an initial Remote Advise and Assist capability that enabled the special forces advisors to track, communicate , and share limited data with
Sato, H.P.; Harp, E.L.
2009-01-01
The 12 May 2008 M7.9 Wenchuan earthquake in the People's Republic of China represented a unique opportunity for the international community to use commonly available GIS (Geographic Information System) tools, like Google Earth (GE), to rapidly evaluate and assess landslide hazards triggered by the destructive earthquake and its aftershocks. In order to map earthquake-triggered landslides, we provide details on the applicability and limitations of publicly available 3-day-post- and pre-earthquake imagery provided by GE from the FORMOSAT-2 (formerly ROCSAT-2; Republic of China Satellite 2). We interpreted landslides on the 8-m-resolution FORMOSAT-2 image by GE; as a result, 257 large landslides were mapped with the highest concentration along the Beichuan fault. An estimated density of 0.3 landslides/km2 represents a minimum bound on density given the resolution of available imagery; higher resolution data would have identified more landslides. This is a preliminary study, and further study is needed to understand the landslide characteristics in detail. Although it is best to obtain landslide locations and measurements from satellite imagery having high resolution, it was found that GE is an effective and rapid reconnaissance tool. ?? 2009 Springer-Verlag.
GeneOnEarth: fitting genetic PC plots on the globe.
Torres-Sánchez, Sergio; Medina-Medina, Nuria; Gignoux, Chris; Abad-Grau, María M; González-Burchard, Esteban
2013-01-01
Principal component (PC) plots have become widely used to summarize genetic variation of individuals in a sample. The similarity between genetic distance in PC plots and geographical distance has shown to be quite impressive. However, in most situations, individual ancestral origins are not precisely known or they are heterogeneously distributed; hence, they are hardly linked to a geographical area. We have developed GeneOnEarth, a user-friendly web-based tool to help geneticists to understand whether a linear isolation-by-distance model may apply to a genetic data set; thus, genetic distances among a set of individuals resemble geographical distances among their origins. Its main goal is to allow users to first apply a by-view Procrustes method to visually learn whether this model holds. To do that, the user can choose the exact geographical area from an on line 2D or 3D world map by using, respectively, Google Maps or Google Earth, and rotate, flip, and resize the images. GeneOnEarth can also compute the optimal rotation angle using Procrustes analysis and assess statistical evidence of similarity when a different rotation angle has been chosen by the user. An online version of GeneOnEarth is available for testing and using purposes at http://bios.ugr.es/GeneOnEarth.
NASA Astrophysics Data System (ADS)
Sareen, Sanjay; Gupta, Sunil Kumar; Sood, Sandeep K.
2017-10-01
Zika virus is a mosquito-borne disease that spreads very quickly in different parts of the world. In this article, we proposed a system to prevent and control the spread of Zika virus disease using integration of Fog computing, cloud computing, mobile phones and the Internet of things (IoT)-based sensor devices. Fog computing is used as an intermediary layer between the cloud and end users to reduce the latency time and extra communication cost that is usually found high in cloud-based systems. A fuzzy k-nearest neighbour is used to diagnose the possibly infected users, and Google map web service is used to provide the geographic positioning system (GPS)-based risk assessment to prevent the outbreak. It is used to represent each Zika virus (ZikaV)-infected user, mosquito-dense sites and breeding sites on the Google map that help the government healthcare authorities to control such risk-prone areas effectively and efficiently. The proposed system is deployed on Amazon EC2 cloud to evaluate its performance and accuracy using data set for 2 million users. Our system provides high accuracy of 94.5% for initial diagnosis of different users according to their symptoms and appropriate GPS-based risk assessment.
Debnath, Manish; Kharumnuid, Graciously; Thongnibah, Welfrank; Tandon, Veena
2016-01-01
Most metazoan parasites that invade vertebrate hosts belong to three phyla: Platyhelminthes, Nematoda and Acanthocephala. Many of the parasitic members of these phyla are collectively known as helminths and are causative agents of many debilitating, deforming and lethal diseases of humans and animals. The North-East India Helminth Parasite Information Database (NEIHPID) project aimed to document and characterise the spectrum of helminth parasites in the north-eastern region of India, providing host, geographical distribution, diagnostic characters and image data. The morphology-based taxonomic data are supplemented with information on DNA sequences of nuclear, ribosomal and mitochondrial gene marker regions that aid in parasite identification. In addition, the database contains raw next generation sequencing (NGS) data for 3 foodborne trematode parasites, with more to follow. The database will also provide study material for students interested in parasite biology. Users can search the database at various taxonomic levels (phylum, class, order, superfamily, family, genus, and species), or by host, habitat and geographical location. Specimen collection locations are noted as co-ordinates in a MySQL database and can be viewed on Google maps, using Google Maps JavaScript API v3. The NEIHPID database has been made freely available at http://nepiac.nehu.ac.in/index.php PMID:27285615
An Interactive Web System for Field Data Sharing and Collaboration
NASA Astrophysics Data System (ADS)
Weng, Y.; Sun, F.; Grigsby, J. D.
2010-12-01
A Web 2.0 system is designed and developed to facilitate data collection for the field studies in the Geological Sciences department at Ball State University. The system provides a student-centered learning platform that enables the users to first upload their collected data in various formats, interact and collaborate dynamically online, and ultimately create a shared digital repository of field experiences. The data types considered for the system and their corresponding format and requirements are listed in the table below. The system has six main functionalities as follows. (1) Only the registered users can access the system with confidential identification and password. (2) Each user can upload/revise/delete data in various formats such as image, audio, video, and text files to the system. (3) Interested users are allowed to co-edit the contents and join the collaboration whiteboard for further discussion. (4) The system integrates with Google, Yahoo, or Flickr to search for similar photos with same tags. (5) Users can search the web system according to the specific key words. (6) Photos with recorded GPS readings can be mashed and mapped to Google Maps/Earth for visualization. Application of the system to geology field trips at Ball State University will be demonstrated to assess the usability of the system.Data Requirements
HCLS 2.0/3.0: health care and life sciences data mashup using Web 2.0/3.0.
Cheung, Kei-Hoi; Yip, Kevin Y; Townsend, Jeffrey P; Scotch, Matthew
2008-10-01
We describe the potential of current Web 2.0 technologies to achieve data mashup in the health care and life sciences (HCLS) domains, and compare that potential to the nascent trend of performing semantic mashup. After providing an overview of Web 2.0, we demonstrate two scenarios of data mashup, facilitated by the following Web 2.0 tools and sites: Yahoo! Pipes, Dapper, Google Maps and GeoCommons. In the first scenario, we exploited Dapper and Yahoo! Pipes to implement a challenging data integration task in the context of DNA microarray research. In the second scenario, we exploited Yahoo! Pipes, Google Maps, and GeoCommons to create a geographic information system (GIS) interface that allows visualization and integration of diverse categories of public health data, including cancer incidence and pollution prevalence data. Based on these two scenarios, we discuss the strengths and weaknesses of these Web 2.0 mashup technologies. We then describe Semantic Web, the mainstream Web 3.0 technology that enables more powerful data integration over the Web. We discuss the areas of intersection of Web 2.0 and Semantic Web, and describe the potential benefits that can be brought to HCLS research by combining these two sets of technologies.
Biswal, Devendra Kumar; Debnath, Manish; Kharumnuid, Graciously; Thongnibah, Welfrank; Tandon, Veena
2016-01-01
Most metazoan parasites that invade vertebrate hosts belong to three phyla: Platyhelminthes, Nematoda and Acanthocephala. Many of the parasitic members of these phyla are collectively known as helminths and are causative agents of many debilitating, deforming and lethal diseases of humans and animals. The North-East India Helminth Parasite Information Database (NEIHPID) project aimed to document and characterise the spectrum of helminth parasites in the north-eastern region of India, providing host, geographical distribution, diagnostic characters and image data. The morphology-based taxonomic data are supplemented with information on DNA sequences of nuclear, ribosomal and mitochondrial gene marker regions that aid in parasite identification. In addition, the database contains raw next generation sequencing (NGS) data for 3 foodborne trematode parasites, with more to follow. The database will also provide study material for students interested in parasite biology. Users can search the database at various taxonomic levels (phylum, class, order, superfamily, family, genus, and species), or by host, habitat and geographical location. Specimen collection locations are noted as co-ordinates in a MySQL database and can be viewed on Google maps, using Google Maps JavaScript API v3. The NEIHPID database has been made freely available at http://nepiac.nehu.ac.in/index.php.
High-Resolution Air Pollution Mapping with Google Street View Cars: Exploiting Big Data.
Apte, Joshua S; Messier, Kyle P; Gani, Shahzad; Brauer, Michael; Kirchstetter, Thomas W; Lunden, Melissa M; Marshall, Julian D; Portier, Christopher J; Vermeulen, Roel C H; Hamburg, Steven P
2017-06-20
Air pollution affects billions of people worldwide, yet ambient pollution measurements are limited for much of the world. Urban air pollution concentrations vary sharply over short distances (≪1 km) owing to unevenly distributed emission sources, dilution, and physicochemical transformations. Accordingly, even where present, conventional fixed-site pollution monitoring methods lack the spatial resolution needed to characterize heterogeneous human exposures and localized pollution hotspots. Here, we demonstrate a measurement approach to reveal urban air pollution patterns at 4-5 orders of magnitude greater spatial precision than possible with current central-site ambient monitoring. We equipped Google Street View vehicles with a fast-response pollution measurement platform and repeatedly sampled every street in a 30-km 2 area of Oakland, CA, developing the largest urban air quality data set of its type. Resulting maps of annual daytime NO, NO 2 , and black carbon at 30 m-scale reveal stable, persistent pollution patterns with surprisingly sharp small-scale variability attributable to local sources, up to 5-8× within individual city blocks. Since local variation in air quality profoundly impacts public health and environmental equity, our results have important implications for how air pollution is measured and managed. If validated elsewhere, this readily scalable measurement approach could address major air quality data gaps worldwide.
HCLS 2.0/3.0: Health Care and Life Sciences Data Mashup Using Web 2.0/3.0
Cheung, Kei-Hoi; Yip, Kevin Y.; Townsend, Jeffrey P.; Scotch, Matthew
2010-01-01
We describe the potential of current Web 2.0 technologies to achieve data mashup in the health care and life sciences (HCLS) domains, and compare that potential to the nascent trend of performing semantic mashup. After providing an overview of Web 2.0, we demonstrate two scenarios of data mashup, facilitated by the following Web 2.0 tools and sites: Yahoo! Pipes, Dapper, Google Maps and GeoCommons. In the first scenario, we exploited Dapper and Yahoo! Pipes to implement a challenging data integration task in the context of DNA microarray research. In the second scenario, we exploited Yahoo! Pipes, Google Maps, and GeoCommons to create a geographic information system (GIS) interface that allows visualization and integration of diverse categories of public health data, including cancer incidence and pollution prevalence data. Based on these two scenarios, we discuss the strengths and weaknesses of these Web 2.0 mashup technologies. We then describe Semantic Web, the mainstream Web 3.0 technology that enables more powerful data integration over the Web. We discuss the areas of intersection of Web 2.0 and Semantic Web, and describe the potential benefits that can be brought to HCLS research by combining these two sets of technologies. PMID:18487092
Mapping the Americanization of English in space and time
Gonçalves, Bruno; Loureiro-Porto, Lucía; Ramasco, José J.
2018-01-01
As global political preeminence gradually shifted from the United Kingdom to the United States, so did the capacity to culturally influence the rest of the world. In this work, we analyze how the world-wide varieties of written English are evolving. We study both the spatial and temporal variations of vocabulary and spelling of English using a large corpus of geolocated tweets and the Google Books datasets corresponding to books published in the US and the UK. The advantage of our approach is that we can address both standard written language (Google Books) and the more colloquial forms of microblogging messages (Twitter). We find that American English is the dominant form of English outside the UK and that its influence is felt even within the UK borders. Finally, we analyze how this trend has evolved over time and the impact that some cultural events have had in shaping it. PMID:29799872
Mapping the Americanization of English in space and time.
Gonçalves, Bruno; Loureiro-Porto, Lucía; Ramasco, José J; Sánchez, David
2018-01-01
As global political preeminence gradually shifted from the United Kingdom to the United States, so did the capacity to culturally influence the rest of the world. In this work, we analyze how the world-wide varieties of written English are evolving. We study both the spatial and temporal variations of vocabulary and spelling of English using a large corpus of geolocated tweets and the Google Books datasets corresponding to books published in the US and the UK. The advantage of our approach is that we can address both standard written language (Google Books) and the more colloquial forms of microblogging messages (Twitter). We find that American English is the dominant form of English outside the UK and that its influence is felt even within the UK borders. Finally, we analyze how this trend has evolved over time and the impact that some cultural events have had in shaping it.
NASA Astrophysics Data System (ADS)
Schmaltz, J. E.; Ilavajhala, S.; Plesea, L.; Hall, J. R.; Boller, R. A.; Chang, G.; Sadaqathullah, S.; Kim, R.; Murphy, K. J.; Thompson, C. K.
2012-12-01
Expedited processing of imagery from NASA satellites for near-real time use by non-science applications users has a long history, especially since the beginning of the Terra and Aqua missions. Several years ago, the Land Atmosphere Near-real-time Capability for EOS (LANCE) was created to greatly expand the range of near-real time data products from a variety of Earth Observing System (EOS) instruments. NASA's Earth Observing System Data and Information System (EOSDIS) began exploring methods to distribute these data as imagery in an intuitive, geo-referenced format, which would be available within three hours of acquisition. Toward this end, EOSDIS has developed the Global Imagery Browse Services (GIBS, http://earthdata.nasa.gov/gibs) to provide highly responsive, scalable, and expandable imagery services. The baseline technology chosen for GIBS was a Tiled Web Mapping Service (TWMS) developed at the Jet Propulsion Laboratory. Using this, global images and mosaics are divided into tiles with fixed bounding boxes for a pyramid of fixed resolutions. Initially, the satellite imagery is created at the existing data systems for each sensor, ensuring the oversight of those most knowledgeable about the science. There, the satellite data is geolocated and converted to an image format such as JPEG, TIFF, or PNG. The GIBS ingest server retrieves imagery from the various data systems and converts them into image tiles, which are stored in a highly-optimized raster format named Meta Raster Format (MRF). The image tiles are then served to users via HTTP by means of an Apache module. Services are available for the entire globe (lat-long projection) and for both polar regions (polar stereographic projection). Requests to the services can be made with the non-standard, but widely known, TWMS format or via the well-known OGC Web Map Tile Service (WMTS) standard format. Standard OGC Web Map Service (WMS) access to the GIBS server is also available. In addition, users may request a KML pyramid. This variety of access methods allows stakeholders to develop visualization/browse clients for a diverse variety of specific audiences. Currently, EOSDIS is providing an OpenLayers web client, Worldview (http://earthdata.nasa.gov/worldview), as an interface to GIBS. A variety of other existing clients can also be developed using such tools as Google Earth, Google Earth browser Plugin, ESRI's Adobe Flash/Flex Client Library, NASA World Wind, Perceptive Pixel Client, Esri's iOS Client Library, and OpenLayers for Mobile. The imagery browse capabilities from GIBS can be combined with other EOSDIS services (i.e. ECHO OpenSearch) via a client that ties them both together to provide an interface that enables data download from the onscreen imagery. Future plans for GIBS include providing imagery based on science quality data from the entire data record of these EOS instruments.
,
1994-01-01
In genealogy, maps are most often used as clues to where public or other records about an ancestor are likely to be found. Searching for maps seldom begins until a newcomer to genealogy has mastered basic genealogical routines
NASA Astrophysics Data System (ADS)
Tellman, B.; Sullivan, J.; Kettner, A.; Brakenridge, G. R.; Slayback, D. A.; Kuhn, C.; Doyle, C.
2016-12-01
There is an increasing need to understand flood vulnerability as the societal and economic effects of flooding increases. Risk models from insurance companies and flood models from hydrologists must be calibrated based on flood observations in order to make future predictions that can improve planning and help societies reduce future disasters. Specifically, to improve these models both traditional methods of flood prediction from physically based models as well as data-driven techniques, such as machine learning, require spatial flood observation to validate model outputs and quantify uncertainty. A key dataset that is missing for flood model validation is a global historical geo-database of flood event extents. Currently, the most advanced database of historical flood extent is hosted and maintained at the Dartmouth Flood Observatory (DFO) that has catalogued 4320 floods (1985-2015) but has only mapped 5% of these floods. We are addressing this data gap by mapping the inventory of floods in the DFO database to create a first-of- its-kind, comprehensive, global and historical geospatial database of flood events. To do so, we combine water detection algorithms on MODIS and Landsat 5,7 and 8 imagery in Google Earth Engine to map discrete flood events. The created database will be available in the Earth Engine Catalogue for download by country, region, or time period. This dataset can be leveraged for new data-driven hydrologic modeling using machine learning algorithms in Earth Engine's highly parallelized computing environment, and we will show examples for New York and Senegal.
Toubal, Abderrezak Kamel; Achite, Mohammed; Ouillon, Sylvain; Dehni, Abdelatif
2018-03-12
Soil losses must be quantified over watersheds in order to set up protection measures against erosion. The main objective of this paper is to quantify and to map soil losses in the Wadi Sahouat basin (2140 km 2 ) in the north-west of Algeria, using the Revised Universal Soil Loss Equation (RUSLE) model assisted by a Geographic Information System (GIS) and remote sensing. The Model Builder of the GIS allowed the automation of the different operations for establishing thematic layers of the model parameters: the erosivity factor (R), the erodibility factor (K), the topographic factor (LS), the crop management factor (C), and the conservation support practice factor (P). The average annual soil loss rate in the Wadi Sahouat basin ranges from 0 to 255 t ha -1 year -1 , maximum values being observed over steep slopes of more than 25% and between 600 and 1000 m elevations. 3.4% of the basin is classified as highly susceptible to erosion, 4.9% with a medium risk, and 91.6% at a low risk. Google Earth reveals a clear conformity with the degree of zones to erosion sensitivity. Based on the soil loss map, 32 sub-basins were classified into three categories by priority of intervention: high, moderate, and low. This priority is available to sustain a management plan against sediment filling of the Ouizert dam at the basin outlet. The method enhancing the RUSLE model and confrontation with Google Earth can be easily adapted to other watersheds.
Benford’s Law Applies to Online Social Networks
Golbeck, Jennifer
2015-01-01
Benford’s Law states that, in naturally occurring systems, the frequency of numbers’ first digits is not evenly distributed. Numbers beginning with a 1 occur roughly 30% of the time, and are six times more common than numbers beginning with a 9. We show that Benford’s Law applies to social and behavioral features of users in online social networks. Using social data from five major social networks (Facebook, Twitter, Google Plus, Pinterest, and LiveJournal), we show that the distribution of first significant digits of friend and follower counts for users in these systems follow Benford’s Law. The same is true for the number of posts users make. We extend this to egocentric networks, showing that friend counts among the people in an individual’s social network also follows the expected distribution. We discuss how this can be used to detect suspicious or fraudulent activity online and to validate datasets. PMID:26308716
Benford's Law Applies to Online Social Networks.
Golbeck, Jennifer
2015-01-01
Benford's Law states that, in naturally occurring systems, the frequency of numbers' first digits is not evenly distributed. Numbers beginning with a 1 occur roughly 30% of the time, and are six times more common than numbers beginning with a 9. We show that Benford's Law applies to social and behavioral features of users in online social networks. Using social data from five major social networks (Facebook, Twitter, Google Plus, Pinterest, and LiveJournal), we show that the distribution of first significant digits of friend and follower counts for users in these systems follow Benford's Law. The same is true for the number of posts users make. We extend this to egocentric networks, showing that friend counts among the people in an individual's social network also follows the expected distribution. We discuss how this can be used to detect suspicious or fraudulent activity online and to validate datasets.
ERIC Educational Resources Information Center
Kopcha, Theodore J.; Otumfuor, Beryl A.; Wang, Lu
2015-01-01
This study examines the effects of spatial ability, gender differences, and pictorial training on fourth grade students' ability to recall landmark locations from memory. Ninety-six students used Google Earth over a 3-week period to locate landmarks (3-D) and mark their location on a 2-D topographical map. Analysis of covariance on posttest scores…
VizieR Online Data Catalog: Orion Integral Filament ALMA+IRAM30m N2H+(1-0) data (Hacar+, 2018)
NASA Astrophysics Data System (ADS)
Hacar, A.; Tafalla, M.; Forbrich, J.; Alves, J.; Meingast, S.; Grossschedl, J.; Teixeira, P. S.
2018-01-01
Combined ALMA+IRAM30m large-scale N2H+(1-0) emission in the Orion ISF. Two datasets are presented here in FITS format: 1.- Full data cube: spectral resolution = 0.1 kms-1 2.- Total integrated line intensity (moment 0) map Units are in Jy/beam See also: https://sites.google.com/site/orion4dproject/home (2 data files).
NASA Technical Reports Server (NTRS)
Xiong, Jun; Thenkabail, Prasad S.; Tilton, James C.; Gumma, Murali K.; Teluguntla, Pardhasaradhi; Oliphant, Adam; Congalton, Russell G.; Yadav, Kamini; Gorelick, Noel
2017-01-01
A satellite-derived cropland extent map at high spatial resolution (30-m or better) is a must for food and water security analysis. Precise and accurate global cropland extent maps, indicating cropland and non-cropland areas, is a starting point to develop high-level products such as crop watering methods (irrigated or rainfed), cropping intensities (e.g., single, double, or continuous cropping), crop types, cropland fallows, as well as assessment of cropland productivity (productivity per unit of land), and crop water productivity (productivity per unit of water). Uncertainties associated with the cropland extent map have cascading effects on all higher-level cropland products. However, precise and accurate cropland extent maps at high spatial resolution over large areas (e.g., continents or the globe) are challenging to produce due to the small-holder dominant agricultural systems like those found in most of Africa and Asia. Cloud-based Geospatial computing platforms and multi-date, multi-sensor satellite image inventories on Google Earth Engine offer opportunities for mapping croplands with precision and accuracy over large areas that satisfy the requirements of broad range of applications. Such maps are expected to provide highly significant improvements compared to existing products, which tend to be coarser in resolution, and often fail to capture fragmented small-holder farms especially in regions with high dynamic change within and across years. To overcome these limitations, in this research we present an approach for cropland extent mapping at high spatial resolution (30-m or better) using the 10-day, 10 to 20-m, Sentinel-2 data in combination with 16-day, 30-m, Landsat-8 data on Google Earth Engine (GEE). First, nominal 30-m resolution satellite imagery composites were created from 36,924 scenes of Sentinel-2 and Landsat-8 images for the entire African continent in 2015-2016. These composites were generated using a median-mosaic of five bands (blue, green, red, near-infrared, NDVI) during each of the two periods (period 1: January-June 2016 and period 2: July-December 2015) plus a 30-m slope layer derived from the Shuttle Radar Topographic Mission (SRTM) elevation dataset. Second, we selected Cropland/Non-cropland training samples (sample size 9791) from various sources in GEE to create pixel-based classifications. As supervised classification algorithm, Random Forest (RF) was used as the primary classifier because of its efficiency, and when over-fitting issues of RF happened due to the noise of input training data, Support Vector Machine (SVM) was applied to compensate for such defects in specific areas. Third, the Recursive Hierarchical Segmentation (RHSeg) algorithm was employed to generate an object-oriented segmentation layer based on spectral and spatial properties from the same input data. This layer was merged with the pixel-based classification to improve segmentation accuracy. Accuracies of the merged 30-m crop extent product were computed using an error matrix approach in which 1754 independent validation samples were used. In addition, a comparison was performed with other available cropland maps as well as with LULC maps to show spatial similarity. Finally, the cropland area results derived from the map were compared with UN FAO statistics. The independent accuracy assessment showed a weighted overall accuracy of 94, with a producers accuracy of 85.9 (or omission error of 14.1), and users accuracy of 68.5 (commission error of 31.5) for the cropland class. The total net cropland area (TNCA) of Africa was estimated as 313 Mha for the nominal year 2015.
Research of cartographer laser SLAM algorithm
NASA Astrophysics Data System (ADS)
Xu, Bo; Liu, Zhengjun; Fu, Yiran; Zhang, Changsai
2017-11-01
As the indoor is a relatively closed and small space, total station, GPS, close-range photogrammetry technology is difficult to achieve fast and accurate indoor three-dimensional space reconstruction task. LIDAR SLAM technology does not rely on the external environment a priori knowledge, only use their own portable lidar, IMU, odometer and other sensors to establish an independent environment map, a good solution to this problem. This paper analyzes the Google Cartographer laser SLAM algorithm from the point cloud matching and closed loop detection. Finally, the algorithm is presented in the 3D visualization tool RViz from the data acquisition and processing to create the environment map, complete the SLAM technology and realize the process of indoor threedimensional space reconstruction
Integration of Bim, Web Maps and Iot for Supporting Comfort Analysis
NASA Astrophysics Data System (ADS)
Gunduz, M.; Isikdag, U.; Basaraner, M.
2017-11-01
The use of the Internet is expanding and the technological capabilities of electronic devices are evolving. Today, Internet of Things (IoT) solutions can be developed that were never even imaginable before. In this paper, a case study is presented on the joint use of Building Information Model (BIM), Geographical Information Systems (GIS) and Internet of Things (IoT) technologies. It is a part of an ongoing study that intends to overcome some problems about the management of complex facilities. In the study, a BIM has been converted and displayed in 2D on Google Maps, and information on various sensors have been represented on the web with geographic coordinates in real-time.
NASA Astrophysics Data System (ADS)
Giardino, Marco; Magagna, Alessandra; Ferrero, Elena; Perrone, Gianluigi
2015-04-01
Digital field mapping has certainly provided geoscientists with the opportunity to map and gather data in the field directly using digital tools and software rather than using paper maps, notebooks and analogue devices and then subsequently transferring the data to a digital format for subsequent analysis. But, the same opportunity has to be recognized for Geoscience education, as well as for stimulating and helping students in the recognition of landforms and interpretation of the geological and geomorphological components of a landscape. More, an early exposure to mapping during school and prior to university can optimise the ability to "read" and identify uncertainty in 3d models. During 2014, about 200 Secondary School students (aged 12-15) of the Piedmont region (NW Italy) participated in a research program involving the use of mobile devices (smartphone and tablet) in the field. Students, divided in groups, used the application Trimble Outdoors Navigators for tracking a geological trail in the Sangone Valley and for taking georeferenced pictures and notes. Back to school, students downloaded the digital data in a .kml file for the visualization on Google Earth. This allowed them: to compare the hand tracked trail on a paper map with the digital trail, and to discuss about the functioning and the precision of the tools; to overlap a digital/semitransparent version of the 2D paper map (a Regional Technical Map) used during the field trip on the 2.5D landscape of Google Earth, as to help them in the interpretation of conventional symbols such as contour lines; to perceive the landforms seen during the field trip as a part of a more complex Pleistocene glacial landscape; to understand the classical and innovative contributions from different geoscientific disciplines to the generation of a 3D structural geological model of the Rivoli-Avigliana Morainic Amphitheatre. In 2013 and 2014, some other pilot projects have been carried out in different areas of the Piedmont region, and in the Sesia Val Grande Geopark, for testing the utility of digital field mapping in Geoscience education. Feedback from students are positive: they are stimulated and involved by the use of ICT for learning Geoscience, and they voluntary choose to work with their personal mobile device (more than 90% of them own a smartphone); they are interested in knowing the features of GPS, and of software for the visualization of satellite and aerial images, but they recognize the importance of integrating and comparing traditional and innovative methods in the field.
NASA Astrophysics Data System (ADS)
Bajo, J. V.; Martinez-Hackert, B.; Polio, C.; Gutierrez, E.
2015-12-01
Santa Ana (Ilamatepec) Volcano is an active composite volcano located in the Apaneca Volcanic Field located in western part of El Salvador, Central America. The volcano is surrounded by rural communities in its proximal areas and the second (Santa Ana, 13 km) and fourth (Sonsosante, 15 km) largest cities of the country. On October 1st, 2005, the volcano erupted after months of increased activity. Following the eruption, volcanic mitigation projects were conducted in the region, but the communities had little or no input on them. This project consisted in the creation of lahar volcanic hazard map for the Canton Buanos Aires on the northern part of the volcano by incorporating the community's knowledge from prior events to model parameters and results. The work with the community consisted in several meetings where the community members recounted past events. They were asked to map the outcomes of those events using either a topographic map of the area, a Google Earth image, or a blank paper poster size. These maps have been used to identify hazard and vulnerable areas, and for model validation. These maps were presented to the communities and they accepted their results and the maps.
NaviCell Web Service for network-based data visualization.
Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei
2015-07-01
Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Dong, Weihua; Liao, Hua
2016-06-01
Despite the now-ubiquitous two-dimensional (2D) maps, photorealistic three-dimensional (3D) representations of cities (e.g., Google Earth) have gained much attention by scientists and public users as another option. However, there is no consistent evidence on the influences of 3D photorealism on pedestrian navigation. Whether 3D photorealism can communicate cartographic information for navigation with higher effectiveness and efficiency and lower cognitive workload compared to the traditional symbolic 2D maps remains unknown. This study aims to explore whether the photorealistic 3D representation can facilitate processes of map reading and navigation in digital environments using a lab-based eye tracking approach. Here we show the differences of symbolic 2D maps versus photorealistic 3D representations depending on users' eye-movement and navigation behaviour data. We found that the participants using the 3D representation were less effective, less efficient and were required higher cognitive workload than using the 2D map for map reading. However, participants using the 3D representation performed more efficiently in self-localization and orientation at the complex decision points. The empirical results can be helpful to improve the usability of pedestrian navigation maps in future designs.
NaviCell Web Service for network-based data visualization
Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei
2015-01-01
Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393
ERIC Educational Resources Information Center
Levi, Peter
2010-01-01
Purpose: The purpose of this paper is to describe a project to digitise maps at the Royal Tropical Institute, or Koninklijk Instituut voor de Tropen (KIT), of The Netherlands. KIT has an extensive collection of maps and nautical charts of (sub-) tropical regions, including general maps and topographical map series, city maps, thematic maps and…
Steven H. Ackers; Raymond J. Davis; Keith A. Olsen; Katie M. Dugger
2015-01-01
Wildlife habitat mapping has evolved at a rapid pace over the last fewdecades. Beginning with simple, often subjective, hand-drawn maps, habitat mapping now involves complex species distribution models (SDMs) using mapped predictor variables derived from remotely sensed data. For species that inhabit large geographic areas, remote sensing technology is often...
ERIC Educational Resources Information Center
Merrett, Christopher E.
This guide to the theory and practice of map classification begins with a discussion of the filing of maps and the function of map classification based on area and theme as illustrated by four maps of Africa. The description of the various classification systems which follows is divided into book schemes with provision for maps (including Dewey…
A virtual tour of geological heritage: Valourising geodiversity using Google Earth and QR code
NASA Astrophysics Data System (ADS)
Martínez-Graña, A. M.; Goy, J. L.; Cimarra, C. A.
2013-12-01
When making land-use plans, it is necessary to inventory and catalogue the geological heritage and geodiversity of a site to establish an apolitical conservation protection plan to meet the educational and social needs of society. New technologies make it possible to create virtual databases using virtual globes - e.g., Google Earth - and other personal-use geomatics applications (smartphones, tablets, PDAs) for accessing geological heritage information in “real time” for scientific, educational, and cultural purposes via a virtual geological itinerary. Seventeen mapped and georeferenced geosites have been created in Keyhole Markup Language for use in map layers used in geological itinerary stops for different applications. A virtual tour has been developed for Las Quilamas Natural Park, which is located in the Spanish Central System, using geological layers and topographic and digital terrain models that can be overlaid in a 3D model. The Google Earth application was used to import the geosite placemarks. For each geosite, a tab has been developed that shows a description of the geology with photographs and diagrams and that evaluates the scientific, educational, and tourism quality. Augmented reality allows the user to access these georeferenced thematic layers and overlay data, images, and graphics in real time on their mobile devices. These virtual tours can be incorporated into subject guides designed by public. Seven educational and interpretive panels describing some of the geosites were designed and tagged with a QR code that could be printed at each stop or in the printed itinerary. These QR codes can be scanned with the camera found on most mobile devices, and video virtual tours can be viewed on these devices. The virtual tour of the geological heritage can be used to show tourists the geological history of the Las Quilamas Natural Park using new geomatics technologies (virtual globes, augmented reality, and QR codes).
Google Earth Grand Tour Themes
NASA Astrophysics Data System (ADS)
De Paor, D. G.; Whitmeyer, S. J.; Bentley, C.; Dordevic, M. M.
2014-12-01
As part of an NSF TUES Type 3 project entitled "Google Earth for Onsite and Distance Education (GEODE)," we are assembling a "Grand Tour" of locations on Earth and other terrestrial bodies that every geoscience student should know about and visit at least in virtual reality. Based on feedback from colleagues at previous meetings, we have identified nine Grand Tour themes: "Plates and Plumes," "Rocks and Regions," "Geology Through Time," "The Mapping Challenge*," "U.S. National Parks*," "The Magical Mystery Tour*," "Resources and Hazards," "Planets and Moons," and "Top of the Pops." Themes marked with an asterisk are most developed at this stage and will be demonstrated in real time. The Mapping Challenge invites students to trace geological contacts, measure bedding strike and dip and the plunge, trend, and facing of a fold. There is an advanced tool for modeling periclinal folds. The challenge is presented in a game-like format with an emphasis on puzzle-solving that will appeal to students regardless of gender. For the tour of U.S. national parks, we divided the most geologically important parks into four groups—Western Pacific, West Coast, Rockies, and East Coast. We are combining our own team's GigaPan imagery with imagery already available on the Internet. There is a great deal of imagery just waiting to be annotated for geological education purposes. The Magical Mystery Tour takes students to Google Streetview locations selected by instructors. Students are presented with questions or tasks and are given automatic feedback. Other themes are under development. Within each theme, we are crowd-sourcing contributions from colleagues and inviting colleagues to vote for or against proposed locations and student interactions. The GEODE team includes the authors and: Heather Almquist, Stephen Burgin, Cinzia Cervato, Gene Cooper, Paul Karabinos, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Kristen St. John, and Barb Tewksbury.
Next Generation Landsat Products Delivered Using Virtual Globes and OGC Standard Services
NASA Astrophysics Data System (ADS)
Neiers, M.; Dwyer, J.; Neiers, S.
2008-12-01
The Landsat Data Continuity Mission (LDCM) is the next in the series of Landsat satellite missions and is tasked with the objective of delivering data acquired by the Operational Land Imager (OLI). The OLI instrument will provide data continuity to over 30 years of global multispectral data collected by the Landsat series of satellites. The U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center has responsibility for the development and operation of the LDCM ground system. One of the mission objectives of the LDCM is to distribute OLI data products electronically over the Internet to the general public on a nondiscriminatory basis and at no cost. To ensure the user community and general public can easily access LDCM data from multiple clients, the User Portal Element (UPE) of the LDCM ground system will use OGC standards and services such as Keyhole Markup Language (KML), Web Map Service (WMS), Web Coverage Service (WCS), and Geographic encoding of Really Simple Syndication (GeoRSS) feeds for both access to and delivery of LDCM products. The USGS has developed and tested the capabilities of several successful UPE prototypes for delivery of Landsat metadata, full resolution browse, and orthorectified (L1T) products from clients such as Google Earth, Google Maps, ESRI ArcGIS Explorer, and Microsoft's Virtual Earth. Prototyping efforts included the following services: using virtual globes to search the historical Landsat archive by dynamic generation of KML; notification of and access to new Landsat acquisitions and L1T downloads from GeoRSS feeds; Google indexing of KML files containing links to full resolution browse and data downloads; WMS delivery of reduced resolution browse, full resolution browse, and cloud mask overlays; and custom data downloads using WCS clients. These various prototypes will be demonstrated and LDCM service implementation plans will be discussed during this session.
Walsh, Gregory J.
2014-01-01
The bedrock geology of the 7.5-minute Uxbridge quadrangle consists of Neoproterozoic metamorphic and igneous rocks of the Avalon zone. In this area, rocks of the Avalon zone lie within the core of the Milford antiform, south and east of the terrane-bounding Bloody Bluff fault zone. Permian pegmatite dikes and quartz veins occur throughout the quadrangle. The oldest metasedimentary rocks include the Blackstone Group, which represents a Neoproterozoic peri-Gondwanan marginal shelf sequence. The metasedimentary rocks are intruded by Neoproterozoic arc-related plutonic rocks of the Rhode Island batholith. This report presents mapping by G.J. Walsh. The complete report consists of a map, text pamphlet, and GIS database. The map and text pamphlet are available only as downloadable files (see frame at right). The GIS database is available for download in ESRI™ shapefile and Google Earth™ formats, and includes contacts of bedrock geologic units, faults, outcrops, structural geologic information, geochemical data, and photographs.
Mapping 2000 2010 Impervious Surface Change in India Using Global Land Survey Landsat Data
NASA Technical Reports Server (NTRS)
Wang, Panshi; Huang, Chengquan; Brown De Colstoun, Eric C.
2017-01-01
Understanding and monitoring the environmental impacts of global urbanization requires better urban datasets. Continuous field impervious surface change (ISC) mapping using Landsat data is an effective way to quantify spatiotemporal dynamics of urbanization. It is well acknowledged that Landsat-based estimation of impervious surface is subject to seasonal and phenological variations. The overall goal of this paper is to map 200-02010 ISC for India using Global Land Survey datasets and training data only available for 2010. To this end, a method was developed that could transfer the regression tree model developed for mapping 2010 impervious surface to 2000 using an iterative training and prediction (ITP) approach An independent validation dataset was also developed using Google Earth imagery. Based on the reference ISC from the validation dataset, the RMSE of predicted ISC was estimated to be 18.4%. At 95% confidence, the total estimated ISC for India between 2000 and 2010 is 2274.62 +/- 7.84 sq km.
Krystosik, Amy R; Curtis, Andrew; Buritica, Paola; Ajayakumar, Jayakrishnan; Squires, Robert; Dávalos, Diana; Pacheco, Robinson; Bhatta, Madhav P; James, Mark A
2017-01-01
Cali, Colombia has experienced chikungunya and Zika outbreaks and hypoendemic dengue. Studies have explained Cali's dengue patterns but lack the sub-neighborhood-scale detail investigated here. Spatial-video geonarratives (SVG) with Ministry of Health officials and Community Health Workers were collected in hotspots, providing perspective on perceptions of why dengue, chikungunya and Zika hotspots exist, impediments to control, and social outcomes. Using spatial video and Google Street View, sub-neighborhood features possibly contributing to incidence were mapped to create risk surfaces, later compared with dengue, chikungunya and Zika case data. SVG captured insights in 24 neighborhoods. Trash and water risks in Calipso were mapped using SVG results. Perceived risk factors included proximity to standing water, canals, poverty, invasions, localized violence and military migration. These risks overlapped case density maps and identified areas that are suitable for transmission but are possibly underreporting to the surveillance system. Resulting risk maps with local context could be leveraged to increase vector-control efficiency- targeting key areas of environmental risk.
Integrating Socioeconomic and Earth Science Data Using Geobrowsers and Web Services: A Demonstration
NASA Astrophysics Data System (ADS)
Schumacher, J. A.; Yetman, G. G.
2007-12-01
The societal benefit areas identified as the focus for the Global Earth Observing System of Systems (GEOSS) 10- year implementation plan are an indicator of the importance of integrating socioeconomic data with earth science data to support decision makers. To aid this integration, CIESIN is delivering its global and U.S. demographic data to commercial and open source Geobrowsers and providing open standards based services for data access. Currently, data on population distribution, poverty, and detailed census data for the U.S. are available for visualization and access in Google Earth, NASA World Wind, and a browser-based 2-dimensional mapping client. The mapping client allows for the creation of web map documents that pull together layers from distributed servers and can be saved and shared. Visualization tools with Geobrowsers, user-driven map creation and sharing via browser-based clients, and a prototype for characterizing populations at risk to predicted precipitation deficits will be demonstrated.
Quantification of Plant Chlorophyll Content Using Google Glass
Cortazar, Bingen; Koydemir, Hatice Ceylan; Tseng, Derek; Feng, Steve; Ozcan, Aydogan
2015-01-01
Measuring plant chlorophyll concentration is a well-known and commonly used method in agriculture and environmental applications for monitoring plant health, which also correlates with many other plant parameters including, e.g., carotenoids, nitrogen, maximum green fluorescence, etc. Direct chlorophyll measurement using chemical extraction is destructive, complex and time-consuming, which has led to the development of mobile optical readers, providing non-destructive but at the same time relatively expensive tools for evaluation of plant chlorophyll levels. Here we demonstrate accurate measurement of chlorophyll concentration in plant leaves using Google Glass and a custom-developed software application together with a cost-effective leaf holder and multi-spectral illuminator device. Two images, taken using Google Glass, of a leaf placed in our portable illuminator device under red and white (i.e., broadband) light-emitting-diode (LED) illumination are uploaded to our servers for remote digital processing and chlorophyll quantification, with results returned to the user in less than 10 seconds. Intensity measurements extracted from the uploaded images are mapped against gold-standard colorimetric measurements made through a commercially available reader to generate calibration curves for plant leaf chlorophyll concentration. Using five plant species to calibrate our system, we demonstrate that our approach can accurately and rapidly estimate chlorophyll concentration of fifteen different plant species under both indoor and outdoor lighting conditions. This Google Glass based chlorophyll measurement platform can display the results in spatiotemporal and tabular forms and would be highly useful for monitoring of plant health in environmental and agriculture related applications, including e.g., urban plant monitoring, indirect measurements of the effects of climate change, and as an early indicator for water, soil, and air quality degradation. PMID:25669673
Quantification of plant chlorophyll content using Google Glass.
Cortazar, Bingen; Koydemir, Hatice Ceylan; Tseng, Derek; Feng, Steve; Ozcan, Aydogan
2015-04-07
Measuring plant chlorophyll concentration is a well-known and commonly used method in agriculture and environmental applications for monitoring plant health, which also correlates with many other plant parameters including, e.g., carotenoids, nitrogen, maximum green fluorescence, etc. Direct chlorophyll measurement using chemical extraction is destructive, complex and time-consuming, which has led to the development of mobile optical readers, providing non-destructive but at the same time relatively expensive tools for evaluation of plant chlorophyll levels. Here we demonstrate accurate measurement of chlorophyll concentration in plant leaves using Google Glass and a custom-developed software application together with a cost-effective leaf holder and multi-spectral illuminator device. Two images, taken using Google Glass, of a leaf placed in our portable illuminator device under red and white (i.e., broadband) light-emitting-diode (LED) illumination are uploaded to our servers for remote digital processing and chlorophyll quantification, with results returned to the user in less than 10 seconds. Intensity measurements extracted from the uploaded images are mapped against gold-standard colorimetric measurements made through a commercially available reader to generate calibration curves for plant leaf chlorophyll concentration. Using five plant species to calibrate our system, we demonstrate that our approach can accurately and rapidly estimate chlorophyll concentration of fifteen different plant species under both indoor and outdoor lighting conditions. This Google Glass based chlorophyll measurement platform can display the results in spatiotemporal and tabular forms and would be highly useful for monitoring of plant health in environmental and agriculture related applications, including e.g., urban plant monitoring, indirect measurements of the effects of climate change, and as an early indicator for water, soil, and air quality degradation.
Li, Ya-pin; Fang, Li-qun; Gao, Su-qing; Wang, Zhen; Gao, Hong-wei; Liu, Peng; Wang, Ze-Rui; Li, Yan-Li; Zhu, Xu-Guang; Li, Xin-Lou; Xu, Bo; Li, Yin-Jun; Yang, Hong; de Vlas, Sake J; Shi, Tao-Xing; Cao, Wu-Chun
2013-01-01
For years, emerging infectious diseases have appeared worldwide and threatened the health of people. The emergence and spread of an infectious-disease outbreak are usually unforeseen, and have the features of suddenness and uncertainty. Timely understanding of basic information in the field, and the collection and analysis of epidemiological information, is helpful in making rapid decisions and responding to an infectious-disease emergency. Therefore, it is necessary to have an unobstructed channel and convenient tool for the collection and analysis of epidemiologic information in the field. Baseline information for each county in mainland China was collected and a database was established by geo-coding information on a digital map of county boundaries throughout the country. Google Maps was used to display geographic information and to conduct calculations related to maps, and the 3G wireless network was used to transmit information collected in the field to the server. This study established a decision support system for the response to infectious-disease emergencies based on WebGIS and mobile services (DSSRIDE). The DSSRIDE provides functions including data collection, communication and analyses in real time, epidemiological detection, the provision of customized epidemiological questionnaires and guides for handling infectious disease emergencies, and the querying of professional knowledge in the field. These functions of the DSSRIDE could be helpful for epidemiological investigations in the field and the handling of infectious-disease emergencies. The DSSRIDE provides a geographic information platform based on the Google Maps application programming interface to display information of infectious disease emergencies, and transfers information between workers in the field and decision makers through wireless transmission based on personal computers, mobile phones and personal digital assistants. After a 2-year practice and application in infectious disease emergencies, the DSSRIDE is becoming a useful platform and is a useful tool for investigations in the field carried out by response sections and individuals. The system is suitable for use in developing countries and low-income districts.
Kaewpitoon, Soraya J; Rujirakul, Ratana; Sangkudloa, Amnat; Kaewthani, Sarochinee; Khemplila, Kritsakorn; Cherdjirapong, Karuna; Kujapun, Jirawoot; Norkaew, Jun; Chavengkun, Wasugree; Ponphimai, Sukanya; Polsripradist, Poowadol; Padchasuwan, Natnapa; Joosiri, Apinya; Wakkhuwattapong, Parichart; Loyd, Ryan A; Matrakool, Likit; Tongtawee, Taweesak; Panpimanmas, Sukij; Kaewpitoon, Natthawut
2016-01-01
Cholangiocarcinoma (CCA), a major problem of health in Thailand, particularly in Northeastern and Northern regions, is generally incurable and rapidly lethal because of presentation in stage 3 or 4. Early diagnosis of stage 1 and 2 could allow better survival. Therefore, this study aimed to provide a distribution map of populations at risk for CCA in BuaYai district of Nakhon Ratchasima province, Northeast Thailand. A cross-sectional survey was carried out in 10 sub-districts and 122 villages, during June and November 2015. The populations at risk for CCA were screened using the Korat CCA verbal screening test (KCVST) and then risk areas were displayed by using Google map (GM). A total of 11,435 individuals from a 26,198 population completed the KCVST. The majority had a low score of risk for CCA (1-4 points; 93.3%). High scores with 6, 7 and 8 points accounted for 1.20%, 0.13% and 0.02%. The population at risk was found frequently in sub-district municipalities, followed by sub-district administrative organization and town municipalities, (F=396.220, P-value=0.000). Distribution mapping comprised 11 layers: 1, district; 2, local administrative organization; 3, hospital; 4, KCVST opisthorchiasis; 5, KCVST praziquantel used; 6, KCVST cholelithiasis; 7, KCVST raw fish consumption; 8, KCVST alcohol consumption; 9, KCVST pesticide used; 10, KCVST relative family with CCA; and 11, KCVST naive northeastern people. Geovisual display is now available online. This study indicated that the population at high risk of CCA in Bua Yai district is low, therefore setting a zero model project is possible. Key success factors for disease prevention and control need further study. GM production is suitable for further CCA surveillance and monitoring of the population with a high risk score in this area.
Walsh, Gregory J.; Jahns, Richard H.; Aleinikoff, John N.
2013-01-01
The bedrock geology of the 7.5-minute Nashua South quadrangle consists primarily of deformed Silurian metasedimentary rocks of the Berwick Formation. The metasedimentary rocks are intruded by a Late Silurian to Early Devonian diorite-gabbro suite, Devonian rocks of the Ayer Granodiorite, Devonian granitic rocks of the New Hampshire Plutonic Suite including pegmatite and the Chelmsford Granite, and Jurassic diabase dikes. The bedrock geology was mapped to study the tectonic history of the area and to provide a framework for ongoing hydrogeologic characterization of the fractured bedrock of Massachusetts and New Hampshire. This report presents mapping by G.J. Walsh and R.H. Jahns and zircon U-Pb geochronology by J.N. Aleinikoff. The complete report consists of a map, text pamphlet, and GIS database. The map and text pamphlet are only available as downloadable files (see frame at right). The GIS database is available for download in ESRITM shapefile and Google EarthTM formats, and includes contacts of bedrock geologic units, faults, outcrops, structural geologic information, photographs, and a three-dimensional model.
BatMis: a fast algorithm for k-mismatch mapping.
Tennakoon, Chandana; Purbojati, Rikky W; Sung, Wing-Kin
2012-08-15
Second-generation sequencing (SGS) generates millions of reads that need to be aligned to a reference genome allowing errors. Although current aligners can efficiently map reads allowing a small number of mismatches, they are not well suited for handling a large number of mismatches. The efficiency of aligners can be improved using various heuristics, but the sensitivity and accuracy of the alignments are sacrificed. In this article, we introduce Basic Alignment tool for Mismatches (BatMis)--an efficient method to align short reads to a reference allowing k mismatches. BatMis is a Burrows-Wheeler transformation based aligner that uses a seed and extend approach, and it is an exact method. Benchmark tests show that BatMis performs better than competing aligners in solving the k-mismatch problem. Furthermore, it can compete favorably even when compared with the heuristic modes of the other aligners. BatMis is a useful alternative for applications where fast k-mismatch mappings, unique mappings or multiple mappings of SGS data are required. BatMis is written in C/C++ and is freely available from http://code.google.com/p/batmis/
ANTP Protocol Suite Software Implementation Architecture in Python
2011-06-03
a popular platform of networking programming, an area in which C has traditionally dominated. 2 NetController AeroRP AeroNP AeroNP API AeroTP...visualisation of the running system. For example using the Google Maps API , the main logging web page can show all the running nodes in the system. By...communication between AeroNP and AeroRP and runs on the operating system as daemon. Furthermore, it creates an API interface to mange the communication between
Optimizing Distributed Sensor Placement for Border Patrol Interdiction Using Microsoft Excel
2007-04-01
weather conditions and they can be evaded by using techniques which minimize heat signatures use of lasers and other technologies day or night (26:8...technologies which can be used for border security. Maier [2004] developed a seismic intrusion sensor technology which uses fiber optic cables, lasers , and...needed to create the is used as the base map for the network. program originally developed by Keyhole by Google Inc. It provides satellite images of
NASA Astrophysics Data System (ADS)
Girvetz, E. H.; Zganjar, C.; Raber, G. T.; Hoekstra, J.; Lawler, J. J.; Kareiva, P.
2008-12-01
Now that there is overwhelming evidence of global climate change, scientists, managers and planners (i.e. practitioners) need to assess the potential impacts of climate change on particular ecological systems, within specific geographic areas, and at spatial scales they care about, in order to make better land management, planning, and policy decisions. Unfortunately, this application of climate science to real world decisions and planning has proceeded too slowly because we lack tools for translating cutting-edge climate science and climate-model outputs into something managers and planners can work with at local or regional scales (CCSP 2008). To help increase the accessibility of climate information, we have developed a freely-available, easy-to-use, web-based climate-change analysis toolbox, called ClimateWizard, for assessing how climate has and is projected to change at specific geographic locations throughout the world. The ClimateWizard uses geographic information systems (GIS), web-services (SOAP/XML), statistical analysis platforms (e.g. R- project), and web-based mapping services (e.g. Google Earth/Maps, KML/GML) to provide a variety of different analyses (e.g. trends and departures) and outputs (e.g. maps, graphs, tables, GIS layers). Because ClimateWizard analyzes large climate datasets stored remotely on powerful computers, users of the tool do not need to have fast computers or expensive software, but simply need access to the internet. The analysis results are then provided to users in a Google Maps webpage tailored to the specific climate-change question being asked. The ClimateWizard is not a static product, but rather a framework to be built upon and modified to suit the purposes of specific scientific, management, and policy questions. For example, it can be expanded to include bioclimatic variables (e.g. evapotranspiration) and marine data (e.g. sea surface temperature), as well as improved future climate projections, and climate-change impact analyses involving hydrology, vegetation, wildfire, disease, and food security. By harnessing the power of computer and web- based technologies, the ClimateWizard puts local, regional, and global climate-change analyses in the hands of a wider array of managers, planners, and scientists.
GIS Application Management for Disabled People
NASA Astrophysics Data System (ADS)
Tongkaw, Sasalak
2017-08-01
This research aimed to develop and design Geographical Information Systems (GIS) for facilitating disabled people by presenting some useful disabled information on the Google Map. The map could provide information about disabled types of people such as blind, deaf and physical movement. This research employed the Multiview 2 theory and method to plan and find out the problems in real world situation. This research used many designing data structure methods such as Data Flow Diagram, and ER-Diagram. The research focused into two parts: server site and client site which included the interface for Web-based application. The clear information of disable people on the map was useful for facilitating disabled people to find some useful information. In addition, it provided specialized data for company and government officers for managing and planning local facilities for disabled people in the cities. The disable could access the system through the Internet access at any time by using mobile or portable devices.
Environmental asbestos exposure sources in Korea
2016-01-01
Background Because of the long asbestos-related disease latencies (10–50 years), detection, diagnosis, and epidemiologic studies require asbestos exposure history. However, environmental asbestos exposure source (EAES) data are lacking. Objectives To survey the available data for past EAES and supplement these data with interviews. Methods We constructed an EAES database using a literature review and interviews of experts, former traders, and workers. Exposure sources by time period and type were visualized using a geographic information system (ArcGIS), web-based mapping (Google Maps), and OpenWeatherMap. The data were mounted in the GIS to show the exposure source location and trend. Results The majority of asbestos mines, factories, and consumption was located in Chungnam; Gyeonggi, Busan, and Gyeongnam; and Gyeonggi, Daejeon, and Busan, respectively. Shipbuilding and repair companies were mostly located in Busan and Gyeongnam. Conclusions These tools might help evaluate past exposure from EAES and estimate the future asbestos burden in Korea. PMID:27726756
Environmental asbestos exposure sources in Korea.
Kang, Dong-Mug; Kim, Jong-Eun; Kim, Ju-Young; Lee, Hyun-Hee; Hwang, Young-Sik; Kim, Young-Ki; Lee, Yong-Jin
2016-10-01
Because of the long asbestos-related disease latencies (10-50 years), detection, diagnosis, and epidemiologic studies require asbestos exposure history. However, environmental asbestos exposure source (EAES) data are lacking. To survey the available data for past EAES and supplement these data with interviews. We constructed an EAES database using a literature review and interviews of experts, former traders, and workers. Exposure sources by time period and type were visualized using a geographic information system (ArcGIS), web-based mapping (Google Maps), and OpenWeatherMap. The data were mounted in the GIS to show the exposure source location and trend. The majority of asbestos mines, factories, and consumption was located in Chungnam; Gyeonggi, Busan, and Gyeongnam; and Gyeonggi, Daejeon, and Busan, respectively. Shipbuilding and repair companies were mostly located in Busan and Gyeongnam. These tools might help evaluate past exposure from EAES and estimate the future asbestos burden in Korea.
Map of Life - A Dashboard for Monitoring Planetary Species Distributions
NASA Astrophysics Data System (ADS)
Jetz, W.
2016-12-01
Geographic information about biodiversity is vital for understanding the many services nature provides and their potential changes, yet remains unreliable and often insufficient. By integrating a wide range of knowledge about species distributions and their dynamics over time, Map of Life supports global biodiversity education, monitoring, research and decision-making. Built on a scalable web platform geared for large biodiversity and environmental data, Map of Life endeavors provides species range information globally and species lists for any area. With data and technology provided by NASA and Google Earth Engine, tools under development use remote sensing-based environmental layers to enable on-the-fly predictions of species distributions, range changes, and early warning signals for threatened species. The ultimate vision is a globally connected, collaborative knowledge- and tool-base for regional and local biodiversity decision-making, education, monitoring, and projection. For currently available tools, more information and to follow progress, go to MOL.org.
Occupancy Grid Map Merging Using Feature Maps
2010-11-01
each robot begins exploring at different starting points, once two robots can communicate, they send their odometry data, LIDAR observations, and maps...robots [11]. Moreover, it is relevant to mention that significant success has been achieved in solving SLAM problems when using hybrid maps [12...represents the environment by parametric features. Our method is capable of representing a LIDAR scanned environment map in a parametric fashion. In general
The North America tapestry of time and terrain
Barton, Kate E.; Howell, David G.; Vigil, Jose F.
2003-01-01
The North America Tapestry of Time and Terrain (1:8,000,000 scale) is a product of the US Geological Survey in the I-map series (I-2781). This map was prepared in collaboration with the Geological Survey of Canada and the Mexican Consejo Recursos de Minerales. This cartographic Tapestry is woven from a geologic map and a shaded relief image. This digital combination reveals the geologic history of North America through the interrelation of rock type, topography and time. Regional surface processes as well as continent-scale tectonic events are exposed in the three dimensions of space and the fourth dimension, geologic time. The large map shows the varying age of bedrock underlying North America, while four smaller maps show the distribution of four principal types of rock: sedimentary, volcanic, plutonic and metamorphic.This map expands the original concept of the 2000 Tapestry of Time and Terrain, by José F. Vigil, Richard J. Pike and David G. Howell, which covered the conterminous United States. The U.S. Tapestry poster and website have been popular in classrooms, homes, and even the Google office building, and we anticipate the North America Tapestry will have a similarly wide appeal, and to a larger audience.
Technology and Information Tool Preferences of Academics in the Field of Anaesthesiology
Akkaya, Akcan; Bilgi, Murat; Demirhan, Abdullah; Kurt, Adem Deniz; Tekelioğlu, Ümit Yaşar; Akkaya, Kadir; Koçoğlu, Hasan; Tekçe, Hikmet
2014-01-01
Objective Researchers use a large number of information technology tools from the beginning until the publication of a scientific study. The aim of the study is to investigate the technology and data processing tool usage preferences of academics who produce scientific publications in the field of anaesthesiology. Methods A multiple-choice survey, including 18 questions regarding the use of technology to assess the preferences of academicians, was performed. Results PubMed has been the most preferred article search portal, and the second is Google Academic. Medscape has become the most preferred medical innovation tracking website. Only 12% of academicians obtain a clinical trial registration number for their randomized clinical research. In total, 28% of respondents used the Consolidated Standards of Reporting Trials checklist in their clinical trials. Of all participants, 21% was using Dropbox and 9% was using Google-Drive for sharing files. Google Chrome was the most preferred internet browser (32.25%) for academic purposes. English language editing service was obtained from the Scribendi (21%) and Textcheck (12%) websites. Half of the academics were getting help from their specialist with a personal relationship, 27% was doing it themselves, and 24% was obtaining professional assistance for statistical requirements. Sixty percent of the participants were not using a reference editing program, and 21% was using EndNote. Nine percent of the academics were spending money for article writing, and the mean cost was 1287 Turkish Liras/year. Conclusion Academics in the field of anaesthesiology significantly benefit from technology and informatics tools to produce scientific publications. PMID:27366448
The excitement of Google Scholar, the worry of Google Print
Banks, Marcus A
2005-01-01
In late 2004 Google announced two major projects, the unveiling of Google Scholar and a major expansion of the Google Print digitization program. Both projects have generated discussion within the library and research communities, and Google Print has received significant media attention. This commentary describes exciting educational possibilities stimulated by Google Scholar, and argues for caution regarding the Google Print project. PMID:15784147
Usability evaluation of mobile applications using ISO 9241 and ISO 25062 standards.
Moumane, Karima; Idri, Ali; Abran, Alain
2016-01-01
This paper presents an empirical study based on a set of measures to evaluate the usability of mobile applications running on different mobile operating systems, including Android, iOS and Symbian. The aim is to evaluate empirically a framework that we have developed on the use of the Software Quality Standard ISO 9126 in mobile environments, especially the usability characteristic. To do that, 32 users had participated in the experiment and we have used ISO 25062 and ISO 9241 standards for objective measures by working with two widely used mobile applications: Google Apps and Google Maps. The QUIS 7.0 questionnaire have been used to collect measures assessing the users' level of satisfaction when using these two mobile applications. By analyzing the results we highlighted a set of mobile usability issues that are related to the hardware as well as to the software and that need to be taken into account by designers and developers in order to improve the usability of mobile applications.
3D Immersive Visualization with Astrophysical Data
NASA Astrophysics Data System (ADS)
Kent, Brian R.
2017-01-01
We present the refinement of a new 3D immersion technique for astrophysical data visualization.Methodology to create 360 degree spherical panoramas is reviewed. The 3D software package Blender coupled with Python and the Google Spatial Media module are used together to create the final data products. Data can be viewed interactively with a mobile phone or tablet or in a web browser. The technique can apply to different kinds of astronomical data including 3D stellar and galaxy catalogs, images, and planetary maps.
Dynamic prescription maps for site-specific variable rate irrigation of cotton
USDA-ARS?s Scientific Manuscript database
A prescription map is a set of instructions that controls a variable rate irrigation (VRI) system. These maps, which may be based on prior yield, soil texture, topography, or soil electrical conductivity data, are often manually applied at the beginning of an irrigation season and remain static. The...
27 CFR 9.194 - San Antonio Valley.
Code of Federal Regulations, 2011 CFR
2011-04-01
... significance. (b) Approved Maps. The appropriate maps for determining the boundary of the San Antonio Valley...) Hames Valley, California, 1949, photorevised 1978; (2) Tierra Redonda Mountain, California, 1949... southeast corner of section 14, T23S, R9E, on the Hames Valley map; (2) From the beginning point, proceed...
Assessing Geographic Knowledge with Sketch Maps.
ERIC Educational Resources Information Center
Wise, Naomi; Kon, Jane Heckley
1990-01-01
Maintains that comparison of students' sketch maps at the beginning and end of the year can provide information on how student's representations of the world changes. Describes a study from the California International Studies Project (CISP) that provides an easy method for sorting and summarizing sketch map data. Illustrates the method with…
NASA Astrophysics Data System (ADS)
Lilly, M. R.; Feditova, A.; Levine, K.; Giardino, J. R.
2017-12-01
The Harris County Flood Control District has an impressive amount of information available for the public related to flood management and response. During Hurricane Harvey, this information was used by the authors to help address daily questions from family and friends living in the Houston area. Common near-real-time reporting data included precipitation and water levels. Maps included locations of data stations, stream or bayou conditions (in bank, out of bank) and watershed or drainage boundaries. In general, the data station reporting and online information was updating well throughout the hurricane and post-flooding period. Only a few of the data reporting stations had problems with water level sensor measurements. The overall information was helpful to hydrologists and floodplain managers. The online information could not easily answer all common questions residents may have during a flood event. Some of the more common questions were how to use the water-level information to know the potential extent of flooding and relative location of flooding to the location of residents. To help address the questions raised during the flooding on how to use the available water level data, we used Google Earth to get lot and intersection locations to help show the relative differences between nearby water-level stations and residences of interest. The reported resolution of the Google Earth elevation data is 1-foot. To help confirm the use of this data, we compared Google Earth approximate elevations with reported Harris County Floodplain Reference Mark individual reports. This method helped verify we could use the Google Earth information for approximate comparisons. We also faced questions on what routes to take if evacuation was needed, and where to go to get to higher ground elevations. Google Earth again provided a helpful and easy to use interface to look at road and intersection elevations and develop suggested routes for family and friends to take to avoid low areas that may be subject to flooding. These and other recommendations that helped answer common questions by residents reacting to the hurricane and subsequent flooding conditions are summarized with examples.
Marek, Lukáš; Tuček, Pavel; Pászto, Vít
2015-01-28
Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.
NASA Astrophysics Data System (ADS)
de Paor, D. G.; Whitmeyer, S. J.; Gobert, J.
2009-12-01
We previously reported on innovative techniques for presenting data on virtual globes such as Google Earth using emergent Collada models that reveal subsurface geology and geophysics. We here present several new and enhanced models and linked lesson plans to aid deployment in undergraduate geoscience courses, along with preliminary results from our assessment of their effectiveness. The new Collada models are created with Google SketchUp, Bonzai3D, and MeshLab software, and are grouped to cover (i) small scale field mapping areas; (ii) regional scale studies of the North Atlantic Ocean Basin, the Appalachian Orogen, and the Pacific Ring of Fire; and (iii) global scale studies of terrestrial planets, moons, and asteroids. Enhancements include emergent block models with three-dimensional surface topography; models that conserve structural orientation data; interactive virtual specimens; models that animate plate movements on the virtual globe; exploded 3-D views of planetary mantles and cores; and server-generated dynamic KML. We tested volunteer students and professors using Silverback monitoring software, think-aloud verbalizations, and questionnaires designed to assess their understanding of the underlying geo-scientific phenomena. With the aid of a cohort of instructors across the U.S., we are continuing to assess areas in which users encounter difficulties with both the software and geoscientific concepts. Preliminary results suggest that it is easy to overestimate the computer expertise of novice users even when they are content knowledge experts (i.e., instructors), and that a detailed introduction to virtual globe manipulation is essential before moving on to geoscience applications. Tasks that seem trivial to developers may present barriers to non-technical users and technicalities that challenge instructors may block adoption in the classroom. We have developed new models using the Google Earth API which permits enhanced interaction and dynamic feedback and are assessing their relative merits versus the Google Earth APP. Overall, test students and professors value the models very highly. There are clear pedagogical opportunities for using materials such as these to create engaging in-course research opportunities for undergraduates.
Assessing the methods needed for improved dengue mapping: a SWOT analysis
Attaway, David Frost; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M
2014-01-01
Introduction Dengue fever, a mosquito-borne viral infection, is a growing threat to human health in tropical and subtropical areas worldwide. There is a demand from public officials for maps that capture the current distribution of dengue and maps that analyze risk factors to predict the future burden of disease. Methods To identify relevant articles, we searched Google Scholar, PubMed, BioMed Central, and WHOLIS (World Health Organization Library Database) for published articles with a specific set of dengue criteria between January 2002 and July 2013. Results After evaluating the currently available dengue models, we identified four key barriers to the creation of high-quality dengue maps: (1) data limitations related to the expense of diagnosing and reporting dengue cases in places where health information systems are underdeveloped; (2) issues related to the use of socioeconomic proxies in places with limited dengue incidence data; (3) mosquito ranges which may be changing as a result of climate changes; and (4) the challenges of mapping dengue events at a variety of scales. Conclusion An ideal dengue map will present endemic and epidemic dengue information from both rural and urban areas. Overcoming the current barriers requires expanded collaboration and data sharing by geographers, epidemiologists, and entomologists. Enhanced mapping techniques would allow for improved visualizations of dengue rates and risks. PMID:25328585
Assessing the methods needed for improved dengue mapping: a SWOT analysis.
Attaway, David Frost; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M
2014-01-01
Dengue fever, a mosquito-borne viral infection, is a growing threat to human health in tropical and subtropical areas worldwide. There is a demand from public officials for maps that capture the current distribution of dengue and maps that analyze risk factors to predict the future burden of disease. To identify relevant articles, we searched Google Scholar, PubMed, BioMed Central, and WHOLIS (World Health Organization Library Database) for published articles with a specific set of dengue criteria between January 2002 and July 2013. After evaluating the currently available dengue models, we identified four key barriers to the creation of high-quality dengue maps: (1) data limitations related to the expense of diagnosing and reporting dengue cases in places where health information systems are underdeveloped; (2) issues related to the use of socioeconomic proxies in places with limited dengue incidence data; (3) mosquito ranges which may be changing as a result of climate changes; and (4) the challenges of mapping dengue events at a variety of scales. An ideal dengue map will present endemic and epidemic dengue information from both rural and urban areas. Overcoming the current barriers requires expanded collaboration and data sharing by geographers, epidemiologists, and entomologists. Enhanced mapping techniques would allow for improved visualizations of dengue rates and risks.
Human papillomavirus and oral cancer: a primer for dental public health professionals.
Chattopadhyay, A; Weatherspoon, D; Pinto, A
2015-06-01
There is strong evidence for causal association between human papillomavirus (HPV) and cervical cancer, evidence of association of HPV and oropharyngeal cancer is beginning to mount. To review the HPV-oral cancer literature for a comprehensive assessment of the issues involved. Literature search conducted using PubMed, Google Scholar and Google search engine. Both available HPV vaccines are efficacious and safe although expensive. Policy for mandatory HPV vaccination for cervical prevention is mired in political issues stemming from negative cost-effectiveness balance. Dental professionals are not ready to discuss the role of HPV vaccine in cancer prevention. This review discusses the impact of HPV on cervical cancer, transmission of HPV among humans, impact of HPV in oral health, and its plausible role in oral and oropharyngeal cancer, prevention of HPV transmission, available vaccines against HPV, testing, cost, policy and use of HPV vaccines internationally and dentists readiness related to HPV associated health communication. Given the mounting literature on the association between HPV and oropharyngeal cancer, the dental community must be prepared to answer patients' HPV-related questions and to educate patients about the role of HPV as a risk factor for oral and oropharyngeal cancers.
Public health preparedness for the impact of global warming on human health.
Wassel, John J
2009-01-01
To assess the changes in weather and weather-associated disturbances related to global warming; the impact on human health of these changes; and the public health preparedness mandated by this impact. Qualitative review of the literature. Articles will be obtained by searching PubMed database, Google, and Google Scholar search engines using terms such as "global warming," "climate change," "human health," "public health," and "preparedness." Sixty-seven journal articles were reviewed. The projections and signs of global environmental changes are worrisome, and there are reasons to believe that related information may have been conservatively interpreted and presented in the recent past. Although the challenges are great, there are many opportunities for devising beneficial solutions at individual, community, and global levels. It is essential for public health professionals to become involved in advocating for change at all of these levels, as well as through professional organizations. We must begin "greening" our own lives and clinical practice, and start talking about these issues with patients. As we build walkable neighborhoods, change methods of energy production, and make water use and food production and distribution more sustainable, the benefits to improved air quality, a stabilized climate, social support, and individual and community health will be dramatic.
Cropland Capture: A Game to Improve Global Cropland through Crowdsourcing
NASA Astrophysics Data System (ADS)
Fritz, Steffen; Sturn, Tobias; See, Linda; Perger, Christoph; Schill, Christian; McCallum, Ian; Schepaschenko, Dmitry; Karner, Mathias; Dueruer, Martina; Kraxner, Florian; Obersteiner, Michael
2014-05-01
Accurate and reliable global cropland extent maps are essential for estimating and forecasting crop yield, in particular losses due to drought and production anomalies. Major questions surrounding energy futures and environmental change (EU and US biofuel target setting, determination of greenhouse gas emissions, REDD initiatives, and implications of climate change on crop production and productivity patterns) also require reliable information on the spatial distribution of cropland as well as crop types. Although global land cover maps identify cropland (which exist as one or more land cover categories), this information is currently not accurate enough for many applications. There are several ways of improving current cropland extent though hybrid approaches and by integrating information collected though Geo-Wiki (a global crowdsourcing platform) from very high resolution imagery such as that found on Google Earth. Another way of getting improved cropland extent maps would be to classify all very high resolution images found on Google Earth and to create a wall-to-wall map of cropland. This is a very ambitious task that would require a large number of individuals, like that found in massive multiplayer online games. For this reason we have developed a game called 'Cropland Capture'. The game can be played on a desktop, on a tablet (iPad or Android) or mobile phone (iPhone or Android) where the game mechanics are very simple. The player is provided with a satellite image or in-situ photo and they must determine if the image contains cropland or not. The game was launched in the middle of November 2013 and will run for 6 months, after which the weekly winners will be entered into a draw to win large prizes. To date we have collected more than 2.5 million areas, where we will continue to expand the sample to more locations around the world. Eventually the data will be used to calibrate and validate a new version of our global cropland map, where the latest version is available from http://beta-hybrid.geo-wiki.org. If we find, however, that a large number of people participate in the game, we will aim to make wall-to-wall cropland maps for those countries where no national maps exist. This paper will present an overview of the game and a summary of the crowdsourced data from the game, including information about quality and user performance. If successful, this gaming approach could be used to gather information about other land cover types in the future in order to improve global land cover information more generally.
Embracing Open Software Development in Solar Physics
NASA Astrophysics Data System (ADS)
Hughitt, V. K.; Ireland, J.; Christe, S.; Mueller, D.
2012-12-01
We discuss two ongoing software projects in solar physics that have adopted best practices of the open source software community. The first, the Helioviewer Project, is a powerful data visualization tool which includes online and Java interfaces inspired by Google Maps (tm). This effort allows users to find solar features and events of interest, and download the corresponding data. Having found data of interest, the user now has to analyze it. The dominant solar data analysis platform is an open-source library called SolarSoft (SSW). Although SSW itself is open-source, the programming language used is IDL, a proprietary language with licensing costs that are prohibative for many institutions and individuals. SSW is composed of a collection of related scripts written by missions and individuals for solar data processing and analysis, without any consistent data structures or common interfaces. Further, at the time when SSW was initially developed, many of the best software development processes of today (mirrored and distributed version control, unit testing, continuous integration, etc.) were not standard, and have not since been adopted. The challenges inherent in developing SolarSoft led to a second software project known as SunPy. SunPy is an open-source Python-based library which seeks to create a unified solar data analysis environment including a number of core datatypes such as Maps, Lightcurves, and Spectra which have consistent interfaces and behaviors. By taking advantage of the large and sophisticated body of scientific software already available in Python (e.g. SciPy, NumPy, Matplotlib), and by adopting many of the best practices refined in open-source software development, SunPy has been able to develop at a very rapid pace while still ensuring a high level of reliability. The Helioviewer Project and SunPy represent two pioneering technologies in solar physics - simple yet flexible data visualization and a powerful, new data analysis environment. We discuss the development of both these efforts and how they are beginning to influence the solar physics community.
Fourth international circumpolar arctic vegetation mapping workshop
Raynolds, Martha K.; Markon, C.J.
2002-01-01
During the week of April 10, 2001, the Fourth International Circumpolar Arctic Vegetation Mapping Workshop was held in Moscow, Russia. The purpose of this meeting was to bring together the vegetation scientists working on the Circumpolar Arctic Vegetation Map (CAVM) to (1) review the progress of current mapping activities, (2) discuss and agree upon a standard set of arctic tundra subzones, (3) plan for the production and dissemination of a draft map, and (4) begin work on a legend for the final map.
Extra-terra incognita: Martian maps in the digital age.
Messeri, Lisa
2017-02-01
Science and technology studies (STS) and critical cartography are both asking questions about the ontological fixity of maps and other scientific objects. This paper examines how a group of NASA computer scientists who call themselves The Mapmakers conceptualizes and creates maps in service of different commitments. The maps under construction are those of alien Mars, produced through partnerships that NASA has established with Google and Microsoft. With the goal of bringing an experience of Mars to as many people as possible, these maps influence how we imagine our neighbouring planet. This paper analyzes two attributes of the map, evident in both its representation and the attending cartographic practices: a sense of Mars as dynamic and a desire for a democratic experience of Mars in which up-to-date Mars data can be intuitively accessed not only by scientists but by lay users as well. Whereas a democratic Mars promises users the ability to decide how to interact with the map and understand Mars, dynamic Mars imposes a more singular sense of Mars as a target of continued robotic and maybe even human exploration. Because maps of Mars have a different (and arguably less complex) set of social and political commitments than those of Earth, they help us see how different goals contradict and complement each other in matters of exploration and state-craft relevant both to other worlds and our own.
The Cellular Automata for modelling of spreading of lava flow on the earth surface
NASA Astrophysics Data System (ADS)
Jarna, A.
2012-12-01
Volcanic risk assessment is a very important scientific, political and economic issue in densely populated areas close to active volcanoes. Development of effective tools for early prediction of a potential volcanic hazard and management of crises are paramount. However, to this date volcanic hazard maps represent the most appropriate way to illustrate the geographical area that can potentially be affected by a volcanic event. Volcanic hazard maps are usually produced by mapping out old volcanic deposits, however dynamic lava flow simulation gaining popularity and can give crucial information to corroborate other methodologies. The methodology which is used here for the generation of volcanic hazard maps is based on numerical simulation of eruptive processes by the principle of Cellular Automata (CA). The python script is integrated into ArcToolbox in ArcMap (ESRI) and the user can select several input and output parameters which influence surface morphology, size and shape of the flow, flow thickness, flow velocity and length of lava flows. Once the input parameters are selected, the software computes and generates hazard maps on the fly. The results can be exported to Google Maps (.klm format) to visualize the results of the computation. For validation of the simulation code are used data from a real lava flow. Comparison of the simulation results with real lava flows mapped out from satellite images will be presented.
The U.S. Geological Survey mapping and cartographic database activities, 2006-2010
Craun, Kari J.; Donnelly, John P.; Allord, Gregory J.
2011-01-01
The U.S. Geological Survey (USGS) began systematic topographic mapping of the United States in the 1880s, beginning with scales of 1:250,000 and 1:125,000 in support of geological mapping. Responding to the need for higher resolution and more detail, the 1:62,500-scale, 15-minute, topographic map series was begun in the beginning of the 20th century. Finally, in the 1950s the USGS adopted the 1:24,000-scale, 7.5-minute topographic map series to portray even more detail, completing the coverage of the conterminous 48 states of the United States with this series in 1992. In 2001, the USGS developed the vision and concept of The National Map, a topographic database for the 21st century and the source for a new generation of topographic maps (http://nationalmap.gov/). In 2008, the initial production of those maps began with a 1:24,000-scale digital product. In a separate, but related project, the USGS began scanning the existing inventory of historical topographic maps at all scales to accompany the new topographic maps. The USGS also had developed a digital database of The National Atlas of the United States. The digital version of Atlas is now Web-available and supports a mapping engine for small scale maps of the United States and North America. These three efforts define topographic mapping activities of the USGS during the last few years and are discussed below.
Can Satellite Remote Sensing be Applied in Geological Mapping in Tropics?
NASA Astrophysics Data System (ADS)
Magiera, Janusz
2018-03-01
Remote sensing (RS) techniques are based on spectral data registered by RS scanners as energy reflected from the Earth's surface or emitted by it. In "geological" RS the reflectance (or emittence) should come from rock or sediment. The problem in tropical and subtropical areas is a dense vegetation. Spectral response from the rocks and sediments is gathered only from the gaps among the trees and shrubs. Images of high resolution are appreciated here, therefore. New generation of satellites and scanners (Digital Globe WV2, WV3 and WV4) yield imagery of spatial resolution of 2 m and up to 16 spectral bands (WV3). Images acquired by Landsat (TM, ETM+, OLI) and Sentinel 2 have good spectral resolution too (6-12 bands in visible and infrared) and, despite lower spatial resolution (10-60 m of pixel size) are useful in extracting lithological information too. Lithological RS map may reveal good precision (down to a single rock or outcrop of a meter size). Supplemented with the analysis of Digital Elevation Model and high resolution ortophotomaps (Google Maps, Bing etc.) allows for quick and cheap mapping of unsurveyed areas.
Curtis, Andrew; Buritica, Paola; Ajayakumar, Jayakrishnan; Squires, Robert; Dávalos, Diana; Pacheco, Robinson; Bhatta, Madhav P.; James, Mark A.
2017-01-01
Background Cali, Colombia has experienced chikungunya and Zika outbreaks and hypoendemic dengue. Studies have explained Cali’s dengue patterns but lack the sub-neighborhood-scale detail investigated here. Methods Spatial-video geonarratives (SVG) with Ministry of Health officials and Community Health Workers were collected in hotspots, providing perspective on perceptions of why dengue, chikungunya and Zika hotspots exist, impediments to control, and social outcomes. Using spatial video and Google Street View, sub-neighborhood features possibly contributing to incidence were mapped to create risk surfaces, later compared with dengue, chikungunya and Zika case data. Results SVG captured insights in 24 neighborhoods. Trash and water risks in Calipso were mapped using SVG results. Perceived risk factors included proximity to standing water, canals, poverty, invasions, localized violence and military migration. These risks overlapped case density maps and identified areas that are suitable for transmission but are possibly underreporting to the surveillance system. Conclusion Resulting risk maps with local context could be leveraged to increase vector-control efficiency- targeting key areas of environmental risk. PMID:28767730
NASA Astrophysics Data System (ADS)
Bouiflane, Mustapha; Manar, Ahmed; Medina, Fida; Youbi, Nasrrddine; Rimi, Abdelkrim
2017-06-01
A high-resolution aeromagnetic survey was carried out in the Anti- Atlas, Morocco covering the main areas traversed by the Great CAMP Foum Zguid dyke (FZD). This ;doleritic; dyke belongs to the Central Atlantic Magmatic Province (CAMP), a Large Igneous Province which is associated with the fragmentation of the supercontinent Pangaea and the initial stages of rifting of the Central Atlantic Ocean. It also coincides in time with the mass extinction of the Triassic - Jurassic boundary. Based on the study of geological maps and Google Earth satellite images, it appears that the FZD is poorly exposed and, often covered by Quaternary deposits. This work proposes aeromagnetic modelling and interpretation of the FZD in order to better constrain its structural extent. The data have allowed (i) mapping of the dyke over great distances, under the Quaternary deposits and through areas where it was poorly characterized on the geological map; (ii) identifying major tectonic lineaments interpreted as faults; (iii) recognizing magnetic anomalies related to mafic intrusive bodies; and (iv) informing about regional structural context.
Virtual GEOINT Center: C2ISR through an avatar's eyes
NASA Astrophysics Data System (ADS)
Seibert, Mark; Tidbal, Travis; Basil, Maureen; Muryn, Tyler; Scupski, Joseph; Williams, Robert
2013-05-01
As the number of devices collecting and sending data in the world are increasing, finding ways to visualize and understand that data is becoming more and more of a problem. This has often been coined as the problem of "Big Data." The Virtual Geoint Center (VGC) aims to aid in solving that problem by providing a way to combine the use of the virtual world with outside tools. Using open-source software such as OpenSim and Blender, the VGC uses a visually stunning 3D environment to display the data sent to it. The VGC is broken up into two major components: The Kinect Minimap, and the Geoint Map. The Kinect Minimap uses the Microsoft Kinect and its open-source software to make a miniature display of people the Kinect detects in front of it. The Geoint Map collect smartphone sensor information from online databases and displays them in real time onto a map generated by Google Maps. By combining outside tools and the virtual world, the VGC can help a user "visualize" data, and provide additional tools to "understand" the data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Susandi, Armi, E-mail: armi@meteo.itb.ac.id; Tamamadin, Mamad, E-mail: mamadtama@meteo.itb.ac.id; Djamal, Erizal, E-mail: erizal-jamal@yahoo.com
This paper describes information system of rice planting calendar to help farmers in determining the time for rice planting. The information includes rainfall prediction in ten days (dasarian) scale overlaid to map of rice field to produce map of rice planting in village level. The rainfall prediction was produced by stochastic modeling using Fast Fourier Transform (FFT) and Non-Linier Least Squares methods to fit the curve of function to the rainfall data. In this research, the Fourier series has been modified become non-linear function to follow the recent characteristics of rainfall that is non stationary. The results have been alsomore » validated in 4 steps, including R-Square, RMSE, R-Skill, and comparison with field data. The development of information system (cyber extension) provides information such as rainfall prediction, prediction of the planting time, and interactive space for farmers to respond to the information submitted. Interfaces for interactive response will be critical to the improvement of prediction accuracy of information, both rainfall and planting time. The method used to get this information system includes mapping on rice planting prediction, converting the format file, developing database system, developing website, and posting website. Because of this map was overlaid with the Google map, the map files must be converted to the .kml file format.« less
A Different Web-Based Geocoding Service Using Fuzzy Techniques
NASA Astrophysics Data System (ADS)
Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.
2015-12-01
Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.
The Status of Topographic Mapping in the World a Unggim-Isprs Project 2012-2015
NASA Astrophysics Data System (ADS)
Konecny, G.; Breitkopf, U.; Radtke, A.
2016-06-01
In December 2011, UNGGIM initiated a cooperative project with ISPRS to resume the former UN Secretariat studies on the status of topographic mapping in the world, conducted between 1968 and 1986. After the design of a questionnaire with 27 questions, the UNGGIM Secretariat sent the questionnaires to the UN member states. 115 replies were received from the 193 member states and regions thereof. Regarding the global data coverage and age, the UN questionnaire survey was supplemented by data from the Eastview database. For each of the 27 questions, an interactive viewer was programmed permitting the analysis of the results. The authoritative data coverage at the various scale ranges has greatly increased between 1986 and 2012. Now, a 30 % 1 : 25 000 map data coverage and a 75 % 1 : 50 000 map data coverage has been completed. Nevertheless, there is still an updating problem, as data for some countries is 10 to 30 years old. Private Industry, with Google, Microsoft and Navigation system providers, have undertaken huge efforts to supplement authoritative mapping. For critical areas on the globe, MGCP committed to military mapping at 1 : 50 000. ISPRS has decided to make such surveys a sustainable issue by establishing a working group.
Wang, Jie; Xiao, Xiangming; Qin, Yuanwei; Dong, Jinwei; Zhang, Geli; Kou, Weili; Jin, Cui; Zhou, Yuting; Zhang, Yao
2015-05-12
As farmland systems vary over space and time (season and year), accurate and updated maps of paddy rice are needed for studies of food security and environmental problems. We selected a wheat-rice double-cropped area from fragmented landscapes along the rural-urban complex (Jiangsu Province, China) and explored the potential utility of integrating time series optical images (Landsat-8, MODIS) and radar images (PALSAR) in mapping paddy rice planting areas. We first identified several main types of non-cropland land cover and then identified paddy rice fields by selecting pixels that were inundated only during paddy rice flooding periods. These key temporal windows were determined based on MODIS Land Surface Temperature and vegetation indices. The resultant paddy rice map was evaluated using regions of interest (ROIs) drawn from multiple high-resolution images, Google Earth, and in-situ cropland photos. The estimated overall accuracy and Kappa coefficient were 89.8% and 0.79, respectively. In comparison with the National Land Cover Data (China) from 2010, the resultant map better detected changes in the paddy rice fields and revealed more details about their distribution. These results demonstrate the efficacy of using images from multiple sources to generate paddy rice maps for two-crop rotation systems.
Google Scholar: The 800-Pound Gorilla in the Room
ERIC Educational Resources Information Center
Shapiro, Steven
2012-01-01
There is a "clash of civilizations" going on in the information field--a clash characterized by a brash upstart, Google, and its attendant creations, Google Scholar and Google Books, and the old guard represented by the library world. Librarians who deprecate Google Scholar or simply ignore the Google phenomenon do so at their own risk. Google…
2013-01-01
The Korean Journal of Urology began to be published exclusively in English in 2010 and is indexed in PubMed Central/PubMed. This study analyzed a variety of citation indicators of the Korean Journal of Urology before and after 2010 to clarify the present position of the journal among the urology category journals. The impact factor, SCImago Journal Rank (SJR), impact index, Z-impact factor (ZIF, impact factor excluding self-citation), and Hirsch Index (H-index) were referenced or calculated from Web of Science, Scopus, SCImago Journal & Country Ranking, Korean Medical Citation Index (KoMCI), KoreaMed Synapse, and Google Scholar. Both the impact factor and the total citations rose rapidly beginning in 2011. The 2012 impact factor corresponded to the upper 84.9% in the nephrology-urology category, whereas the 2011 SJR was in the upper 58.5%. The ZIF in KoMCI was one fifth of the impact factor because there are only two other urology journals in KoMCI. Up to 2009, more than half of the citations in the Web of Science were from Korean researchers, but from 2010 to 2012, more than 85% of the citations were from international researchers. The H-indexes from Web of Science, Scopus, KoMCI, KoreaMed Synapse, and Google Scholar were 8, 10, 12, 9, and 18, respectively. The strategy of the language change in 2010 was successful from the perspective of citation indicators. The values of the citation indicators will continue to increase rapidly and consistently as the research achievement of authors of the Korean Journal of Urology increases. PMID:23614057
Huh, Sun
2013-04-01
The Korean Journal of Urology began to be published exclusively in English in 2010 and is indexed in PubMed Central/PubMed. This study analyzed a variety of citation indicators of the Korean Journal of Urology before and after 2010 to clarify the present position of the journal among the urology category journals. The impact factor, SCImago Journal Rank (SJR), impact index, Z-impact factor (ZIF, impact factor excluding self-citation), and Hirsch Index (H-index) were referenced or calculated from Web of Science, Scopus, SCImago Journal & Country Ranking, Korean Medical Citation Index (KoMCI), KoreaMed Synapse, and Google Scholar. Both the impact factor and the total citations rose rapidly beginning in 2011. The 2012 impact factor corresponded to the upper 84.9% in the nephrology-urology category, whereas the 2011 SJR was in the upper 58.5%. The ZIF in KoMCI was one fifth of the impact factor because there are only two other urology journals in KoMCI. Up to 2009, more than half of the citations in the Web of Science were from Korean researchers, but from 2010 to 2012, more than 85% of the citations were from international researchers. The H-indexes from Web of Science, Scopus, KoMCI, KoreaMed Synapse, and Google Scholar were 8, 10, 12, 9, and 18, respectively. The strategy of the language change in 2010 was successful from the perspective of citation indicators. The values of the citation indicators will continue to increase rapidly and consistently as the research achievement of authors of the Korean Journal of Urology increases.
The ethics of Google Earth: crossing thresholds from spatial data to landscape visualisation.
Sheppard, Stephen R J; Cizek, Petr
2009-05-01
'Virtual globe' software systems such as Google Earth are growing rapidly in popularity as a way to visualise and share 3D environmental data. Scientists and environmental professionals, many of whom are new to 3D modeling and visual communications, are beginning routinely to use such techniques in their work. While the appeal of these techniques is evident, with unprecedented opportunities for public access to data and collaborative engagement over the web, are there nonetheless risks in their widespread usage when applied in areas of the public interest such as planning and policy-making? This paper argues that the Google Earth phenomenon, which features realistic imagery of places, cannot be dealt with only as a question of spatial data and geographic information science. The virtual globe type of visualisation crosses several key thresholds in communicating scientific and environmental information, taking it well beyond the realm of conventional spatial data and geographic information science, and engaging more complex dimensions of human perception and aesthetic preference. The realism, perspective views, and social meanings of the landscape visualisations embedded in virtual globes invoke not only cognition but also emotional and intuitive responses, with associated issues of uncertainty, credibility, and bias in interpreting the imagery. This paper considers the types of risks as well as benefits that may exist with participatory uses of virtual globes by experts and lay-people. It is illustrated with early examples from practice and relevant themes from the literature in landscape visualisation and related disciplines such as environmental psychology and landscape planning. Existing frameworks and principles for the appropriate use of environmental visualisation methods are applied to the special case of widely accessible, realistic 3D and 4D visualisation systems such as Google Earth, in the context of public awareness-building and agency decision-making on environmental issues. Relevant principles are suggested which lend themselves to much-needed evaluation of risks and benefits of virtual globe systems. Possible approaches for balancing these benefits and risks include codes of ethics, software design, and metadata templates.
CERT Research Annual Report 2009
2009-01-01
Domain Name System (DNS), which maps names to IP addresses, is a vital component of the Internet. Nearly every transaction on the Internet begins by...many different ASNs (Autonomous System Numbers, which map to Internet Service Providers) there are. If there are more than 20, then it is extremely...functions, that is, mappings from their domains to ranges, or inputs to outputs. These mappings are pre-defined as a starting point for the FX
Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation
2011-01-01
This paper covers the use of depth sensors such as Microsoft Kinect and ASUS Xtion to provide a natural user interface (NUI) for controlling 3-D (three-dimensional) virtual globes such as Google Earth (including its Street View mode), Bing Maps 3D, and NASA World Wind. The paper introduces the Microsoft Kinect device, briefly describing how it works (the underlying technology by PrimeSense), as well as its market uptake and application potential beyond its original intended purpose as a home entertainment and video game controller. The different software drivers available for connecting the Kinect device to a PC (Personal Computer) are also covered, and their comparative pros and cons briefly discussed. We survey a number of approaches and application examples for controlling 3-D virtual globes using the Kinect sensor, then describe Kinoogle, a Kinect interface for natural interaction with Google Earth, developed by students at Texas A&M University. Readers interested in trying out the application on their own hardware can download a Zip archive (included with the manuscript as additional files 1, 2, &3) that contains a 'Kinnogle installation package for Windows PCs'. Finally, we discuss some usability aspects of Kinoogle and similar NUIs for controlling 3-D virtual globes (including possible future improvements), and propose a number of unique, practical 'use scenarios' where such NUIs could prove useful in navigating a 3-D virtual globe, compared to conventional mouse/3-D mouse and keyboard-based interfaces. PMID:21791054
McGough, Sarah F.; Brownstein, John S.; Hawkins, Jared B.; Santillana, Mauricio
2017-01-01
Background Over 400,000 people across the Americas are thought to have been infected with Zika virus as a consequence of the 2015–2016 Latin American outbreak. Official government-led case count data in Latin America are typically delayed by several weeks, making it difficult to track the disease in a timely manner. Thus, timely disease tracking systems are needed to design and assess interventions to mitigate disease transmission. Methodology/Principal Findings We combined information from Zika-related Google searches, Twitter microblogs, and the HealthMap digital surveillance system with historical Zika suspected case counts to track and predict estimates of suspected weekly Zika cases during the 2015–2016 Latin American outbreak, up to three weeks ahead of the publication of official case data. We evaluated the predictive power of these data and used a dynamic multivariable approach to retrospectively produce predictions of weekly suspected cases for five countries: Colombia, El Salvador, Honduras, Venezuela, and Martinique. Models that combined Google (and Twitter data where available) with autoregressive information showed the best out-of-sample predictive accuracy for 1-week ahead predictions, whereas models that used only Google and Twitter typically performed best for 2- and 3-week ahead predictions. Significance Given the significant delay in the release of official government-reported Zika case counts, we show that these Internet-based data streams can be used as timely and complementary ways to assess the dynamics of the outbreak. PMID:28085877
Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation.
Boulos, Maged N Kamel; Blanchard, Bryan J; Walker, Cory; Montero, Julio; Tripathy, Aalap; Gutierrez-Osuna, Ricardo
2011-07-26
This paper covers the use of depth sensors such as Microsoft Kinect and ASUS Xtion to provide a natural user interface (NUI) for controlling 3-D (three-dimensional) virtual globes such as Google Earth (including its Street View mode), Bing Maps 3D, and NASA World Wind. The paper introduces the Microsoft Kinect device, briefly describing how it works (the underlying technology by PrimeSense), as well as its market uptake and application potential beyond its original intended purpose as a home entertainment and video game controller. The different software drivers available for connecting the Kinect device to a PC (Personal Computer) are also covered, and their comparative pros and cons briefly discussed. We survey a number of approaches and application examples for controlling 3-D virtual globes using the Kinect sensor, then describe Kinoogle, a Kinect interface for natural interaction with Google Earth, developed by students at Texas A&M University. Readers interested in trying out the application on their own hardware can download a Zip archive (included with the manuscript as additional files 1, 2, &3) that contains a 'Kinnogle installation package for Windows PCs'. Finally, we discuss some usability aspects of Kinoogle and similar NUIs for controlling 3-D virtual globes (including possible future improvements), and propose a number of unique, practical 'use scenarios' where such NUIs could prove useful in navigating a 3-D virtual globe, compared to conventional mouse/3-D mouse and keyboard-based interfaces.
Using Google Location History to track personal exposure to air pollution
NASA Astrophysics Data System (ADS)
Marais, E. A.; Wiedinmyer, C.
2017-12-01
Big data is increasingly used in air pollution research to monitor air quality and develop mitigation strategies. Google Location History provides an archive of geolocation and time information from mobile devices that can be used to track personal exposure to air pollution. Here we demonstrate the utility of Google Location History for assessing true exposure of individuals to air pollution hazardous to human health in an increasingly mobile world. We use the GEOS-Chem chemical transport model at coarse resolution (2° × 2.5°; latitude × longitude) to calculate and sample surface concentrations of fine particle mass (PM2.5) and ozone concentrations at the same time and location of each of six volunteers for 2 years (June 2015 to May 2017) and compare this to annual mean PM2.5 and ozone estimated at their postal addresses. The latter is synonymous with Global Burden of Disease studies that use a static population distribution map. We find that mobile PM2.5 is higher than static PM2.5 for most (five out of six) volunteers and can lead to a 10% increase in the risk for ischemic heart disease and stroke mortality. The difference may be more if instead a high resolution CTM or an abundant air quality monitoring network is used. There is tremendous potential to exploit geolocation and time data from mobile devices for cohort health studies and to determine best practices for limiting personal exposure to air pollution.
Landsat Based Woody Vegetation Loss Detection in Queensland, Australia Using the Google Earth Engine
NASA Astrophysics Data System (ADS)
Johansen, K.; Phinn, S. R.; Taylor, M.
2014-12-01
Land clearing detection and woody Foliage Projective Cover (FPC) monitoring at the state and national level in Australia has mainly been undertaken by state governments and the Terrestrial Ecosystem Research Network (TERN) because of the considerable expense, expertise, sustained duration of activities and staffing levels needed. Only recently have services become available, providing low budget, generalized access to change detection tools suited to this task. The objective of this research was to examine if a globally available service, Google Earth Engine Beta, could be used to predict woody vegetation loss with accuracies approaching the methods used by TERN and the government of the state of Queensland, Australia. Two change detection approaches were investigated using Landsat Thematic Mapper time series and the Google Earth Engine Application Programming Interface: (1) CART and Random Forest classifiers; and (2) a normalized time series of Foliage Projective Cover (FPC) and NDVI combined with a spectral index. The CART and Random Forest classifiers produced high user's and producer's mapping accuracies of clearing (77-92% and 54-77%, respectively) when detecting change within epochs for which training data were available, but extrapolation to epochs without training data reduced the mapping accuracies. The use of FPC and NDVI time series provided a more robust approach for calculation of a clearing probability, as it did not rely on training data but instead on the difference of the normalized FPC / NDVI mean and standard deviation of a single year at the change point in relation to the remaining time series. However, the FPC and NDVI time series approach represented a trade-off between user's and producer's accuracies. Both change detection approaches explored in this research were sensitive to ephemeral greening and drying of the landscape. However, the developed normalized FPC and NDVI time series approach can be tuned to provide automated alerts for large woody vegetation clearing events by selecting suitable thresholds to identify very likely clearing. This research provides a comprehensive foundation to build further capacity to use globally accessible, free, online image datasets and processing tools to accurately detect woody vegetation clearing in an automated and rapid manner.
Global Analysis of River Planform Change using the Google Earth Engine
NASA Astrophysics Data System (ADS)
Bryk, A.; Dietrich, W. E.; Gorelick, N.; Sargent, R.; Braudrick, C. A.
2014-12-01
Geomorphologists have historically tracked river dynamics using a combination of maps, aerial photographs, and the stratigraphic record. Although stratigraphic records can extend into deep time, maps and aerial photographs often confine our record of change to sparse measurements over the last ~80 years and in some cases much less time. For the first time Google's Earth Engine (GEE) cloud based platform allows researchers the means to analyze quantitatively the pattern and pace of river channel change over the last 30 years with high temporal resolution across the entire planet. The GEE provides an application programing interface (API) that enables quantitative analysis of various data sets including the entire Landsat L1T archive. This allows change detection for channels wider than about 150 m over 30 years of successive, georeferenced imagery. Qualitatively, it becomes immediately evident that the pace of channel morphodynamics for similar planforms varies by orders of magnitude across the planet and downstream along individual rivers. To quantify these rates of change and to explore their controls we have developed methods for differentiating channels from floodplain along large alluvial rivers. We introduce a new metric of morphodynamics: the ratio of eroded area to channel area per unit time, referred to as "M". We also keep track of depositional areas resulting from channel shifting. To date our quantitative analysis has focused on rivers in the Andean foreland. Our analysis shows channel bank erosion rates, M, varies by orders of magnitude for these rivers, from 0 to ~0.25 yr-1, yet these rivers have essentially identical curvature and sinuosity and are visually indistinguishable. By tracking both bank paths in time, we find that, for some meandering rivers, a significant fraction of new floodplain is produced through outer-bank accretion rather than point bar deposition. This process is perhaps more important in generating floodplain stratigraphy than previously recognized. These initial findings indicate a new set of quantitative observations will emerge to further test and advance morphodynamic theory. The Google Earth Engine offers the opportunity to explore river morphodynamics on an unprecedented scale and provides a powerful tool for addressing fundamental questions in river morphodynamics.
Improving fieldwork by using GIS for quantitative exploration, data management and digital mapping
NASA Astrophysics Data System (ADS)
Marra, Wouter; Alberti, Koko; van de Grint, Liesbeth; Karssenberg, Derek
2016-04-01
Fieldwork is an essential part of teaching geosciences. The essence of a fieldwork is to study natural phenomena in its proper context. Fieldworks dominantly utilize a learning-by-experiencing learning style and are often light on abstract thinking skills. We introduce more of the latter skills to a first-year fieldwork of several weeks by using Geographical Information Systems (GIS). We use simple techniques as the involved students had no prior experience with GIS. In our project, we introduced new tutorials prior to the fieldwork where students explored their research area using aerial photos, satellite images, an elevation model and slope-map using Google Earth and QGIS. The goal of these tutorials was to get acquainted with the area, plan the first steps of the fieldwork, and formulate hypotheses in form of a preliminary map based on quantitative data. During the actual fieldwork, half of the students processed and managed their field data using GIS, used elevation data as additional data source, and made digital geomorphological maps. This was in contrast to the other half of the students that used classic techniques with paper maps. We evaluated the learning benefits by two questionnaires (one before and one after the fieldwork), and a group interview with students that used GIS in the field. Students liked the use of Google Earth and GIS, and many indicate the added value of using quantitative maps. The hypotheses and fieldwork plans of the students were quickly superseded by insights during the fieldwork itself, but making these plans and hypotheses in advance improved the student's ability to perform empirical research. Students were very positive towards the use of GIS for their fieldwork, mainly because they experienced it as a modern and relevant technique for research and the labour market. Tech-savvy students were extra motivated and explored additional methods. There were some minor technical difficulties with using GIS during the fieldwork, but these can be solved by focussing the preparatory tutorials on what to expect during the fieldwork. We did not observe a significant difference in the quality of the products created by students between both groups since both digital and classic maps show a large range of aesthetic and scientific quality. To conclude, we had a positive experience with our first attempt to add GIS components to a classic fieldwork. The main benefit is that students use quantitative data which provides a different view on the fieldwork area and triggers abstract thinking. Future plans include using the student's field data in a web-gis app to allow easy remote supervision and using digital maps in the field.
Taking advantage of Google's Web-based applications and services.
Brigham, Tara J
2014-01-01
Google is a company that is constantly expanding and growing its services and products. While most librarians possess a "love/hate" relationship with Google, there are a number of reasons you should consider exploring some of the tools Google has created and made freely available. Applications and services such as Google Docs, Slides, and Google+ are functional and dynamic without the cost of comparable products. This column will address some of the issues users should be aware of before signing up to use Google's tools, and a description of some of Google's Web applications and services, plus how they can be useful to librarians in health care.
Emergency Response Damage Assessment using Satellite Remote Sensing Data
NASA Astrophysics Data System (ADS)
Clandillon, Stephen; Yésou, Hervé; Schneiderhan, Tobias; de Boissezon, Hélène; de Fraipont, Paul
2013-04-01
During disasters rescue and relief organisations need quick access to reliable and accurate information to be better equipped to do their job. It is increasingly felt that satellites offer a unique near real time (NRT) tool to aid disaster management. A short introduction to the International Charter 'Space and Major Disasters', in operation since 2000 promoting worldwide cooperation among member space agencies, will be given as it is the foundation on which satellite-based, emergency response, damage assessment has been built. Other complementary mechanisms will also be discussed. The user access, triggering mechanism, an essential component for this user-driven service, will be highlighted with its 24/7 single access point. Then, a clear distinction will be made between data provision and geo-information delivery mechanisms to underline the user need for geo-information that is easily integrated into their working environments. Briefly, the path to assured emergency response product quality will be presented beginning with user requirements, expressed early-on, for emergency response value-adding services. Initiatives were then established, supported by national and European institutions, to develop the sector, with SERTIT and DLR being key players, providing support to decision makers in headquarters and relief teams in the field. To consistently meet the high quality levels demanded by users, rapid mapping has been transformed via workflow and quality control standardisation to improve both speed and quality. As such, SERTIT located in Alsace, France, and DLR/ZKI from Bavaria, Germany, join their knowledge in this presentation to report about recent standards as both have ISO certified their rapid mapping services based on experienced, well-trained, 24/7 on-call teams and established systems providing the first crisis analysis product in 6 hours after satellite data reception. The three main product types provided are then outlined: up-to-date pre-event reference maps, disaster extent maps and damage assessment or intensity/grading maps. With Google and open-sourced information the need for the reference maps has diminished, but not altogether, as damage extent and assessment products also require coherent reference geo-information which often has to be produced internally. Increasingly users need up-to-date, highly detailed, customised products; it is in damage assessment that an operator's working environment, geomatic skills and experience can often provide the highest levels of value-adding while adapting to user requests. Accordingly, DLR and SERTIT are involved in R&D work integrating data, e.g. TerraSAR-X and Pléiades sources plus Sentinel simulated data, which have interesting emergency mapping capacities. Their close interaction with the research sector is essential to be at the cutting-edge of the field, implementing effective and efficient analysis methods. Future R&D challenges to further improve the quality of the damage mapping service will be highlighted. Finally, this presentation will show some practical examples and thus how at present, space-based rapid mapping, which has more than 10 years of experience, has come to being able to provide, if rapidly programmed and acquired, geo-information linked to disaster extent and damage assessment from overview scales down to the street level and this with an ever increasing array of satellite data sources.
MapMyFlu: visualizing spatio-temporal relationships between related influenza sequences
Nolte, Nicholas; Kurzawa, Nils; Eils, Roland; Herrmann, Carl
2015-01-01
Understanding the molecular dynamics of viral spreading is crucial for anticipating the epidemiological implications of disease outbreaks. In the case of influenza, reassortments or point mutations affect the adaption to new hosts or resistance to anti-viral drugs and can determine whether a new strain will result in a pandemic infection or a less severe progression. To this end, tools integrating molecular information with epidemiological parameters are important to understand how molecular characteristics reflect in the infection dynamics. We present a new web tool, MapMyFlu, which allows to spatially and temporally display influenza viruses related to a query sequence on a Google Map based on BLAST results against the NCBI Influenza Database. Temporal and geographical trends appear clearly and may help in reconstructing the evolutionary history of a particular sequence. The tool is accessible through a web server, hence without the need for local installation. The website has an intuitive design and provides an easy-to-use service, and is available at http://mapmyflu.ipmb.uni-heidelberg.de PMID:25940623
NASA Soil Moisture Mission Produces First Global Radar Map
2015-04-21
With its antenna now spinning at full speed, NASA new Soil Moisture Active Passive SMAP observatory has successfully re-tested its science instruments and generated its first global maps, a key step to beginning routine science operations in May, 2015
NASA Soil Moisture Mission Produces First Global Radiometer Map
2015-04-21
With its antenna now spinning at full speed, NASA new Soil Moisture Active Passive SMAP observatory has successfully re-tested its science instruments and generated its first global maps, a key step to beginning routine science operations in May, 2015
Gao, Su-qing; Wang, Zhen; Gao, Hong-wei; Liu, Peng; Wang, Ze-rui; Li, Yan-li; Zhu, Xu-guang; Li, Xin-lou; Xu, Bo; Li, Yin-jun; Yang, Hong; de Vlas, Sake J.; Shi, Tao-xing; Cao, Wu-chun
2013-01-01
Background For years, emerging infectious diseases have appeared worldwide and threatened the health of people. The emergence and spread of an infectious-disease outbreak are usually unforeseen, and have the features of suddenness and uncertainty. Timely understanding of basic information in the field, and the collection and analysis of epidemiological information, is helpful in making rapid decisions and responding to an infectious-disease emergency. Therefore, it is necessary to have an unobstructed channel and convenient tool for the collection and analysis of epidemiologic information in the field. Methodology/Principal Findings Baseline information for each county in mainland China was collected and a database was established by geo-coding information on a digital map of county boundaries throughout the country. Google Maps was used to display geographic information and to conduct calculations related to maps, and the 3G wireless network was used to transmit information collected in the field to the server. This study established a decision support system for the response to infectious-disease emergencies based on WebGIS and mobile services (DSSRIDE). The DSSRIDE provides functions including data collection, communication and analyses in real time, epidemiological detection, the provision of customized epidemiological questionnaires and guides for handling infectious disease emergencies, and the querying of professional knowledge in the field. These functions of the DSSRIDE could be helpful for epidemiological investigations in the field and the handling of infectious-disease emergencies. Conclusions/Significance The DSSRIDE provides a geographic information platform based on the Google Maps application programming interface to display information of infectious disease emergencies, and transfers information between workers in the field and decision makers through wireless transmission based on personal computers, mobile phones and personal digital assistants. After a 2-year practice and application in infectious disease emergencies, the DSSRIDE is becoming a useful platform and is a useful tool for investigations in the field carried out by response sections and individuals. The system is suitable for use in developing countries and low-income districts. PMID:23372780
Dynamic Flood Vulnerability Mapping with Google Earth Engine
NASA Astrophysics Data System (ADS)
Tellman, B.; Kuhn, C.; Max, S. A.; Sullivan, J.
2015-12-01
Satellites capture the rate and character of environmental change from local to global levels, yet integrating these changes into flood exposure models can be cost or time prohibitive. We explore an approach to global flood modeling by leveraging satellite data with computing power in Google Earth Engine to dynamically map flood hazards. Our research harnesses satellite imagery in two main ways: first to generate a globally consistent flood inundation layer and second to dynamically model flood vulnerability. Accurate and relevant hazard maps rely on high quality observation data. Advances in publicly available spatial, spectral, and radar data together with cloud computing allow us to improve existing efforts to develop a comprehensive flood extent database to support model training and calibration. This talk will demonstrate the classification results of algorithms developed in Earth Engine designed to detect flood events by combining observations from MODIS, Landsat 8, and Sentinel-1. Our method to derive flood footprints increases the number, resolution, and precision of spatial observations for flood events both in the US, recorded in the NCDC (National Climatic Data Center) storm events database, and globally, as recorded events from the Colorado Flood Observatory database. This improved dataset can then be used to train machine learning models that relate spatial temporal flood observations to satellite derived spatial temporal predictor variables such as precipitation, antecedent soil moisture, and impervious surface. This modeling approach allows us to rapidly update models with each new flood observation, providing near real time vulnerability maps. We will share the water detection algorithms used with each satellite and discuss flood detection results with examples from Bihar, India and the state of New York. We will also demonstrate how these flood observations are used to train machine learning models and estimate flood exposure. The final stage of our comprehensive approach to flood vulnerability couples inundation extent with social data to determine which flood exposed communities have the greatest propensity for loss. Specifically, by linking model outputs to census derived social vulnerability estimates (Indian and US, respectively) to predict how many people are at risk.
Harvesting rockfall hazard evaluation parameters from Google Earth Street View
NASA Astrophysics Data System (ADS)
Partsinevelos, Panagiotis; Agioutantis, Zacharias; Tripolitsiotis, Achilles; Steiakakis, Chrysanthos; Mertikas, Stelios
2015-04-01
Rockfall incidents along highways and railways prove extremely dangerous for properties, infrastructures and human lives. Several qualitative metrics such as the Rockfall Hazard Rating System (RHRS) and the Colorado Rockfall Hazard Rating System (CRHRS) have been established to estimate rockfall potential and provide risk maps in order to control and monitor rockfall incidents. The implementation of such metrics for efficient and reliable risk modeling require accurate knowledge of multi-parametric attributes such as the geological, geotechnical, topographic parameters of the study area. The Missouri Rockfall Hazard Rating System (MORH RS) identifies the most potentially problematic areas using digital video logging for the determination of parameters like slope height and angle, face irregularities, etc. This study aims to harvest in a semi-automated approach geometric and qualitative measures through open source platforms that may provide 3-dimensional views of the areas of interest. More specifically, the Street View platform from Google Maps, is hereby used to provide essential information that can be used towards 3-dimensional reconstruction of slopes along highways. The potential of image capturing along a programmable virtual route to provide the input data for photogrammetric processing is also evaluated. Moreover, qualitative characterization of the geological and geotechnical status, based on the Street View images, is performed. These attributes are then integrated to deliver a GIS-based rockfall hazard map. The 3-dimensional models are compared to actual photogrammetric measures in a rockfall prone area in Crete, Greece while in-situ geotechnical characterization is also used to compare and validate the hazard risk. This work is considered as the first step towards the exploitation of open source platforms to improve road safety and the development of an operational system where authorized agencies (i.e., civil protection) will be able to acquire near-real time hazard maps based on video images retrieved either by open source platforms, operational unmanned aerial vehicles, and/or simple video recordings from users. This work has been performed under the framework of the "Cooperation 2011" project ISTRIA (11_SYN_9_13989) funded from the Operational Program "Competitiveness and Entrepreneurship" (co-funded by the European Regional Development Fund (ERDF)) and managed by the Greek General Secretariat for Research and Technology.
HapMap filter 1.0: a tool to preprocess the HapMap genotypic data for association studies.
Zhang, Wei; Duan, Shiwei; Dolan, M Eileen
2008-05-13
The International HapMap Project provides a resource of genotypic data on single nucleotide polymorphisms (SNPs), which can be used in various association studies to identify the genetic determinants for phenotypic variations. Prior to the association studies, the HapMap dataset should be preprocessed in order to reduce the computation time and control the multiple testing problem. The less informative SNPs including those with very low genotyping rate and SNPs with rare minor allele frequencies to some extent in one or more population are removed. Some research designs only use SNPs in a subset of HapMap cell lines. Although the HapMap website and other association software packages have provided some basic tools for optimizing these datasets, a fast and user-friendly program to generate the output for filtered genotypic data would be beneficial for association studies. Here, we present a flexible, straight-forward bioinformatics program that can be useful in preparing the HapMap genotypic data for association studies by specifying cell lines and two common filtering criteria: minor allele frequencies and genotyping rate. The software was developed for Microsoft Windows and written in C++. The Windows executable and source code in Microsoft Visual C++ are available at Google Code (http://hapmap-filter-v1.googlecode.com/) or upon request. Their distribution is subject to GNU General Public License v3.
Solid-State Recorders Enhance Scientific Data Collection
NASA Technical Reports Server (NTRS)
2010-01-01
Under Small Business Innovation Research (SBIR) contracts with Goddard Space Flight Center, SEAKR Engineering Inc., of Centennial, Colorado, crafted a solid-state recorder (SSR) to replace the tape recorder onboard a Spartan satellite carrying NASA's Inflatable Antenna Experiment. Work for that mission and others has helped SEAKR become the world leader in SSR technology for spacecraft. The company has delivered more than 100 systems, more than 85 of which have launched onboard NASA, military, and commercial spacecraft including imaging satellites that provide much of the high-resolution imagery for online mapping services like Google Earth.
NASA Astrophysics Data System (ADS)
Ray, S. E.; Fetzer, E. J.; Lambrigtsen, B.; Olsen, E. T.; Licata, S. J.; Hall, J. R.; Penteado, P. F.; Realmuto, V. J.; Thrastarson, H. T.; Teixeira, J.; Granger, S. L.; Behrangi, A.; Farahmand, A.
2017-12-01
The Atmospheric Infrared Sounder (AIRS) has been returning daily global observations of Earth's atmospheric constituents and properties since 2002. With its 15-year data record and near real-time capability, AIRS data are being used in the development of applications that fall within many of the NASA Applied Science focus areas. An automated alert system for volcanic plumes has been developed that triggers on threshold breaches of SO2, ash and dust in granules of AIRS data. The system generates a suite of granule-scale maps that depict both plume and clouds, all accessible from the AIRS web site. Alerts are sent to a curated list of volcano community members, and links to views in NASA Worldview and Google Earth are also available. Seasonal influenza epidemics are major public health concern with millions of cases of severe illness and large economic impact. Recent studies have highlighted the role of absolute or specific humidity as a likely player in the seasonal nature of these outbreaks. A quasi-operational influenza outbreak prediction system has been developed based on the SIRS model which uses AIRS and NCEP humidity data, Center for Disease Control reports on flu and flu-like illnesses, and results from Google Flu Trends. Work is underway to account for diffusion (spatial) in addition to the temporal spreading of influenza. The US Drought Monitor (USDM) is generated weekly by the National Drought Mitigation Center (NDMC) and is used by policymakers for drought decision-making. AIRS data have demonstrated utility in monitoring the development and detection of meteorological drought with both AIRS-derived standardized vapor pressure deficit and standardized relative humidity, showing early detection lead times of up to two months. An agreement was secured with the NDMC to begin a trial period using AIRS products in the production of the USDM, and in July of 2017 the operational delivery of weekly CONUS AIRS images of Relative Humidity, Surface Air Temperature, and Vapor Pressure Deficit to the National Drought Mitigation Center commenced. Next objectives include determining whether AIRS drought products can also be useful in the NDMC's VegDRI and QuickDRI products. This poster provides an overview of the work being done in these three application areas and summarize additional application efforts using data from AIRS.
ERIC Educational Resources Information Center
Columbus State Community Coll., OH.
This document contains materials developed for and about the environmental technology tech prep program of the South-Western City Schools in Ohio. Part 1 begins with a map of the program, which begins with an environmental science technology program in grades 11 and 12 that leads to entry-level employment or a 2-year environmental technology…
Bright Beginnings. WWC Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2009
2009-01-01
Bright Beginnings is an early childhood curriculum, based in part on High/Scope[R] and Creative Curriculum[R], with an additional emphasis on literacy skills. The curriculum consists of nine thematic units designed to enhance children's cognitive, social, emotional, and physical development, and each unit includes concept maps, literacy lessons,…
Automotive Diagnostic Technologies.
ERIC Educational Resources Information Center
Columbus State Community Coll., OH.
This document contains materials developed for and about the automotive diagnostic technologies tech prep program of the South-Western City Schools in Ohio. Part 1 begins with a map of the program, which begins with an automotive/diagnostic technologies program in grades 11 and 12 that leads to entry-level employment or a 2-year automotive…
NASA Astrophysics Data System (ADS)
Yang, D.; Fu, C. S.; Binford, M. W.
2017-12-01
The southeastern United States has high landscape heterogeneity, withheavily managed forestlands, highly developed agriculture lands, and multiple metropolitan areas. Human activities are transforming and altering land patterns and structures in both negative and positive manners. A land-use map for at the greater scale is a heavy computation task but is critical to most landowners, researchers, and decision makers, enabling them to make informed decisions for varying objectives. There are two major difficulties in generating the classification maps at the regional scale: the necessity of large training point sets and the expensive computation cost-in terms of both money and time-in classifier modeling. Volunteered Geographic Information (VGI) opens a new era in mapping and visualizing our world, where the platform is open for collecting valuable georeferenced information by volunteer citizens, and the data is freely available to the public. As one of the most well-known VGI initiatives, OpenStreetMap (OSM) contributes not only road network distribution, but also the potential for using this data to justify land cover and land use classifications. Google Earth Engine (GEE) is a platform designed for cloud-based mapping with a robust and fast computing power. Most large scale and national mapping approaches confuse "land cover" and "land-use", or build up the land-use database based on modeled land cover datasets. Unlike most other large-scale approaches, we distinguish and differentiate land-use from land cover. By focusing our prime objective of mapping land-use and management practices, a robust regional land-use mapping approach is developed by incorporating the OpenstreepMap dataset into Earth observation remote sensing imageries instead of the often-used land cover base maps.
Treatments options for alopecia.
Iorizzo, Matilde; Tosti, Antonella
2015-01-01
Hair disorders have a very high social and psychological impact. Treatment is often frustrating and time-consuming both for the patients and the clinicians and requires special skills and expertise. This paper aims to provide an overview of available treatments for the most common forms of alopecia in adults (androgenetic alopecia [AGA], alopecia areata and cicatricial alopecias) after reviewing the literature in PubMed, Google Scholar and ClinicalTrial.gov. Before starting treatment, it is very important to confirm diagnosis and discuss patient's expectations. Treatment of hair disorders requires time and first results are usually visible a few months after beginning of therapy. Treatment of most hair disorders is mostly not evidenced-based as randomized controlled trials are available only for AGA.
NASA Astrophysics Data System (ADS)
Jacobsen, Jurma; Edlich, Stefan
2009-02-01
There is a broad range of potential useful mobile location-based applications. One crucial point seems to be to make them available to the public at large. This case illuminates the abilities of Android - the operating system for mobile devices - to fulfill this demand in the mashup way by use of some special geocoding web services and one integrated web service for getting the nearest cash machines data. It shows an exemplary approach for building mobile location-based mashups for everyone: 1. As a basis for reaching as many people as possible the open source Android OS is assumed to spread widely. 2. Everyone also means that the handset has not to be an expensive GPS device. This is realized by re-utilization of the existing GSM infrastructure with the Cell of Origin (COO) method which makes a lookup of the CellID in one of the growing web available CellID databases. Some of these databases are still undocumented and not yet published. Furthermore the Google Maps API for Mobile (GMM) and the open source counterpart OpenCellID are used. The user's current position localization via lookup of the closest cell to which the handset is currently connected to (COO) is not as precise as GPS, but appears to be sufficient for lots of applications. For this reason the GPS user is the most pleased one - for this user the system is fully automated. In contrary there could be some users who doesn't own a GPS cellular. This user should refine his/her location by one click on the map inside of the determined circular region. The users are then shown and guided by a path to the nearest cash machine by integrating Google Maps API with an overlay. Additionally, the GPS user can keep track of him- or herself by getting a frequently updated view via constantly requested precise GPS data for his or her position.
Monitoring Global Precipitation through UCI CHRS's RainMapper App on Mobile Devices
NASA Astrophysics Data System (ADS)
Nguyen, P.; Huynh, P.; Braithwaite, D.; Hsu, K. L.; Sorooshian, S.
2014-12-01
The Water and Development Information for Arid Lands-a Global Network (G-WADI) Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks—Cloud Classification System (PERSIANN-CCS) GeoServer has been developed through a collaboration between the Center for Hydrometeorology and Remote Sensing (CHRS) at the University of California, Irvine (UCI) and the UNESCO's International Hydrological Program (IHP). G-WADI PERSIANN-CCS GeoServer provides near real-time high resolution (0.04o, approx 4km) global (60oN - 60oS) satellite precipitation estimated by the PERSIANN-CCS algorithm developed by the scientists at CHRS. The G-WADI PERSIANN-CCS GeoServer utilizes the open-source MapServer software from the University of Minnesota to provide a user-friendly web-based mapping and visualization of satellite precipitation data. Recent efforts have been made by the scientists at CHRS to provide free on-the-go access to the PERSIANN-CCS precipitation data through an application named RainMapper for mobile devices. RainMapper provides visualization of global satellite precipitation of the most recent 3, 6, 12, 24, 48 and 72-hour periods overlaid with various basemaps. RainMapper uses the Google maps application programing interface (API) and embedded global positioning system (GPS) access to better monitor the global precipitation data on mobile devices. Functionalities include using geographical searching with voice recognition technologies make it easy for the user to explore near real-time precipitation in a certain location. RainMapper also allows for conveniently sharing the precipitation information and visualizations with the public through social networks such as Facebook and Twitter. RainMapper is available for iOS and Android devices and can be downloaded (free) from the App Store and Google Play. The usefulness of RainMapper was demonstrated through an application in tracking the evolution of the recent Rammasun Typhoon over the Philippines in mid July 2014.
NASA Astrophysics Data System (ADS)
García-Flores, Agustín.; Paz-Gallardo, Abel; Plaza, Antonio; Li, Jun
2016-10-01
This paper describes a new web platform dedicated to the classification of satellite images called Hypergim. The current implementation of this platform enables users to perform classification of satellite images from any part of the world thanks to the worldwide maps provided by Google Maps. To perform this classification, Hypergim uses unsupervised algorithms like Isodata and K-means. Here, we present an extension of the original platform in which we adapt Hypergim in order to use supervised algorithms to improve the classification results. This involves a significant modification of the user interface, providing the user with a way to obtain samples of classes present in the images to use in the training phase of the classification process. Another main goal of this development is to improve the runtime of the image classification process. To achieve this goal, we use a parallel implementation of the Random Forest classification algorithm. This implementation is a modification of the well-known CURFIL software package. The use of this type of algorithms to perform image classification is widespread today thanks to its precision and ease of training. The actual implementation of Random Forest was developed using CUDA platform, which enables us to exploit the potential of several models of NVIDIA graphics processing units using them to execute general purpose computing tasks as image classification algorithms. As well as CUDA, we use other parallel libraries as Intel Boost, taking advantage of the multithreading capabilities of modern CPUs. To ensure the best possible results, the platform is deployed in a cluster of commodity graphics processing units (GPUs), so that multiple users can use the tool in a concurrent way. The experimental results indicate that this new algorithm widely outperform the previous unsupervised algorithms implemented in Hypergim, both in runtime as well as precision of the actual classification of the images.
2009-07-19
Michael Weiss-Malik, Product Manager for Moon in Google Earth, Google, Inc., speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)
2009-07-19
Tiffany Montague, Technical Program Manager for NASA and Google Lunar X PRIZE, Google, Inc., speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)
Exploring Google to Enhance Reference Services
ERIC Educational Resources Information Center
Jia, Peijun
2011-01-01
Google is currently recognized as the world's most powerful search engine. Google is so powerful and intuitive that one does not need to possess many skills to use it. However, Google is more than just simple search. For those who have special search skills and know Google's superior search features, it becomes an extraordinary tool. To understand…
Jin, Shan-Xue; Arai, Junko; Tian, Xuejun; Kumar-Singh, Rajendra; Feig, Larry A
2013-07-26
RAS-GRF1 is a guanine nucleotide exchange factor with the ability to activate RAS and RAC GTPases in response to elevated calcium levels. We previously showed that beginning at 1 month of age, RAS-GRF1 mediates NMDA-type glutamate receptor (NMDAR)-induction of long term depression in the CA1 region of the hippocampus of mice. Here we show that beginning at 2 months of age, when mice first acquire the ability to discriminate between closely related contexts, RAS-GRF1 begins to contribute to the induction of long term potentiation (LTP) in the CA1 hippocampus by mediating the action of calcium-permeable, AMPA-type glutamate receptors (CP-AMPARs). Surprisingly, LTP induction by CP-AMPARs through RAS-GRF1 occurs via activation of p38 MAP kinase rather than ERK MAP kinase, which has more frequently been linked to LTP. Moreover, contextual discrimination is blocked by knockdown of Ras-Grf1 expression specifically in the CA1 hippocampus, infusion of a p38 MAP kinase inhibitor into the CA1 hippocampus, or the injection of an inhibitor of CP-AMPARs. These findings implicate the CA1 hippocampus in the developmentally dependent capacity to distinguish closely related contexts through the appearance of a novel LTP-supporting signaling pathway.
Jin, Shan-Xue; Arai, Junko; Tian, Xuejun; Kumar-Singh, Rajendra; Feig, Larry A.
2013-01-01
RAS-GRF1 is a guanine nucleotide exchange factor with the ability to activate RAS and RAC GTPases in response to elevated calcium levels. We previously showed that beginning at 1 month of age, RAS-GRF1 mediates NMDA-type glutamate receptor (NMDAR)-induction of long term depression in the CA1 region of the hippocampus of mice. Here we show that beginning at 2 months of age, when mice first acquire the ability to discriminate between closely related contexts, RAS-GRF1 begins to contribute to the induction of long term potentiation (LTP) in the CA1 hippocampus by mediating the action of calcium-permeable, AMPA-type glutamate receptors (CP-AMPARs). Surprisingly, LTP induction by CP-AMPARs through RAS-GRF1 occurs via activation of p38 MAP kinase rather than ERK MAP kinase, which has more frequently been linked to LTP. Moreover, contextual discrimination is blocked by knockdown of Ras-Grf1 expression specifically in the CA1 hippocampus, infusion of a p38 MAP kinase inhibitor into the CA1 hippocampus, or the injection of an inhibitor of CP-AMPARs. These findings implicate the CA1 hippocampus in the developmentally dependent capacity to distinguish closely related contexts through the appearance of a novel LTP-supporting signaling pathway. PMID:23766509
Forced Folds and Craters of Elevation in the Afar
NASA Astrophysics Data System (ADS)
Hetherington, Rachel; Mussetti, Giulio; Hagos, Miruts; Deering, Chad; Corti, Giacomo; Magee, Craig; Ian, Bastow; van Wyk de Vries, Benjamin; Marques, Alvaro
2015-04-01
Uplifts caused by magma intrusion have been observed and mapped since the pioneering work of von Buch two hundred years ago. Von Buch's "Craters of elevation theory", developed in the Auvergne and Canaries, was, unfortunately, discredited and mostly forgotten until recent work has shown that the forced folds mapped in seismic sections in sedimentary basins are the same type of feature. Also, these magmatic bulges are being found on an increasing number of volcanoes. The Danakil region of Ethiopia contains a superb range of forced folds with craters of elevation, which we have mapped using Google Earth and other satellite imagery, and some ground truthing. This preliminary work provides a geological map that is inspired from the 1970 Barberi and Vallet map, and which sets out in detail the structure and lava flow units of this area. We describe the main structures of each uplift, and suggest a preliminary general structural evolutionary pattern that provides a model for future research. This work has been untertaken by a group from multiple universities sharing interpretations and data in an open format. We aim to continue this work to study these superb geological features. Their superb nature and preservation is such that we would also press for them to be made a geoheritage reserve of global importance: Either a Geopark or a World Heritage site.
Google Scholar Goes to School: The Presence of Google Scholar on College and University Web Sites
ERIC Educational Resources Information Center
Neuhaus, Chris; Neuhaus, Ellen; Asher, Alan
2008-01-01
This study measured the degree of Google Scholar adoption within academia by analyzing the frequency of Google Scholar appearances on 948 campus and library Web sites, and by ascertaining the establishment of link resolution between Google Scholar and library resources. Results indicate a positive correlation between the implementation of Google…
The Google-ization of Knowledge
ERIC Educational Resources Information Center
Larson, Natasja; Parsons, Jim; Servage, Laura
2007-01-01
How has GOOGLE shaped knowledge? How has it shaped those who use it? This article considers the impact of online knowledge upon the content of knowledge and upon the people who seek it and create it. The authors suggest that 1. Google-ization is reshaping knowledge. 2. Google-ization is changing how knowledge counts as important. 3. Google-ization…
Students' Google Drive Intended Usage: A Case Study of Mathematics Courses in Bangkok University
ERIC Educational Resources Information Center
Prasertsith, Krisawan; Kanthawongs, Penjira; Limpachote, Tan
2016-01-01
Many technologies have changed the way individuals live and learn. Google Inc. has played significant roles in business and academic worlds. Google Apps for Education and Google Classroom have been offered to higher institutions around the globe. Although large cloud service provider such as Google do not encrypt all their stored electronic data…
Tools for Knowledge Analysis, Synthesis, and Sharing
ERIC Educational Resources Information Center
Medland, Michael B.
2007-01-01
Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…
Accuracy of remote electrocardiogram interpretation with the use of Google Glass technology.
Jeroudi, Omar M; Christakopoulos, George; Christopoulos, George; Kotsia, Anna; Kypreos, Megan A; Rangan, Bavana V; Banerjee, Subhash; Brilakis, Emmanouil S
2015-02-01
We sought to investigate the accuracy of remote electrocardiogram (ECG) interpretation using Google Glass (Google, Mountain View, California). Google Glass is an optical head mounted display device with growing applications in medicine. We compared interpretation of 10 ECGs with 21 clinically important findings by faculty and fellow cardiologists by (1) viewing the electrocardiographic image at the Google Glass screen; (2) viewing a photograph of the ECG taken using Google Glass and interpreted on a mobile device; (3) viewing the original paper ECG; and (4) viewing a photograph of the ECG taken with a high-resolution camera and interpreted on a mobile device. One point was given for identification of each correct finding. Subjective rating of the user experience was also recorded. Twelve physicians (4 faculty and 8 fellow cardiologists) participated. The average electrocardiographic interpretation score (maximum 21 points) as viewed through the Google Glass, Google Glass photograph on a mobile device, on paper, and high-resolution photograph on a mobile device was 13.5 ± 1.8, 16.1 ± 2.6, 18.3 ± 1.7, and 18.6 ± 1.5, respectively (p = 0.0005 between Google Glass and mobile device, p = 0.0005 between Google Glass and paper, and p = 0.002 between mobile device and paper). Of the 12 physicians, 9 (75%) were dissatisfied with ECGs viewing on the prism display of Google Glass. In conclusion, further improvements are needed before Google Glass can be reliably used for remote electrocardiographic analysis. Published by Elsevier Inc.
Data visualization in interactive maps and time series
NASA Astrophysics Data System (ADS)
Maigne, Vanessa; Evano, Pascal; Brockmann, Patrick; Peylin, Philippe; Ciais, Philippe
2014-05-01
State-of-the-art data visualization has nothing to do with plots and maps we used few years ago. Many opensource tools are now available to provide access to scientific data and implement accessible, interactive, and flexible web applications. Here we will present a web site opened November 2013 to create custom global and regional maps and time series from research models and datasets. For maps, we explore and get access to data sources from a THREDDS Data Server (TDS) with the OGC WMS protocol (using the ncWMS implementation) then create interactive maps with the OpenLayers javascript library and extra information layers from a GeoServer. Maps become dynamic, zoomable, synchroneaously connected to each other, and exportable to Google Earth. For time series, we extract data from a TDS with the Netcdf Subset Service (NCSS) then display interactive graphs with a custom library based on the Data Driven Documents javascript library (D3.js). This time series application provides dynamic functionalities such as interpolation, interactive zoom on different axes, display of point values, and export to different formats. These tools were implemented for the Global Carbon Atlas (http://www.globalcarbonatlas.org): a web portal to explore, visualize, and interpret global and regional carbon fluxes from various model simulations arising from both human activities and natural processes, a work led by the Global Carbon Project.
Mapping of traditional settlements by unmanned airborne vehicles towards architectural restoration
NASA Astrophysics Data System (ADS)
Partsinevelos, Panagiotis; Skoutelis, Nikolaos; Tripolitsiotis, Achilleas; Tsatsarounos, Stelios; Tsitonaki, Anna; Zervos, Panagiotis
2015-06-01
Conservation and restoration of traditional settlements are amongst the actions that international directives proclaim in order to protect our cultural heritage. Towards this end, a mandatory base step in all archaeological and historical practices includes the surveying and mapping of the study area. Often, new, unexplored or abandoned settlements are considered, where dense vegetation, damaged structures and ruins, incorporation of newer structures and renovation characteristics make the precise surveying procedure a labor intensive and time consuming procedure. Unmanned airborne vehicles (UAVs) have been effectively incorporated into several cultural heritage projects mainly for mapping archeological sites. However, the majority of relevant publications lack of quantitative evaluation of their results and when such a validation is provided it is rather a procedural error estimation readily available from the software used, without independent ground truth verification. In this study, a low-cost custom-built hexacopter prototype was employed to deliver accurate mapping of the traditional settlement of Kamariotis in east Crete, Greece. The case of Kamariotis settlement included highly dense urban structures with continuous building forms, curved walls and missing terraces, while wild vegetation made classic geodetic surveying unfeasible. The resulting maps were qualitatively compared against the ones derived using Google Earth and the Greek Cadastral Orthophoto Viewing platforms to evaluate their applicability for architectural mapping. Moreover, the overall precision of the photogrammetric procedure was compared against geodetic surveying.
Georeferenced LiDAR 3D vine plantation map generation.
Llorens, Jordi; Gil, Emilio; Llop, Jordi; Queraltó, Meritxell
2011-01-01
The use of electronic devices for canopy characterization has recently been widely discussed. Among such devices, LiDAR sensors appear to be the most accurate and precise. Information obtained with LiDAR sensors during reading while driving a tractor along a crop row can be managed and transformed into canopy density maps by evaluating the frequency of LiDAR returns. This paper describes a proposed methodology to obtain a georeferenced canopy map by combining the information obtained with LiDAR with that generated using a GPS receiver installed on top of a tractor. Data regarding the velocity of LiDAR measurements and UTM coordinates of each measured point on the canopy were obtained by applying the proposed transformation process. The process allows overlap of the canopy density map generated with the image of the intended measured area using Google Earth(®), providing accurate information about the canopy distribution and/or location of damage along the rows. This methodology was applied and tested on different vine varieties and crop stages in two important vine production areas in Spain. The results indicate that the georeferenced information obtained with LiDAR sensors appears to be an interesting tool with the potential to improve crop management processes.
Creating global comparative analyses of tectonic rifts, monogenetic volcanism and inverted relief
NASA Astrophysics Data System (ADS)
van Wyk de Vries, Benjamin
2016-04-01
I have been all around the world, and to other planets and have travelled from the present to the Archaean and back to seek out the most significant tectonic rifts, monogenetic volcanoes and examples of inverted relief. I have done this to provide a broad foundation of the comparative analysis for the Chaîne des Puys - Limagne fault nomination to UNESCO world Heritage. This would have been an impossible task, if not for the cooperation of the scientific community and for Google Earth, Google Maps and academic search engines. In preparing global comparisons of geological features, these quite recently developed tools provide a powerful way to find and describe geological features. The ability to do scientific crowd sourcing, rapidly discussing with colleagues about features, allows large numbers of areas to be checked and the open GIS tools (such as Google Earth) allow a standardised description. Search engines also allow the literature on areas to be checked and compared. I will present a comparative study of rifts of the world, monogenetic volcanic field and inverted relief, integrated to analyse the full geological system represented by the Chaîne des Puys - Limagne fault. The analysis confirms that the site is an exceptional example of the first steps of continental drift in a mountain rift setting, and that this is necessarily seen through the combined landscape of tectonic, volcanic and geomorphic features. The analysis goes further to deepen the understanding of geological systems and stresses the need for more study on geological heritage using such a global and broad systems approach.
Google glass based immunochromatographic diagnostic test analysis
NASA Astrophysics Data System (ADS)
Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan
2015-03-01
Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.
Google Book Search: The Good, the Bad, & the Ugly
ERIC Educational Resources Information Center
Schaffhauser, Dian
2008-01-01
Google is opening up whole new worlds for internet surfers and researchers everywhere. Google Book Search (books.google.com), which is still in beta after several years of testing, offers the ubiquitous Google search box on its home page. It also has categories of books as well as book cover images that refresh every time the home page is…
Solar Eclipse Computer API: Planning Ahead for August 2017
NASA Astrophysics Data System (ADS)
Bartlett, Jennifer L.; Chizek Frouard, Malynda; Lesniak, Michael V.; Bell, Steve
2016-01-01
With the total solar eclipse of 2017 August 21 over the continental United States approaching, the U.S. Naval Observatory (USNO) on-line Solar Eclipse Computer can now be accessed via an application programming interface (API). This flexible interface returns local circumstances for any solar eclipse in JavaScript Object Notation (JSON) that can be incorporated into third-party Web sites or applications. For a given year, it can also return a list of solar eclipses that can be used to build a more specific request for local circumstances. Over the course of a particular eclipse as viewed from a specific site, several events may be visible: the beginning and ending of the eclipse (first and fourth contacts), the beginning and ending of totality (second and third contacts), the moment of maximum eclipse, sunrise, or sunset. For each of these events, the USNO Solar Eclipse Computer reports the time, Sun's altitude and azimuth, and the event's position and vertex angles. The computer also reports the duration of the total phase, the duration of the eclipse, the magnitude of the eclipse, and the percent of the Sun obscured for a particular eclipse site. On-line documentation for using the API-enabled Solar Eclipse Computer, including sample calls, is available (http://aa.usno.navy.mil/data/docs/api.php). The same Web page also describes how to reach the Complete Sun and Moon Data for One Day, Phases of the Moon, Day and Night Across the Earth, and Apparent Disk of a Solar System Object services using API calls.For those who prefer using a traditional data input form, local circumstances can still be requested that way at http://aa.usno.navy.mil/data/docs/SolarEclipses.php. In addition, the 2017 August 21 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2017.php) consolidates all of the USNO resources for this event, including a Google Map view of the eclipse track designed by Her Majesty's Nautical Almanac Office (HMNAO). Looking further ahead, a 2024 April 8 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2024.php) is also available.
Publications - DGGS Annual Reports | Alaska Division of Geological &
Publications Geologic Materials Center General Information Inventory Monthly Report Hours and Location Policy : Report = Report Disk = CD/DVD Map = Maps Geospatial Data = Geospatial Data Outside Link = Outside Link Interactive = Interactive Beginning in 2000, the DGGS Annual Report series was reactivated to produce reports
Mapping ecological systems in southeastern Arizona
Jim Malusa; Donald Falk; Larry Laing; Brooke Gebow
2013-01-01
Beginning in 2007 in and around the Huachuca Mountains, the Coronado National Forest and other partners have been mapping ecosystems at multiple scales. The approach has focused on identifying land type associations (LTA), which represent the sum of bedrock and superficial geology, topography, elevation, potential and existing vegetation, soil properties, and local...
Confessions of a Librarian or: How I Learned to Stop Worrying and Love Google
ERIC Educational Resources Information Center
Gunnels, Claire B.; Sisson, Amy
2009-01-01
Have you ever stopped to think about life before Google? We will make the argument that Google is the first manifestation of Web 2.0, of the power and promise of social networking and the ubiquitous wiki. We will discuss the positive influence of Google and how Google and other social networking tools afford librarians leading-edge technologies…
Davis, Christopher R; Rosenfield, Lorne K
2015-03-01
Google Glass has the potential to become a ubiquitous and translational technological tool within clinical plastic surgery. Google Glass allows clinicians to remotely view patient notes, laboratory results, and imaging; training can be augmented via streamed expert master classes; and patient safety can be improved by remote advice from a senior colleague. This systematic review identified and appraised every Google Glass publication relevant to plastic surgery and describes the first plastic surgical procedures recorded using Google Glass. A systematic review was performed using PubMed National Center for Biotechnology Information, Ovid MEDLINE, and the Cochrane Central Register of Controlled Trials, following modified Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. Key search terms "Google" and "Glass" identified mutually inclusive publications that were screened for inclusion. Eighty-two publications were identified, with 21 included for review. Google Glass publications were formal articles (n = 3), editorial/commentary articles (n = 7), conference proceedings (n = 1), news reports (n = 3), and online articles (n = 7). Data support Google Glass' positive impact on health care delivery, clinical training, medical documentation, and patient safety. Concerns exist regarding patient confidentiality, technical issues, and limited software. The first plastic surgical procedure performed using Google Glass was a blepharoplasty on October 29, 2013. Google Glass is an exciting translational technology with the potential to positively impact health care delivery, medical documentation, surgical training, and patient safety. Further high-quality scientific research is required to formally appraise Google Glass in the clinical setting.
Developing scholarly thinking using mind maps in graduate nursing education.
Kotcherlakota, Suhasini; Zimmerman, Lani; Berger, Ann M
2013-01-01
Two broad issues that beginning graduate nursing students face are identifying a research focus and learning how to organize complex information. Developing a mind map is 1 strategy to help students clarify their thinking and lay the foundation for in-depth expertise related to their research focus, review of the literature, and conceptual framework. The authors discuss their use of mind mapping combined with feedback using a fishbowl technique.
Boeker, Martin; Vach, Werner; Motschall, Edith
2013-10-26
Recent research indicates a high recall in Google Scholar searches for systematic reviews. These reports raised high expectations of Google Scholar as a unified and easy to use search interface. However, studies on the coverage of Google Scholar rarely used the search interface in a realistic approach but instead merely checked for the existence of gold standard references. In addition, the severe limitations of the Google Search interface must be taken into consideration when comparing with professional literature retrieval tools.The objectives of this work are to measure the relative recall and precision of searches with Google Scholar under conditions which are derived from structured search procedures conventional in scientific literature retrieval; and to provide an overview of current advantages and disadvantages of the Google Scholar search interface in scientific literature retrieval. General and MEDLINE-specific search strategies were retrieved from 14 Cochrane systematic reviews. Cochrane systematic review search strategies were translated to Google Scholar search expression as good as possible under consideration of the original search semantics. The references of the included studies from the Cochrane reviews were checked for their inclusion in the result sets of the Google Scholar searches. Relative recall and precision were calculated. We investigated Cochrane reviews with a number of included references between 11 and 70 with a total of 396 references. The Google Scholar searches resulted in sets between 4,320 and 67,800 and a total of 291,190 hits. The relative recall of the Google Scholar searches had a minimum of 76.2% and a maximum of 100% (7 searches). The precision of the Google Scholar searches had a minimum of 0.05% and a maximum of 0.92%. The overall relative recall for all searches was 92.9%, the overall precision was 0.13%. The reported relative recall must be interpreted with care. It is a quality indicator of Google Scholar confined to an experimental setting which is unavailable in systematic retrieval due to the severe limitations of the Google Scholar search interface. Currently, Google Scholar does not provide necessary elements for systematic scientific literature retrieval such as tools for incremental query optimization, export of a large number of references, a visual search builder or a history function. Google Scholar is not ready as a professional searching tool for tasks where structured retrieval methodology is necessary.
Are you ready for the net generation or the free agent learner?
Desilets, Lynore D
2011-08-01
The newest generation of soon to be health care professionals was raised by Father Google and Mother IM. The world has been a connected place for them their entire lives. They are experts at multitasking. They prefer electronic over print news, dictionaries, and maps; cell phones that do more than make phone calls; e-mail exchanges over face-to-face visits; online payments over checks; and credit cards over cash. In this column, I share some information about technology and these digital natives that I, a digital immigrant, have recently discovered. Copyright 2011, SLACK Incorporated.
Effective Web and Desktop Retrieval with Enhanced Semantic Spaces
NASA Astrophysics Data System (ADS)
Daoud, Amjad M.
We describe the design and implementation of the NETBOOK prototype system for collecting, structuring and efficiently creating semantic vectors for concepts, noun phrases, and documents from a corpus of free full text ebooks available on the World Wide Web. Automatic generation of concept maps from correlated index terms and extracted noun phrases are used to build a powerful conceptual index of individual pages. To ensure scalabilty of our system, dimension reduction is performed using Random Projection [13]. Furthermore, we present a complete evaluation of the relative effectiveness of the NETBOOK system versus the Google Desktop [8].
Video Games - Did They Begin at Brookhaven
dropdown arrow Site Map A-Z Index Menu Synopsis Video Games  Did They Begin at Brookhaven? Additional Web program led to the pioneering development of video games. William Higinbotham William Higinbotham First Pong, now Space Invaders, next Star Castle  video games have mesmerized children of at all ages
Master Plan for Educational Facilities: Garwood, Union County, New Jersey.
ERIC Educational Resources Information Center
Engelhardt and Engelhardt, Inc., Purdy Station, NY.
Garwood, New Jersey, is a small borough of 0.69 square miles with an estimated population in 1978 of 4,856 persons. This master plan for educational facilities begins with an overview of the district that describes its beginnings as an industrial community. A number of maps illustrate characteristics of the area including its topography,…
Aggregating concept map data to investigate the knowledge of beginning CS students
NASA Astrophysics Data System (ADS)
Mühling, Andreas
2016-07-01
Concept maps have a long history in educational settings as a tool for teaching, learning, and assessing. As an assessment tool, they are predominantly used to extract the structural configuration of learners' knowledge. This article presents an investigation of the knowledge structures of a large group of beginning CS students. The investigation is based on a method that collects, aggregates, and automatically analyzes the concept maps of a group of learners as a whole, to identify common structural configurations and differences in the learners' knowledge. It shows that those students who have attended CS education in their secondary school life have, on average, configured their knowledge about typical core CS/OOP concepts differently. Also, artifacts of their particular CS curriculum are visible in their externalized knowledge. The data structures and analysis methods necessary for working with concept landscapes have been implemented as a GNU R package that is freely available.
Network-Based Mitigation of Illegal Immigration in Aegean Sea (Greece)
2010-09-01
From Google- Images ) ...........................................1 Figure 2. The perilous trip (From Google- Images ...2 Figure 3. EU countries (From Google- Images ).................................................................3 Figure 4...Eastern Aegen Sea and territorial water line (From Google- Images )................4 Figure 5. Cross-border zone
NASA Astrophysics Data System (ADS)
Azhdari, G. H.; Deilami, K.; Firooznia, E.
2015-12-01
Natural Resources are essential for security and sustainable development of each country. Therefore, in order to reach sustainable development, conservation as well as optimum utilization of natural resources, executing of natural resources cadastral plan is necessary and essential. Governments conduct lands management in Iran, so there is a need for comprehensive plan with arranged program for best evaluation. In this research as a pilot, Pasargadae city is opted. Pasargadae region is located in north-east of Shiraz in Fars province with Latitude and longitude of 30° 15 ´ 53 ° N and 53° 13 ´ 29 ° E respectively. In order to generate the cadastral maps, Firstly, images from QuickBird satellite with 50-60 centimeters resolution were georeferenced by utilizing ground control points with accurate GPS coordinates. In addition to satellite images, old paper maps with 1:10000 scale in local coordinate system from agriculture ministry in 1963 were digitized according to 1:25000 scale map from army geographical organization with AutoCad software. Beside, paper maps with 1:50000 scale and Google Earth were used to find the changes during time. All the above maps were added to QuickBird images as new layers by using ArcMap software. These maps also were utilized to determine the different land-uses. Thus, by employing ArcMap software lands divide into 2 groups: firstly, lands with official document, which is owned by either natural or legal persons, and secondly national lands under different uses such as forestry, range management and desertification plans. Consequently, the generation of cadastral maps leads to better difference between private and national lands. In addition, producing cadastral maps prevent the destruction and illegal possession of natural lands by individuals.
Simultaneous comparison and assessment of eight remotely sensed maps of Philippine forests
NASA Astrophysics Data System (ADS)
Estoque, Ronald C.; Pontius, Robert G.; Murayama, Yuji; Hou, Hao; Thapa, Rajesh B.; Lasco, Rodel D.; Villar, Merlito A.
2018-05-01
This article compares and assesses eight remotely sensed maps of Philippine forest cover in the year 2010. We examined eight Forest versus Non-Forest maps reclassified from eight land cover products: the Philippine Land Cover, the Climate Change Initiative (CCI) Land Cover, the Landsat Vegetation Continuous Fields (VCF), the MODIS VCF, the MODIS Land Cover Type product (MCD12Q1), the Global Tree Canopy Cover, the ALOS-PALSAR Forest/Non-Forest Map, and the GlobeLand30. The reference data consisted of 9852 randomly distributed sample points interpreted from Google Earth. We created methods to assess the maps and their combinations. Results show that the percentage of the Philippines covered by forest ranges among the maps from a low of 23% for the Philippine Land Cover to a high of 67% for GlobeLand30. Landsat VCF estimates 36% forest cover, which is closest to the 37% estimate based on the reference data. The eight maps plus the reference data agree unanimously on 30% of the sample points, of which 11% are attributable to forest and 19% to non-forest. The overall disagreement between the reference data and Philippine Land Cover is 21%, which is the least among the eight Forest versus Non-Forest maps. About half of the 9852 points have a nested structure such that the forest in a given dataset is a subset of the forest in the datasets that have more forest than the given dataset. The variation among the maps regarding forest quantity and allocation relates to the combined effects of the various definitions of forest and classification errors. Scientists and policy makers must consider these insights when producing future forest cover maps and when establishing benchmarks for forest cover monitoring.
Community Near-Port Modeling System (C-PORT): Briefing for ...
What C-PORT is: Screening level tool for assessing port activities and exploring the range of potential impacts that changes to port operations might have on local air quality; Analysis of decision alternatives through mapping of the likely pattern of potential pollutant dispersion and an estimated change in pollutant concentrations for user-designated scenarios; Designed primarily to evaluate the local air quality impacts of proposed port expansion or modernization, as well as to identify options for mitigating any impacts; Currently includes data from 21 US seaports and features a map-based interface similar to the widely used Google Earth; Still under development, C-PORT is designed as an easy-to-use computer modeling tool for users, such as state air quality managers and planners. This is part of our product outreach prior to model public release and to solicit for additional beta testers.
Voice Based City Panic Button System
NASA Astrophysics Data System (ADS)
Febriansyah; Zainuddin, Zahir; Bachtiar Nappu, M.
2018-03-01
The development of voice activated panic button application aims to design faster early notification of hazardous condition in community to the nearest police by using speech as the detector where the current application still applies touch-combination on screen and use coordination of orders from control center then the early notification still takes longer time. The method used in this research was by using voice recognition as the user voice detection and haversine formula for the comparison of closest distance between the user and the police. This research was equipped with auto sms, which sent notification to the victim’s relatives, that was also integrated with Google Maps application (GMaps) as the map to the victim’s location. The results show that voice registration on the application reaches 100%, incident detection using speech recognition while the application is running is 94.67% in average, and the auto sms to the victim relatives reaches 100%.
Near real-time aftershock hazard maps for earthquakes
NASA Astrophysics Data System (ADS)
McCloskey, J.; Nalbant, S. S.
2009-04-01
Stress interaction modelling is routinely used to explain the spatial relationships between earthquakes and their aftershocks. On 28 October 2008 a M6.4 earthquake occurred near the Pakistan-Afghanistan border killing several hundred and causing widespread devastation. A second M6.4 event occurred 12 hours later 20km to the south east. By making some well supported assumptions concerning the source event and the geometry of any likely triggered event it was possible to map those areas most likely to experience further activity. Using Google earth, it would further have been possible to identify particular settlements in the source area which were particularly at risk and to publish their locations globally within about 3 hours of the first earthquake. Such actions could have significantly focused the initial emergency response management. We argue for routine prospective testing of such forecasts and dialogue between social and physical scientists and emergency response professionals around the practical application of these techniques.
Retrieving clinical evidence: a comparison of PubMed and Google Scholar for quick clinical searches.
Shariff, Salimah Z; Bejaimal, Shayna Ad; Sontrop, Jessica M; Iansavichus, Arthur V; Haynes, R Brian; Weir, Matthew A; Garg, Amit X
2013-08-15
Physicians frequently search PubMed for information to guide patient care. More recently, Google Scholar has gained popularity as another freely accessible bibliographic database. To compare the performance of searches in PubMed and Google Scholar. We surveyed nephrologists (kidney specialists) and provided each with a unique clinical question derived from 100 renal therapy systematic reviews. Each physician provided the search terms they would type into a bibliographic database to locate evidence to answer the clinical question. We executed each of these searches in PubMed and Google Scholar and compared results for the first 40 records retrieved (equivalent to 2 default search pages in PubMed). We evaluated the recall (proportion of relevant articles found) and precision (ratio of relevant to nonrelevant articles) of the searches performed in PubMed and Google Scholar. Primary studies included in the systematic reviews served as the reference standard for relevant articles. We further documented whether relevant articles were available as free full-texts. Compared with PubMed, the average search in Google Scholar retrieved twice as many relevant articles (PubMed: 11%; Google Scholar: 22%; P<.001). Precision was similar in both databases (PubMed: 6%; Google Scholar: 8%; P=.07). Google Scholar provided significantly greater access to free full-text publications (PubMed: 5%; Google Scholar: 14%; P<.001). For quick clinical searches, Google Scholar returns twice as many relevant articles as PubMed and provides greater access to free full-text articles.
Google searches help with diagnosis in dermatology.
Amri, Montassar; Feroz, Kaliyadan
2014-01-01
Several previous studies have tried to assess the usefulness of Google search as a diagnostic aid. The results were discordant and have led to controversies. To investigate how often Google search is helpful to reach correct diagnoses in dermatology. Two fifth-year students (A and B) and one demonstrator (C) have participated as investigators in this paper. Twenty-five diagnostic dermatological cases were selected from all the clinical cases published in the Web only images in clinical medicine from March 2005 to November 2009. The main outcome measure of our paper was to compare the number of correct diagnoses provided by the investigators without, and with Google search. Investigator A gave correct diagnoses in 9/25 (36%) cases without Google search, his diagnostic success after Google search was 18/25 (72%). Investigator B results were 11/25 (44%) correct diagnoses without Google search, and 19/25 (76%) after this search. For investigator C, the results were 12/25 (48%) without Google search, and 18/25 (72%) after the use of this tool. Thus, the total correct diagnoses provided by the three investigators were 32 (42.6%) without Google search, and 55 (73.3%) when using this facility. The difference was statistically significant between the total number of correct diagnoses given by the three investigators without, and with Google search (p = 0.0002). In the light of our paper, Google search appears to be an interesting diagnostic aid in dermatology. However, we emphasize that diagnosis is primarily an art based on clinical skills and experience.
Beletsky, Leo; Arredondo, Jaime; Werb, Dan; Vera, Alicia; Abramovitz, Daniela; Amon, Joseph J; Brouwer, Kimberly C; Strathdee, Steffanie A; Gaines, Tommi L
2016-07-28
As geospatial data have become increasingly integral to health and human rights research, their collection using formal address designations or paper maps has been complicated by numerous factors, including poor cartographic literacy, nomenclature imprecision, and human error. As part of a longitudinal study of people who inject drugs in Tijuana, Mexico, respondents were prompted to georeference specific experiences. At baseline, only about one third of the 737 participants were native to Tijuana, underscoring prevalence of migration/deportation experience. Areas frequented typically represented locations with no street address (e.g. informal encampments). Through web-based cartographic technology and participatory mapping, this study was able to overcome the use of vernacular names and difficulties mapping liminal spaces in generating georeferenced data points that were subsequently analyzed in other research. Integrating low-threshold virtual navigation as part of data collection can enhance investigations of mobile populations, informal settlements, and other locations in research into structural production of health at low- or no cost. However, further research into user experience is warranted.
Enhancing Simulation Learning with Team Mental Model Mapping
ERIC Educational Resources Information Center
Goltz, Sonia M.
2017-01-01
Simulations have been developed for many business courses because of enhanced student engagement and learning. A challenge for instructors using simulations is how to take this learning to the next level since student reflection and learning can vary. This article describes how to use a conceptual mapping game at the beginning and end of a…
Navigating Maps to Support Comprehension: When Textbooks Don't Have GPS
ERIC Educational Resources Information Center
Roberts, Kathryn L.; Brugar, Kristy A.
2014-01-01
In this article, Kathryn Roberts & Kristy Brugar discuss their assessment of third-, fourth- and fifth-grade children's understandings of the types of maps commonly found in their social studies text and trade books, and the knowledge and misconceptions it revealed. They then discuss how teachers might begin to address those issues with…
ERIC Educational Resources Information Center
White, Brian
2004-01-01
This paper presents a generally applicable method for characterizing subjects' hypothesis-testing behaviour based on a synthesis that extends on previous work. Beginning with a transcript of subjects' speech and videotape of their actions, a Reasoning Map is created that depicts the flow of their hypotheses, tests, predictions, results, and…
Counting the Nouns: Simple Structural Cues to Verb Meaning
ERIC Educational Resources Information Center
Yuan, Sylvia; Fisher, Cynthia; Snedeker, Jesse
2012-01-01
Two-year-olds use the sentence structures verbs appear in--"subcategorization frames"--to guide verb learning. This is syntactic bootstrapping. This study probed the developmental origins of this ability. The structure-mapping account proposes that children begin with a bias toward one-to-one mapping between nouns in sentences and participant…
Code of Federal Regulations, 2010 CFR
2010-04-01
... are two 1:24,000 Scale USGS topographic maps. They are titled: (1) Patterson, California Quadrangle... the town of Patterson. The Salado Creek viticultural area boundary is as follows: (1) Beginning on the Patterson Quadrangle map, section 19, T6S, R8E, at the intersection of Interstate Highway 5 and Fink Road...
Use of "Google Scholar" in Corpus-Driven EAP Research
ERIC Educational Resources Information Center
Brezina, Vaclav
2012-01-01
This primarily methodological article makes a proposition for linguistic exploration of textual resources available through the "Google Scholar" search engine. These resources ("Google Scholar virtual corpus") are significantly larger than any existing corpus of academic writing. "Google Scholar", however, was not designed for linguistic searches…
Application of neogeographic tools for geochemistry
NASA Astrophysics Data System (ADS)
Zhilin, Denis
2010-05-01
Neogeography is a usage of geographical tools for utilization by a non-expert group of users. It have been rapidly developing last ten years and is founded on (a) availability of Global Positioning System (GPS) receivers, that allows to obtain very precise geographical position (b) services, that allows linking geographical position with satellite images, GoogleEarth for example and (c) programs as GPS Track Maker or OziExplorer, that allows linking geographical coordinates with other raster images (for example, maps). However, the possibilities of neogeographic approach are much wider. It allows linking different parameters with geographical coordinates on the one hand and space image or map - on the other. If it is easy to measure a parameter, a great database could be collected for a very small time. The results can be presented in very different ways. One can plot a parameter versus the distance from a particular point (for example, a source of a substance), make two-dimension distribution of parameter of put the results onto a map or space image. In the case of chemical parameters it can help finding the source of pollution, trace the influence of pollution, reveal geochemical processes and patterns. The main advantage of neogeograpic approach is the employment of non-experts in collecting data. Now non-experts can easily measure electrical conductivity and pH of natural waters, concentration of different gases in the atmosphere, solar irradiation, radioactivity and so on. If the results are obtained (for example, by students of secondary schools) and shared, experts can proceed them and make significant conclusions. An interface of sharing the results (http://maps.sch192.ru/) was elaborated by V. Ilyin. Within the interface a user can load *.csv file with coordinates, type of parameter and the value of parameter in a particular point. The points are marked on the GoogleEarth map with the color corresponding the value of the parameter. The color scale can be edited manually. We would like to show some results of practical and scientific importance, obtained by non-experts. At 2006 our secondary school students investigated the distribution of snow salinity around Kosygina Street in Moscow. One can conclude that the distribution of salinity is reproducible and that the street influences the snow up to 150 meters. Another example obtained by our students is the distribution of electrical conductivity of swamp water showing extreme irregularity of this parameter within the small area (about 0.5x0.5 km) the electrical conductivity varied from 22 to 77 uS with no regularity. It points out the key role of local processes in swamp water chemistry. The third example (maps of electrical conductivity and pH of water on a large area) one can see at http://fenevo.narod.ru/maps/ec-maps.htm and http://fenevo.narod.ru/maps/ph-maps.htm. Basing on the map one can conclude mechanisms of formation of water mineralization in the area. Availability of GPS receivers and systems for easy measuring of chemical parameters can lead to neogeochemical revolution as GPS receivers have led to neogeographical. A great number of non-experts can share their geochemical results, forming huge amount of available geochemical data. It will help to falsify and visualize concepts of geochemistry and environmental chemistry and, maybe, develop new ones. Geophysical and biological data could be shared as well with the same advantages for corresponding sciences.
2013-01-01
Background Recent research indicates a high recall in Google Scholar searches for systematic reviews. These reports raised high expectations of Google Scholar as a unified and easy to use search interface. However, studies on the coverage of Google Scholar rarely used the search interface in a realistic approach but instead merely checked for the existence of gold standard references. In addition, the severe limitations of the Google Search interface must be taken into consideration when comparing with professional literature retrieval tools. The objectives of this work are to measure the relative recall and precision of searches with Google Scholar under conditions which are derived from structured search procedures conventional in scientific literature retrieval; and to provide an overview of current advantages and disadvantages of the Google Scholar search interface in scientific literature retrieval. Methods General and MEDLINE-specific search strategies were retrieved from 14 Cochrane systematic reviews. Cochrane systematic review search strategies were translated to Google Scholar search expression as good as possible under consideration of the original search semantics. The references of the included studies from the Cochrane reviews were checked for their inclusion in the result sets of the Google Scholar searches. Relative recall and precision were calculated. Results We investigated Cochrane reviews with a number of included references between 11 and 70 with a total of 396 references. The Google Scholar searches resulted in sets between 4,320 and 67,800 and a total of 291,190 hits. The relative recall of the Google Scholar searches had a minimum of 76.2% and a maximum of 100% (7 searches). The precision of the Google Scholar searches had a minimum of 0.05% and a maximum of 0.92%. The overall relative recall for all searches was 92.9%, the overall precision was 0.13%. Conclusion The reported relative recall must be interpreted with care. It is a quality indicator of Google Scholar confined to an experimental setting which is unavailable in systematic retrieval due to the severe limitations of the Google Scholar search interface. Currently, Google Scholar does not provide necessary elements for systematic scientific literature retrieval such as tools for incremental query optimization, export of a large number of references, a visual search builder or a history function. Google Scholar is not ready as a professional searching tool for tasks where structured retrieval methodology is necessary. PMID:24160679
Social.Water--Open Source Citizen Science Software for CrowdHydrology
NASA Astrophysics Data System (ADS)
Fienen, M. N.; Lowry, C.
2013-12-01
CrowdHydrology is a crowd-sourced citizen science project in which passersby near streams are encouraged to read a gage and send an SMS (text) message with the water level to a number indicated on a sign. The project was initially started using free services such as Google Voice, Gmail, and Google Maps to acquire and present the data on the internet. Social.Water is open-source software, using Python and JavaScript, that automates the acquisition, categorization, and presentation of the data. Open-source objectives pervade both the project and the software as the code is hosted at Github, only free scripting codes are used, and any person or organization can install a gage and join the CrowdHydrology network. In the first year, 10 sites were deployed in upstate New York, USA. In the second year, expansion to 44 sites throughout the upper Midwest USA was achieved. Comparison with official USGS and academic measurements have shown low error rates. Citizen participation varies greatly from site to site, so surveys or other social information is sought for insight into why some sites experience higher rates of participation than others.
Muellner, Ulrich J; Vial, Flavie; Wohlfender, Franziska; Hadorn, Daniela; Reist, Martin; Muellner, Petra
2015-01-01
The reporting of outputs from health surveillance systems should be done in a near real-time and interactive manner in order to provide decision makers with powerful means to identify, assess, and manage health hazards as early and efficiently as possible. While this is currently rarely the case in veterinary public health surveillance, reporting tools do exist for the visual exploration and interactive interrogation of health data. In this work, we used tools freely available from the Google Maps and Charts library to develop a web application reporting health-related data derived from slaughterhouse surveillance and from a newly established web-based equine surveillance system in Switzerland. Both sets of tools allowed entry-level usage without or with minimal programing skills while being flexible enough to cater for more complex scenarios for users with greater programing skills. In particular, interfaces linking statistical softwares and Google tools provide additional analytical functionality (such as algorithms for the detection of unusually high case occurrences) for inclusion in the reporting process. We show that such powerful approaches could improve timely dissemination and communication of technical information to decision makers and other stakeholders and could foster the early-warning capacity of animal health surveillance systems.
Ajax Architecture Implementation Techniques
NASA Astrophysics Data System (ADS)
Hussaini, Syed Asadullah; Tabassum, S. Nasira; Baig, Tabassum, M. Khader
2012-03-01
Today's rich Web applications use a mix of Java Script and asynchronous communication with the application server. This mechanism is also known as Ajax: Asynchronous JavaScript and XML. The intent of Ajax is to exchange small pieces of data between the browser and the application server, and in doing so, use partial page refresh instead of reloading the entire Web page. AJAX (Asynchronous JavaScript and XML) is a powerful Web development model for browser-based Web applications. Technologies that form the AJAX model, such as XML, JavaScript, HTTP, and XHTML, are individually widely used and well known. However, AJAX combines these technologies to let Web pages retrieve small amounts of data from the server without having to reload the entire page. This capability makes Web pages more interactive and lets them behave like local applications. Web 2.0 enabled by the Ajax architecture has given rise to a new level of user interactivity through web browsers. Many new and extremely popular Web applications have been introduced such as Google Maps, Google Docs, Flickr, and so on. Ajax Toolkits such as Dojo allow web developers to build Web 2.0 applications quickly and with little effort.
A Web-Based Information System for Field Data Management
NASA Astrophysics Data System (ADS)
Weng, Y. H.; Sun, F. S.
2014-12-01
A web-based field data management system has been designed and developed to allow field geologists to store, organize, manage, and share field data online. System requirements were analyzed and clearly defined first regarding what data are to be stored, who the potential users are, and what system functions are needed in order to deliver the right data in the right way to the right user. A 3-tiered architecture was adopted to create this secure, scalable system that consists of a web browser at the front end while a database at the back end and a functional logic server in the middle. Specifically, HTML, CSS, and JavaScript were used to implement the user interface in the front-end tier, the Apache web server runs PHP scripts, and MySQL to server is used for the back-end database. The system accepts various types of field information, including image, audio, video, numeric, and text. It allows users to select data and populate them on either Google Earth or Google Maps for the examination of the spatial relations. It also makes the sharing of field data easy by converting them into XML format that is both human-readable and machine-readable, and thus ready for reuse.
Crime event 3D reconstruction based on incomplete or fragmentary evidence material--case report.
Maksymowicz, Krzysztof; Tunikowski, Wojciech; Kościuk, Jacek
2014-09-01
Using our own experience in 3D analysis, the authors will demonstrate the possibilities of 3D crime scene and event reconstruction in cases where originally collected material evidence is largely insufficient. The necessity to repeat forensic evaluation is often down to the emergence of new facts in the course of case proceedings. Even in cases when a crime scene and its surroundings have undergone partial or complete transformation, with regard to elements significant to the course of the case, or when the scene was not satisfactorily secured, it is still possible to reconstruct it in a 3D environment based on the originally-collected, even incomplete, material evidence. In particular cases when no image of the crime scene is available, its partial or even full reconstruction is still potentially feasible. Credibility of evidence for such reconstruction can still satisfy the evidence requirements in court. Reconstruction of the missing elements of the crime scene is still possible with the use of information obtained from current publicly available databases. In the study, we demonstrate that these can include Google Maps(®*), Google Street View(®*) and available construction and architecture archives. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
A spatio-temporal landslide inventory for the NW of Spain: BAPA database
NASA Astrophysics Data System (ADS)
Valenzuela, Pablo; Domínguez-Cuesta, María José; Mora García, Manuel Antonio; Jiménez-Sánchez, Montserrat
2017-09-01
A landslide database has been created for the Principality of Asturias, NW Spain: the BAPA (Base de datos de Argayos del Principado de Asturias - Principality of Asturias Landslide Database). Data collection is mainly performed through searching local newspaper archives. Moreover, a BAPA App and a BAPA website (http://geol.uniovi.es/BAPA) have been developed to obtain additional information from citizens and institutions. Presently, the dataset covers the period 1980-2015, recording 2063 individual landslides. The use of free cartographic servers, such as Google Maps, Google Street View and Iberpix (Government of Spain), combined with the spatial descriptions and pictures contained in the press news, makes it possible to assess different levels of spatial accuracy. In the database, 59% of the records show an exact spatial location, and 51% of the records provided accurate dates, showing the usefulness of press archives as temporal records. Thus, 32% of the landslides show the highest spatial and temporal accuracy levels. The database also gathers information about the type and characteristics of the landslides, the triggering factors and the damage and costs caused. Field work was conducted to validate the methodology used in assessing the spatial location, temporal occurrence and characteristics of the landslides.
IRIS Earthquake Browser with Integration to the GEON IDV for 3-D Visualization of Hypocenters.
NASA Astrophysics Data System (ADS)
Weertman, B. R.
2007-12-01
We present a new generation of web based earthquake query tool - the IRIS Earthquake Browser (IEB). The IEB combines the DMC's large set of earthquake catalogs (provided by USGS/NEIC, ISC and the ANF) with the popular Google Maps web interface. With the IEB you can quickly and easily find earthquakes in any region of the globe. Using Google's detailed satellite images, earthquakes can be easily co-located with natural geographic features such as volcanoes as well as man made features such as commercial mines. A set of controls allow earthquakes to be filtered by time, magnitude, and depth range as well as catalog name, contributor name and magnitude type. Displayed events can be easily exported in NetCDF format into the GEON Integrated Data Viewer (IDV) where hypocenters may be visualized in three dimensions. Looking "under the hood", the IEB is based on AJAX technology and utilizes REST style web services hosted at the IRIS DMC. The IEB is part of a broader effort at the DMC aimed at making our data holdings available via web services. The IEB is useful both educationally and as a research tool.
Retrieving Clinical Evidence: A Comparison of PubMed and Google Scholar for Quick Clinical Searches
Bejaimal, Shayna AD; Sontrop, Jessica M; Iansavichus, Arthur V; Haynes, R Brian; Weir, Matthew A; Garg, Amit X
2013-01-01
Background Physicians frequently search PubMed for information to guide patient care. More recently, Google Scholar has gained popularity as another freely accessible bibliographic database. Objective To compare the performance of searches in PubMed and Google Scholar. Methods We surveyed nephrologists (kidney specialists) and provided each with a unique clinical question derived from 100 renal therapy systematic reviews. Each physician provided the search terms they would type into a bibliographic database to locate evidence to answer the clinical question. We executed each of these searches in PubMed and Google Scholar and compared results for the first 40 records retrieved (equivalent to 2 default search pages in PubMed). We evaluated the recall (proportion of relevant articles found) and precision (ratio of relevant to nonrelevant articles) of the searches performed in PubMed and Google Scholar. Primary studies included in the systematic reviews served as the reference standard for relevant articles. We further documented whether relevant articles were available as free full-texts. Results Compared with PubMed, the average search in Google Scholar retrieved twice as many relevant articles (PubMed: 11%; Google Scholar: 22%; P<.001). Precision was similar in both databases (PubMed: 6%; Google Scholar: 8%; P=.07). Google Scholar provided significantly greater access to free full-text publications (PubMed: 5%; Google Scholar: 14%; P<.001). Conclusions For quick clinical searches, Google Scholar returns twice as many relevant articles as PubMed and provides greater access to free full-text articles. PMID:23948488
Hanna, Alan; Hanna, Lezley-Anne
2018-03-30
The aim was to provide a comprehensive overview (using pertinent examples) of the various ways that Google Trends and Google data could inform pharmacy practice. The objectives were to: examine what type of information people search for in relation to a common class of medicines; ascertain where people are directed to (websites) following an initial search for a medicine or medical condition; and establish information about when they search. The methodology differed depending on whether Google Trends or Google was being interrogated, but the search domain was always limited to the United Kingdom. Google Trends was queried, typically for a 5-year time frame, and data downloaded for many search inputs relating to medical conditions (self-treatable and non-self-treatable) and medicines (bought over-the-counter and prescribed). Google was queried and data collected for searches related to 'antibiotics'. Google Trends revealed a previously unknown seasonality pattern for irritable bowel syndrome. Related searches for 'antibiotics' revealed a high level of interest in the appropriateness of concomitant alcohol consumption and queries about what antibiotics are. Largely, people were being directed to reputable websites following their initial search input about a prescription-only medicine. However, searches for over-the-counter medicines were more likely to lead to commercial domains. This is one of the first studies to investigate use of Google Trends and Google in a pharmacy-specific context. It is relevant for practice as it could inform marketing strategies, public health policy and help tailor patient advice and counselling. © 2018 Royal Pharmaceutical Society.
Accuracy of remote chest X-ray interpretation using Google Glass technology.
Spaedy, Emily; Christakopoulos, Georgios E; Tarar, Muhammad Nauman J; Christopoulos, Georgios; Rangan, Bavana V; Roesle, Michele; Ochoa, Cristhiaan D; Yarbrough, William; Banerjee, Subhash; Brilakis, Emmanouil S
2016-09-15
We sought to explore the accuracy of remote chest X-ray reading using hands-free, wearable technology (Google Glass, Google, Mountain View, California). We compared interpretation of twelve chest X-rays with 23 major cardiopulmonary findings by faculty and fellows from cardiology, radiology, and pulmonary-critical care via: (1) viewing the chest X-ray image on the Google Glass screen; (2) viewing a photograph of the chest X-ray taken using Google Glass and interpreted on a mobile device; (3) viewing the original chest X-ray on a desktop computer screen. One point was given for identification of each correct finding and a subjective rating of user experience was recorded. Fifteen physicians (5 faculty and 10 fellows) participated. The average chest X-ray reading score (maximum 23 points) as viewed through the Google Glass, Google Glass photograph on a mobile device, and the original X-ray viewed on a desktop computer was 14.1±2.2, 18.5±1.5 and 21.3±1.7, respectively (p<0.0001 between Google Glass and mobile device, p<0.0001 between Google Glass and desktop computer and p=0.0004 between mobile device and desktop computer). Of 15 physicians, 11 (73.3%) felt confident in detecting findings using the photograph taken by Google Glass as viewed on a mobile device. Remote chest X-ray interpretation using hands-free, wearable technology (Google Glass) is less accurate than interpretation using a desktop computer or a mobile device, suggesting that further technical improvements are needed before widespread application of this novel technology. Published by Elsevier Ireland Ltd.
2009-07-19
Brian McLendon, VP of Engineering, Google, Inc., speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)
2009-07-19
Alan Eustace, Senior VP of Engineering and Research, Google, Inc., speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)
The Google Online Marketing Challenge and Research Opportunities
ERIC Educational Resources Information Center
Neale, Larry; Treiblmaier, Horst; Henderson, Vani; Hunter, Lee; Hudson, Karen; Murphy, Jamie
2009-01-01
The Google Online Marketing Challenge is an ongoing collaboration between Google and academics, to give students experiential learning. The Challenge gives student teams US$200 in AdWords, Google's flagship advertising product, to develop online marketing campaigns for actual businesses. The end result is an engaging in-class exercise that…
Where Did Google Get Its Value?
ERIC Educational Resources Information Center
Caufield, James
2005-01-01
Google's extraordinary success is usually attributed to innovative technology and new business models. By contrast, this paper argues that Google's success is mostly due to its adoption of certain library values. First, Google has refused to adopt the standard practices of the search engine business, practices that compromised service to the user…
Putting Google Scholar to the Test: A Preliminary Study
ERIC Educational Resources Information Center
Robinson, Mary L.; Wusteman, Judith
2007-01-01
Purpose: To describe a small-scale quantitative evaluation of the scholarly information search engine, Google Scholar. Design/methodology/approach: Google Scholar's ability to retrieve scholarly information was compared to that of three popular search engines: Ask.com, Google and Yahoo! Test queries were presented to all four search engines and…
ERIC Educational Resources Information Center
Wang, Xueli
2016-01-01
This research focuses on course-taking patterns of beginning community college students enrolled in one or more non-remedial science, technology, engineering, and mathematics (STEM) courses during their first year of college, and how these patterns are mapped against upward transfer in STEM fields of study. Drawing upon postsecondary transcript…
Reaching the Next Generation of College Students via Their Digital Devices.
NASA Astrophysics Data System (ADS)
Whitmeyer, S. J.; De Paor, D. G.; Bentley, C.
2015-12-01
Current college students attended school during a decade in which many school districts banned cellphones from the classroom or even from school grounds. These students are used to being told to put away their mobile devices and concentrate on traditional classroom activities such as watching PowerPoint presentations or calculating with pencil and paper. However, due to a combination of parental security concerns and recent education research, schools are rapidly changing policy and embracing mobile devices for ubiquitous learning opportunities inside and outside of the classroom. Consequently, many of the next generation of college students will have expectations of learning via mobile technology. We have developed a range of digital geology resources to aid mobile-based geoscience education at college level, including mapping on iPads and other tablets, "crowd-sourced" field projects, augmented reality-supported asynchronous field classes, 3D and 4D split-screen virtual reality tours, macroscopic and microscopic gigapixel imagery, 360° panoramas, assistive devices for inclusive field education, and game-style educational challenges. Class testing of virtual planetary tours shows modest short-term learning gains, but more work is needed to ensure long-term retention. Many of our resources rely on the Google Earth browser plug-in and application program interface (API). Because of security concerns, browser plug-ins in general are being phased out and the Google Earth API will not be supported in future browsers. However, a new plug-in-free API is promised by Google and an alternative open-source virtual globe called Cesium is undergoing rapid development. It already supports the main aspects of Keyhole Markup Language and has features of significant benefit to geoscience, including full support on mobile devices and sub-surface viewing and touring. The research team includes: Heather Almquist, Stephen Burgin, Cinzia Cervato, Filis Coba, Chloe Constants, Gene Cooper, Mladen Dordevic, Marissa Dudek, Brandon Fitzwater, Bridget Gomez, Tyler Hansen, Paul Karabinos, Terry Pavlis, Jen Piatek, Alan Pitts, Robin Rohrback, Bill Richards, Caroline Robinson, Jeff Rollins, Jeff Ryan, Ron Schott, Kristen St. John, and Barb Tewksbury. Supported by NSF DUE 1323419 and by Google Geo Curriculum Awards.
Development of Waypoint Planning Tool in Response to NASA Field Campaign Challenges
NASA Technical Reports Server (NTRS)
He, Matt; Hardin, Danny; Mayer, Paul; Blakeslee, Richard; Goodman, Michael
2012-01-01
Airborne real time observations are a major component of NASA 's Earth Science research and satellite ground validation studies. Multiple aircraft are involved in most NASA field campaigns. The coordination of the aircraft with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. Planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. A flight planning tools is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama ]Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point -and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin on web platform, and to the rising open source GIS tools with New Java Script frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientist reach their mission objectives.
Villa-González, Emilio; Rodríguez-López, Carlos; Barranco-Ruiz, Yaira; Cabezas-Arévalo, Luis Fabián; Chillón, Palma
2016-06-30
Objetivo: analizar la concordancia de dos métodos de medición (Google MapsTM vs. Sistema de Información Geográfica) para la determinación de la distancia desde el domicilio familiar hasta el colegio.Métodos: un total de 542 escolares de entre 8-11 años de edad (media = 9,36 ± 0,6) del sur de España participaron en el estudio, facilitando la dirección de su domicilio familiar. La distancia desde el domicilio familiar al colegio se calculó mediante la utilización de dos programas diferentes:Google MapsTM y Sistema de Información Geográfica (GIS) en ruta y en línea recta. La asociación entre ambos métodos fue analizada utilizando la correlación de Spearman y el grado de acuerdo a través del coeficiente de correlación intraclase (ICC), así como el método Bland Altman.Resultados: la correlación entre ambos métodos de medición propuestos fue muy significativa (r = 0,966, p < 0,001; r = 0,984, p < 0,001; y r = 0,954, p < 0,001, respectivamente), y la concordancia fue excelente (ICC = 0,96, p < 0,001; ICC = 0,92, p < 0,001; ICC = 0,97, p < 0,001).Conclusiones: los métodos de medición estudiados podrían ser utilizados en función de las necesidades de la investigación, al presentar ambos una alta concordancia. Sin embargo, se recomienda la utilización del Sistema de Información Geográfica en ruta si se cuenta con medios yfinanciación, por tratarse de un método constatado en fiabilidad y validez.
NASA Astrophysics Data System (ADS)
Yen, Y.-N.; Wu, Y.-W.; Weng, K.-H.
2013-07-01
E-learning assisted teaching and learning is the trend of the 21st century and has many advantages - freedom from the constraints of time and space, hypertext and multimedia rich resources - enhancing the interaction between students and the teaching materials. The purpose of this study is to explore how rich Internet resources assisted students with the Western Architectural History course. First, we explored the Internet resources which could assist teaching and learning activities. Second, according to course objectives, we built a web-based platform which integrated the Google spreadsheets form, SIMILE widget, Wikipedia and the Google Maps and applied it to the course of Western Architectural History. Finally, action research was applied to understanding the effectiveness of this teaching/learning mode. Participants were the students of the Department of Architecture in the Private University of Technology in northern Taiwan. Results showed that students were willing to use the web-based platform to assist their learning. They found this platform to be useful in understanding the relationship between different periods of buildings. Through the view of the map mode, this platform also helped students expand their international perspective. However, we found that the information shared by students via the Internet were not completely correct. One possible reason was that students could easily acquire information on Internet but they could not determine the correctness of the information. To conclude, this study found some useful and rich resources that could be well-integrated, from which we built a web-based platform to collect information and present this information in diverse modes to stimulate students' learning motivation. We recommend that future studies should consider hiring teaching assistants in order to ease the burden on teachers, and to assist in the maintenance of information quality.
Data-Driven Geospatial Visual Analytics for Real-Time Urban Flooding Decision Support
NASA Astrophysics Data System (ADS)
Liu, Y.; Hill, D.; Rodriguez, A.; Marini, L.; Kooper, R.; Myers, J.; Wu, X.; Minsker, B. S.
2009-12-01
Urban flooding is responsible for the loss of life and property as well as the release of pathogens and other pollutants into the environment. Previous studies have shown that spatial distribution of intense rainfall significantly impacts the triggering and behavior of urban flooding. However, no general purpose tools yet exist for deriving rainfall data and rendering them in real-time at the resolution of hydrologic units used for analyzing urban flooding. This paper presents a new visual analytics system that derives and renders rainfall data from the NEXRAD weather radar system at the sewershed (i.e. urban hydrologic unit) scale in real-time for a Chicago stormwater management project. We introduce a lightweight Web 2.0 approach which takes advantages of scientific workflow management and publishing capabilities developed at NCSA (National Center for Supercomputing Applications), streaming data-aware semantic content management repository, web-based Google Earth/Map and time-aware KML (Keyhole Markup Language). A collection of polygon-based virtual sensors is created from the NEXRAD Level II data using spatial, temporal and thematic transformations at the sewershed level in order to produce persistent virtual rainfall data sources for the animation. Animated color-coded rainfall map in the sewershed can be played in real-time as a movie using time-aware KML inside the web browser-based Google Earth for visually analyzing the spatiotemporal patterns of the rainfall intensity in the sewershed. Such system provides valuable information for situational awareness and improved decision support during extreme storm events in an urban area. Our further work includes incorporating additional data (such as basement flooding events data) or physics-based predictive models that can be used for more integrated data-driven decision support.
Phan, Thanh G; Beare, Richard; Chen, Jian; Clissold, Benjamin; Ly, John; Singhal, Shaloo; Ma, Henry; Srikanth, Velandai
2017-05-01
There is great interest in how endovascular clot retrieval hubs provide services to a population. We applied a computational method to objectively generate service boundaries for such endovascular clot retrieval hubs, defined by traveling time to hub. Stroke incidence data merged with population census to estimate numbers of stroke in metropolitan Melbourne, Australia. Traveling time from randomly generated addresses to 4 endovascular clot retrieval-capable hubs (Royal Melbourne Hospital [RMH], Monash Medical Center [MMC], Alfred Hospital [ALF], and Austin Hospital [AUS]) estimated using Google Map application program interface. Boundary maps generated based on traveling time at various times of day for combinations of hubs. In a 2-hub model, catchment was best distributed when RMH was paired with MMC (model 1a, RMH 1765 km 2 and MMC 1164 km 2 ) or with AUS (model 1c, RMH 1244 km 2 and AUS 1685 km 2 ), with no statistical difference between models ( P =0.20). Catchment was poorly distributed when RMH was paired with ALF (model 1b, RMH 2252 km 2 and ALF 676 km 2 ), significantly different from both models 1a and 1c (both P <0.05). Model 1a had the greatest proportion of patients arriving within ideal time of 30 minutes followed by model 1c ( P <0.001). In a 3-hub model, the combination of RMH, MMC, and AUS was superior to that of RMH, MMC, and ALF in catchment distribution and travel time. The method was also successfully applied to the city of Adelaide demonstrating wider applicability. We provide proof of concept for a novel computational method to objectively designate service boundaries for endovascular clot retrieval hubs. © 2017 American Heart Association, Inc.
Data Visualization of Lunar Orbiter KAGUYA (SELENE) using web-based GIS
NASA Astrophysics Data System (ADS)
Okumura, H.; Sobue, S.; Yamamoto, A.; Fujita, T.
2008-12-01
The Japanese Lunar Orbiter KAGUYA (SELENE) was launched on Sep.14 2007, and started nominal observation from Dec. 21 2007. KAGUYA has 15 ongoing observation missions and is obtaining various physical quantity data of the moon such as elemental abundance, mineralogical composition, geological feature, magnetic field and gravity field. We are working on the visualization of these data and the application of them to web-based GIS. Our purpose of data visualization is the promotion of science and education and public outreach (EPO). As for scientific usage and public outreach, we already constructed KAGUYA Web Map Server (WMS) at JAXA Sagamihara Campus and began to test it among internal KAGUYA project. KAGUYA science team plans the integrated science using the data of multiple instruments with the aim of obtaining the new findings of the origin and the evolution of the moon. In the study of the integrated science, scientists have to access, compare and analyze various types of data with different resolution. Web-based GIS will allow users to map, overlay and share the data and information easily. So it will be the best way to progress such a study and we are developing the KAGUYA WMS as a platform of the KAGUYA integrated science. For the purpose of EPO, we are customizing NASA World Wind (NWW) JAVA for KAGUYA supported by NWW project. Users will be able to search and view many images and movies of KAGUYA on NWW JAVA in the easy and attractive way. In addition, we are considering applying KAGUYA images to Google Moon with KML format and adding KAGUYA movies to Google/YouTube.
Undergraduate Course on Global Concerns
NASA Astrophysics Data System (ADS)
Richard, G. A.; Weidner, D. J.
2008-12-01
GEO 311: Geoscience and Global Concerns is an undergraduate course taught at Stony Brook University during each fall semester. The class meets twice per week, with one session consisting of a lecture and the other, an interactive activity in a computer laboratory that engages the students in exploring real world problems. A specific concern or issue serves as a focus during each session. The students are asked to develop answers to a series of questions that engage them in identifying causes of the problem, connections with the Earth system, relationships to other problems, and possible solutions on both a global and local scale. The questions are designed to facilitate an integrated view of the Earth system. Examples of topics that the students explore during the laboratory sessions are: 1) fossil fuel reserves and consumption rates and the effect of their use on climate, 2) alternative sources of energy and associated technologies, such as solar photovoltaics, nuclear energy, tidal power, geothermal energy, and wind power, 3) effects of tsunamis and earthquakes on human populations and infrastructure, 4) climate change, and 5) hurricanes and storms. The selection and scheduling of topics often takes advantage of the occurrence of media attention or events that can serve as case studies. Tools used during the computer sessions include Google Earth, ArcGIS, spreadsheets, and web sites that offer data and maps. The students use Google Earth or ArcGIS to map events such as earthquakes, storms, tsunamis, and changes in the extent of polar ice. Spreadsheets are employed to discern trends in fossil fuel supply and consumption, and to experiment with models that make predictions for the future. We present examples of several of these activities and discuss how they facilitate an understanding of interrelationships within the Earth system.
Global trends in the awareness of sepsis: insights from search engine data between 2012 and 2017.
Jabaley, Craig S; Blum, James M; Groff, Robert F; O'Reilly-Shah, Vikas N
2018-01-17
Sepsis is an established global health priority with high mortality that can be curtailed through early recognition and intervention; as such, efforts to raise awareness are potentially impactful and increasingly common. We sought to characterize trends in the awareness of sepsis by examining temporal, geographic, and other changes in search engine utilization for sepsis information-seeking online. Using time series analyses and mixed descriptive methods, we retrospectively analyzed publicly available global usage data reported by Google Trends (Google, Palo Alto, CA, USA) concerning web searches for the topic of sepsis between 24 June 2012 and 24 June 2017. Google Trends reports aggregated and de-identified usage data for its search products, including interest over time, interest by region, and details concerning the popularity of related queries where applicable. Outlying epochs of search activity were identified using autoregressive integrated moving average modeling with transfer functions. We then identified awareness campaigns and news media coverage that correlated with epochs of significantly heightened search activity. A second-order autoregressive model with transfer functions was specified following preliminary outlier analysis. Nineteen significant outlying epochs above the modeled baseline were identified in the final analysis that correlated with 14 awareness and news media events. Our model demonstrated that the baseline level of search activity increased in a nonlinear fashion. A recurrent cyclic increase in search volume beginning in 2012 was observed that correlates with World Sepsis Day. Numerous other awareness and media events were correlated with outlying epochs. The average worldwide search volume for sepsis was less than that of influenza, myocardial infarction, and stroke. Analyzing aggregate search engine utilization data has promise as a mechanism to measure the impact of awareness efforts. Heightened information-seeking about sepsis occurs in close proximity to awareness events and relevant news media coverage. Future work should focus on validating this approach in other contexts and comparing its results to traditional methods of awareness campaign evaluation.
Seaside, Oregon, Tsunami Pilot Study-Modernization of FEMA Flood Hazard Maps: GIS Data
Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.
2006-01-01
Introduction: The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.
Mashup Scheme Design of Map Tiles Using Lightweight Open Source Webgis Platform
NASA Astrophysics Data System (ADS)
Hu, T.; Fan, J.; He, H.; Qin, L.; Li, G.
2018-04-01
To address the difficulty involved when using existing commercial Geographic Information System platforms to integrate multi-source image data fusion, this research proposes the loading of multi-source local tile data based on CesiumJS and examines the tile data organization mechanisms and spatial reference differences of the CesiumJS platform, as well as various tile data sources, such as Google maps, Map World, and Bing maps. Two types of tile data loading schemes have been designed for the mashup of tiles, the single data source loading scheme and the multi-data source loading scheme. The multi-sources of digital map tiles used in this paper cover two different but mainstream spatial references, the WGS84 coordinate system and the Web Mercator coordinate system. According to the experimental results, the single data source loading scheme and the multi-data source loading scheme with the same spatial coordinate system showed favorable visualization effects; however, the multi-data source loading scheme was prone to lead to tile image deformation when loading multi-source tile data with different spatial references. The resulting method provides a low cost and highly flexible solution for small and medium-scale GIS programs and has a certain potential for practical application values. The problem of deformation during the transition of different spatial references is an important topic for further research.
Google in the Research and Teaching of Instruction Librarians
ERIC Educational Resources Information Center
Sorensen, Charlene; Dahl, Candice
2008-01-01
This exploratory study assesses the differences and similarities between how instruction librarians in Western Canada use Google and how they instruct students to use it. Survey results indicate that these librarians do use Google but can be influenced by faculty to present Google negatively to students. (Contains 4 figures and 1 table.)
Google Docs in an Out-of-Class Collaborative Writing Activity
ERIC Educational Resources Information Center
Zhou, Wenyi; Simpson, Elizabeth; Domizi, Denise Pinette
2012-01-01
Google Docs, an online word processing application, is a promising tool for collaborative learning. However, many college instructors and students lack knowledge to effectively use Google Docs to enhance teaching and learning. Goals of this study include (1) assessing the effectiveness of using Google Docs in an out-of-class collaborative writing…
Tag Team Tech: What Makes Google Tick.
ERIC Educational Resources Information Center
Janes, Joseph
2002-01-01
The Google search engine is growing in popularity and usually shines in performance ratings. This article summarizes findings from a technical paper written by Google's developers in 1998 before anyone had seen Google. Suggests that a careful reader of the paper will be rewarded with a deeper appreciation of the designers' ideas and…
Hedging the Commons: Google Books, Libraries, and Open Access to Knowledge
ERIC Educational Resources Information Center
Bottando, Evelyn
2012-01-01
This dissertation analyzes the legal, social, technological, and cultural environment that gave rise to Google's library partnership program in order to propose an institutional corrective to Google's project to digitize cultural heritage. Interview research done with those actively involved with Google's project revealed the need for a history of…
Flipping the Online Classroom with Web 2.0: The Asynchronous Workshop
ERIC Educational Resources Information Center
Cummings, Lance
2016-01-01
This article examines how Web 2.0 technologies can be used to "flip" the online classroom by creating asynchronous workshops in social environments where immediacy and social presence can be maximized. Using experience teaching several communication and writing classes in Google Apps (Google+, Google Hangouts, Google Drive, etc.), I…
A Virtual Tour of the 1868 Hayward Earthquake in Google EarthTM
NASA Astrophysics Data System (ADS)
Lackey, H. G.; Blair, J. L.; Boatwright, J.; Brocher, T.
2007-12-01
The 1868 Hayward earthquake has been overshadowed by the subsequent 1906 San Francisco earthquake that destroyed much of San Francisco. Nonetheless, a modern recurrence of the 1868 earthquake would cause widespread damage to the densely populated Bay Area, particularly in the east Bay communities that have grown up virtually on top of the Hayward fault. Our concern is heightened by paleoseismic studies suggesting that the recurrence interval for the past five earthquakes on the southern Hayward fault is 140 to 170 years. Our objective is to build an educational web site that illustrates the cause and effect of the 1868 earthquake drawing on scientific and historic information. We will use Google EarthTM software to visually illustrate complex scientific concepts in a way that is understandable to a non-scientific audience. This web site will lead the viewer from a regional summary of the plate tectonics and faulting system of western North America, to more specific information about the 1868 Hayward earthquake itself. Text and Google EarthTM layers will include modeled shaking of the earthquake, relocations of historic photographs, reconstruction of damaged buildings as 3-D models, and additional scientific data that may come from the many scientific studies conducted for the 140th anniversary of the event. Earthquake engineering concerns will be stressed, including population density, vulnerable infrastructure, and lifelines. We will also present detailed maps of the Hayward fault, measurements of fault creep, and geologic evidence of its recurrence. Understanding the science behind earthquake hazards is an important step in preparing for the next significant earthquake. We hope to communicate to the public and students of all ages, through visualizations, not only the cause and effect of the 1868 earthquake, but also modern seismic hazards of the San Francisco Bay region.
Telling and measuring urban floods: event reconstruction by means of public-domain media
NASA Astrophysics Data System (ADS)
Macchia, S.; Gallo, E.; Claps, P.
2012-04-01
In the last decade, the diffusion of mobile telephones and ond of low-cost digital cameras have changed the public approach to catastrophes. As regards floods, it has become widespread the availability of images and videos taken in urban areas. Searching into Youtube or Youreporter, for example, one can understand how often citizen are considering to report even scary events. Nowadays these amateurs videos are often used in news world reports, which often increase or dampen the public perception of flood risk. More importantly, these amateur videos can play a crucial role in a didactic and technical representation of media flooding problems. The question so arise: why don't use the amateur videos for civil protection purposes? This work shows a new way to use flood images and videos to obtain technical data and spread safety information. Specifically, we show how to determine the height and speed of water flow, which have been achieved in some places during Genoa flood - 4th November 2011 - For this event we have downloaded more than 50 videos from different websites, where the authors have provided information about the time of recording, the geographical coordinates and the height above ground of the point of recording. The support by Google tools, such as Google maps and StreetWiew © has allowed us to geographically locate the recording points, so to put together shots and slides necessary to put together a whole reconstruction of the event. Future research will be in the direction of using these videos to generate a tool for the Google platforms, in order to address an easily achievable, yet accurate, information to the public, so to warn people on how to behave in front of imminent floods.
Al-Kindi, Sara M; Naiem, Ahmed A; Taqi, Kadhim M; Al-Gheiti, Najla M; Al-Toobi, Ikhtiyar S; Al-Busaidi, Nasra Q; Al-Harthy, Ahmed Z; Taqi, Alaa M; Ba-Alawi, Sharif A; Al-Qadhi, Hani A
2017-11-01
Road traffic injuries (RTIs) are considered a major public health problem worldwide. In Oman, high numbers of RTIs and RTI-related deaths are frequently registered. This study aimed to evaluate the distribution of trauma care facilities in Oman with regards to their proximity to RTI-prevalent areas. This descriptive pilot study analysed RTI data recorded in the national Royal Oman Police registry from January to December 2014. The distribution of trauma care facilities was analysed by calculating distances between areas of peak RTI incidence and the closest trauma centre using Google Earth and Google Maps software (Google Inc., Googleplex, Mountain View, California, USA). A total of 32 trauma care facilities were identified. Four facilities (12.5%) were categorised as class V trauma centres. Of the facilities in Muscat, 42.9% were ranked as class IV or V. There were no class IV or V facilities in Musandam, Al-Wusta or Al-Buraimi. General surgery, orthopaedic surgery and neurosurgery services were available in 68.8%, 59.3% and 12.5% of the centres, respectively. Emergency services were available in 75.0% of the facilities. Intensive care units were available in 11 facilities, with four located in Muscat. The mean distance between a RTI hotspot and the nearest trauma care facility was 34.7 km; however, the mean distance to the nearest class IV or V facility was 83.3 km. The distribution and quality of trauma care facilities in Oman needs modification. It is recommended that certain centres upgrade their levels of trauma care in order to reduce RTI-associated morbidity and mortality in Oman.
Wang, Wei-Ming; Zhou, Hua-Yun; Liu, Yao-Bao; Li, Ju-Lin; Cao, Yuan-Yuan; Cao, Jun
2013-04-01
To explore a new mode of malaria elimination through the application of digital earth system in malaria epidemic management and surveillance. While we investigated the malaria cases and deal with the epidemic areas in Jiangsu Province in 2011, we used JISIBAO UniStrong G330 GIS data acquisition unit (GPS) to collect the latitude and longitude of the cases located, and then established a landmark library about early-warning areas and an image management system by using Google Earth Free 6.2 and its image processing software. A total of 374 malaria cases were reported in Jiangsu Province in 2011. Among them, there were 13 local vivax malaria cases, 11 imported vivax malaria cases from other provinces, 20 abroad imported vivax malaria cases, 309 abroad imported falciparum malaria cases, 7 abroad imported quartan malaria cases (Plasmodium malaria infection), and 14 abroad imported ovale malaria cases (P. ovale infection). Through the analysis of Google Earth Mapping system, these malaria cases showed a certain degree of aggregation except the abroad imported quartan malaria cases which were highly sporadic. The local vivax malaria cases mainly concentrated in Sihong County, the imported vivax malaria cases from other provinces mainly concentrated in Suzhou City and Wuxi City, the abroad imported vivax malaria cases concentrated in Nanjing City, the abroad imported falciparum malaria cases clustered in the middle parts of Jiangsu Province, and the abroad imported ovale malaria cases clustered in Liyang City. The operation of Google Earth Free 6.2 is simple, convenient and quick, which could help the public health authority to make the decision of malaria prevention and control, including the use of funds and other health resources.
Youm, Julie; Wiechmann, Warren
2018-01-01
This case study explored the use of Google Glass in a clinical examination scenario to capture the first-person perspective of a standardized patient as a way to provide formative feedback on students' communication and empathy skills 'through the patient's eyes.' During a 3-year period between 2014 and 2017, third-year students enrolled in a family medicine clerkship participated in a Google Glass station during a summative clinical examination. At this station, standardized patients wore Google Glass to record an encounter focused on communication and empathy skills 'through the patient's eyes.' Students completed an online survey using a 4-point Likert scale about their perspectives on Google Glass as a feedback tool (N= 255). We found that the students' experiences with Google Glass 'through the patient's eyes' were largely positive and that students felt the feedback provided by the Google Glass recording to be helpful. Although a third of the students felt that Google Glass was a distraction, the majority believed that the first-person perspective recordings provided an opportunity for feedback that did not exist before. Continuing exploration of first-person perspective recordings using Google Glass to improve education on communication and empathy skills is warranted.
I Still Haven't Found What I'm Looking For... Bono, Google and Glaucoma Awareness.
Lyons, C; Ellard, R; McElnea, E; Townley, D
2017-05-10
The effect of celebrity diagnosis on public awareness of health conditions has already been well documented. In October 2014, Bono, the lead singer with U2, revealed publicly for the first time that he has glaucoma. This study aimed to analyze the impact of Bono's announcement on public awareness of glaucoma using Google Search trends as an indicator of public interest in the disease. Google Trends was used to examine Google Search activity for the term 'Glaucoma' between 2009 and 2015 in both Ireland and the United Kingdom. Trend analyses were performed using Microsoft Excel Version 14.3.5. Increased Google Search activity for 'Glaucoma' in October 2014 was found in both Ireland and the United Kingdom. A five-fold increase from the mean Google Search activity for this term was found in Ireland and a two-fold increase from the mean Google Search activity for this term was found in the United Kingdom. No such increase in Google Search activity occurred during each country's 2014 Glaucoma Awareness week. Google Trends is useful in medical research as a means of assessing public awareness of, and/or interest in, health related topics. Current approaches to glaucoma related health promotion in both Ireland and the United Kingdom have failed to yield an increase in on-line Google Search activity. While there was an increase in interest in glaucoma it is unclear whether this led to an increase in health seeking behaviour.
Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise
2010-01-01
A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware. PMID:21344013
Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise
2011-01-01
A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware.
Detailed Mapping of the Alu Volcano, Ethiopia
NASA Astrophysics Data System (ADS)
Agrain, Guillaume; Buso, Roxane; Carlier, Jean; van Wyk de Vries, Benjamin
2017-04-01
The Alu volcano in the Danakil Depression is interpreted as a forced-fold related uplift, related to progressive intrusions of sills, or similar tabular intrusions. Alu is in a very isolated and difficult to access area, but Google Earth provides high resolution images that can be used for mapping the structure and volcanic features. We use the imagery to map in as much detail as possible all the morphological features of Alu, which we separate into primary volcanic features and secondary structural features. The mapping has been undertaken by a group undergraduates, graduates and researchers. The group has checked and validated the interpretation of each feature mapped. The data set is available as a kmz, and has been imported into QGIS. The detailed mapping reveals a complex history of multiple lava fields and fissure eruptions, some which pre-date uplift, while others have occurred during uplift, but are subsequently deformed. Similarly, there are cross-cutting structures, and we are able to set up a chronology of events. This shows that uplift grew in an area which was already covered by lavas, that some lava has been probably erupted from Alu's flanks, while most eruptions have been from around the base of Alu. Early in the deformation, thrust faults developed on the lower flanks, similar to those described near the Grosmanaux uplift (van Wyk de Vries et al 2014). These are cut by the larger faults, and by minor fissures. The mapping provides an accessible way of preparing for dedicated fieldwork in preparation of an eventual field expedition to Alu, while extracting the most from remote sensing data.
Topographic Map and Compass Use. A Teaching Packet to Supplement the Student Manual.
ERIC Educational Resources Information Center
Taylor, Michael
This teacher's manual is designed to supplement the student manual for a unit of study on topographic map and compass use. The beginning section of the manual discusses (1) teaching strategy and evaluation, (2) teaching time and facilities, (3) materials and equipment required, (4) suggested field experience, (5) setting up a compass competition,…
What We Found on Our Journey through Fantasy Land
ERIC Educational Resources Information Center
Baker, Deirdre F.
2006-01-01
This paper sketches a "map" of certain patterns in current children's fantasy. Beginning with literal maps of fantasy worlds, I point out the similarities of the physical layout of a number of invented worlds, suggesting that sameness of geography often indicates a lack of innovation in the ideological or philosophical ideas behind the stories.…
The Social Maps of Children Approaching Adolescence: Studying the Ecology of Youth Development.
ERIC Educational Resources Information Center
Garbarino, James; And Others
This paper reports the first results of a three-year longitudinal study of the social maps of children beginning the transition to adolescence. This exploratory study is guided by Bronfenbrenner's conception of the ecology of human development stressing the importance of a phenomenological orientation to development in the context of ecological…
Getting to Know Education in the Pacific Region
ERIC Educational Resources Information Center
Regional Educational Laboratory Pacific, 2014
2014-01-01
The Pacific region is comprised of American Samoa, the Commonwealth of the Northern Mariana Islands (CNMI); the Federated States of Micronesia (FSM)-Chuuk, Kosrae, Pohnpei, and Yap; Guam; Hawai'i; the Republic of the Marshall Islands; and the Republic of Palau. This document begins by providing a map of the REL Pacific region overlaid on a map of…
Comparison of flank modification on Ascraeus and Arsia Montes volcanoes, Mars
NASA Technical Reports Server (NTRS)
Zimbelman, James R.
1993-01-01
Geologic mapping of the Tharsis Montes on Mars is in progress as part of the Mars Geologic Mapping Program of NASA. Mapping of the southern flanks of Ascraeus Mons at 1:500,000 scale was undertaken first followed by detailed mapping of Arsia Mons; mapping of Pavonis Mons will begin later this year. Results indicate that each of the Tharsis volcanoes displays unique variations on the general 'theme' of a martian shield volcano. Here we concentrate on the flank characteristics on Ascraeus Mons and Arsia Mons, the northernmost and southernmost of the Tharsis Montes, as illustrative of the most prominent trends.
Panoramic Images Mapping Tools Integrated Within the ESRI ArcGIS Software
NASA Astrophysics Data System (ADS)
Guo, Jiao; Zhong, Ruofei; Zeng, Fanyang
2014-03-01
There is a general study on panoramic images which are presented along with appearance of the Google street map. Despite 360 degree viewing of street, we can realize more applications over panoramic images. This paper developed a toolkits plugged in ArcGIS, which can view panoramic photographs at street level directly from ArcMap and measure and capture all visible elements as frontages, trees and bridges. We use a series of panoramic images adjoined with absolute coordinate through GPS and IMU. There are two methods in this paper to measure object from these panoramic images: one is to intersect object position through a stereogram; the other one is multichip matching involved more than three images which all cover the object. While someone wants to measure objects from these panoramic images, each two panoramic images which both contain the object can be chosen to display on ArcMap. Then we calculate correlation coefficient of the two chosen panoramic images so as to calculate the coordinate of object. Our study test different patterns of panoramic pairs and compare the results of measurement to the real value of objects so as to offer the best choosing suggestion. The article has mainly elaborated the principles of calculating correlation coefficient and multichip matching.
Visualization of High-Resolution LiDAR Topography in Google Earth
NASA Astrophysics Data System (ADS)
Crosby, C. J.; Nandigam, V.; Arrowsmith, R.; Blair, J. L.
2009-12-01
The growing availability of high-resolution LiDAR (Light Detection And Ranging) topographic data has proven to be revolutionary for Earth science research. These data allow scientists to study the processes acting on the Earth’s surfaces at resolutions not previously possible yet essential for their appropriate representation. In addition to their utility for research, the data have also been recognized as powerful tools for communicating earth science concepts for education and outreach purposes. Unfortunately, the massive volume of data produced by LiDAR mapping technology can be a barrier to their use. To facilitate access to these powerful data for research and educational purposes, we have been exploring the use of Keyhole Markup Language (KML) and Google Earth to deliver LiDAR-derived visualizations. The OpenTopography Portal (http://www.opentopography.org/) is a National Science Foundation-funded facility designed to provide access to Earth science-oriented LiDAR data. OpenTopography hosts a growing collection of LiDAR data for a variety of geologic domains, including many of the active faults in the western United States. We have found that the wide spectrum of LiDAR users have variable scientific applications, computing resources, and technical experience and thus require a data distribution system that provides various levels of access to the data. For users seeking a synoptic view of the data, and for education and outreach purposes, delivering full-resolution images derived from LiDAR topography into the Google Earth virtual globe is powerful. The virtual globe environment provides a freely available and easily navigated viewer and enables quick integration of the LiDAR visualizations with imagery, geographic layers, and other relevant data available in KML format. Through region-dependant network linked KML, OpenTopography currently delivers over 20 GB of LiDAR-derived imagery to users via simple, easily downloaded KMZ files hosted at the Portal. This method provides seamlessly access to hillshaded imagery for both bare earth and first return terrain models with various angles of illumination. Seamless access to LiDAR-derived imagery in Google Earth has proven to be the most popular product available in the OpenTopography Portal. The hillshade KMZ files have been downloaded over 3000 times by users ranging from earthquake scientists to K-12 educators who wish to introduce cutting edge real world data into their earth science lessons. OpenTopography also provides dynamically generated KMZ visualizations of LiDAR data products produced when users choose to use the OpenTopography point cloud access and processing system. These Google Earth compatible products allow users to quickly visualize the custom terrain products they have generated without the burden of loading the data into a GIS environment. For users who have installed the Google Earth browser plug-in, these visualizations can be launched directly from the OpenTopography results page and viewed directly in the browser.
Google Scholar Users and User Behaviors: An Exploratory Study
ERIC Educational Resources Information Center
Herrera, Gail
2011-01-01
The University of Mississippi Library created a profile to provide linking from Google Scholar (GS) to library resources in 2005. Although Google Scholar does not provide usage statistics for institutions, use of Google Scholar is clearly evident in looking at library link resolver logs. The purpose of this project is to examine users of Google…
From Tech Skills to Life Skills: Google Online Marketing Challenge and Experiential Learning
ERIC Educational Resources Information Center
Croes, Jo-Anne V.; Visser, Melina M.
2015-01-01
The Google Online Marketing Challenge (GOMC) is a global, online student competition sponsored by Google. It is a prime example of an experiential learning activity that includes using real money ($250 sponsored by Google) with a real client. The GOMC has yielded compelling results in student engagement and learning objectives related to the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxberry, Geoffrey
Google Test MPI Listener is a plugin for the Google Test c++ unit testing library that organizes test output of software that uses both the MPI parallel programming model and Google Test. Typically, such output is ordered arbitrarily and disorganized, making difficult the process of interpreting test output. This plug organizes output in MPI rank order, enabling easy interpretation of test results.
100 Colleges Sign Up with Google to Speed Access to Library Resources
ERIC Educational Resources Information Center
Young, Jeffrey R.
2005-01-01
More than 100 colleges and universities have arranged to give people using the Google Scholar search engine on their campuses more-direct access to library materials. Google Scholar is a free tool that searches scholarly materials on the Web and in academic databases. The new arrangements essentially let Google know which online databases the…
Places to Go: Google's Search Results for "Net Generation"
ERIC Educational Resources Information Center
Downes, Stephen
2007-01-01
In his Places to Go column for a special issue on the Net Generation, Stephen Downes takes an unexpected trip--to Google. According to Downes, Google epitomizes the essence of the Net Generation. Infinitely searchable and adaptable, Google represents the spirit of a generation raised in the world of the Internet, a generation that adapts…
Decision support system for emergency management of oil spill accidents in the Mediterranean Sea
NASA Astrophysics Data System (ADS)
Liubartseva, Svitlana; Coppini, Giovanni; Pinardi, Nadia; De Dominicis, Michela; Lecci, Rita; Turrisi, Giuseppe; Cretì, Sergio; Martinelli, Sara; Agostini, Paola; Marra, Palmalisa; Palermo, Francesco
2016-08-01
This paper presents an innovative web-based decision support system to facilitate emergency management in the case of oil spill accidents, called WITOIL (Where Is The Oil). The system can be applied to create a forecast of oil spill events, evaluate uncertainty of the predictions, and calculate hazards based on historical meteo-oceanographic datasets. To compute the oil transport and transformation, WITOIL uses the MEDSLIK-II oil spill model forced by operational meteo-oceanographic services. Results of the modeling are visualized through Google Maps. A special application for Android is designed to provide mobile access for competent authorities, technical and scientific institutions, and citizens.
Spatiotemporal-Thematic Data Processing for the Semantic Web
NASA Astrophysics Data System (ADS)
Hakimpour, Farshad; Aleman-Meza, Boanerges; Perry, Matthew; Sheth, Amit
This chapter presents practical approaches to data processing in the space, time and theme dimensions using existing Semantic Web technologies. It describes how we obtain geographic and event data from Internet sources and also how we integrate them into an RDF store. We briefly introduce a set of functionalities in space, time and semantics. These functionalities are implemented based on our existing technology for main-memory-based RDF data processing developed at the LSDIS Lab. A number of these functionalities are exposed as REST Web services. We present two sample client-side applications that are developed using a combination of our services with Google Maps service.
ERIC Educational Resources Information Center
Georgas, Helen
2013-01-01
Federated searching was once touted as the library world's answer to Google, but ten years since federated searching technology's inception, how does it actually compare? This study focuses on undergraduate student preferences and perceptions when doing research using both Google and a federated search tool. Students were asked about their…
Google earth as a source of ancillary material in a history of psychology class.
Stevison, Blake K; Biggs, Patrick T; Abramson, Charles I
2010-06-01
This article discusses the use of Google Earth to visit significant geographical locations associated with events in the history of psychology. The process of opening files, viewing content, adding placemarks, and saving customized virtual tours on Google Earth are explained. Suggestions for incorporating Google Earth into a history of psychology course are also described.
Enhancing Geographic and Digital Literacy with a Student-Generated Course Portfolio in Google Earth
ERIC Educational Resources Information Center
Guertin, Laura; Stubbs, Christopher; Millet, Christopher; Lee, Tsan-Kuang; Bodek, Matthew
2012-01-01
Google Earth can serve as a platform for students to construct a course ePortfolio. By having students construct their own placemarks in a customized Google Earth file, students document their learning in a geospatial context, learn an innovative use of Google Earth, and have the opportunity for creativity and flexibility with disseminating their…
Google Earth: A Virtual Globe for Elementary Geography
ERIC Educational Resources Information Center
Britt, Judy; LaFontaine, Gus
2009-01-01
Originally called Earth Viewer in 2004, Google Earth was the first virtual globe easily available to the ordinary user of the Internet. Google Earth, at earth.google.com, is a free, 3-dimensional computer model of Earth, but that means more than just a large collection of pretty pictures. It allows the viewer to "fly" anywhere on Earth "to view…
Institutional Repositories in the UK: What Can the Google User Find There?
ERIC Educational Resources Information Center
Markland, Margaret
2006-01-01
This study investigates the efficiency of the Google search engine at retrieving items from 26 UK Institutional Repositories, covering a wide range of subject areas. One item is chosen from each repository and four searches are carried out: two keyword searches and two full title searches, each using both Google and then Google Scholar. A further…
From Google Maps to a fine-grained catalog of street trees
NASA Astrophysics Data System (ADS)
Branson, Steve; Wegner, Jan Dirk; Hall, David; Lang, Nico; Schindler, Konrad; Perona, Pietro
2018-01-01
Up-to-date catalogs of the urban tree population are of importance for municipalities to monitor and improve quality of life in cities. Despite much research on automation of tree mapping, mainly relying on dedicated airborne LiDAR or hyperspectral campaigns, tree detection and species recognition is still mostly done manually in practice. We present a fully automated tree detection and species recognition pipeline that can process thousands of trees within a few hours using publicly available aerial and street view images of Google MapsTM. These data provide rich information from different viewpoints and at different scales from global tree shapes to bark textures. Our work-flow is built around a supervised classification that automatically learns the most discriminative features from thousands of trees and corresponding, publicly available tree inventory data. In addition, we introduce a change tracker that recognizes changes of individual trees at city-scale, which is essential to keep an urban tree inventory up-to-date. The system takes street-level images of the same tree location at two different times and classifies the type of change (e.g., tree has been removed). Drawing on recent advances in computer vision and machine learning, we apply convolutional neural networks (CNN) for all classification tasks. We propose the following pipeline: download all available panoramas and overhead images of an area of interest, detect trees per image and combine multi-view detections in a probabilistic framework, adding prior knowledge; recognize fine-grained species of detected trees. In a later, separate module, track trees over time, detect significant changes and classify the type of change. We believe this is the first work to exploit publicly available image data for city-scale street tree detection, species recognition and change tracking, exhaustively over several square kilometers, respectively many thousands of trees. Experiments in the city of Pasadena, California, USA show that we can detect >70% of the street trees, assign correct species to >80% for 40 different species, and correctly detect and classify changes in >90% of the cases.
Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes
NASA Astrophysics Data System (ADS)
Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.
2014-12-01
With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data Web developer-friendly with a RESTful service. This goal was achieved by defining a proxy layer on top of the existing SPARQL endpoint that 1) translates HTTP requests into SPARQL queries, and 2) renders the returned results as required by the request sender using content negotiation, suffixes and parameters.
NASA Astrophysics Data System (ADS)
Solvik, K.; Macedo, M.; Graesser, J.; Lathuilliere, M. J.
2017-12-01
Large-scale agriculture and cattle ranching in Brazil has driving the creation of tens of thousands of small stream impoundments to provide water for crops and livestock. These impoundments are a source of methane emissions and have significant impacts on stream temperature, connectivity, and water use over a large region. Due to their large numbers and small size, they are difficult to map using conventional methods. Here, we present a two-stage object-based supervised classification methodology for identifying man-made impoundments in Brazil. First, in Google Earth Engine pixels are classified as water or non-water using satellite data and HydroSHEDS products as predictors. Second, using Python's scikit-learn and scikit-image modules the water objects are classified as man-made or natural based on a variety of shape and spectral properties. Both classifications are performed by a random forest classifier. Training data is acquired by visually identifying impoundments and natural water bodies using high resolution satellite imagery from Google Earth.This methodology was applied to the state of Mato Grosso using a cloud-free mosaic of Sentinel 1 (10m resolution) radar and Sentinel 2 (10-20m) multispectral data acquired during the 2016 dry season. Independent test accuracy was estimated at 95% for the first stage and 93% for the second. We identified 54,294 man-made impoundments in Mato Grosso in 2016. The methodology is generalizable to other high resolution satellite data and has been tested on Landsat 5 and 8 imagery. Applying the same approach to Landsat 8 images (30 m), we identified 35,707 impoundments in the 2015 dry season. The difference in number is likely because the coarser-scale imagery fails to detect small (< 900 m2) objects. On-going work will apply this approach to satellite time series for the entire Amazon-Cerrado frontier, allowing us to track changes in the number, size, and distribution of man-made impoundments. Automated impoundment mapping over large areas may help with management of streams in agricultural landscapes in Brazil and other tropical regions.
Mapping and analysis of phosphorylation sites: a quick guide for cell biologists
Dephoure, Noah; Gould, Kathleen L.; Gygi, Steven P.; Kellogg, Douglas R.
2013-01-01
A mechanistic understanding of signaling networks requires identification and analysis of phosphorylation sites. Mass spectrometry offers a rapid and highly sensitive approach to mapping phosphorylation sites. However, mass spectrometry has significant limitations that must be considered when planning to carry out phosphorylation-site mapping. Here we provide an overview of key information that should be taken into consideration before beginning phosphorylation-site analysis, as well as a step-by-step guide for carrying out successful experiments. PMID:23447708
Moss, Donald B
2006-01-01
The author uses the metaphor of mapping to illuminate a structural feature of racist thought, locating the degraded object along vertical and horizontal axes. These axes establish coordinates of hierarchy and of distance. With the coordinates in place, racist thought begins to seem grounded in natural processes. The other's identity becomes consolidated, and parochialism results. The use of this kind of mapping is illustrated via two patient vignettes. The author presents Freud's (1905, 1927) views in relation to such a "mapping" process, as well as Adorno's (1951) and Baldwin's (1965). Finally, the author conceptualizes the crucial status of primitivity in the workings of racist thought.
Designing Courses that Encourage Post-College Scientific Literacy in General Education Students
NASA Astrophysics Data System (ADS)
Horodyskyj, L.
2010-12-01
In a time when domestic and foreign policy is becoming increasingly dependent on a robust understanding of scientific concepts (especially in regards to climate science), it is of vital importance that non-specialist students taking geoscience courses gain an understanding not only of Earth system processes, but also of how to discern scientific information from "spin". An experimental introductory level environmental geology course was developed at the Glendale Community College in Glendale, Arizona, in the fall of 2010 that sought to integrate collaborative learning, online resources, and science in the media. The goal of this course was for students to end the semester with not just an understanding of basic Earth systems concepts, but also with a set of tools for evaluating information presented by the media. This was accomplished by integrating several online sites that interface scientific data with popular web tools (ie, Google Maps) and collaborative exercises that required students to generate ideas based on their observations followed by evaluation and refinement of these ideas through interactions with peers and the instructor. The capstone activity included a series of homework assignments that required students to make note of science-related news stories in the media early in the semester, and then gradually begin critically evaluating these news sources, which will become their primary source of post-college geoscience information. This combination of activities will benefit students long after the semester has ended by giving them access to primary sources of scientific information, encouraging them to discuss and evaluate their ideas with their peers, and, most importantly, to critically evaluate the information they receive from the media and their peers so that they can become more scientifically literate citizens.
Asubonteng, Kwabena; Pfeffer, Karin; Ros-Tonen, Mirjam; Verbesselt, Jan; Baud, Isa
2018-05-11
Tree crops such as cocoa and oil palm are important to smallholders' livelihoods and national economies of tropical producer countries. Governments seek to expand tree-crop acreages and improve yields. Existing literature has analyzed socioeconomic and environmental effects of tree-crop expansion, but its spatial effects on the landscape are yet to be explored. This study aims to assess the effects of tree-crop farming on the composition and the extent of land-cover transitions in a mixed cocoa/oil palm landscape in Ghana. Land-cover maps of 1986 and 2015 produced through ISODATA, and maximum likelihood classification were validated with field reference, Google Earth data, and key respondent interviews. Post-classification change detection was conducted and the transition matrix analyzed using intensity analysis. Cocoa and oil palm areas have increased in extent by 8.9% and 11.2%, respectively, mainly at the expense of food-crop land and forest. The intensity of forest loss to both tree crops is at a lower intensity than the loss of food-crop land. There were transitions between cocoa and oil palm, but the gains in oil palm outweigh those of cocoa. Cocoa and oil palm have increased in area and dominance. The main cover types converted to tree-crop areas are food-crop land and off-reserve forest. This is beginning to have serious implications for food security and livelihood options that depend on ecosystem services provided by the mosaic landscape. Tree-crop policies should take account of the geographical distribution of tree-commodity production at landscape level and its implications for food production and ecosystems services.
ERIC Educational Resources Information Center
McCullough, Mark; Holmberg, Melissa
2005-01-01
The purpose of this research was to explore Google's potential for detecting occurrences of word-for-word (1) plagiarism in master's theses. The authors sought answers to these questions:1. Is Google an effective tool for detecting plagiarism in master's theses?2. Is Google an efficient tool for detecting plagiarism in master's theses?The first…
Curating the Web: Building a Google Custom Search Engine for the Arts
ERIC Educational Resources Information Center
Hennesy, Cody; Bowman, John
2008-01-01
Google's first foray onto the web made search simple and results relevant. With its Co-op platform, Google has taken another step toward dramatically increasing the relevancy of search results, further adapting the World Wide Web to local needs. Google Custom Search Engine, a tool on the Co-op platform, puts one in control of his or her own search…
Information We Collect: Surveillance and Privacy in the Implementation of Google Apps for Education
ERIC Educational Resources Information Center
Lindh, Maria; Nolin, Jan
2016-01-01
The aim of this study is to show how Google's business model is concealed within Google Apps for Education (GAFE) as well as how such a bundle is perceived within one educational organisation, consisting of approximately 30 schools. The study consists of two parts: 1) a rhetorical analysis of Google policy documents and 2) an interview study in a…
ERIC Educational Resources Information Center
Bartolo, Paula
2017-01-01
The purpose of this phenomenological study was to understand the lived experiences of public school teachers using Google Suite for Education with Google Chromebooks integrated into the core curriculum. With the adoption of Common Core standards by 46 states, the increased use of technology has occurred due to standards that integrate technology.…
Cervellin, Gianfranco; Comelli, Ivan; Lippi, Giuseppe
2017-09-01
Internet-derived information has been recently recognized as a valuable tool for epidemiological investigation. Google Trends, a Google Inc. portal, generates data on geographical and temporal patterns according to specified keywords. The aim of this study was to compare the reliability of Google Trends in different clinical settings, for both common diseases with lower media coverage, and for less common diseases attracting major media coverage. We carried out a search in Google Trends using the keywords "renal colic", "epistaxis", and "mushroom poisoning", selected on the basis of available and reliable epidemiological data. Besides this search, we carried out a second search for three clinical conditions (i.e., "meningitis", "Legionella Pneumophila pneumonia", and "Ebola fever"), which recently received major focus by the Italian media. In our analysis, no correlation was found between data captured from Google Trends and epidemiology of renal colics, epistaxis and mushroom poisoning. Only when searching for the term "mushroom" alone the Google Trends search generated a seasonal pattern which almost overlaps with the epidemiological profile, but this was probably mostly due to searches for harvesting and cooking rather than to for poisoning. The Google Trends data also failed to reflect the geographical and temporary patterns of disease for meningitis, Legionella Pneumophila pneumonia and Ebola fever. The results of our study confirm that Google Trends has modest reliability for defining the epidemiology of relatively common diseases with minor media coverage, or relatively rare diseases with higher audience. Overall, Google Trends seems to be more influenced by the media clamor than by true epidemiological burden. Copyright © 2017 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.
Croatian Medical Journal citation score in Web of Science, Scopus, and Google Scholar.
Sember, Marijan; Utrobicić, Ana; Petrak, Jelka
2010-04-01
To analyze the 2007 citation count of articles published by the Croatian Medical Journal in 2005-2006 based on data from the Web of Science, Scopus, and Google Scholar. Web of Science and Scopus were searched for the articles published in 2005-2006. As all articles returned by Scopus were included in Web of Science, the latter list was the sample for further analysis. Total citation counts for each article on the list were retrieved from Web of Science, Scopus, and Google Scholar. The overlap and unique citations were compared and analyzed. Proportions were compared using chi(2)-test. Google Scholar returned the greatest proportion of articles with citations (45%), followed by Scopus (42%), and Web of Science (38%). Almost a half (49%) of articles had no citations and 11% had an equal number of identical citations in all 3 databases. The greatest overlap was found between Web of Science and Scopus (54%), followed by Scopus and Google Scholar (51%), and Web of Science and Google Scholar (44%). The greatest number of unique citations was found by Google Scholar (n=86). The majority of these citations (64%) came from journals, followed by books and PhD theses. Approximately 55% of all citing documents were full-text resources in open access. The language of citing documents was mostly English, but as many as 25 citing documents (29%) were in Chinese. Google Scholar shares a total of 42% citations returned by two others, more influential, bibliographic resources. The list of unique citations in Google Scholar is predominantly journal based, but these journals are mainly of local character. Citations received by internationally recognized medical journals are crucial for increasing the visibility of small medical journals but Google Scholar may serve as an alternative bibliometric tool for an orientational citation insight.
Accurate estimation of influenza epidemics using Google search data via ARGO.
Yang, Shihao; Santillana, Mauricio; Kou, S C
2015-11-24
Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions.
Duran-Nelson, Alisa; Gladding, Sophia; Beattie, Jim; Nixon, L James
2013-06-01
To determine which resources residents use at the point-of-care (POC) for decision making, the drivers for selection of these resources, and how residents use Google/Google Scholar to answer clinical questions at the POC. In January 2012, 299 residents from three internal medicine residencies were sent an electronic survey regarding resources used for POC decision making. Resource use frequency and factors influencing choice were determined using descriptive statistics. Binary logistic regression analysis was performed to determine relationships between the independent variables. A total of 167 residents (56%) responded; similar numbers responded at each level of training. Residents most frequently reported using UpToDate and Google at the POC at least daily (85% and 63%, respectively), with speed and trust in the quality of information being the primary drivers of selection. Google, used by 68% of residents, was used primarily to locate Web sites and general information about diseases, whereas Google Scholar, used by 30% of residents, tended to be used for treatment and management decisions or locating a journal article. The findings suggest that internal medicine residents use UpToDate most frequently, followed by consultation with faculty and the search engines Google and Google Scholar; speed, trust, and portability are the biggest drivers for resource selection; and time and information overload appear to be the biggest barriers to resources such as Ovid MEDLINE. Residents frequently used Google and may benefit from further training in information management skills.
Results of new petrologic and remote sensing studies in the Big Bend region
NASA Astrophysics Data System (ADS)
Benker, Stevan Christian
The initial section of this manuscript involves the South Rim Formation, a series of 32.2-32 Ma comenditic quartz trachytic-rhyolitic volcanics and associated intrusives, erupted and was emplaced in Big Bend National Park, Texas. Magmatic parameters have only been interpreted for one of the two diverse petrogenetic suites comprising this formation. Here, new mineralogic data for the South Rim Formation rocks are presented. Magmatic parameters interpreted from these data assist in deciphering lithospheric characteristics during the mid-Tertiary. Results indicate low temperatures (< 750 °C), reduced conditions (generally below the FMQ buffer), and low pressures (≤ 100 MPa) associated with South Rim Formation magmatism with slight conditional differences between the two suites. Newly discovered fayalite microphenocrysts allowed determination of oxygen fugacity values (between -0.14 and -0.25 DeltaFMQ over temperature ranges of 680-700 °C), via mineral equilibria based QUILF95 calculations, for Emory Peak Suite. Petrologic information is correlated with structural evidence from Trans-Pecos Texas and adjacent regions to evaluate debated timing of tectonic transition (Laramide compression to Basin and Range extension) and onset of the southern Rio Grande Rift during the mid-Tertiary. The A-type and peralkaline characteristics of the South Rim Formation and other pre-31 Ma magmatism in Trans-Pecos Texas, in addition to evidence implying earlier Rio Grande Rift onset in Colorado and New Mexico, promotes a near-neutral to transtensional setting in Trans-Pecos Texas by 32 Ma. This idea sharply contrasts with interpretations of tectonic compression and arc-related magmatism until 31 Ma as suggested by some authors. However, evidence discussed cannot preclude a pre-36 Ma proposed by other authors. The later section of this manuscript involves research in the Big Bend area using Google Earth. At present there is high interest in using Google Earth in a variety of scientific investigations. However, program developers have disclosed limited information concerning the program and its accuracy. While some authors have attempted to independently constrain the accuracy of Google Earth, their results have potentially lost validity through time due to technological advances and updates to imagery archives. For this reason we attempt to constrain more current horizontal and vertical position accuracies for the Big Bend region of West Texas. In Google Earth a series of 268 data points were virtually traced along various early Tertiary unconformities in Big Bend National Park and Big Bend Ranch State Park. These data points were compared with high precision GPS measurements collected in field and yielded a horizontal position accuracy of 2.64 meters RMSE. Complications arose in determining vertical position accuracy for Google Earth because default keyhole markup language (.kml) files currently do not export elevation data. This drawback forces users to hand record and manually input elevation values listed on screen. This is a significant handicap rendering Google Earth data useless with larger datasets. However, in a workaround solution exempted elevation values can be replaced from other data sources based on Google Earth horizontal positioning. We used Fledermaus 3D three-dimensional visualization software to drape Google Earth horizontal positions over a National Elevation Dataset (NED) digital elevation map (DEM) in order to adopt a large set of elevation data. A vertical position accuracy of 1.63 meters RMSE was determined between 268 Google Earth data points and the NED. Since determined accuracies were considerably lower than those reported in previous investigations, we devoted a later portion of this investigation to testing Google Earth-NED data in paleo-surface modeling of the Big Bend region. An 18 x 30 kilometer area in easternmost Big Ranch State Park was selected to create a post-Laramide paleo-surface model via interpolation of approximately 2900 Google Earth-NED data points representing sections of an early Tertiary unconformity. The area proved difficult to model as unconformity tracing and interpolation were often hindered by surface inflation due to regional magmatism, burial of Laramide topography by subsequent volcanism and sedimentation, and overprinting of Basin & Range extensional features masking Laramide compressional features. Despite these difficulties, a model was created illustrating paleo-topographic highs in the southeastern Bofecillos Mountains and at Lajitas Mesa. Based on the amount of surface relief depicted, inconsistency with subsequent normal faulting, and distance from magmatic features capable of surface doming or inflation, we believe the paleo-topographic highs modeled legitimately reflect the post-Laramide surface. We interpret the paleo-surface in this area as reflecting a post-Laramide surface that has experienced significant erosion. We attribute the paleo-topographic highs as Laramide topography that was more resistant. The model also implies a southern paleo-drainage direction for the area and suggests the present day topographic low through which the Rio Grande flows may have formed very soon after the Laramide Orogeny. Based on the newly calculated horizontal and vertical position accuracies for the Big Bend region and results of modeled Google Earth-NED data in easternmost Big Bend Ranch State Park, it seems Google Earth can be effectively utilized for remote sensing and geologic studies, however we urge caution as developers remain reluctant to disclose detailed program information to the public.
Cat-Map: putting cataract on the map
Bennett, Thomas M.; Hejtmancik, J. Fielding
2010-01-01
Lens opacities, or cataract(s), may be inherited as a classic Mendelian disorder usually with early-onset or, more commonly, acquired with age as a multi-factorial or complex trait. Many genetic forms of cataract have been described in mice and other animal models. Considerable progress has been made in mapping and identifying the genes and mutations responsible for inherited forms of cataract, and genetic determinants of age-related cataract are beginning to be discovered. To provide a convenient and accurate summary of current information focused on the increasing genetic complexity of Mendelian and age-related cataract we have created an online chromosome map and reference database for cataract in humans and mice (Cat-Map). PMID:21042563
2013-06-01
widgets for an OA system Design-time architecture: Browser, email, widget, DB, OS Go ogle Instance architecture: Chrome, Gmail, Google...provides functionally similar components or applications compatible with an OA system design Firefox Browser, WP, calendar Opera Instance...architecture: Firefox , AbiWord, Evolution, Fedora GPL Ab1Word Google Docs Instance ardlitecture: Fire fox, OR Google cal., Google Docs, Fedora
ERIC Educational Resources Information Center
Albanese, Andrew Richard
2006-01-01
This article observes that it's not hard to understand why Google creates such unease among librarians. The profession, however, can't afford to be myopic when it comes to Google. As inescapable as it is, Google is not the Internet. And as the web evolves, new opportunities and challenges loom larger for libraries than who's capturing the bulk of…
Alignment of genetic maps and QTLs between inter- and intra-specific sorghum populations.
Feltus, F A; Hart, G E; Schertz, K F; Casa, A M; Kresovich, S; Abraham, S; Klein, P E; Brown, P J; Paterson, A H
2006-05-01
To increase the value of associated molecular tools and also to begin to explore the degree to which interspecific and intraspecific genetic variation in Sorghum is attributable to corresponding genetic loci, we have aligned genetic maps derived from two sorghum populations that share one common parent (Sorghum bicolor L. Moench accession BTx623) but differ in morphological and evolutionarily distant alternate parents (S. propinquum or S. bicolor accession IS3620C). A total of 106 well-distributed DNA markers provide for map alignment, revealing only six nominal differences in marker order that are readily explained by sampling variation or mapping of paralogous loci. We also report a total of 61 new QTLs detected from 17 traits in these crosses. Among eight corresponding traits (some new, some previously published) that could be directly compared between the two maps, QTLs for two (tiller height and tiller number) were found to correspond in a non-random manner (P<0.05). For several other traits, correspondence of subsets of QTLs narrowly missed statistical significance. In particular, several QTLs for leaf senescence were near loci previously mapped for 'stay-green' that have been implicated by others in drought tolerance. These data provide strong validation for the value of molecular tools developed in the interspecific cross for utilization in cultivated sorghum, and begin to separate QTLs that distinguish among Sorghum species from those that are informative within the cultigen (S. bicolor).
NASA Astrophysics Data System (ADS)
Ivankovic, D.; Dadic, V.
2009-04-01
Some of oceanographic parameters have to be manually inserted into database; some (for example data from CTD probe) are inserted from various files. All this parameters requires visualization, validation and manipulation from research vessel or scientific institution, and also public presentation. For these purposes is developed web based system, containing dynamic sql procedures and java applets. Technology background is Oracle 10g relational database, and Oracle application server. Web interfaces are developed using PL/SQL stored database procedures (mod PL/SQL). Additional parts for data visualization include use of Java applets and JavaScript. Mapping tool is Google maps API (javascript) and as alternative java applet. Graph is realized as dynamically generated web page containing java applet. Mapping tool and graph are georeferenced. That means that click on some part of graph, automatically initiate zoom or marker onto location where parameter was measured. This feature is very useful for data validation. Code for data manipulation and visualization are partially realized with dynamic SQL and that allow as to separate data definition and code for data manipulation. Adding new parameter in system requires only data definition and description without programming interface for this kind of data.
Semi-Automatic Building Models and FAÇADE Texture Mapping from Mobile Phone Images
NASA Astrophysics Data System (ADS)
Jeong, J.; Kim, T.
2016-06-01
Research on 3D urban modelling has been actively carried out for a long time. Recently the need of 3D urban modelling research is increased rapidly due to improved geo-web services and popularized smart devices. Nowadays 3D urban models provided by, for example, Google Earth use aerial photos for 3D urban modelling but there are some limitations: immediate update for the change of building models is difficult, many buildings are without 3D model and texture, and large resources for maintaining and updating are inevitable. To resolve the limitations mentioned above, we propose a method for semi-automatic building modelling and façade texture mapping from mobile phone images and analyze the result of modelling with actual measurements. Our method consists of camera geometry estimation step, image matching step, and façade mapping step. Models generated from this method were compared with actual measurement value of real buildings. Ratios of edge length of models and measurements were compared. Result showed 5.8% average error of length ratio. Through this method, we could generate a simple building model with fine façade textures without expensive dedicated tools and dataset.
75 FR 61724 - Combined Notice of Filings #2
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-06
..., October 13, 2010. Docket Numbers: ER10-2835-000. Applicants: Google Energy LLC. Description: Google Energy LLC submits tariff filing per 35.12: Google Energy LLC Baseline Filing for MBR Tariff to be effective...