Sample records for google maps api

  1. A Web-Based Interactive Mapping System of State Wide School Performance: Integrating Google Maps API Technology into Educational Achievement Data

    ERIC Educational Resources Information Center

    Wang, Kening; Mulvenon, Sean W.; Stegman, Charles; Anderson, Travis

    2008-01-01

    Google Maps API (Application Programming Interface), released in late June 2005 by Google, is an amazing technology that allows users to embed Google Maps in their own Web pages with JavaScript. Google Maps API has accelerated the development of new Google Maps based applications. This article reports a Web-based interactive mapping system…

  2. Web GIS in practice III: creating a simple interactive map of England's Strategic Health Authorities using Google Maps API, Google Earth KML, and MSN Virtual Earth Map Control

    PubMed Central

    Boulos, Maged N Kamel

    2005-01-01

    This eye-opener article aims at introducing the health GIS community to the emerging online consumer geoinformatics services from Google and Microsoft (MSN), and their potential utility in creating custom online interactive health maps. Using the programmable interfaces provided by Google and MSN, we created three interactive demonstrator maps of England's Strategic Health Authorities. These can be browsed online at – Google Maps API (Application Programming Interface) version, – Google Earth KML (Keyhole Markup Language) version, and – MSN Virtual Earth Map Control version. Google and MSN's worldwide distribution of "free" geospatial tools, imagery, and maps is to be commended as a significant step towards the ultimate "wikification" of maps and GIS. A discussion is provided of these emerging online mapping trends, their expected future implications and development directions, and associated individual privacy, national security and copyrights issues. Although ESRI have announced their planned response to Google (and MSN), it remains to be seen how their envisaged plans will materialize and compare to the offerings from Google and MSN, and also how Google and MSN mapping tools will further evolve in the near future. PMID:16176577

  3. a Map Mash-Up Application: Investigation the Temporal Effects of Climate Change on Salt Lake Basin

    NASA Astrophysics Data System (ADS)

    Kirtiloglu, O. S.; Orhan, O.; Ekercin, S.

    2016-06-01

    The main purpose of this paper is to investigate climate change effects that have been occurred at the beginning of the twenty-first century at the Konya Closed Basin (KCB) located in the semi-arid central Anatolian region of Turkey and particularly in Salt Lake region where many major wetlands located in and situated in KCB and to share the analysis results online in a Web Geographical Information System (GIS) environment. 71 Landsat 5-TM, 7-ETM+ and 8-OLI images and meteorological data obtained from 10 meteorological stations have been used at the scope of this work. 56 of Landsat images have been used for extraction of Salt Lake surface area through multi-temporal Landsat imagery collected from 2000 to 2014 in Salt lake basin. 15 of Landsat images have been used to make thematic maps of Normalised Difference Vegetation Index (NDVI) in KCB, and 10 meteorological stations data has been used to generate the Standardized Precipitation Index (SPI), which was used in drought studies. For the purpose of visualizing and sharing the results, a Web GIS-like environment has been established by using Google Maps and its useful data storage and manipulating product Fusion Tables which are all Google's free of charge Web service elements. The infrastructure of web application includes HTML5, CSS3, JavaScript, Google Maps API V3 and Google Fusion Tables API technologies. These technologies make it possible to make effective "Map Mash-Ups" involving an embedded Google Map in a Web page, storing the spatial or tabular data in Fusion Tables and add this data as a map layer on embedded map. The analysing process and map mash-up application have been discussed in detail as the main sections of this paper.

  4. Using Mobile App Development Tools to Build a GIS Application

    NASA Astrophysics Data System (ADS)

    Mital, A.; Catchen, M.; Mital, K.

    2014-12-01

    Our group designed and built working web, android, and IOS applications using different mapping libraries as bases on which to overlay fire data from NASA. The group originally planned to make app versions for Google Maps, Leaflet, and OpenLayers. However, because the Leaflet library did not properly load on Android, the group focused efforts on the other two mapping libraries. For Google Maps, the group first designed a UI for the web app and made a working version of the app. After updating the source of fire data to one which also provided historical fire data, the design had to be modified to include the extra data. After completing a working version of the web app, the group used webview in android, a built in resource which allowed porting the web app to android without rewriting the code for android. Upon completing this, the group found Apple IOS devices had a similar capability, and so decided to add an IOS app to the project using a function similar to webview. Alongside this effort, the group began implementing an OpenLayers fire map using a simpler UI. This web app was completed fairly quickly relative to Google Maps; however, it did not include functionality such as satellite imagery or searchable locations. The group finished the project with a working android version of the Google Maps based app supporting API levels 14-19 and an OpenLayers based app supporting API levels 8-19, as well as a Google Maps based IOS app supporting both old and new screen formats. This project was implemented by high school and college students under an SGT Inc. STEM internship program

  5. What Google Maps can do for biomedical data dissemination: examples and a design study.

    PubMed

    Jianu, Radu; Laidlaw, David H

    2013-05-04

    Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations.

  6. What google maps can do for biomedical data dissemination: examples and a design study

    PubMed Central

    2013-01-01

    Background Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. Results We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. Conclusions We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations. PMID:23642009

  7. ANTP Protocol Suite Software Implementation Architecture in Python

    DTIC Science & Technology

    2011-06-03

    a popular platform of networking programming, an area in which C has traditionally dominated. 2 NetController AeroRP AeroNP AeroNP API AeroTP...visualisation of the running system. For example using the Google Maps API , the main logging web page can show all the running nodes in the system. By...communication between AeroNP and AeroRP and runs on the operating system as daemon. Furthermore, it creates an API interface to mange the communication between

  8. Design and Implementation of Surrounding Transaction Plotting and Management System Based on Google Map API

    NASA Astrophysics Data System (ADS)

    Cao, Y. B.; Hua, Y. X.; Zhao, J. X.; Guo, S. M.

    2013-11-01

    With China's rapid economic development and comprehensive national strength growing, Border work has become a long-term and important task in China's diplomatic work. How to implement rapid plotting, real-time sharing and mapping surrounding affairs has taken great significance for government policy makers and diplomatic staff. However, at present the already exists Boundary information system are mainly have problems of Geospatial data update is heavily workload, plotting tools are in a state of serious lack of, Geographic events are difficult to share, this phenomenon has seriously hampered the smooth development of the border task. The development and progress of Geographic information system technology especially the development of Web GIS offers the possibility to solve the above problems, this paper adopts four layers of B/S architecture, with the support of Google maps service, uses the free API which is offered by Google maps and its features of openness, ease of use, sharing characteristics, highresolution images to design and implement the surrounding transaction plotting and management system based on the web development technology of ASP.NET, C#, Ajax. The system can provide decision support for government policy makers as well as diplomatic staff's real-time plotting and sharing of surrounding information. The practice has proved that the system has good usability and strong real-time.

  9. [Who Hits the Mark? A Comparative Study of the Free Geocoding Services of Google and OpenStreetMap].

    PubMed

    Lemke, D; Mattauch, V; Heidinger, O; Hense, H W

    2015-09-01

    Geocoding, the process of converting textual information (addresses) into geographic coordinates is increasingly used in public health/epidemiological research and practice. To date, little attention has been paid to geocoding quality and its impact on different types of spatially-related health studies. The primary aim of this study was to compare 2 freely available geocoding services (Google and OpenStreetMap) with regard to matching rate (percentage of address records capable of being geocoded) and positional accuracy (distance between geocodes and the ground truth locations). Residential addresses were geocoded by the NRW state office for information and technology and were considered as reference data (gold standard). The gold standard included the coordinates, the quality of the addresses (4 categories), and a binary urbanity indicator based on the CORINE land cover data. 2 500 addresses were randomly sampled after stratification for address quality and urbanity indicator (approximately 20 000 addresses). These address samples were geocoded using the geocoding services from Google and OSM. In general, both geocoding services showed a decrease in the matching rate with decreasing address quality and urbanity. Google showed consistently a higher completeness than OSM (>93 vs. >82%). Also, the cartographic confounding between urban and rural regions was less distinct with Google's geocoding API. Regarding the positional accuracy of the geo-coordinates, Google also showed the smallest deviations from the reference coordinates, with a median of <9 vs. <175.8 m. The cumulative density function derived from the positional accuracy showed for Google that nearly 95% and for OSM 50% of the addresses were geocoded within <50 m of their reference coordinates. The geocoding API from Google is superior to OSM regarding completeness and positional accuracy of the geocoded addresses. On the other hand, Google has several restrictions, such as the limitation of the requests to 2 500 addresses per 24 h and the presentation of the results exclusively on Google Maps, which may complicate the use for scientific purposes. © Georg Thieme Verlag KG Stuttgart · New York.

  10. Learning to Map the Earth and Planets using a Google Earth - based Multi-student Game

    NASA Astrophysics Data System (ADS)

    De Paor, D. G.; Wild, S. C.; Dordevic, M.

    2011-12-01

    We report on progress in developing an interactive geological and geophysical mapping game employing the Google Earth, Google Moon, and Goole Mars virtual globes. Working in groups of four, students represent themselves on the Google Earth surface by selecting an avatar. One of the group drives to each field stop in a model vehicle using game-like controls. When they arrive at a field stop and get out of their field vehicle, students can control their own avatars' movements independently and can communicate with one another by text message. They are geo-fenced and receive automatic messages if they wander off target. Individual movements are logged and stored in a MySQL database for later analysis. Students collaborate on mapping decisions and submit a report to their instructor through a Javascript interface to the Google Earth API. Unlike real mapping, students are not restricted by geographic access and can engage in comparative mapping on different planets. Using newly developed techniques, they can also explore and map the sub-surface down to the core-mantle boundary. Virtual specimens created with a 3D scanner, Gigapan images of outcrops, and COLLADA models of mantle structures such as subducted lithospheric slabs all contribute to an engaging learning experience.

  11. MaRGEE: Move and Rotate Google Earth Elements

    NASA Astrophysics Data System (ADS)

    Dordevic, Mladen M.; Whitmeyer, Steven J.

    2015-12-01

    Google Earth is recognized as a highly effective visualization tool for geospatial information. However, there remain serious limitations that have hindered its acceptance as a tool for research and education in the geosciences. One significant limitation is the inability to translate or rotate geometrical elements on the Google Earth virtual globe. Here we present a new JavaScript web application to "Move and Rotate Google Earth Elements" (MaRGEE). MaRGEE includes tools to simplify, translate, and rotate elements, add intermediate steps to a transposition, and batch process multiple transpositions. The transposition algorithm uses spherical geometry calculations, such as the haversine formula, to accurately reposition groups of points, paths, and polygons on the Google Earth globe without distortion. Due to the imminent deprecation of the Google Earth API and browser plugin, MaRGEE uses a Google Maps interface to facilitate and illustrate the transpositions. However, the inherent spatial distortions that result from the Google Maps Web Mercator projection are not apparent once the transposed elements are saved as a KML file and opened in Google Earth. Potential applications of the MaRGEE toolkit include tectonic reconstructions, the movements of glaciers or thrust sheets, and time-based animations of other large- and small-scale geologic processes.

  12. Detecting Runtime Anomalies in AJAX Applications through Trace Analysis

    DTIC Science & Technology

    2011-08-10

    statements by adding the instrumentation to the GWT UI classes, leaving the user code untouched. Some content management frameworks such as Drupal [12...Google web toolkit.” http://code.google.com/webtoolkit/. [12] “Form generation – drupal api.” http://api.drupal.org/api/group/form_api/6. 9

  13. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    NASA Astrophysics Data System (ADS)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its early stages, high school and college teachers, as well as researchers have expressed interest in using and extending these tools for visualizing and interacting with data on Earth and other planetary bodies.

  14. Feature Positioning on Google Street View Panoramas

    NASA Astrophysics Data System (ADS)

    Tsai, V. J. D.; Chang, C.-T.

    2012-07-01

    Location-based services (LBS) on web-based maps and images have come into real-time since Google launched its Street View imaging services in 2007. This research employs Google Maps API and Web Service, GAE for JAVA, AJAX, Proj4js, CSS and HTML in developing an internet platform for accessing the orientation parameters of Google Street View (GSV) panoramas in order to determine the three dimensional position of interest features that appear on two overlapping panoramas by geometric intersection. A pair of GSV panoramas was examined using known points located on the Library Building of National Chung Hsing University (NCHU) with the root-mean-squared errors of ±0.522m, ±1.230m, and ±5.779m for intersection and ±0.142m, ±1.558m, and ±5.733m for resection in X, Y, and h (elevation), respectively. Potential error sources in GSV positioning were analyzed and illustrated that the errors in Google provided GSV positional parameters dominate the errors in geometric intersection. The developed system is suitable for data collection in establishing LBS applications integrated with Google Maps and Google Earth in traffic sign and infrastructure inventory by adding automatic extraction and matching techniques for points of interest (POI) from GSV panoramas.

  15. Secure and Resilient Cloud Computing for the Department of Defense

    DTIC Science & Technology

    2015-11-16

    platform as a service (PaaS), and software as a service ( SaaS )—that target system administrators, developers, and end-users respectively (see Table 2...interfaces (API) and services Medium Amazon Elastic MapReduce, MathWorks Cloud, Red Hat OpenShift SaaS Full-fledged applications Low Google gMail

  16. GIS tool to locate major Sikh temples in USA

    NASA Astrophysics Data System (ADS)

    Sharma, Saumya

    This tool is a GIS based interactive and graphical user interface tool, which locates the major Sikh temples of USA on a map. This tool is using Java programming language along with MOJO (Map Object Java Object) provided by ESRI that is the organization that provides the GIS software. It also includes some of the integration with Google's API's like Google Translator API. This application will tell users about the origin of Sikhism in India and USA, the major Sikh temples in each state of USA, location, name and detail information through their website. The primary purpose of this application is to make people aware about this religion and culture. This tool will also measure the distance between two temple points in a map and display the result in miles and kilometers. Also, there is an added support to convert each temple's website language from English to Punjabi or any other language using a language convertor tool so that people from different nationalities can understand their culture. By clicking on each point on a map, a new window will pop up showing the picture of the temple and a hyperlink that will redirect to the website of that particular temple .It will also contain links to their dance, music, history, and also a help menu to guide the users to use the software efficiently.

  17. Mars @ ASDC

    NASA Astrophysics Data System (ADS)

    Carraro, Francesco

    "Mars @ ASDC" is a project born with the goal of using the new web technologies to assist researches involved in the study of Mars. This project employs Mars map and javascript APIs provided by Google to visualize data acquired by space missions on the planet. So far, visualization of tracks acquired by MARSIS and regions observed by VIRTIS-Rosetta has been implemented. The main reason for the creation of this kind of tool is the difficulty in handling hundreds or thousands of acquisitions, like the ones from MARSIS, and the consequent difficulty in finding observations related to a particular region. This led to the development of a tool which allows to search for acquisitions either by defining the region of interest through a set of geometrical parameters or by manually selecting the region on the map through a few mouse clicks The system allows the visualization of tracks (acquired by MARSIS) or regions (acquired by VIRTIS-Rosetta) which intersect the user defined region. MARSIS tracks can be visualized both in Mercator and polar projections while the regions observed by VIRTIS can presently be visualized only in Mercator projection. The Mercator projection is the standard map provided by Google. The polar projections are provided by NASA and have been developed to be used in combination with APIs provided by Google The whole project has been developed following the "open source" philosophy: the client-side code which handles the functioning of the web page is written in javascript; the server-side code which executes the searches for tracks or regions is written in PHP and the DB which undergoes the system is MySQL.

  18. How to Display Hazards and other Scientific Data Using Google Maps

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Fee, J. M.

    2007-12-01

    The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) is launching a map-based interface to display hazards information using the Google® Map API (Application Program Interface). Map-based interfaces provide a synoptic view of data, making patterns easier to detect and allowing users to quickly ascertain where hazards are in relation to major population and infrastructure centers. Several map-based interfaces are now simple to run on a web server, providing ideal platforms for sharing information with colleagues, emergency managers, and the public. There are three main steps to making data accessible on a map-based interface; formatting the input data, plotting the data on the map, and customizing the user interface. The presentation, "Creating Geospatial RSS and ATOM feeds for Map-based Interfaces" (Fee and Venezky, this session), reviews key features for map input data. Join us for this presentation on how to plot data in a geographic context and then format the display with images, custom markers, and links to external data. Examples will show how the VHP Volcano Status Map was created and how to plot a field trip with driving directions.

  19. Population resizing on fitness improvement genetic algorithm to optimize promotion visit route based on android and google maps API

    NASA Astrophysics Data System (ADS)

    Listyorini, Tri; Muzid, Syafiul

    2017-06-01

    The promotion team of Muria Kudus University (UMK) has done annual promotion visit to several senior high schools in Indonesia. The visits were done to numbers of schools in Kudus, Jepara, Demak, Rembang and Purwodadi. To simplify the visit, each visit round is limited to 15 (fifteen) schools. However, the team frequently faces some obstacles during the visit, particularly in determining the route that they should take toward the targeted school. It is due to the long distance or the difficult route to reach the targeted school that leads to elongated travel duration and inefficient fuel cost. To solve these problems, the development of a certain application using heuristic genetic algorithm method based on the dynamic of population size or Population Resizing on Fitness lmprovement Genetic Algorithm (PRoFIGA), was done. This android-based application was developed to make the visit easier and to determine a shorter route for the team, hence, the visiting period will be effective and efficient. The result of this research was an android-based application to determine the shortest route by combining heuristic method and Google Maps Application Programming lnterface (API) that display the route options for the team.

  20. Participating in the Geospatial Web: Collaborative Mapping, Social Networks and Participatory GIS

    NASA Astrophysics Data System (ADS)

    Rouse, L. Jesse; Bergeron, Susan J.; Harris, Trevor M.

    In 2005, Google, Microsoft and Yahoo! released free Web mapping applications that opened up digital mapping to mainstream Internet users. Importantly, these companies also released free APIs for their platforms, allowing users to geo-locate and map their own data. These initiatives have spurred the growth of the Geospatial Web and represent spatially aware online communities and new ways of enabling communities to share information from the bottom up. This chapter explores how the emerging Geospatial Web can meet some of the fundamental needs of Participatory GIS projects to incorporate local knowledge into GIS, as well as promote public access and collaborative mapping.

  1. Finding, Weighting and Describing Venues: CSIRO at the 2012 TREC Contextual Suggestion Track

    DTIC Science & Technology

    2012-11-01

    commercial system (namely the Google Places API ), and whether the current experimental setup encourages diversity. The remaining two submissions...baseline systems that rely on the Google Places API and the user reviews it provides, and two more complex systems that incorporate information...from the Foursquare API , and are sensitive to personal preference and time. The remainder of this paper is structured as follows. The next section

  2. Visualize Your Data with Google Fusion Tables

    NASA Astrophysics Data System (ADS)

    Brisbin, K. E.

    2011-12-01

    Google Fusion Tables is a modern data management platform that makes it easy to host, manage, collaborate on, visualize, and publish tabular data online. Fusion Tables allows users to upload their own data to the Google cloud, which they can then use to create compelling and interactive visualizations with the data. Users can view data on a Google Map, plot data in a line chart, or display data along a timeline. Users can share these visualizations with others to explore and discover interesting trends about various types of data, including scientific data such as invasive species or global trends in disease. Fusion Tables has been used by many organizations to visualize a variety of scientific data. One example is the California Redistricting Map created by the LA Times: http://goo.gl/gwZt5 The Pacific Institute and Circle of Blue have used Fusion Tables to map the quality of water around the world: http://goo.gl/T4SX8 The World Resources Institute mapped the threat level of coral reefs using Fusion Tables: http://goo.gl/cdqe8 What attendees will learn in this session: This session will cover all the steps necessary to use Fusion Tables to create a variety of interactive visualizations. Attendees will begin by learning about the various options for uploading data into Fusion Tables, including Shapefile, KML file, and CSV file import. Attendees will then learn how to use Fusion Tables to manage their data by merging it with other data and controlling the permissions of the data. Finally, the session will cover how to create a customized visualization from the data, and share that visualization with others using both Fusion Tables and the Google Maps API.

  3. Extensible Probabilistic Repository Technology (XPRT)

    DTIC Science & Technology

    2004-10-01

    projects, such as, Centaurus , Evidence Data Base (EDB), etc., others were fabricated, such as INS and FED, while others contain data from the open...Google Web Report Unlimited SOAP API News BBC News Unlimited WEB RSS 1.0 Centaurus Person Demographics 204,402 people from 240 countries...objects of the domain ontology map to the various simulated data-sources. For example, the PersonDemographics are stored in the Centaurus database, while

  4. jmzIdentML API: A Java interface to the mzIdentML standard for peptide and protein identification data.

    PubMed

    Reisinger, Florian; Krishna, Ritesh; Ghali, Fawaz; Ríos, Daniel; Hermjakob, Henning; Vizcaíno, Juan Antonio; Jones, Andrew R

    2012-03-01

    We present a Java application programming interface (API), jmzIdentML, for the Human Proteome Organisation (HUPO) Proteomics Standards Initiative (PSI) mzIdentML standard for peptide and protein identification data. The API combines the power of Java Architecture of XML Binding (JAXB) and an XPath-based random-access indexer to allow a fast and efficient mapping of extensible markup language (XML) elements to Java objects. The internal references in the mzIdentML files are resolved in an on-demand manner, where the whole file is accessed as a random-access swap file, and only the relevant piece of XMLis selected for mapping to its corresponding Java object. The APIis highly efficient in its memory usage and can handle files of arbitrary sizes. The APIfollows the official release of the mzIdentML (version 1.1) specifications and is available in the public domain under a permissive licence at http://www.code.google.com/p/jmzidentml/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Optimizing Travel Time to Outpatient Interventional Radiology Procedures in a Multi-Site Hospital System Using a Google Maps Application.

    PubMed

    Mandel, Jacob E; Morel-Ovalle, Louis; Boas, Franz E; Ziv, Etay; Yarmohammadi, Hooman; Deipolyi, Amy; Mohabir, Heeralall R; Erinjeri, Joseph P

    2018-02-20

    The purpose of this study is to determine whether a custom Google Maps application can optimize site selection when scheduling outpatient interventional radiology (IR) procedures within a multi-site hospital system. The Google Maps for Business Application Programming Interface (API) was used to develop an internal web application that uses real-time traffic data to determine estimated travel time (ETT; minutes) and estimated travel distance (ETD; miles) from a patient's home to each a nearby IR facility in our hospital system. Hypothetical patient home addresses based on the 33 cities comprising our institution's catchment area were used to determine the optimal IR site for hypothetical patients traveling from each city based on real-time traffic conditions. For 10/33 (30%) cities, there was discordance between the optimal IR site based on ETT and the optimal IR site based on ETD at non-rush hour time or rush hour time. By choosing to travel to an IR site based on ETT rather than ETD, patients from discordant cities were predicted to save an average of 7.29 min during non-rush hour (p = 0.03), and 28.80 min during rush hour (p < 0.001). Using a custom Google Maps application to schedule outpatients for IR procedures can effectively reduce patient travel time when more than one location providing IR procedures is available within the same hospital system.

  6. From Analysis to Impact: Challenges and Outcomes from Google's Cloud-based Platforms for Analyzing and Leveraging Petapixels of Geospatial Data

    NASA Astrophysics Data System (ADS)

    Thau, D.

    2017-12-01

    For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson

  7. An integrated WebGIS framework for volunteered geographic information and social media in soil and water conservation.

    PubMed

    Werts, Joshua D; Mikhailova, Elena A; Post, Christopher J; Sharp, Julia L

    2012-04-01

    Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.

  8. An Integrated WebGIS Framework for Volunteered Geographic Information and Social Media in Soil and Water Conservation

    NASA Astrophysics Data System (ADS)

    Werts, Joshua D.; Mikhailova, Elena A.; Post, Christopher J.; Sharp, Julia L.

    2012-04-01

    Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.

  9. Google Earth Engine: a new cloud-computing platform for global-scale earth observation data and analysis

    NASA Astrophysics Data System (ADS)

    Moore, R. T.; Hansen, M. C.

    2011-12-01

    Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as well as transparency in data and methods. Methods developed for global processing of MODIS data to map land cover are being adopted for use with Landsat data. Specifically, the MODIS Vegetation Continuous Field product methodology has been applied for mapping forest extent and change at national scales using Landsat time-series data sets. Scaling this method to continental and global scales is enabled by Google Earth Engine computing capabilities. By combining the supervised learning VCF approach with the Landsat archive and cloud computing, unprecedented monitoring of land cover dynamics is enabled.

  10. Using GeoRSS feeds to distribute house renting and selling information based on Google map

    NASA Astrophysics Data System (ADS)

    Nong, Yu; Wang, Kun; Miao, Lei; Chen, Fei

    2007-06-01

    Geographically Encoded Objects RSS (GeoRSS) is a way to encode location in RSS feeds. RSS is a widely supported format for syndication of news and weblogs, and is extendable to publish any sort of itemized data. When Weblogs explode since RSS became new portals, Geo-tagged feed is necessary to show the location that story tells. Geographically Encoded Objects adopts the core of RSS framework, making itself the map annotations specified in the RSS XML format. The case studied illuminates that GeoRSS could be maximally concise in representation and conception, so it's simple to manipulate generation and then mashup GeoRSS feeds with Google Map through API to show the real estate information with other attribute in the information window. After subscribe to feeds of concerned subjects, users could easily check for new bulletin showing on map through syndication. The primary design goal of GeoRSS is to make spatial data creation as easy as regular Web content development. However, it does more for successfully bridging the gap between traditional GIS professionals and amateurs, Web map hackers, and numerous services that enable location-based content for its simplicity and effectiveness.

  11. How Accurately Can the Google Web Speech API Recognize and Transcribe Japanese L2 English Learners' Oral Production?

    ERIC Educational Resources Information Center

    Ashwell, Tim; Elam, Jesse R.

    2017-01-01

    The ultimate aim of our research project was to use the Google Web Speech API to automate scoring of elicited imitation (EI) tests. However, in order to achieve this goal, we had to take a number of preparatory steps. We needed to assess how accurate this speech recognition tool is in recognizing native speakers' production of the test items; we…

  12. Signalling maps in cancer research: construction and data analysis

    PubMed Central

    Kondratova, Maria; Sompairac, Nicolas; Barillot, Emmanuel; Zinovyev, Andrei

    2018-01-01

    Abstract Generation and usage of high-quality molecular signalling network maps can be augmented by standardizing notations, establishing curation workflows and application of computational biology methods to exploit the knowledge contained in the maps. In this manuscript, we summarize the major aims and challenges of assembling information in the form of comprehensive maps of molecular interactions. Mainly, we share our experience gained while creating the Atlas of Cancer Signalling Network. In the step-by-step procedure, we describe the map construction process and suggest solutions for map complexity management by introducing a hierarchical modular map structure. In addition, we describe the NaviCell platform, a computational technology using Google Maps API to explore comprehensive molecular maps similar to geographical maps and explain the advantages of semantic zooming principles for map navigation. We also provide the outline to prepare signalling network maps for navigation using the NaviCell platform. Finally, several examples of cancer high-throughput data analysis and visualization in the context of comprehensive signalling maps are presented. PMID:29688383

  13. Reaching the Next Generation of College Students via Their Digital Devices.

    NASA Astrophysics Data System (ADS)

    Whitmeyer, S. J.; De Paor, D. G.; Bentley, C.

    2015-12-01

    Current college students attended school during a decade in which many school districts banned cellphones from the classroom or even from school grounds. These students are used to being told to put away their mobile devices and concentrate on traditional classroom activities such as watching PowerPoint presentations or calculating with pencil and paper. However, due to a combination of parental security concerns and recent education research, schools are rapidly changing policy and embracing mobile devices for ubiquitous learning opportunities inside and outside of the classroom. Consequently, many of the next generation of college students will have expectations of learning via mobile technology. We have developed a range of digital geology resources to aid mobile-based geoscience education at college level, including mapping on iPads and other tablets, "crowd-sourced" field projects, augmented reality-supported asynchronous field classes, 3D and 4D split-screen virtual reality tours, macroscopic and microscopic gigapixel imagery, 360° panoramas, assistive devices for inclusive field education, and game-style educational challenges. Class testing of virtual planetary tours shows modest short-term learning gains, but more work is needed to ensure long-term retention. Many of our resources rely on the Google Earth browser plug-in and application program interface (API). Because of security concerns, browser plug-ins in general are being phased out and the Google Earth API will not be supported in future browsers. However, a new plug-in-free API is promised by Google and an alternative open-source virtual globe called Cesium is undergoing rapid development. It already supports the main aspects of Keyhole Markup Language and has features of significant benefit to geoscience, including full support on mobile devices and sub-surface viewing and touring. The research team includes: Heather Almquist, Stephen Burgin, Cinzia Cervato, Filis Coba, Chloe Constants, Gene Cooper, Mladen Dordevic, Marissa Dudek, Brandon Fitzwater, Bridget Gomez, Tyler Hansen, Paul Karabinos, Terry Pavlis, Jen Piatek, Alan Pitts, Robin Rohrback, Bill Richards, Caroline Robinson, Jeff Rollins, Jeff Ryan, Ron Schott, Kristen St. John, and Barb Tewksbury. Supported by NSF DUE 1323419 and by Google Geo Curriculum Awards.

  14. Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Gorelick, Noel

    2013-04-01

    The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.

  15. Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Gorelick, N.

    2012-12-01

    The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.

  16. 3D Visualization of near real-time remote-sensing observation for hurricanes field campaign using Google Earth API

    NASA Astrophysics Data System (ADS)

    Li, P.; Turk, J.; Vu, Q.; Knosp, B.; Hristova-Veleva, S. M.; Lambrigtsen, B.; Poulsen, W. L.; Licata, S.

    2009-12-01

    NASA is planning a new field experiment, the Genesis and Rapid Intensification Processes (GRIP), in the summer of 2010 to better understand how tropical storms form and develop into major hurricanes. The DC-8 aircraft and the Global Hawk Unmanned Airborne System (UAS) will be deployed loaded with instruments for measurements including lightning, temperature, 3D wind, precipitation, liquid and ice water contents, aerosol and cloud profiles. During the field campaign, both the spaceborne and the airborne observations will be collected in real-time and integrated with the hurricane forecast models. This observation-model integration will help the campaign achieve its science goals by allowing team members to effectively plan the mission with current forecasts. To support the GRIP experiment, JPL developed a website for interactive visualization of all related remote-sensing observations in the GRIP’s geographical domain using the new Google Earth API. All the observations are collected in near real-time (NRT) with 2 to 5 hour latency. The observations include a 1KM blended Sea Surface Temperature (SST) map from GHRSST L2P products; 6-hour composite images of GOES IR; stability indices, temperature and vapor profiles from AIRS and AMSU-B; microwave brightness temperature and rain index maps from AMSR-E, SSMI and TRMM-TMI; ocean surface wind vectors, vorticity and divergence of the wind from QuikSCAT; the 3D precipitation structure from TRMM-PR and vertical profiles of cloud and precipitation from CloudSAT. All the NRT observations are collected from the data centers and science facilities at NASA and NOAA, subsetted, re-projected, and composited into hourly or daily data products depending on the frequency of the observation. The data products are then displayed on the 3D Google Earth plug-in at the JPL Tropical Cyclone Information System (TCIS) website. The data products offered by the TCIS in the Google Earth display include image overlays, wind vectors, clickable placemarks with vertical profiles for temperature and water vapors and curtain plots along the satellite tracks. Multiple products can be overlaid with individual adjustable opacity control. The time sequence visualization is supported by calendar and Google Earth time animation. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  17. Application based on ArcObject inquiry and Google maps demonstration to real estate database

    NASA Astrophysics Data System (ADS)

    Hwang, JinTsong

    2007-06-01

    Real estate industry in Taiwan has been flourishing in recent years. To acquire various and abundant information of real estate for sale is the same goal for the consumers and the brokerages. Therefore, before looking at the property, it is important to get all pertinent information possible. Not only this beneficial for the real estate agent as they can provide the sellers with the most information, thereby solidifying the interest of the buyer, but may also save time and the cost of manpower were something out of place. Most of the brokerage sites are aware of utilizes Internet as form of media for publicity however; the contents are limited to specific property itself and the functions of query are mostly just provided searching by condition. This paper proposes a query interface on website which gives function of zone query by spatial analysis for non-GIS users, developing a user-friendly interface with ArcObject in VB6, and query by condition. The inquiry results can show on the web page which is embedded functions of Google Maps and the UrMap API on it. In addition, the demonstration of inquiry results will give the multimedia present way which includes hyperlink to Google Earth with surrounding of the property, the Virtual Reality scene of house, panorama of interior of building and so on. Therefore, the website provides extra spatial solution for query and demonstration abundant information of real estate in two-dimensional and three-dimensional types of view.

  18. Sedimentation and erosion in Lake Diefenbaker, Canada: solutions for shoreline retreat monitoring.

    PubMed

    Sadeghian, Amir; de Boer, Dirk; Lindenschmidt, Karl-Erich

    2017-09-15

    This study looks into sedimentation and erosion rates in Lake Diefenbaker, a prairie reservoir, in Saskatchewan, Canada, which has been in operation since 1968. First, we looked at the historical data in all different formats over the last 70 years, which includes data from more than 20 years before the formation of the lake. The field observations indicate high rates of shoreline erosion, especially in the upstream portion as a potential region for shoreline retreat. Because of the great importance of this waterbody to the province, monitoring sedimentation and erosion rates is necessary for maintaining the quality of water especially after severe floods which are more common due to climate change effects. Second, we used Google Maps Elevation API, a new tool from Google that provides elevation data for cross sections drawn between two points, by drawing 24 cross sections in the upstream area extending 250 m from each bank. This feature from Google can be used as an easy and fast monitoring tool, is free of charge, and provides excellent control capabilities for monitoring changes in cross-sectional profiles.

  19. Map based multimedia tool on Pacific theatre in World War II

    NASA Astrophysics Data System (ADS)

    Pakala Venkata, Devi Prasada Reddy

    Maps have been used for depicting data of all kinds in the educational community for many years. A standout amongst the rapidly changing methods of teaching is through the development of interactive and dynamic maps. The emphasis of the thesis is to develop an intuitive map based multimedia tool, which provides a timeline of battles and events in the Pacific theatre of World War II. The tool contains summaries of major battles and commanders and has multimedia content embedded in it. The primary advantage of this Map tool is that one can quickly know about all the battles and campaigns of the Pacific Theatre by accessing Timeline of Battles in each region or Individual Battles in each region or Summary of each Battle in an interactive way. This tool can be accessed via any standard web browser and motivate the user to know more about the battles involved in the Pacific Theatre. It was made responsive using Google maps API, JavaScript, HTML5 and CSS.

  20. Development, Deployment, and Assessment of Dynamic Geological and Geophysical Models Using the Google Earth APP and API: Implications for Undergraduate Education in the Earth and Planetary Sciences

    NASA Astrophysics Data System (ADS)

    de Paor, D. G.; Whitmeyer, S. J.; Gobert, J.

    2009-12-01

    We previously reported on innovative techniques for presenting data on virtual globes such as Google Earth using emergent Collada models that reveal subsurface geology and geophysics. We here present several new and enhanced models and linked lesson plans to aid deployment in undergraduate geoscience courses, along with preliminary results from our assessment of their effectiveness. The new Collada models are created with Google SketchUp, Bonzai3D, and MeshLab software, and are grouped to cover (i) small scale field mapping areas; (ii) regional scale studies of the North Atlantic Ocean Basin, the Appalachian Orogen, and the Pacific Ring of Fire; and (iii) global scale studies of terrestrial planets, moons, and asteroids. Enhancements include emergent block models with three-dimensional surface topography; models that conserve structural orientation data; interactive virtual specimens; models that animate plate movements on the virtual globe; exploded 3-D views of planetary mantles and cores; and server-generated dynamic KML. We tested volunteer students and professors using Silverback monitoring software, think-aloud verbalizations, and questionnaires designed to assess their understanding of the underlying geo-scientific phenomena. With the aid of a cohort of instructors across the U.S., we are continuing to assess areas in which users encounter difficulties with both the software and geoscientific concepts. Preliminary results suggest that it is easy to overestimate the computer expertise of novice users even when they are content knowledge experts (i.e., instructors), and that a detailed introduction to virtual globe manipulation is essential before moving on to geoscience applications. Tasks that seem trivial to developers may present barriers to non-technical users and technicalities that challenge instructors may block adoption in the classroom. We have developed new models using the Google Earth API which permits enhanced interaction and dynamic feedback and are assessing their relative merits versus the Google Earth APP. Overall, test students and professors value the models very highly. There are clear pedagogical opportunities for using materials such as these to create engaging in-course research opportunities for undergraduates.

  1. An Android based location service using GSMCellID and GPS to obtain a graphical guide to the nearest cash machine

    NASA Astrophysics Data System (ADS)

    Jacobsen, Jurma; Edlich, Stefan

    2009-02-01

    There is a broad range of potential useful mobile location-based applications. One crucial point seems to be to make them available to the public at large. This case illuminates the abilities of Android - the operating system for mobile devices - to fulfill this demand in the mashup way by use of some special geocoding web services and one integrated web service for getting the nearest cash machines data. It shows an exemplary approach for building mobile location-based mashups for everyone: 1. As a basis for reaching as many people as possible the open source Android OS is assumed to spread widely. 2. Everyone also means that the handset has not to be an expensive GPS device. This is realized by re-utilization of the existing GSM infrastructure with the Cell of Origin (COO) method which makes a lookup of the CellID in one of the growing web available CellID databases. Some of these databases are still undocumented and not yet published. Furthermore the Google Maps API for Mobile (GMM) and the open source counterpart OpenCellID are used. The user's current position localization via lookup of the closest cell to which the handset is currently connected to (COO) is not as precise as GPS, but appears to be sufficient for lots of applications. For this reason the GPS user is the most pleased one - for this user the system is fully automated. In contrary there could be some users who doesn't own a GPS cellular. This user should refine his/her location by one click on the map inside of the determined circular region. The users are then shown and guided by a path to the nearest cash machine by integrating Google Maps API with an overlay. Additionally, the GPS user can keep track of him- or herself by getting a frequently updated view via constantly requested precise GPS data for his or her position.

  2. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  3. Interactive Computing and Processing of NASA Land Surface Observations Using Google Earth Engine

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Bell, Jordan

    2016-01-01

    Google's Earth Engine offers a "big data" approach to processing large volumes of NASA and other remote sensing products. h\\ps://earthengine.google.com/ Interfaces include a Javascript or Python-based API, useful for accessing and processing over large periods of record for Landsat and MODIS observations. Other data sets are frequently added, including weather and climate model data sets, etc. Demonstrations here focus on exploratory efforts to perform land surface change detection related to severe weather, and other disaster events.

  4. A Java API for working with PubChem datasets.

    PubMed

    Southern, Mark R; Griffin, Patrick R

    2011-03-01

    PubChem is a public repository of chemical structures and associated biological activities. The PubChem BioAssay database contains assay descriptions, conditions and readouts and biological screening results that have been submitted by the biomedical research community. The PubChem web site and Power User Gateway (PUG) web service allow users to interact with the data and raw files are available via FTP. These resources are helpful to many but there can also be great benefit by using a software API to manipulate the data. Here, we describe a Java API with entity objects mapped to the PubChem Schema and with wrapper functions for calling the NCBI eUtilities and PubChem PUG web services. PubChem BioAssays and associated chemical compounds can then be queried and manipulated in a local relational database. Features include chemical structure searching and generation and display of curve fits from stored dose-response experiments, something that is not yet available within PubChem itself. The aim is to provide researchers with a fast, consistent, queryable local resource from which to manipulate PubChem BioAssays in a database agnostic manner. It is not intended as an end user tool but to provide a platform for further automation and tools development. http://code.google.com/p/pubchemdb.

  5. Northeast India Helminth Parasite Information Database (NEIHPID): Knowledge Base for Helminth Parasites

    PubMed Central

    Debnath, Manish; Kharumnuid, Graciously; Thongnibah, Welfrank; Tandon, Veena

    2016-01-01

    Most metazoan parasites that invade vertebrate hosts belong to three phyla: Platyhelminthes, Nematoda and Acanthocephala. Many of the parasitic members of these phyla are collectively known as helminths and are causative agents of many debilitating, deforming and lethal diseases of humans and animals. The North-East India Helminth Parasite Information Database (NEIHPID) project aimed to document and characterise the spectrum of helminth parasites in the north-eastern region of India, providing host, geographical distribution, diagnostic characters and image data. The morphology-based taxonomic data are supplemented with information on DNA sequences of nuclear, ribosomal and mitochondrial gene marker regions that aid in parasite identification. In addition, the database contains raw next generation sequencing (NGS) data for 3 foodborne trematode parasites, with more to follow. The database will also provide study material for students interested in parasite biology. Users can search the database at various taxonomic levels (phylum, class, order, superfamily, family, genus, and species), or by host, habitat and geographical location. Specimen collection locations are noted as co-ordinates in a MySQL database and can be viewed on Google maps, using Google Maps JavaScript API v3. The NEIHPID database has been made freely available at http://nepiac.nehu.ac.in/index.php PMID:27285615

  6. Northeast India Helminth Parasite Information Database (NEIHPID): Knowledge Base for Helminth Parasites.

    PubMed

    Biswal, Devendra Kumar; Debnath, Manish; Kharumnuid, Graciously; Thongnibah, Welfrank; Tandon, Veena

    2016-01-01

    Most metazoan parasites that invade vertebrate hosts belong to three phyla: Platyhelminthes, Nematoda and Acanthocephala. Many of the parasitic members of these phyla are collectively known as helminths and are causative agents of many debilitating, deforming and lethal diseases of humans and animals. The North-East India Helminth Parasite Information Database (NEIHPID) project aimed to document and characterise the spectrum of helminth parasites in the north-eastern region of India, providing host, geographical distribution, diagnostic characters and image data. The morphology-based taxonomic data are supplemented with information on DNA sequences of nuclear, ribosomal and mitochondrial gene marker regions that aid in parasite identification. In addition, the database contains raw next generation sequencing (NGS) data for 3 foodborne trematode parasites, with more to follow. The database will also provide study material for students interested in parasite biology. Users can search the database at various taxonomic levels (phylum, class, order, superfamily, family, genus, and species), or by host, habitat and geographical location. Specimen collection locations are noted as co-ordinates in a MySQL database and can be viewed on Google maps, using Google Maps JavaScript API v3. The NEIHPID database has been made freely available at http://nepiac.nehu.ac.in/index.php.

  7. Interacting with Petabytes of Earth Science Data using Jupyter Notebooks, IPython Widgets and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T. A.; Granger, B.; Grout, J.; Corlay, S.

    2017-12-01

    The volume of Earth science data gathered from satellites, aircraft, drones, and field instruments continues to increase. For many scientific questions in the Earth sciences, managing this large volume of data is a barrier to progress, as it is difficult to explore and analyze large volumes of data using the traditional paradigm of downloading datasets to a local computer for analysis. Furthermore, methods for communicating Earth science algorithms that operate on large datasets in an easily understandable and reproducible way are needed. Here we describe a system for developing, interacting, and sharing well-documented Earth Science algorithms that combines existing software components: Jupyter Notebook: An open-source, web-based environment that supports documents that combine code and computational results with text narrative, mathematics, images, and other media. These notebooks provide an environment for interactive exploration of data and development of well documented algorithms. Jupyter Widgets / ipyleaflet: An architecture for creating interactive user interface controls (such as sliders, text boxes, etc.) in Jupyter Notebooks that communicate with Python code. This architecture includes a default set of UI controls (sliders, dropboxes, etc.) as well as APIs for building custom UI controls. The ipyleaflet project is one example that offers a custom interactive map control that allows a user to display and manipulate geographic data within the Jupyter Notebook. Google Earth Engine: A cloud-based geospatial analysis platform that provides access to petabytes of Earth science data via a Python API. The combination of Jupyter Notebooks, Jupyter Widgets, ipyleaflet, and Google Earth Engine makes it possible to explore and analyze massive Earth science datasets via a web browser, in an environment suitable for interactive exploration, teaching, and sharing. Using these environments can make Earth science analyses easier to understand and reproducible, which may increase the rate of scientific discoveries and the transition of discoveries into real-world impacts.

  8. The jmzQuantML programming interface and validator for the mzQuantML data standard.

    PubMed

    Qi, Da; Krishna, Ritesh; Jones, Andrew R

    2014-03-01

    The mzQuantML standard from the HUPO Proteomics Standards Initiative has recently been released, capturing quantitative data about peptides and proteins, following analysis of MS data. We present a Java application programming interface (API) for mzQuantML called jmzQuantML. The API provides robust bridges between Java classes and elements in mzQuantML files and allows random access to any part of the file. The API provides read and write capabilities, and is designed to be embedded in other software packages, enabling mzQuantML support to be added to proteomics software tools (http://code.google.com/p/jmzquantml/). The mzQuantML standard is designed around a multilevel validation system to ensure that files are structurally and semantically correct for different proteomics quantitative techniques. In this article, we also describe a Java software tool (http://code.google.com/p/mzquantml-validator/) for validating mzQuantML files, which is a formal part of the data standard. © 2014 The Authors. Proteomics published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Google Maps: You Are Here

    ERIC Educational Resources Information Center

    Jacobsen, Mikael

    2008-01-01

    Librarians use online mapping services such as Google Maps, MapQuest, Yahoo Maps, and others to check traffic conditions, find local businesses, and provide directions. However, few libraries are using one of Google Maps most outstanding applications, My Maps, for the creation of enhanced and interactive multimedia maps. My Maps is a simple and…

  10. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  11. A Java API for working with PubChem datasets

    PubMed Central

    Southern, Mark R.; Griffin, Patrick R.

    2011-01-01

    Summary: PubChem is a public repository of chemical structures and associated biological activities. The PubChem BioAssay database contains assay descriptions, conditions and readouts and biological screening results that have been submitted by the biomedical research community. The PubChem web site and Power User Gateway (PUG) web service allow users to interact with the data and raw files are available via FTP. These resources are helpful to many but there can also be great benefit by using a software API to manipulate the data. Here, we describe a Java API with entity objects mapped to the PubChem Schema and with wrapper functions for calling the NCBI eUtilities and PubChem PUG web services. PubChem BioAssays and associated chemical compounds can then be queried and manipulated in a local relational database. Features include chemical structure searching and generation and display of curve fits from stored dose–response experiments, something that is not yet available within PubChem itself. The aim is to provide researchers with a fast, consistent, queryable local resource from which to manipulate PubChem BioAssays in a database agnostic manner. It is not intended as an end user tool but to provide a platform for further automation and tools development. Availability: http://code.google.com/p/pubchemdb Contact: southern@scripps.edu PMID:21216779

  12. Towards better digital pathology workflows: programming libraries for high-speed sharpness assessment of Whole Slide Images.

    PubMed

    Ameisen, David; Deroulers, Christophe; Perrier, Valérie; Bouhidel, Fatiha; Battistella, Maxime; Legrès, Luc; Janin, Anne; Bertheau, Philippe; Yunès, Jean-Baptiste

    2014-01-01

    Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow.

  13. Using Google Earth as an innovative tool for community mapping.

    PubMed

    Lefer, Theodore B; Anderson, Matthew R; Fornari, Alice; Lambert, Anastasia; Fletcher, Jason; Baquero, Maria

    2008-01-01

    Maps are used to track diseases and illustrate the social context of health problems. However, commercial mapping software requires special training. This article illustrates how nonspecialists used Google Earth, a free program, to create community maps. The Bronx, New York, is characterized by high levels of obesity and diabetes. Residents and medical students measured the variety and quality of food and exercise sources around a residency training clinic and a student-run free clinic, using Google Earth to create maps with minimal assistance. Locations were identified using street addresses or simply by pointing to them on a map. Maps can be shared via e-mail, viewed online with Google Earth or Google Maps, and the data can be incorporated into other mapping software.

  14. There's An App For That: Planning Ahead for the Solar Eclipse in August 2017

    NASA Astrophysics Data System (ADS)

    Chizek Frouard, Malynda R.; Lesniak, Michael V.; Bell, Steve

    2017-01-01

    With the total solar eclipse of 2017 August 21 over the continental United States approaching, the U.S. Naval Observatory (USNO) on-line Solar Eclipse Computer can now be accessed via an Android application, available on Google Play.Over the course of the eclipse, as viewed from a specific site, several events may be visible: the beginning and ending of the eclipse (first and fourth contacts), the beginning and ending of totality (second and third contacts), the moment of maximum eclipse, sunrise, or sunset. For each of these events, the USNO Solar Eclipse 2017 Android application reports the time, Sun's altitude and azimuth, and the event's position and vertex angles. The app also lists the duration of the total phase, the duration of the eclipse, the magnitude of the eclipse, and the percent of the Sun obscured for a particular eclipse site.All of the data available in the app comes from the flexible USNO Solar Eclipse Computer Application Programming Interface (API), which produces JavaScript Object Notation (JSON) that can be incorporated into third-party Web sites or custom applications. Additional information is available in the on-line documentation (http://aa.usno.navy.mil/data/docs/api.php).For those who prefer using a traditional data input form, the local circumstances can still be requested at http://aa.usno.navy.mil/data/docs/SolarEclipses.php.In addition the 2017 August 21 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2017.php) consolidates all of the USNO resources for this event, including a Google Map view of the eclipse track designed by Her Majesty's Nautical Almanac Office (HMNAO).Looking further ahead, a 2024 April 8 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2024.php) is also available.

  15. Generating and Visualizing Climate Indices using Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T. A.; Guentchev, G.; Rood, R. B.

    2017-12-01

    Climate change is expected to have largest impacts on regional and local scales. Relevant and credible climate information is needed to support the planning and adaptation efforts in our communities. The volume of climate projections of temperature and precipitation is steadily increasing, as datasets are being generated on finer spatial and temporal grids with an increasing number of ensembles to characterize uncertainty. Despite advancements in tools for querying and retrieving subsets of these large, multi-dimensional datasets, ease of access remains a barrier for many existing and potential users who want to derive useful information from these data, particularly for those outside of the climate modelling research community. Climate indices, that can be derived from daily temperature and precipitation data, such as annual number of frost days or growing season length, can provide useful information to practitioners and stakeholders. For this work the NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP) dataset was loaded into Google Earth Engine, a cloud-based geospatial processing platform. Algorithms that use the Earth Engine API to generate several climate indices were written. The indices were chosen from the set developed by the joint CCl/CLIVAR/JCOMM Expert Team on Climate Change Detection and Indices (ETCCDI). Simple user interfaces were created that allow users to query, produce maps and graphs of the indices, as well as download results for additional analyses. These browser-based interfaces could allow users in low-bandwidth environments to access climate information. This research shows that calculating climate indices from global downscaled climate projection datasets and sharing them widely using cloud computing technologies is feasible. Further development will focus on exposing the climate indices to existing applications via the Earth Engine API, and building custom user interfaces for presenting climate indices to a diverse set of user groups.

  16. Low-cost Tools for Aerial Video Geolocation and Air Traffic Analysis for Delay Reduction Using Google Earth

    NASA Astrophysics Data System (ADS)

    Zetterlind, V.; Pledgie, S.

    2009-12-01

    Low-cost, low-latency, robust geolocation and display of aerial video is a common need for a wide range of earth observing as well as emergency response and security applications. While hardware costs for aerial video collection systems, GPS, and inertial sensors have been decreasing, software costs for geolocation algorithms and reference imagery/DTED remain expensive and highly proprietary. As part of a Federal Small Business Innovative Research project, MosaicATM and EarthNC, Inc have developed a simple geolocation system based on the Google Earth API and Google's 'built-in' DTED and reference imagery libraries. This system geolocates aerial video based on platform and camera position, attitude, and field-of-view metadata using geometric photogrammetric principles of ray-intersection with DTED. Geolocated video can be directly rectified and viewed in the Google Earth API during processing. Work is underway to extend our geolocation code to NASA World Wind for additional flexibility and a fully open-source platform. In addition to our airborne remote sensing work, MosaicATM has developed the Surface Operations Data Analysis and Adaptation (SODAA) tool, funded by NASA Ames, which supports analysis of airport surface operations to optimize aircraft movements and reduce fuel burn and delays. As part of SODAA, MosaicATM and EarthNC, Inc have developed powerful tools to display national airspace data and time-animated 3D flight tracks in Google Earth for 4D analysis. The SODAA tool can convert raw format flight track data, FAA National Flight Data (NFD), and FAA 'Adaptation' airport surface data to a spatial database representation and then to Google Earth KML. The SODAA client provides users with a simple graphical interface through which to generate queries with a wide range of predefined and custom filters, plot results, and export for playback in Google Earth in conjunction with NFD and Adaptation overlays.

  17. A SNP based high-density linkage map of Apis cerana reveals a high recombination rate similar to Apis mellifera.

    PubMed

    Shi, Yuan Yuan; Sun, Liang Xian; Huang, Zachary Y; Wu, Xiao Bo; Zhu, Yong Qiang; Zheng, Hua Jun; Zeng, Zhi Jiang

    2013-01-01

    The Eastern honey bee, Apis cerana Fabricius, is distributed in southern and eastern Asia, from India and China to Korea and Japan and southeast to the Moluccas. This species is also widely kept for honey production besides Apis mellifera. Apis cerana is also a model organism for studying social behavior, caste determination, mating biology, sexual selection, and host-parasite interactions. Few resources are available for molecular research in this species, and a linkage map was never constructed. A linkage map is a prerequisite for quantitative trait loci mapping and for analyzing genome structure. We used the Chinese honey bee, Apis cerana cerana to construct the first linkage map in the Eastern honey bee. F2 workers (N = 103) were genotyped for 126,990 single nucleotide polymorphisms (SNPs). After filtering low quality and those not passing the Mendel test, we obtained 3,000 SNPs, 1,535 of these were informative and used to construct a linkage map. The preliminary map contains 19 linkage groups, we then mapped the 19 linkage groups to 16 chromosomes by comparing the markers to the genome of A. mellfiera. The final map contains 16 linkage groups with a total of 1,535 markers. The total genetic distance is 3,942.7 centimorgans (cM) with the largest linkage group (180 loci) measuring 574.5 cM. Average marker interval for all markers across the 16 linkage groups is 2.6 cM. We constructed a high density linkage map for A. c. cerana with 1,535 markers. Because the map is based on SNP markers, it will enable easier and faster genotyping assays than randomly amplified polymorphic DNA or microsatellite based maps used in A. mellifera.

  18. Web based tools for data manipulation, visualisation and validation with interactive georeferenced graphs

    NASA Astrophysics Data System (ADS)

    Ivankovic, D.; Dadic, V.

    2009-04-01

    Some of oceanographic parameters have to be manually inserted into database; some (for example data from CTD probe) are inserted from various files. All this parameters requires visualization, validation and manipulation from research vessel or scientific institution, and also public presentation. For these purposes is developed web based system, containing dynamic sql procedures and java applets. Technology background is Oracle 10g relational database, and Oracle application server. Web interfaces are developed using PL/SQL stored database procedures (mod PL/SQL). Additional parts for data visualization include use of Java applets and JavaScript. Mapping tool is Google maps API (javascript) and as alternative java applet. Graph is realized as dynamically generated web page containing java applet. Mapping tool and graph are georeferenced. That means that click on some part of graph, automatically initiate zoom or marker onto location where parameter was measured. This feature is very useful for data validation. Code for data manipulation and visualization are partially realized with dynamic SQL and that allow as to separate data definition and code for data manipulation. Adding new parameter in system requires only data definition and description without programming interface for this kind of data.

  19. Creation of a Web-Based GIS Server and Custom Geoprocessing Tools for Enhanced Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Welton, B.; Chouinard, K.; Sultan, M.; Becker, D.; Milewski, A.; Becker, R.

    2010-12-01

    Rising populations in the arid and semi arid parts of the World are increasing the demand for fresh water supplies worldwide. Many data sets needed for assessment of hydrologic applications across vast regions of the world are expensive, unpublished, difficult to obtain, or at varying scales which complicates their use. Fortunately, this situation is changing with the development of global remote sensing datasets and web-based platforms such as GIS Server. GIS provides a cost effective vehicle for comparing, analyzing, and querying a variety of spatial datasets as geographically referenced layers. We have recently constructed a web-based GIS, that incorporates all relevant geological, geochemical, geophysical, and remote sensing data sets that were readily used to identify reservoir types and potential well locations on local and regional scales in various tectonic settings including: (1) extensional environment (Red Sea rift), (2) transcurrent fault system (Najd Fault in the Arabian-Nubian Shield), and (3) compressional environments (Himalayas). The web-based GIS could also be used to detect spatial and temporal trends in precipitation, recharge, and runoff in large watersheds on local, regional, and continental scales. These applications were enabled through the construction of a web-based ArcGIS Server with Google Map’s interface and the development of customized geoprocessing tools. ArcGIS Server provides out-of-the-box setups that are generic in nature. This platform includes all of the standard web based GIS tools (e.g. pan, zoom, identify, search, data querying, and measurement). In addition to the standard suite of tools provided by ArcGIS Server an additional set of advanced data manipulation and display tools was also developed to allow for a more complete and customizable view of the area of interest. The most notable addition to the standard GIS Server tools is the custom on-demand geoprocessing tools (e.g., graph, statistical functions, custom raster creation, profile, TRMM). The generation of a wide range of derivative maps (e.g., buffer zone, contour map, graphs, temporal rainfall distribution maps) from various map layers (e.g., geologic maps, geophysics, satellite images) allows for more user flexibility. The use of these tools along with Google Map’s API which enables the website user to utilize high quality GeoEye 2 images provide by Google in conjunction with our data, creates a more complete image of the area being observed and allows for custom derivative maps to be created in the field and viewed immediately on the web, processes that were restricted to offline databases.

  20. Towards better digital pathology workflows: programming libraries for high-speed sharpness assessment of Whole Slide Images

    PubMed Central

    2014-01-01

    Background Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Methods Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. Results We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Conclusions Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow. PMID:25565494

  1. In Pursuit of Agile Acquisition: Are We There Yet?

    DTIC Science & Technology

    2013-03-01

    digital mapping capabilities like Google , 71Microsoft,72 and Wikimapia,73 are readily obtainable in the commercial marketplace. This knowledge...Fox. Defense Acquisition Reform, 14. 69 Ibid., 8. 70 XBRADTC, “Army Acquisition Woes,” Bring the Heat Bring the Stupid , entry posted May 1, 2011...https://xbradtc.wordpress.com/2011/05/01/Army-acquisition-woes/ (accessed on December 5, 2012). 71 Google Maps, http://maps.google.com/maps (accessed

  2. Be-safe travel, a web-based geographic application to explore safe-route in an area

    NASA Astrophysics Data System (ADS)

    Utamima, Amalia; Djunaidy, Arif

    2017-08-01

    In large cities in developing countries, the various forms of criminality are often found. For instance, the most prominent crimes in Surabaya, Indonesia is 3C, that is theft with violence (curas), theft by weighting (curat), and motor vehicle theft (curanmor). 3C case most often occurs on the highway and residential areas. Therefore, new entrants in an area should be aware of these kind of crimes. Route Planners System or route planning system such as Google Maps only consider the shortest distance in the calculation of the optimal route. The selection of the optimal path in this study not only consider the shortest distance, but also involves other factors, namely the security level. This research considers at the need for an application to recommend the safest road to be passed by the vehicle passengers while drive an area. This research propose Be-Safe Travel, a web-based application using Google API that can be accessed by people who like to drive in an area, but still lack of knowledge of the pathways which are safe from crime. Be-Safe Travel is not only useful for the new entrants, but also useful for delivery courier of valuables goods to go through the safest streets.

  3. A Knowledge Portal and Collaboration Environment for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    D'Agnese, F. A.

    2008-12-01

    Earth Knowledge is developing a web-based 'Knowledge Portal and Collaboration Environment' that will serve as the information-technology-based foundation of a modular Internet-based Earth-Systems Monitoring, Analysis, and Management Tool. This 'Knowledge Portal' is essentially a 'mash- up' of web-based and client-based tools and services that support on-line collaboration, community discussion, and broad public dissemination of earth and environmental science information in a wide-area distributed network. In contrast to specialized knowledge-management or geographic-information systems developed for long- term and incremental scientific analysis, this system will exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize existing environmental datasets using Google Earth and Google Maps. An early form of these tools and services is being used by Earth Knowledge to facilitate the investigations and conversations of scientists, resource managers, and citizen-stakeholders addressing water resource sustainability issues in the Great Basin region of the desert southwestern United States. These ongoing projects will serve as use cases for the further development of this information-technology infrastructure. This 'Knowledge Portal' will accelerate the deployment of Earth- system data and information into an operational knowledge management system that may be used by decision-makers concerned with stewardship of water resources in the American Desert Southwest.

  4. An application of traveling salesman problem using the improved genetic algorithm on android google maps

    NASA Astrophysics Data System (ADS)

    Narwadi, Teguh; Subiyanto

    2017-03-01

    The Travelling Salesman Problem (TSP) is one of the best known NP-hard problems, which means that no exact algorithm to solve it in polynomial time. This paper present a new variant application genetic algorithm approach with a local search technique has been developed to solve the TSP. For the local search technique, an iterative hill climbing method has been used. The system is implemented on the Android OS because android is now widely used around the world and it is mobile system. It is also integrated with Google API that can to get the geographical location and the distance of the cities, and displays the route. Therefore, we do some experimentation to test the behavior of the application. To test the effectiveness of the application of hybrid genetic algorithm (HGA) is compare with the application of simple GA in 5 sample from the cities in Central Java, Indonesia with different numbers of cities. According to the experiment results obtained that in the average solution HGA shows in 5 tests out of 5 (100%) is better than simple GA. The results have shown that the hybrid genetic algorithm outperforms the genetic algorithm especially in the case with the problem higher complexity.

  5. Online Public Access Catalog: The Google Maps of the Library World

    ERIC Educational Resources Information Center

    Bailey, Kieren

    2011-01-01

    What do Google Maps and a library's Online Public Access Catalog (OPAC) have in common? Google Maps provides users with all the information they need for a trip in one place; users can get directions and find out what attractions, hotels, and restaurants are close by. Librarians must find the ultimate OPAC that will provide, in one place, all the…

  6. Solar Eclipse Computer API: Planning Ahead for August 2017

    NASA Astrophysics Data System (ADS)

    Bartlett, Jennifer L.; Chizek Frouard, Malynda; Lesniak, Michael V.; Bell, Steve

    2016-01-01

    With the total solar eclipse of 2017 August 21 over the continental United States approaching, the U.S. Naval Observatory (USNO) on-line Solar Eclipse Computer can now be accessed via an application programming interface (API). This flexible interface returns local circumstances for any solar eclipse in JavaScript Object Notation (JSON) that can be incorporated into third-party Web sites or applications. For a given year, it can also return a list of solar eclipses that can be used to build a more specific request for local circumstances. Over the course of a particular eclipse as viewed from a specific site, several events may be visible: the beginning and ending of the eclipse (first and fourth contacts), the beginning and ending of totality (second and third contacts), the moment of maximum eclipse, sunrise, or sunset. For each of these events, the USNO Solar Eclipse Computer reports the time, Sun's altitude and azimuth, and the event's position and vertex angles. The computer also reports the duration of the total phase, the duration of the eclipse, the magnitude of the eclipse, and the percent of the Sun obscured for a particular eclipse site. On-line documentation for using the API-enabled Solar Eclipse Computer, including sample calls, is available (http://aa.usno.navy.mil/data/docs/api.php). The same Web page also describes how to reach the Complete Sun and Moon Data for One Day, Phases of the Moon, Day and Night Across the Earth, and Apparent Disk of a Solar System Object services using API calls.For those who prefer using a traditional data input form, local circumstances can still be requested that way at http://aa.usno.navy.mil/data/docs/SolarEclipses.php. In addition, the 2017 August 21 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2017.php) consolidates all of the USNO resources for this event, including a Google Map view of the eclipse track designed by Her Majesty's Nautical Almanac Office (HMNAO). Looking further ahead, a 2024 April 8 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2024.php) is also available.

  7. Using Web Speech Technology with Language Learning Applications

    ERIC Educational Resources Information Center

    Daniels, Paul

    2015-01-01

    In this article, the author presents the history of human-to-computer interaction based upon the design of sophisticated computerized speech recognition algorithms. Advancements such as the arrival of cloud-based computing and software like Google's Web Speech API allows anyone with an Internet connection and Chrome browser to take advantage of…

  8. Developing an Approach to Prioritize River Restoration using Data Extracted from Flood Risk Information System Databases.

    NASA Astrophysics Data System (ADS)

    Vimal, S.; Tarboton, D. G.; Band, L. E.; Duncan, J. M.; Lovette, J. P.; Corzo, G.; Miles, B.

    2015-12-01

    Prioritizing river restoration requires information on river geometry. In many states in the US detailed river geometry has been collected for floodplain mapping and is available in Flood Risk Information Systems (FRIS). In particular, North Carolina has, for its 100 Counties, developed a database of numerous HEC-RAS models which are available through its Flood Risk Information System (FRIS). These models that include over 260 variables were developed and updated by numerous contractors. They contain detailed surveyed or LiDAR derived cross-sections and modeled flood extents for different extreme event return periods. In this work, over 4700 HEC-RAS models' data was integrated and upscaled to utilize detailed cross-section information and 100-year modelled flood extent information to enable river restoration prioritization for the entire state of North Carolina. We developed procedures to extract geomorphic properties such as entrenchment ratio, incision ratio, etc. from these models. Entrenchment ratio quantifies the vertical containment of rivers and thereby their vulnerability to flooding and incision ratio quantifies the depth per unit width. A map of entrenchment ratio for the whole state was derived by linking these model results to a geodatabase. A ranking of highly entrenched counties enabling prioritization for flood allowance and mitigation was obtained. The results were shared through HydroShare and web maps developed for their visualization using Google Maps Engine API.

  9. Everglades Ecological Forecasting II: Utilizing NASA Earth Observations to Enhance the Capabilities of Everglades National Park to Monitor & Predict Mangrove Extent to Aid Current Restoration Efforts

    NASA Technical Reports Server (NTRS)

    Kirk, Donnie; Wolfe, Amy; Ba, Adama; Nyquist, Mckenzie; Rhodes, Tyler; Toner, Caitlin; Cabosky, Rachel; Gotschalk, Emily; Gregory, Brad; Kendall, Candace

    2016-01-01

    Mangroves act as a transition zone between fresh and salt water habitats by filtering and indicating salinity levels along the coast of the Florida Everglades. However, dredging and canals built in the early 1900s depleted the Everglades of much of its freshwater resources. In an attempt to assist in maintaining the health of threatened habitats, efforts have been made within Everglades National Park to rebalance the ecosystem and adhere to sustainably managing mangrove forests. The Everglades Ecological Forecasting II team utilized Google Earth Engine API and satellite imagery from Landsat 5, 7, and 8 to continuously create land-change maps over a 25 year period, and to allow park officials to continue producing maps in the future. In order to make the process replicable for project partners at Everglades National Park, the team was able to conduct a supervised classification approach to display mangrove regions in 1995, 2000, 2005, 2010 and 2015. As freshwater was depleted, mangroves encroached further inland and freshwater marshes declined. The current extent map, along with transition maps helped create forecasting models that show mangrove encroachment further inland in the year 2030 as well. This project highlights the changes to the Everglade habitats in relation to a changing climate and hydrological changes throughout the park.

  10. Assessing natural hazard risk using images and data

    NASA Astrophysics Data System (ADS)

    Mccullough, H. L.; Dunbar, P. K.; Varner, J. D.; Mungov, G.

    2012-12-01

    Photographs and other visual media provide valuable pre- and post-event data for natural hazard assessment. Scientific research, mitigation, and forecasting rely on visual data for risk analysis, inundation mapping and historic records. Instrumental data only reveal a portion of the whole story; photographs explicitly illustrate the physical and societal impacts from the event. Visual data is rapidly increasing as the availability of portable high resolution cameras and video recorders becomes more attainable. Incorporating these data into archives ensures a more complete historical account of events. Integrating natural hazards data, such as tsunami, earthquake and volcanic eruption events, socio-economic information, and tsunami deposits and runups along with images and photographs enhances event comprehension. Global historic databases at NOAA's National Geophysical Data Center (NGDC) consolidate these data, providing the user with easy access to a network of information. NGDC's Natural Hazards Image Database (ngdc.noaa.gov/hazardimages) was recently improved to provide a more efficient and dynamic user interface. It uses the Google Maps API and Keyhole Markup Language (KML) to provide geographic context to the images and events. Descriptive tags, or keywords, have been applied to each image, enabling easier navigation and discovery. In addition, the Natural Hazards Map Viewer (maps.ngdc.noaa.gov/viewers/hazards) provides the ability to search and browse data layers on a Mercator-projection globe with a variety of map backgrounds. This combination of features creates a simple and effective way to enhance our understanding of hazard events and risks using imagery.

  11. Monitoring Global Precipitation through UCI CHRS's RainMapper App on Mobile Devices

    NASA Astrophysics Data System (ADS)

    Nguyen, P.; Huynh, P.; Braithwaite, D.; Hsu, K. L.; Sorooshian, S.

    2014-12-01

    The Water and Development Information for Arid Lands-a Global Network (G-WADI) Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks—Cloud Classification System (PERSIANN-CCS) GeoServer has been developed through a collaboration between the Center for Hydrometeorology and Remote Sensing (CHRS) at the University of California, Irvine (UCI) and the UNESCO's International Hydrological Program (IHP). G-WADI PERSIANN-CCS GeoServer provides near real-time high resolution (0.04o, approx 4km) global (60oN - 60oS) satellite precipitation estimated by the PERSIANN-CCS algorithm developed by the scientists at CHRS. The G-WADI PERSIANN-CCS GeoServer utilizes the open-source MapServer software from the University of Minnesota to provide a user-friendly web-based mapping and visualization of satellite precipitation data. Recent efforts have been made by the scientists at CHRS to provide free on-the-go access to the PERSIANN-CCS precipitation data through an application named RainMapper for mobile devices. RainMapper provides visualization of global satellite precipitation of the most recent 3, 6, 12, 24, 48 and 72-hour periods overlaid with various basemaps. RainMapper uses the Google maps application programing interface (API) and embedded global positioning system (GPS) access to better monitor the global precipitation data on mobile devices. Functionalities include using geographical searching with voice recognition technologies make it easy for the user to explore near real-time precipitation in a certain location. RainMapper also allows for conveniently sharing the precipitation information and visualizations with the public through social networks such as Facebook and Twitter. RainMapper is available for iOS and Android devices and can be downloaded (free) from the App Store and Google Play. The usefulness of RainMapper was demonstrated through an application in tracking the evolution of the recent Rammasun Typhoon over the Philippines in mid July 2014.

  12. Using Clouds for MapReduce Measurement Assignments

    ERIC Educational Resources Information Center

    Rabkin, Ariel; Reiss, Charles; Katz, Randy; Patterson, David

    2013-01-01

    We describe our experiences teaching MapReduce in a large undergraduate lecture course using public cloud services and the standard Hadoop API. Using the standard API, students directly experienced the quality of industrial big-data tools. Using the cloud, every student could carry out scalability benchmarking assignments on realistic hardware,…

  13. SpaceTime Environmental Image Information for Scene Understanding

    DTIC Science & Technology

    2016-04-01

    public Internet resources such as Google,65 MapQuest,66 Bing,67 and Yahoo Maps.68 Approved for public release; distribution unlimited. 9 Table 3...azimuth angle 3 Terrain and location: USACE AGC — Satellite/aerial imagery and terrain analysis 4 Terrain and location: Google, MapQuest, Bing, Yahoo ...Maps. [accessed 2015 Dec]. https://www.bing.com/maps/. 68. YAHOO ! Maps. [accessed 2015 Dec]. https://maps.yahoo.com/b/. 69. 557th Weather Wing. US

  14. Google Maps offers a new way to evaluate claudication.

    PubMed

    Khambati, Husain; Boles, Kim; Jetty, Prasad

    2017-05-01

    Accurate determination of walking capacity is important for the clinical diagnosis and management plan for patients with peripheral arterial disease. The current "gold standard" of measurement is walking distance on a treadmill. However, treadmill testing is not always reflective of the patient's natural walking conditions, and it may not be fully accessible in every vascular clinic. The objective of this study was to determine whether Google Maps, the readily available GPS-based mapping tool, offers an accurate and accessible method of evaluating walking distances in vascular claudication patients. Patients presenting to the outpatient vascular surgery clinic between November 2013 and April 2014 at the Ottawa Hospital with vasculogenic calf, buttock, and thigh claudication symptoms were identified and prospectively enrolled in our study. Onset of claudication symptoms and maximal walking distance (MWD) were evaluated using four tools: history; Walking Impairment Questionnaire (WIQ), a validated claudication survey; Google Maps distance calculator (patients were asked to report their daily walking routes on the Google Maps-based tool runningmap.com, and walking distances were calculated accordingly); and treadmill testing for onset of symptoms and MWD, recorded in a double-blinded fashion. Fifteen patients were recruited for the study. Determination of walking distances using Google Maps proved to be more accurate than by both clinical history and WIQ, correlating highly with the gold standard of treadmill testing for both claudication onset (r = .805; P < .001) and MWD (r = .928; P < .0001). In addition, distances were generally under-reported on history and WIQ. The Google Maps tool was also efficient, with reporting times averaging below 4 minutes. For vascular claudicants with no other walking limitations, Google Maps is a promising new tool that combines the objective strengths of the treadmill test and incorporates real-world walking environments. It offers an accurate, efficient, inexpensive, and readily accessible way to assess walking distances in patients with peripheral vascular disease. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  15. A Driving Cycle Detection Approach Using Map Service API

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Lei; Gonder, Jeffrey D

    Following advancements in smartphone and portable global positioning system (GPS) data collection, wearable GPS data have realized extensive use in transportation surveys and studies. The task of detecting driving cycles (driving or car-mode trajectory segments) from wearable GPS data has been the subject of much research. Specifically, distinguishing driving cycles from other motorized trips (such as taking a bus) is the main research problem in this paper. Many mode detection methods only focus on raw GPS speed data while some studies apply additional information, such as geographic information system (GIS) data, to obtain better detection performance. Procuring and maintaining dedicatedmore » road GIS data are costly and not trivial, whereas the technical maturity and broad use of map service application program interface (API) queries offers opportunities for mode detection tasks. The proposed driving cycle detection method takes advantage of map service APIs to obtain high-quality car-mode API route information and uses a trajectory segmentation algorithm to find the best-matched API route. The car-mode API route data combined with the actual route information, including the actual mode information, are used to train a logistic regression machine learning model, which estimates car modes and non-car modes with probability rates. The experimental results show promise for the proposed method's ability to detect vehicle mode accurately.« less

  16. Fast segmentation of satellite images using SLIC, WebGL and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Donchyts, Gennadii; Baart, Fedor; Gorelick, Noel; Eisemann, Elmar; van de Giesen, Nick

    2017-04-01

    Google Earth Engine (GEE) is a parallel geospatial processing platform, which harmonizes access to petabytes of freely available satellite images. It provides a very rich API, allowing development of dedicated algorithms to extract useful geospatial information from these images. At the same time, modern GPUs provide thousands of computing cores, which are mostly not utilized in this context. In the last years, WebGL became a popular and well-supported API, allowing fast image processing directly in web browsers. In this work, we will evaluate the applicability of WebGL to enable fast segmentation of satellite images. A new implementation of a Simple Linear Iterative Clustering (SLIC) algorithm using GPU shaders will be presented. SLIC is a simple and efficient method to decompose an image in visually homogeneous regions. It adapts a k-means clustering approach to generate superpixels efficiently. While this approach will be hard to scale, due to a significant amount of data to be transferred to the client, it should significantly improve exploratory possibilities and simplify development of dedicated algorithms for geoscience applications. Our prototype implementation will be used to improve surface water detection of the reservoirs using multispectral satellite imagery.

  17. Fine mapping implicates two immunity genes in larval resistance to the honey bee brood fungal disease, Chalkbrood

    USDA-ARS?s Scientific Manuscript database

    Chalkbrood infection of honey bee (Apis mellifera) brood by the fungus Ascosphaera apis results in fatal encapsulation of susceptible larvae with a mycelial coat. Recent QTL analysis indicates that some level of physiological resistance exists in individual larvae. We performed a fine mapping anal...

  18. Global Analysis of River Planform Change using the Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Bryk, A.; Dietrich, W. E.; Gorelick, N.; Sargent, R.; Braudrick, C. A.

    2014-12-01

    Geomorphologists have historically tracked river dynamics using a combination of maps, aerial photographs, and the stratigraphic record. Although stratigraphic records can extend into deep time, maps and aerial photographs often confine our record of change to sparse measurements over the last ~80 years and in some cases much less time. For the first time Google's Earth Engine (GEE) cloud based platform allows researchers the means to analyze quantitatively the pattern and pace of river channel change over the last 30 years with high temporal resolution across the entire planet. The GEE provides an application programing interface (API) that enables quantitative analysis of various data sets including the entire Landsat L1T archive. This allows change detection for channels wider than about 150 m over 30 years of successive, georeferenced imagery. Qualitatively, it becomes immediately evident that the pace of channel morphodynamics for similar planforms varies by orders of magnitude across the planet and downstream along individual rivers. To quantify these rates of change and to explore their controls we have developed methods for differentiating channels from floodplain along large alluvial rivers. We introduce a new metric of morphodynamics: the ratio of eroded area to channel area per unit time, referred to as "M". We also keep track of depositional areas resulting from channel shifting. To date our quantitative analysis has focused on rivers in the Andean foreland. Our analysis shows channel bank erosion rates, M, varies by orders of magnitude for these rivers, from 0 to ~0.25 yr-1, yet these rivers have essentially identical curvature and sinuosity and are visually indistinguishable. By tracking both bank paths in time, we find that, for some meandering rivers, a significant fraction of new floodplain is produced through outer-bank accretion rather than point bar deposition. This process is perhaps more important in generating floodplain stratigraphy than previously recognized. These initial findings indicate a new set of quantitative observations will emerge to further test and advance morphodynamic theory. The Google Earth Engine offers the opportunity to explore river morphodynamics on an unprecedented scale and provides a powerful tool for addressing fundamental questions in river morphodynamics.

  19. Google Earth and Geo Applications: A Toolset for Viewing Earth's Geospatial Information

    NASA Astrophysics Data System (ADS)

    Tuxen-Bettman, K.

    2016-12-01

    Earth scientists measure and derive fundamental data that can be of broad general interest to the public and policy makers. Yet, one of the challenges that has always faced the Earth science community is how to present their data and findings in an easy-to-use and compelling manner. Google's Geo Tools offer an efficient and dynamic way for scientists, educators, journalists and others to both access data and view or tell stories in a dynamic three-dimensional geospatial context. Google Earth in particular provides a dense canvas of satellite imagery on which can be viewed rich vector and raster datasets using the medium of Keyhole Markup Language (KML). Through KML, Google Earth can combine the analytical capabilities of Earth Engine, collaborative mapping of My Maps, and storytelling of Tour Builder and more to make Google's Geo Applications a coherent suite of tools for exploring our planet.https://earth.google.com/https://earthengine.google.com/https://mymaps.google.com/https://tourbuilder.withgoogle.com/https://www.google.com/streetview/

  20. A New Metazoan Recombination Rate Record and Consistently High Recombination Rates in the Honey Bee Genus Apis Accompanied by Frequent Inversions but Not Translocations.

    PubMed

    Rueppell, Olav; Kuster, Ryan; Miller, Katelyn; Fouks, Bertrand; Rubio Correa, Sara; Collazo, Juan; Phaincharoen, Mananya; Tingek, Salim; Koeniger, Nikolaus

    2016-12-01

    Western honey bees (Apis mellifera) far exceed the commonly observed 1–2 meiotic recombination events per chromosome and exhibit the highest Metazoan recombination rate (20 cM/Mb) described thus far. However, the reasons for this exceptional rate of recombination are not sufficiently understood. In a comparative study, we report on the newly constructed genomic linkage maps of Apis florea and Apis dorsata that represent the two honey bee lineages without recombination rate estimates so far. Each linkage map was generated de novo, based on SNP genotypes of haploid male offspring of a single female. The A. florea map spans 4,782 cM with 1,279 markers in 16 linkage groups. The A. dorsata map is 5,762 cM long and contains 1,189 markers in 16 linkage groups. Respectively, these map sizes result in average recombination rate estimates of 20.8 and 25.1 cM/Mb. Synteny analyses indicate that frequent intra-chromosomal rearrangements but no translocations among chromosomes accompany the high rates of recombination during the independent evolution of the three major honey bee lineages. Our results imply a common cause for the evolution of very high recombination rates in Apis. Our findings also suggest that frequent homologous recombination during meiosis might increase ectopic recombination and rearrangements within but not between chromosomes. It remains to be investigated whether the resulting inversions may have been important in the evolutionary differentiation between honey bee species.

  1. KSC-2013-3233

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – Google used an assortment of vehicles to precisely map NASA's Kennedy Space Center in Florida to be featured on the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Google used a car, tricycle and pushcart to maneuver around the center and through some of its facilities. Photo credit: Google/Wendy Wang

  2. Recent Advances in Geospatial Visualization with the New Google Earth

    NASA Astrophysics Data System (ADS)

    Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.

    2017-12-01

    Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.

  3. Teaching Young Adults with Intellectual and Developmental Disabilities Community-Based Navigation Skills to Take Public Transportation.

    PubMed

    Price, Richard; Marsh, Abbie J; Fisher, Marisa H

    2018-03-01

    Facilitating the use of public transportation enhances opportunities for independent living and competitive, community-based employment for individuals with intellectual and developmental disabilities (IDD). Four young adults with IDD were taught through total-task chaining to use the Google Maps application, a self-prompting, visual navigation system, to take the bus to locations around a college campus and the community. Three of four participants learned to use Google Maps to independently navigate public transportation. Google Maps may be helpful in supporting independent travel, highlighting the importance of future research in teaching navigation skills. Learning to independently use public transportation increases access to autonomous activities, such as opportunities to work and to attend postsecondary education programs on large college campuses.Individuals with IDD can be taught through chaining procedures to use the Google Maps application to navigate public transportation.Mobile map applications are an effective and functional modern tool that can be used to teach community navigation.

  4. Destination Information System for Bandung City Using Location-Based Services (LBS) on Android

    NASA Astrophysics Data System (ADS)

    Kurniawan, B.; Pranoto, H.

    2018-02-01

    Bandung is a city in West Java, Indonesia with many interesting locations to visit. For most favourite destinations, we can easily look for it on Google and we will find some blogs there discussing about related content. The problem is we can not guarantee that the destination is frequented by visitor. In this research, we utilizes an application to help everyone choosing destination frequented by visitor. The use of information technology in the form of picture, maps, and textual on Android application makes it possible for user to have information about destination with its visitor in a period of time. If destination has visit history, selection of proper destination will be given with fresh informations. This application can run well on Android Lollipop (API Level 21) or above with a minimum RAM of 2 GB since it will compare two coordinates for every data. The use of this app make it possible to access information about location with its visitor history and could help choosing proper destinations for the users.

  5. Teaching Topographic Map Skills and Geomorphology Concepts with Google Earth in a One-Computer Classroom

    ERIC Educational Resources Information Center

    Hsu, Hsiao-Ping; Tsai, Bor-Wen; Chen, Che-Ming

    2018-01-01

    Teaching high-school geomorphological concepts and topographic map reading entails many challenges. This research reports the applicability and effectiveness of Google Earth in teaching topographic map skills and geomorphological concepts, by a single teacher, in a one-computer classroom. Compared to learning via a conventional instructional…

  6. Spatial decision supporting for winter wheat irrigation and fertilizer optimizing in North China Plain

    NASA Astrophysics Data System (ADS)

    Yang, Xiaodong; Yang, Hao; Dong, Yansheng; Yu, Haiyang

    2014-11-01

    Production management of winter wheat is more complicated than other crops since its growth period is covered all four seasons and growth environment is very complex with frozen injury, drought, insect or disease injury and others. In traditional irrigation and fertilizer management, agricultural technicians or farmers mainly make decision based on phenology, planting experience to carry out artificial fertilizer and irrigation management. For example, wheat needs more nitrogen fertilizer in jointing and booting stage by experience, then when the wheat grow to the two growth periods, the farmer will fertilize to the wheat whether it needs or not. We developed a spatial decision support system for optimizing irrigation and fertilizer measures based on WebGIS, which monitoring winter wheat growth and soil moisture content by combining a crop model, remote sensing data and wireless sensors data, then reasoning professional management schedule from expert knowledge warehouse. This system is developed by ArcIMS, IDL in server-side and JQuery, Google Maps API, ASP.NET in client-side. All computing tasks are run on server-side, such as computing 11 normal vegetable indexes (NDVI/ NDWI/ NDWI2/ NRI/ NSI/ WI/ G_SWIR/ G_SWIR2/ SPSI/ TVDI/ VSWI) and custom VI of remote sensing image by IDL; while real-time building map configuration file and generating thematic map by ArcIMS.

  7. Using Map Service API for Driving Cycle Detection for Wearable GPS Data: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Lei; Gonder, Jeffrey D

    Following advancements in smartphone and portable global positioning system (GPS) data collection, wearable GPS data have realized extensive use in transportation surveys and studies. The task of detecting driving cycles (driving or car-mode trajectory segments) from wearable GPS data has been the subject of much research. Specifically, distinguishing driving cycles from other motorized trips (such as taking a bus) is the main research problem in this paper. Many mode detection methods only focus on raw GPS speed data while some studies apply additional information, such as geographic information system (GIS) data, to obtain better detection performance. Procuring and maintaining dedicatedmore » road GIS data are costly and not trivial, whereas the technical maturity and broad use of map service application program interface (API) queries offers opportunities for mode detection tasks. The proposed driving cycle detection method takes advantage of map service APIs to obtain high-quality car-mode API route information and uses a trajectory segmentation algorithm to find the best-matched API route. The car-mode API route data combined with the actual route information, including the actual mode information, are used to train a logistic regression machine learning model, which estimates car modes and non-car modes with probability rates. The experimental results show promise for the proposed method's ability to detect vehicle mode accurately.« less

  8. interPopula: a Python API to access the HapMap Project dataset

    PubMed Central

    2010-01-01

    Background The HapMap project is a publicly available catalogue of common genetic variants that occur in humans, currently including several million SNPs across 1115 individuals spanning 11 different populations. This important database does not provide any programmatic access to the dataset, furthermore no standard relational database interface is provided. Results interPopula is a Python API to access the HapMap dataset. interPopula provides integration facilities with both the Python ecology of software (e.g. Biopython and matplotlib) and other relevant human population datasets (e.g. Ensembl gene annotation and UCSC Known Genes). A set of guidelines and code examples to address possible inconsistencies across heterogeneous data sources is also provided. Conclusions interPopula is a straightforward and flexible Python API that facilitates the construction of scripts and applications that require access to the HapMap dataset. PMID:21210977

  9. Supporting our scientists with Google Earth-based UIs.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Janine

    2010-10-01

    Google Earth and Google Maps are incredibly useful for researchers looking for easily-digestible displays of data. This presentation will provide a step-by-step tutorial on how to begin using Google Earth to create tools that further the mission of the DOE national lab complex.

  10. PhyloGeoViz: a web-based program that visualizes genetic data on maps.

    PubMed

    Tsai, Yi-Hsin E

    2011-05-01

    The first step of many population genetic studies is the simple visualization of allele frequencies on a landscape. This basic data exploration can be challenging without proprietary software, and the manual plotting of data is cumbersome and unfeasible at large sample sizes. I present an open source, web-based program that plots any kind of frequency or count data as pie charts in Google Maps (Google Inc., Mountain View, CA). Pie polygons are then exportable to Google Earth (Google Inc.), a free Geographic Information Systems platform. Import of genetic data into Google Earth allows phylogeographers access to a wealth of spatial information layers integral to forming hypotheses and understanding patterns in the data. © 2010 Blackwell Publishing Ltd.

  11. Exploring the Effects of Employing Google Docs in Collaborative Concept Mapping on Achievement, Concept Representation, and Attitudes

    ERIC Educational Resources Information Center

    Lin, Yu-Tzu; Chang, Chia-Hu; Hou, Huei-Tse; Wu, Ke-Chou

    2016-01-01

    This study investigated the effectiveness of using Google Docs in collaborative concept mapping (CCM) by comparing it with a paper-and-pencil approach. A quasi-experimental study was conducted in a physics course. The control group drew concept maps using the paper-and-pencil method and face-to-face discussion, whereas the experimental group…

  12. Usability analysis of indoor map application in a shopping centre

    NASA Astrophysics Data System (ADS)

    Dewi, R. S.; Hadi, R. K.

    2018-04-01

    Although indoor navigation is still new in Indonesia, its future development is very promising. Similar to the outdoor one, the indoor navigation technology provides several important functions to support route and landmark findings. Furthermore, there is also a need that indoor navigation can support the public safety especially during disaster evacuation process in a building. It is a common that the indoor navigation technologies are built as applications where users can access this technology using their smartphones, tablets, or personal computers. Therefore, a usability analysis is important to ensure the indoor navigation applications can be operated by users with highest functionality. Among several indoor map applications which were available in the market, this study chose to analyse indoor Google Maps due to its availability and popularity in Indonesia. The experiments to test indoor Google Maps was conducted in one of the biggest shopping centre building in Surabaya, Indonesia. The usability was measured by employing System Usability Scale (SUS) questionnaire. The result showed that the SUS score of indoor Google Maps was below the average score of other cellular applications to indicate the users still had high difficulty in operating and learning the features of indoor Google Maps.

  13. Perils of using speed zone data to assess real-world compliance to speed limits.

    PubMed

    Chevalier, Anna; Clarke, Elizabeth; Chevalier, Aran John; Brown, Julie; Coxon, Kristy; Ivers, Rebecca; Keay, Lisa

    2017-11-17

    Real-world driving studies, including those involving speeding alert devices and autonomous vehicles, can gauge an individual vehicle's speeding behavior by comparing measured speed with mapped speed zone data. However, there are complexities with developing and maintaining a database of mapped speed zones over a large geographic area that may lead to inaccuracies within the data set. When this approach is applied to large-scale real-world driving data or speeding alert device data to determine speeding behavior, these inaccuracies may result in invalid identification of speeding. We investigated speeding events based on service provider speed zone data. We compared service provider speed zone data (Speed Alert by Smart Car Technologies Pty Ltd., Ultimo, NSW, Australia) against a second set of speed zone data (Google Maps Application Programming Interface [API] mapped speed zones). We found a systematic error in the zones where speed limits of 50-60 km/h, typical of local roads, were allocated to high-speed motorways, which produced false speed limits in the speed zone database. The result was detection of false-positive high-range speeding. Through comparison of the service provider speed zone data against a second set of speed zone data, we were able to identify and eliminate data most affected by this systematic error, thereby establishing a data set of speeding events with a high level of sensitivity (a true positive rate of 92% or 6,412/6,960). Mapped speed zones can be a source of error in real-world driving when examining vehicle speed. We explored the types of inaccuracies found within speed zone data and recommend that a second set of speed zone data be utilized when investigating speeding behavior or developing mapped speed zone data to minimize inaccuracy in estimates of speeding.

  14. 77 FR 40579 - Tapered Roller Bearings and Parts Thereof, Finished or Unfinished, From the People's Republic of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... explained in the legislative history of the Omnibus Trade and Competitiveness Act of 1988, the Department... Google Maps: https://maps.google.com . The rates were in effect prior to the POR, so we adjusted them to...

  15. TouchTerrain: A simple web-tool for creating 3D-printable topographic models

    NASA Astrophysics Data System (ADS)

    Hasiuk, Franciszek J.; Harding, Chris; Renner, Alex Raymond; Winer, Eliot

    2017-12-01

    An open-source web-application, TouchTerrain, was developed to simplify the production of 3D-printable terrain models. Direct Digital Manufacturing (DDM) using 3D Printers can change how geoscientists, students, and stakeholders interact with 3D data, with the potential to improve geoscience communication and environmental literacy. No other manufacturing technology can convert digital data into tangible objects quickly at relatively low cost; however, the expertise necessary to produce a 3D-printed terrain model can be a substantial burden: knowledge of geographical information systems, computer aided design (CAD) software, and 3D printers may all be required. Furthermore, printing models larger than the build volume of a 3D printer can pose further technical hurdles. The TouchTerrain web-application simplifies DDM for elevation data by generating digital 3D models customized for a specific 3D printer's capabilities. The only required user input is the selection of a region-of-interest using the provided web-application with a Google Maps-style interface. Publically available digital elevation data is processed via the Google Earth Engine API. To allow the manufacture of 3D terrain models larger than a 3D printer's build volume the selected area can be split into multiple tiles without third-party software. This application significantly reduces the time and effort required for a non-expert like an educator to obtain 3D terrain models for use in class. The web application is deployed at http://touchterrain.geol.iastate.edu/.

  16. Positional Accuracy Assessment of Googleearth in Riyadh

    NASA Astrophysics Data System (ADS)

    Farah, Ashraf; Algarni, Dafer

    2014-06-01

    Google Earth is a virtual globe, map and geographical information program that is controlled by Google corporation. It maps the Earth by the superimposition of images obtained from satellite imagery, aerial photography and GIS 3D globe. With millions of users all around the globe, GoogleEarth® has become the ultimate source of spatial data and information for private and public decision-support systems besides many types and forms of social interactions. Many users mostly in developing countries are also using it for surveying applications, the matter that raises questions about the positional accuracy of the Google Earth program. This research presents a small-scale assessment study of the positional accuracy of GoogleEarth® Imagery in Riyadh; capital of Kingdom of Saudi Arabia (KSA). The results show that the RMSE of the GoogleEarth imagery is 2.18 m and 1.51 m for the horizontal and height coordinates respectively.

  17. Google earth mapping of damage from the Nigata-Ken-Chuetsu M6.6 earthquake of 16 July 2007

    USGS Publications Warehouse

    Kayen, Robert E.; Steele, WM. Clint; Collins, Brian; Walker, Kevin

    2008-01-01

    We describe the use of Google Earth during and after a large damaging earthquake thatstruck the central Japan coast on 16 July 2007 to collect and organize damage information and guide the reconnaissance activities. This software enabled greater real-time collaboration among scientists and engineers. After the field investigation, the Google Earth map is used as a final reporting product that was directly linked to the more traditional research report document. Finally, we analyze the use of the software within the context of a post-disaster reconnaissance investigation, and link it to student use of GoogleEarth in field situations

  18. KSC-2013-3239

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – As seen on Google Maps, the massive F-1 engines of the Saturn V's first stage on display inside the Apollo/Saturn V Center at the Kennedy Space Center Visitor Complex. Each engine stands 19 feet tall with a diameter of more than 12 feet. The five engines on the first stage produced 7.5 million pounds of thrust at liftoff. The Saturn V was used to launch NASA's Apollo missions to the moon which saw 12 astronauts land and work on the lunar surface. Google precisely mapped Kennedy Space Center and some of its historical facilities for the company's map page. Photo credit: Google/Wendy Wang

  19. Visualizing Cross-sectional Data in a Real-World Context

    NASA Astrophysics Data System (ADS)

    Van Noten, K.; Lecocq, T.

    2016-12-01

    If you could fly around your research results in three dimensions, wouldn't you like to do it? Visualizing research results properly during scientific presentations already does half the job of informing the public on the geographic framework of your research. Many scientists use the Google Earth™ mapping service (V7.1.2.2041) because it's a great interactive mapping tool for assigning geographic coordinates to individual data points, localizing a research area, and draping maps of results over Earth's surface for 3D visualization. However, visualizations of research results in vertical cross-sections are often not shown simultaneously with the maps in Google Earth. A few tutorials and programs to display cross-sectional data in Google Earth do exist, and the workflow is rather simple. By importing a cross-sectional figure into in the open software SketchUp Make [Trimble Navigation Limited, 2016], any spatial model can be exported to a vertical figure in Google Earth. In this presentation a clear workflow/tutorial is presented how to image cross-sections manually in Google Earth. No software skills, nor any programming codes are required. It is very easy to use, offers great possibilities for teaching and allows fast figure manipulation in Google Earth. The full workflow can be found in "Van Noten, K. 2016. Visualizing Cross-Sectional Data in a Real-World Context. EOS, Transactions AGU, 97, 16-19".The video tutorial can be found here: https://www.youtube.com/watch?v=Tr8LwFJ4RYU&Figure: Cross-sectional Research Examples Illustrated in Google Earth

  20. The Adversarial Route Analysis Tool: A Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  1. G2S: a web-service for annotating genomic variants on 3D protein structures.

    PubMed

    Wang, Juexin; Sheridan, Robert; Sumer, S Onur; Schultz, Nikolaus; Xu, Dong; Gao, Jianjiong

    2018-06-01

    Accurately mapping and annotating genomic locations on 3D protein structures is a key step in structure-based analysis of genomic variants detected by recent large-scale sequencing efforts. There are several mapping resources currently available, but none of them provides a web API (Application Programming Interface) that supports programmatic access. We present G2S, a real-time web API that provides automated mapping of genomic variants on 3D protein structures. G2S can align genomic locations of variants, protein locations, or protein sequences to protein structures and retrieve the mapped residues from structures. G2S API uses REST-inspired design and it can be used by various clients such as web browsers, command terminals, programming languages and other bioinformatics tools for bringing 3D structures into genomic variant analysis. The webserver and source codes are freely available at https://g2s.genomenexus.org. g2s@genomenexus.org. Supplementary data are available at Bioinformatics online.

  2. Towards democracy in spatial planning through spatial information built by communities: The investigation of spatial information built by citizens from participatory mapping to volunteered geographic information in Indonesia

    NASA Astrophysics Data System (ADS)

    Yudono, Adipandang

    2017-06-01

    Recently, crowd-sourced information is used to produce and improve collective knowledge and community capacity building. Triggered by broadening and expanding access to the Internet and cellular telephones, the utilisation of crowd-sourcing for policy advocacy, e-government and e-participation has increased globally [1]. Crowd-sourced information can conceivably support government’s or general social initiatives to inform, counsel, and cooperate, by engaging subjects and empowering decentralisation and democratization [2]. Crowd-sourcing has turned into a major technique for interactive mapping initiatives by urban or rural community because of its capability to incorporate a wide range of data. Continuously accumulated spatial data can be sorted, layered, and envisioned in ways that even beginners can comprehend with ease. Interactive spatial visualization has the possibility to be a useful democratic planning tool to empower citizens participating in spatial data provision and sharing in government programmes. Since the global emergence of World Wide Web (WWW) technology, the interaction between information providers and users has increased. Local communities are able to produce and share spatial data to produce web interfaces with territorial information in mapping application programming interfaces (APIs) public, such as Google maps, OSM and Wikimapia [3][4][5]. In terms of the democratic spatial planning action, Volunteered Geographic Information (VGI) is considered an effective voluntary method of helping people feel comfortable with the technology and other co-participants in order to shape coalitions of local knowledge. This paper has aim to investigate ‘How is spatial data created by citizens used in Indonesia?’ by discussing the characteristics of spatial data usage by citizens to support spatial policy formulation, starting with the history of participatory mapping to current VGI development in Indonesia.

  3. Predicting plant attractiveness to pollinators with passive crowdsourcing.

    PubMed

    Bahlai, Christie A; Landis, Douglas A

    2016-06-01

    Global concern regarding pollinator decline has intensified interest in enhancing pollinator resources in managed landscapes. These efforts frequently emphasize restoration or planting of flowering plants to provide pollen and nectar resources that are highly attractive to the desired pollinators. However, determining exactly which plant species should be used to enhance a landscape is difficult. Empirical screening of plants for such purposes is logistically daunting, but could be streamlined by crowdsourcing data to create lists of plants most probable to attract the desired pollinator taxa. People frequently photograph plants in bloom and the Internet has become a vast repository of such images. A proportion of these images also capture floral visitation by arthropods. Here, we test the hypothesis that the abundance of floral images containing identifiable pollinator and other beneficial insects is positively associated with the observed attractiveness of the same species in controlled field trials from previously published studies. We used Google Image searches to determine the correlation of pollinator visitation captured by photographs on the Internet relative to the attractiveness of the same species in common-garden field trials for 43 plant species. From the first 30 photographs, which successfully identified the plant, we recorded the number of Apis (managed honeybees), non-Apis (exclusively wild bees) and the number of bee-mimicking syrphid flies. We used these observations from search hits as well as bloom period (BP) as predictor variables in Generalized Linear Models (GLMs) for field-observed abundances of each of these groups. We found that non-Apis bees observed in controlled field trials were positively associated with observations of these taxa in Google Image searches (pseudo-R (2) of 0.668). Syrphid fly observations in the field were also associated with the frequency they were observed in images, but this relationship was weak. Apis bee observations were not associated with Internet images, but were slightly associated with BP. Our results suggest that passively crowdsourced image data can potentially be a useful screening tool to identify candidate plants for pollinator habitat restoration efforts directed at wild bee conservation. Increasing our understanding of the attractiveness of a greater diversity of plants increases the potential for more rapid and efficient research in creating pollinator-supportive landscapes.

  4. Efficiently Communicating Rich Heterogeneous Geospatial Data from the FeMO2008 Dive Cruise with FlashMap on EarthRef.org

    NASA Astrophysics Data System (ADS)

    Minnett, R. C.; Koppers, A. A.; Staudigel, D.; Staudigel, H.

    2008-12-01

    EarthRef.org is comprehensive and convenient resource for Earth Science reference data and models. It encompasses four main portals: the Geochemical Earth Reference Model (GERM), the Magnetics Information Consortium (MagIC), the Seamount Biogeosciences Network (SBN), and the Enduring Resources for Earth Science Education (ERESE). Their underlying databases are publically available and the scientific community has contributed widely and is urged to continue to do so. However, the net result is a vast and largely heterogeneous warehouse of geospatial data ranging from carefully prepared maps of seamounts to geochemical data/metadata, daily reports from seagoing expeditions, large volumes of raw and processed multibeam data, images of paleomagnetic sampling sites, etc. This presents a considerable obstacle for integrating other rich media content, such as videos, images, data files, cruise tracks, and interoperable database results, without overwhelming the web user. The four EarthRef.org portals clearly lend themselves to a more intuitive user interface and has, therefore, been an invaluable test bed for the design and implementation of FlashMap, a versatile KML-driven geospatial browser written for reliability and speed in Adobe Flash. FlashMap allows layers of content to be loaded and displayed over a streaming high-resolution map which can be zoomed and panned similarly to Google Maps and Google Earth. Many organizations, from National Geographic to the USGS, have begun using Google Earth software to display geospatial content. However, Google Earth, as a desktop application, does not integrate cleanly with existing websites requiring the user to navigate away from the browser and focus on a separate application and Google Maps, written in Java Script, does not scale up reliably to large datasets. FlashMap remedies these problems as a web-based application that allows for seamless integration of the real-time display power of Google Earth and the flexibility of the web without losing scalability and control of the base maps. Our Flash-based application is fully compatible with KML (Keyhole Markup Language) 2.2, the most recent iteration of KML, allowing users with existing Google Earth KML files to effortlessly display their geospatial content embedded in a web page. As a test case for FlashMap, the annual Iron-Oxidizing Microbial Observatory (FeMO) dive cruise to the Loihi Seamount, in conjunction with data available from ongoing and published FeMO laboratory studies, showcases the flexibility of this single web-based application. With a KML 2.2 compatible web-service providing the content, any database can display results in FlashMap. The user can then hide and show multiple layers of content, potentially from several data sources, and rapidly digest a vast quantity of information to narrow the search results. This flexibility gives experienced users the ability to drill down to exactly the record they are looking for (SERC at Carleton College's educational application of FlashMap at http://serc.carleton.edu/sp/erese/activities/22223.html) and allows users familiar with Google Earth the ability to load and view geospatial data content within a browser from any computer with an internet connection.

  5. Complaint go: an online complaint registration system using web services and android

    NASA Astrophysics Data System (ADS)

    Mareeswari, V.; Gopalakrishnan, V.

    2017-11-01

    In numerous nations, there are city bodies that are the nearby representing bodies that help keep up and run urban communities. These administering bodies are for the most part called MC (Municipal Cooperation). The MC may need to introduce edit cameras and other observation gadgets to guarantee the city is running easily and productively. It is imperative for an MC to know the deficiencies occurring inside the city. As of now, this must be for all intents and purposes conceivable by introducing sensors/cameras and so forth or enabling nationals to straightforwardly address them. The everyday operations and working of the city are taken care by administering bodies which are known as Government Authorities. Presently keeping in mind the end goal to keep up the huge city requires that the Government Authority should know about any issue or deficiency either through (sensors/CCTV cameras) or by enabling the nationals to grumble about these issues. The second choice is generally granted on the grounds that it gives the best possible substantial data. The GA by and large enables its residents to enlist their grievance through a few mediums. In this application, the citizens are facilitated to send the complaints directly from their smartphone to the higher officials. Many APIs are functioning as the web services which are really essential to make it easier to register a complaint such as Google Places API to detect your current location and show that in Map. The Web portal is used to process various complaints well supported with different web services.

  6. Travel behavior of low income older adults and implementation of an accessibility calculator

    PubMed Central

    Moniruzzaman, Md; Chudyk, Anna; Páez, Antonio; Winters, Meghan; Sims-Gould, Joanie; McKay, Heather

    2016-01-01

    Given the aging demographic landscape, the concept of walkable neighborhoods has emerged as a topic of interest, especially during the last decade. However, we know very little about whether walkable neighborhoods promote walking among older adults, particularly those with lower incomes. Therefore in this paper we: (i) examine the relation between trip distance and sociodemographic attributes and accessibility features of lower income older adults in Metro Vancouver; and, (ii) implement a web-based application to calculate the accessibility of lower income older adults in Metro Vancouver based on their travel behavior. We use multilevel linear regression to estimate the determinants of trip length. We find that in this population distance traveled is associated with gender, living arrangements, and dog ownership. Furthermore, significant geographical variations (measured using a trend surface) were also found. To better visualize the impact of travel behavior on accessibility by personal profile and location, we also implemented a web-based calculator that generates an Accessibility (A)-score using Google Maps API v3 that can be used to evaluate the accessibility of neighborhoods from the perspective of older adults. PMID:27104148

  7. From Google Maps to Google Models (Invited)

    NASA Astrophysics Data System (ADS)

    Moore, R. V.

    2010-12-01

    Why hasn’t integrated modelling taken off? To its advocates, it is self-evidently the best and arguably the only tool available for understanding and predicting the likely response of the environment to events and policies. Legislation requires managers to ensure that their plans are sustainable. How, other than by modelling the interacting processes involved, can the option with the greatest benefits be identified? Integrated modelling (IM) is seen to have huge potential. In science, IM is used to extend and encapsulate our understanding of the whole earth system. Such models are beginning to be incorporated in operational decision support systems and used to seek sustainable solutions to society’s problems, but only on a limited scale. Commercial take up is negligible yet the opportunities would appear limitless. The need is there; the potential is there, so what is inhibiting IM’s take up? What must be done to reap the rewards of the R & D to date? To answer the question, it useful to look back at the developments which have seen paper maps evolve into Google Maps and the systems that now surround it; facilities available not just to experts and governments but to anyone with a an iphone and an internet connection. The initial objective was to automate the process of drawing lines on paper, though it was quickly realised that digitising maps was the key to unlocking the information they held. However, it took thousands of PhD and MSc projects before a computer could generate a map comparable to that produced by a cartographer and many more before it was possible to extract reliable useful information from maps. It also required advances in IT and a change of mindset from one focused on paper map production to one focused on information delivery. To move from digital maps to Google Maps required the availability of data on a world scale, the resources to bring them together, the development of remote sensing, satellite navigation and communications technology and the creation of a commercial climate and conditions that allowed businesses anywhere to exploit the new information. This talk will draw lessons from the experience and imagine how Google Maps could become Google Models. The first lesson is time scale, it took far longer for digital mapping to move out of the development phase than most expected. Its first real customers were the public utilities. They are large organisations, risk averse and take time to change their ways of working; integrated modellers should not be surprised by the slow take up. Few of the early commercial entrants made any significant profits. It was only when the data reached critical mass and became accessible, when the systems became easy to use, affordable and accessible via the web, when convincing demonstrations became available and the necessary standards emerged that Google Maps could emerge. IM has yet to reach this point. It has far bigger technical, scientific and institutional challenges to overcome. The resources required will be large. It is possible though that they could be marshalled by creating an open source community of practice. However, that community will need a facilitating core group and standards to succeed. Having seen what Google Maps made possible, the innovative ideas it released, it is not difficult to imagine where a community of practice might take IM.

  8. KML-based teaching lessons developed by Google in partnership with the University of Alaska.

    NASA Astrophysics Data System (ADS)

    Kolb, E. J.; Bailey, J.; Bishop, A.; Cain, J.; Goddard, M.; Hurowitz, K.; Kennedy, K.; Ornduff, T.; Sfraga, M.; Wernecke, J.

    2008-12-01

    The focus of Google's Geo Education outreach efforts (http://www.google.com/educators/geo.html) is on helping primary, secondary, and post-secondary educators incorporate Google Earth and Sky, Google Maps, and SketchUp into their classroom lessons. In this poster and demonstration, we will show our KML-based science lessons that were developed in partnership with the University of Alaska and used in classroom teachings by our team to Alaskan high-school students.

  9. Using Social Media and Mobile Devices to Discover and Share Disaster Data Products Derived From Satellites

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Patrice; Frye, Stuart; Evans, John; Moe, Karen

    2014-01-01

    Data products derived from Earth observing satellites are difficult to find and share without specialized software and often times a highly paid and specialized staff. For our research effort, we endeavored to prototype a distributed architecture that depends on a standardized communication protocol and applications program interface (API) that makes it easy for anyone to discover and access disaster related data. Providers can easily supply the public with their disaster related products by building an adapter for our API. Users can use the API to browse and find products that relate to the disaster at hand, without a centralized catalogue, for example floods, and then are able to share that data via social media. Furthermore, a longerterm goal for this architecture is to enable other users who see the shared disaster product to be able to generate the same product for other areas of interest via simple point and click actions on the API on their mobile device. Furthermore, the user will be able to edit the data with on the ground local observations and return the updated information to the original repository of this information if configured for this function. This architecture leverages SensorWeb functionality [1] presented at previous IGARSS conferences. The architecture is divided into two pieces, the frontend, which is the GeoSocial API, and the backend, which is a standardized disaster node that knows how to talk to other disaster nodes, and also can communicate with the GeoSocial API. The GeoSocial API, along with the disaster node basic functionality enables crowdsourcing and thus can leverage insitu observations by people external to a group to perform tasks such as improving water reference maps, which are maps of existing water before floods. This can lower the cost of generating precision water maps. Keywords-Data Discovery, Disaster Decision Support, Disaster Management, Interoperability, CEOS WGISS Disaster Architecture

  10. Assessing species distribution using Google Street View: a pilot study with the Pine Processionary Moth.

    PubMed

    Rousselet, Jérôme; Imbert, Charles-Edouard; Dekri, Anissa; Garcia, Jacques; Goussard, Francis; Vincent, Bruno; Denux, Olivier; Robinet, Christelle; Dorkeld, Franck; Roques, Alain; Rossi, Jean-Pierre

    2013-01-01

    Mapping species spatial distribution using spatial inference and prediction requires a lot of data. Occurrence data are generally not easily available from the literature and are very time-consuming to collect in the field. For that reason, we designed a survey to explore to which extent large-scale databases such as Google maps and Google Street View could be used to derive valid occurrence data. We worked with the Pine Processionary Moth (PPM) Thaumetopoea pityocampa because the larvae of that moth build silk nests that are easily visible. The presence of the species at one location can therefore be inferred from visual records derived from the panoramic views available from Google Street View. We designed a standardized procedure allowing evaluating the presence of the PPM on a sampling grid covering the landscape under study. The outputs were compared to field data. We investigated two landscapes using grids of different extent and mesh size. Data derived from Google Street View were highly similar to field data in the large-scale analysis based on a square grid with a mesh of 16 km (96% of matching records). Using a 2 km mesh size led to a strong divergence between field and Google-derived data (46% of matching records). We conclude that Google database might provide useful occurrence data for mapping the distribution of species which presence can be visually evaluated such as the PPM. However, the accuracy of the output strongly depends on the spatial scales considered and on the sampling grid used. Other factors such as the coverage of Google Street View network with regards to sampling grid size and the spatial distribution of host trees with regards to road network may also be determinant.

  11. Assessing Species Distribution Using Google Street View: A Pilot Study with the Pine Processionary Moth

    PubMed Central

    Dekri, Anissa; Garcia, Jacques; Goussard, Francis; Vincent, Bruno; Denux, Olivier; Robinet, Christelle; Dorkeld, Franck; Roques, Alain; Rossi, Jean-Pierre

    2013-01-01

    Mapping species spatial distribution using spatial inference and prediction requires a lot of data. Occurrence data are generally not easily available from the literature and are very time-consuming to collect in the field. For that reason, we designed a survey to explore to which extent large-scale databases such as Google maps and Google street view could be used to derive valid occurrence data. We worked with the Pine Processionary Moth (PPM) Thaumetopoea pityocampa because the larvae of that moth build silk nests that are easily visible. The presence of the species at one location can therefore be inferred from visual records derived from the panoramic views available from Google street view. We designed a standardized procedure allowing evaluating the presence of the PPM on a sampling grid covering the landscape under study. The outputs were compared to field data. We investigated two landscapes using grids of different extent and mesh size. Data derived from Google street view were highly similar to field data in the large-scale analysis based on a square grid with a mesh of 16 km (96% of matching records). Using a 2 km mesh size led to a strong divergence between field and Google-derived data (46% of matching records). We conclude that Google database might provide useful occurrence data for mapping the distribution of species which presence can be visually evaluated such as the PPM. However, the accuracy of the output strongly depends on the spatial scales considered and on the sampling grid used. Other factors such as the coverage of Google street view network with regards to sampling grid size and the spatial distribution of host trees with regards to road network may also be determinant. PMID:24130675

  12. KSC-2013-3238

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – As seen on Google Maps, a Space Shuttle Main Engine, or SSME, stands inside the Engine Shop at Orbiter Processing Facility 3 at NASA's Kennedy Space Center. Each orbiter used three of the engines during launch and ascent into orbit. The engines burn super-cold liquid hydrogen and liquid oxygen and each one produces 155,000 pounds of thrust. The engines, known in the industry as RS-25s, could be reused on multiple shuttle missions. They will be used again later this decade for NASA's Space Launch System rocket. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang

  13. Comparison of Genetic Algorithm and Hill Climbing for Shortest Path Optimization Mapping

    NASA Astrophysics Data System (ADS)

    Fronita, Mona; Gernowo, Rahmat; Gunawan, Vincencius

    2018-02-01

    Traveling Salesman Problem (TSP) is an optimization to find the shortest path to reach several destinations in one trip without passing through the same city and back again to the early departure city, the process is applied to the delivery systems. This comparison is done using two methods, namely optimization genetic algorithm and hill climbing. Hill Climbing works by directly selecting a new path that is exchanged with the neighbour's to get the track distance smaller than the previous track, without testing. Genetic algorithms depend on the input parameters, they are the number of population, the probability of crossover, mutation probability and the number of generations. To simplify the process of determining the shortest path supported by the development of software that uses the google map API. Tests carried out as much as 20 times with the number of city 8, 16, 24 and 32 to see which method is optimal in terms of distance and time computation. Based on experiments conducted with a number of cities 3, 4, 5 and 6 producing the same value and optimal distance for the genetic algorithm and hill climbing, the value of this distance begins to differ with the number of city 7. The overall results shows that these tests, hill climbing are more optimal to number of small cities and the number of cities over 30 optimized using genetic algorithms.

  14. Navigation API Route Fuel Saving Opportunity Assessment on Large-Scale Real-World Travel Data for Conventional Vehicles and Hybrid Electric Vehicles: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Lei; Holden, Jacob; Gonder, Jeffrey D

    The green routing strategy instructing a vehicle to select a fuel-efficient route benefits the current transportation system with fuel-saving opportunities. This paper introduces a navigation API route fuel-saving evaluation framework for estimating fuel advantages of alternative API routes based on large-scale, real-world travel data for conventional vehicles (CVs) and hybrid electric vehicles (HEVs). The navigation APIs, such Google Directions API, integrate traffic conditions and provide feasible alternative routes for origin-destination pairs. This paper develops two link-based fuel-consumption models stratified by link-level speed, road grade, and functional class (local/non-local), one for CVs and the other for HEVs. The link-based fuel-consumption modelsmore » are built by assigning travel from a large number of GPS driving traces to the links in TomTom MultiNet as the underlying road network layer and road grade data from a U.S. Geological Survey elevation data set. Fuel consumption on a link is calculated by the proposed fuel consumption model. This paper envisions two kinds of applications: 1) identifying alternate routes that save fuel, and 2) quantifying the potential fuel savings for large amounts of travel. An experiment based on a large-scale California Household Travel Survey GPS trajectory data set is conducted. The fuel consumption and savings of CVs and HEVs are investigated. At the same time, the trade-off between fuel saving and time saving for choosing different routes is also examined for both powertrains.« less

  15. GIS Database and Google Map of the Population at Risk of Cholangiocarcinoma in Mueang Yang District, Nakhon Ratchasima Province of Thailand.

    PubMed

    Kaewpitoon, Soraya J; Rujirakul, Ratana; Joosiri, Apinya; Jantakate, Sirinun; Sangkudloa, Amnat; Kaewthani, Sarochinee; Chimplee, Kanokporn; Khemplila, Kritsakorn; Kaewpitoon, Natthawut

    2016-01-01

    Cholangiocarcinoma (CCA) is a serious problem in Thailand, particularly in the northeastern and northern regions. Database of population at risk are need required for monitoring, surveillance, home health care, and home visit. Therefore, this study aimed to develop a geographic information system (GIS) database and Google map of the population at risk of CCA in Mueang Yang district, Nakhon Ratchasima province, northeastern Thailand during June to October 2015. Populations at risk were screened using the Korat CCA verbal screening test (KCVST). Software included Microsoft Excel, ArcGIS, and Google Maps. The secondary data included the point of villages, sub-district boundaries, district boundaries, point of hospital in Mueang Yang district, used for created the spatial databese. The populations at risk for CCA and opisthorchiasis were used to create an arttribute database. Data were tranfered to WGS84 UTM ZONE 48. After the conversion, all of the data were imported into Google Earth using online web pages www.earthpoint.us. Some 222 from a 4,800 population at risk for CCA constituted a high risk group. Geo-visual display available at following www.google.com/maps/d/u/0/ edit?mid=zPxtcHv_iDLo.kvPpxl5mAs90 and hl=th. Geo-visual display 5 layers including: layer 1, village location and number of the population at risk for CCA; layer 2, sub-district health promotion hospital in Mueang Yang district and number of opisthorchiasis; layer 3, sub-district district and the number of population at risk for CCA; layer 4, district hospital and the number of population at risk for CCA and number of opisthorchiasis; and layer 5, district and the number of population at risk for CCA and number of opisthorchiasis. This GIS database and Google map production process is suitable for further monitoring, surveillance, and home health care for CCA sufferers.

  16. Migrating Department of Defense (DoD) Web Service Based Applications to Mobile Computing Platforms

    DTIC Science & Technology

    2012-03-01

    World Wide Web Consortium (W3C) Geolocation API to identify the device’s location and then center the map on the device. Finally, we modify the entry...THIS PAGE INTENTIONALLY LEFT BLANK xii List of Acronyms and Abbreviations API Application Programming Interface CSS Cascading Style Sheets CLIMO...Java API for XML Web Services Reference Implementation JS JavaScript JSNI JavaScript Native Interface METOC Meteorological and Oceanographic MAA Mobile

  17. Vocabulary services to support scientific data interoperability

    NASA Astrophysics Data System (ADS)

    Cox, Simon; Mills, Katie; Tan, Florence

    2013-04-01

    Shared vocabularies are a core element in interoperable systems. Vocabularies need to be available at run-time, and where the vocabularies are shared by a distributed community this implies the use of web technology to provide vocabulary services. Given the ubiquity of vocabularies or classifiers in systems, vocabulary services are effectively the base of the interoperability stack. In contemporary knowledge organization systems, a vocabulary item is considered a concept, with the "terms" denoting it appearing as labels. The Simple Knowledge Organization System (SKOS) formalizes this as an RDF Schema (RDFS) application, with a bridge to formal logic in Web Ontology Language (OWL). For maximum utility, a vocabulary should be made available through the following interfaces: * the vocabulary as a whole - at an ontology URI corresponding to a vocabulary document * each item in the vocabulary - at the item URI * summaries, subsets, and resources derived by transformation * through the standard RDF web API - i.e. a SPARQL endpoint * through a query form for human users. However, the vocabulary data model may be leveraged directly in a standard vocabulary API that uses the semantics provided by SKOS. SISSvoc3 [1] accomplishes this as a standard set of URI templates for a vocabulary. Any URI comforming to the template selects a vocabulary subset based on the SKOS properties, including labels (skos:prefLabel, skos:altLabel, rdfs:label) and a subset of the semantic relations (skos:broader, skos:narrower, etc). SISSvoc3 thus provides a RESTFul SKOS API to query a vocabulary, but hiding the complexity of SPARQL. It has been implemented using the Linked Data API (LDA) [2], which connects to a SPARQL endpoint. By using LDA, we also get content-negotiation, alternative views, paging, metadata and other functionality provided in a standard way. A number of vocabularies have been formalized in SKOS and deployed by CSIRO, the Australian Bureau of Meteorology (BOM) and their collaborators using SISSvoc3, including: * geologic timescale (multiple versions) * soils classification * definitions from OGC standards * geosciml vocabularies * mining commodities * hyperspectral scalars Several other agencies in Australia have adopted SISSvoc3 for their vocabularies. SISSvoc3 differs from other SKOS-based vocabulary-access APIs such as GEMET [3] and NVS [4] in that (a) the service is decoupled from the content store, (b) the service URI is independent of the content URIs This means that a SISSvoc3 interface can be deployed over any SKOS vocabulary which is available at a SPARQL endpoint. As an example, a SISSvoc3 query and presentation interface has been deployed over the NERC vocabulary service hosted by the BODC, providing a search interface which is not available natively. We use vocabulary services to populate menus in user interfaces, to support data validation, and to configure data conversion routines. Related services built on LDA have also been used as a generic registry interface, and extended for serving gazetteer information. ACKNOWLEDGEMENTS The CSIRO SISSvoc3 implementation is built using the Epimorphics ELDA platform http://code.google.com/p/elda/. We thank Jacqui Githaiga and Terry Rankine for their contributions to SISSvoc design and implementation. REFERENCES 1. SISSvoc3 Specification https://www.seegrid.csiro.au/wiki/Siss/SISSvoc30Specification 2. Linked Data API http://code.google.com/p/linked-data-api/wiki/Specification 3. GEMET https://svn.eionet.europa.eu/projects/Zope/wiki/GEMETWebServiceAPI 4. NVS 2.0 http://vocab.nerc.ac.uk/

  18. Google Voice: Connecting Your Telephone to the 21st Century

    ERIC Educational Resources Information Center

    Johnson, Benjamin E.

    2010-01-01

    The foundation of the mighty Google Empire rests upon an algorithm that connects people to information--things such as websites, maps, and restaurant reviews. Lately it seems that people are less interested in connecting with information than they are with connecting to one another, which begs the question, "Is Facebook the new Google?" Given this…

  19. Cultural Adventures for the Google[TM] Generation

    ERIC Educational Resources Information Center

    Dann, Tammy

    2010-01-01

    Google Earth is a computer program that allows users to view the Earth through satellite imagery and maps, to see cities from above and through street views, and to search for addresses and browse locations. Many famous buildings and structures from around the world have detailed 3D views accessible on Google Earth. It is possible to explore the…

  20. Visualizing Geographic Data in Google Earth for Education and Outreach

    NASA Astrophysics Data System (ADS)

    Martin, D. J.; Treves, R.

    2008-12-01

    Google Earth is an excellent tool to help students and the public visualize scientific data as with low technical skill scientific content can be shown in three dimensions against a background of remotely sensed imagery. It therefore has a variety of uses in university education and as a tool for public outreach. However, in both situations it is of limited value if it is only used to attract attention with flashy three dimensional animations. In this poster we shall illustrate several applications that represent what we believe is good educational practice. The first example shows how the combination of a floor map and a projection of Google Earth on a screen can be used to produce active learning. Students are asked to imagine where they would build a house on Big Island Hawaii in order to avoid volcanic hazards. In the second example Google Earth is used to illustrate evidence over a range of scales in a description of Lake Agassiz flood events which would be more difficult to comprehend in a traditional paper based format. In the final example a simple text manipulation application "TMapper" is used to change the color palette of a thematic map generated by the students in Google Earth to teach them about the use of color in map design.

  1. KSC-2013-3236

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – As seen on Google Maps, the Rotating Service Structure at Launch Complex 39A at NASA's Kennedy Space Center housed space shuttle payloads temporarily so they could be loaded inside the 60-foot-long cargo bay of a shuttle before launch. The RSS, as the structure was known, was hinged to the Fixed Service Structure on one side and rolled on a rail on the other. As its name suggests, the enclosed facility would rotate into place around the shuttle as it stood at the launch pad. Once in place, the RSS protected the shuttle and its cargo. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang

  2. Active Fire Mapping Program

    MedlinePlus

    Active Fire Mapping Program Current Large Incidents (Home) New Large Incidents Fire Detection Maps MODIS Satellite Imagery VIIRS Satellite Imagery Fire Detection GIS Data Fire Data in Google Earth ...

  3. Leveraging Open Standards and Technologies to Search and Display Planetary Image Data

    NASA Astrophysics Data System (ADS)

    Rose, M.; Schauer, C.; Quinol, M.; Trimble, J.

    2011-12-01

    Mars and the Moon have both been visited by multiple NASA spacecraft. A large number of images and other data have been gathered by the spacecraft and are publicly available in NASA's Planetary Data System. Through a collaboration with Google, Inc., the User Centered Technologies group at NASA Ames Resarch Center has developed at tool for searching and browsing among images from multiple Mars and Moon missions. Development of this tool was facilitated by the use of several open technologies and standards. First, an open-source full-text search engine is used to search both place names on the target and to find images matching a geographic region. Second, the published API of the Google Earth browser plugin is used to geolocate the images on a virtual globe and allow the user to navigate on the globe to see related images. The structure of the application also employs standard protocols and services. The back-end is exposed as RESTful APIs, which could be reused by other client systems in the future. Further, the communication between the front- and back-end portions of the system utilizes open data standards including XML and KML (Keyhole Markup Language) for representation of textual and geographic data. The creation of the search index was facilitated by reuse of existing, publicly available metadata, including the Gazetteer of Planetary Nomenclature from the USGS, available in KML format. And the image metadata was reused from standards-compliant archives in the Planetary Data System. The system also supports collaboration with other tools by allowing export of search results in KML, and the ability to display those results in the Google Earth desktop application. We will demonstrate the search and visualization capabilities of the system, with emphasis on how the system facilitates reuse of data and services through the adoption of open standards.

  4. Virtual Field Trips: Using Google Maps to Support Online Learning and Teaching of the History of Astronomy

    ERIC Educational Resources Information Center

    Fluke, Christopher J.

    2009-01-01

    I report on a pilot study on the use of Google Maps to provide virtual field trips as a component of a wholly online graduate course on the history of astronomy. The Astronomical Tourist Web site (http://astronomy.swin.edu.au/sao/tourist), themed around the role that specific locations on Earth have contributed to the development of astronomical…

  5. Spatio-temporal Change Patterns of Tropical Forests from 2000 to 2014 Using MOD09A1 Dataset

    NASA Astrophysics Data System (ADS)

    Qin, Y.; Xiao, X.; Dong, J.

    2016-12-01

    Large-scale deforestation and forest degradation in the tropical region have resulted in extensive carbon emissions and biodiversity loss. However, restricted by the availability of good-quality observations, large uncertainty exists in mapping the spatial distribution of forests and their spatio-temporal changes. In this study, we proposed a pixel- and phenology-based algorithm to identify and map annual tropical forests from 2000 to 2014, using the 8-day, 500-m MOD09A1 (v005) product, under the support of Google cloud computing (Google Earth Engine). A temporal filter was applied to reduce the random noises and to identify the spatio-temporal changes of forests. We then built up a confusion matrix and assessed the accuracy of the annual forest maps based on the ground reference interpreted from high spatial resolution images in Google Earth. The resultant forest maps showed the consistent forest/non-forest, forest loss, and forest gain in the pan-tropical zone during 2000 - 2014. The proposed algorithm showed the potential for tropical forest mapping and the resultant forest maps are important for the estimation of carbon emission and biodiversity loss.

  6. Predicting plant attractiveness to pollinators with passive crowdsourcing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bahlai, Christie A.; Landis, Douglas A.

    Global concern regarding pollinator decline has intensified interest in enhancing pollinator resources in managed landscapes. These efforts frequently emphasize restoration or planting of flowering plants to provide pollen and nectar resources that are highly attractive to the desired pollinators. However, determining exactly which plant species should be used to enhance a landscape is difficult. Empirical screening of plants for such purposes is logistically daunting, but could be streamlined by crowdsourcing data to create lists of plants most probable to attract the desired pollinator taxa. People frequently photograph plants in bloom and the Internet has become a vast repository of suchmore » images. A proportion of these images also capture floral visitation by arthropods. Here, we test the hypothesis that the abundance of floral images containing identifiable pollinator and other beneficial insects is positively associated with the observed attractiveness of the same species in controlled field trials from previously published studies. We used Google Image searches to determine the correlation of pollinator visitation captured by photographs on the Internet relative to the attractiveness of the same species in common-garden field trials for 43 plant species. From the first 30 photographs, which successfully identified the plant, we recorded the number of Apis (managed honeybees), non-Apis (exclusively wild bees) and the number of bee-mimicking syrphid flies. We used these observations from search hits as well as bloom period (BP) as predictor variables in Generalized Linear Models (GLMs) for field-observed abundances of each of these groups. We found that non-Apis bees observed in controlled field trials were positively associated with observations of these taxa in Google Image searches (pseudo-R 2 of 0.668). Syrphid fly observations in the field were also associated with the frequency they were observed in images, but this relationship was weak. Apis bee observations were not associated with Internet images, but were slightly associated with BP. Our results suggest that passively crowdsourced image data can potentially be a useful screening tool to identify candidate plants for pollinator habitat restoration efforts directed at wild bee conservation. Increasing our understanding of the attractiveness of a greater diversity of plants increases the potential for more rapid and efficient research in creating pollinator-supportive landscapes.« less

  7. Predicting plant attractiveness to pollinators with passive crowdsourcing

    DOE PAGES

    Bahlai, Christie A.; Landis, Douglas A.

    2016-06-01

    Global concern regarding pollinator decline has intensified interest in enhancing pollinator resources in managed landscapes. These efforts frequently emphasize restoration or planting of flowering plants to provide pollen and nectar resources that are highly attractive to the desired pollinators. However, determining exactly which plant species should be used to enhance a landscape is difficult. Empirical screening of plants for such purposes is logistically daunting, but could be streamlined by crowdsourcing data to create lists of plants most probable to attract the desired pollinator taxa. People frequently photograph plants in bloom and the Internet has become a vast repository of suchmore » images. A proportion of these images also capture floral visitation by arthropods. Here, we test the hypothesis that the abundance of floral images containing identifiable pollinator and other beneficial insects is positively associated with the observed attractiveness of the same species in controlled field trials from previously published studies. We used Google Image searches to determine the correlation of pollinator visitation captured by photographs on the Internet relative to the attractiveness of the same species in common-garden field trials for 43 plant species. From the first 30 photographs, which successfully identified the plant, we recorded the number of Apis (managed honeybees), non-Apis (exclusively wild bees) and the number of bee-mimicking syrphid flies. We used these observations from search hits as well as bloom period (BP) as predictor variables in Generalized Linear Models (GLMs) for field-observed abundances of each of these groups. We found that non-Apis bees observed in controlled field trials were positively associated with observations of these taxa in Google Image searches (pseudo-R 2 of 0.668). Syrphid fly observations in the field were also associated with the frequency they were observed in images, but this relationship was weak. Apis bee observations were not associated with Internet images, but were slightly associated with BP. Our results suggest that passively crowdsourced image data can potentially be a useful screening tool to identify candidate plants for pollinator habitat restoration efforts directed at wild bee conservation. Increasing our understanding of the attractiveness of a greater diversity of plants increases the potential for more rapid and efficient research in creating pollinator-supportive landscapes.« less

  8. Predicting plant attractiveness to pollinators with passive crowdsourcing

    PubMed Central

    Bahlai, Christie A.; Landis, Douglas A.

    2016-01-01

    Global concern regarding pollinator decline has intensified interest in enhancing pollinator resources in managed landscapes. These efforts frequently emphasize restoration or planting of flowering plants to provide pollen and nectar resources that are highly attractive to the desired pollinators. However, determining exactly which plant species should be used to enhance a landscape is difficult. Empirical screening of plants for such purposes is logistically daunting, but could be streamlined by crowdsourcing data to create lists of plants most probable to attract the desired pollinator taxa. People frequently photograph plants in bloom and the Internet has become a vast repository of such images. A proportion of these images also capture floral visitation by arthropods. Here, we test the hypothesis that the abundance of floral images containing identifiable pollinator and other beneficial insects is positively associated with the observed attractiveness of the same species in controlled field trials from previously published studies. We used Google Image searches to determine the correlation of pollinator visitation captured by photographs on the Internet relative to the attractiveness of the same species in common-garden field trials for 43 plant species. From the first 30 photographs, which successfully identified the plant, we recorded the number of Apis (managed honeybees), non-Apis (exclusively wild bees) and the number of bee-mimicking syrphid flies. We used these observations from search hits as well as bloom period (BP) as predictor variables in Generalized Linear Models (GLMs) for field-observed abundances of each of these groups. We found that non-Apis bees observed in controlled field trials were positively associated with observations of these taxa in Google Image searches (pseudo-R2 of 0.668). Syrphid fly observations in the field were also associated with the frequency they were observed in images, but this relationship was weak. Apis bee observations were not associated with Internet images, but were slightly associated with BP. Our results suggest that passively crowdsourced image data can potentially be a useful screening tool to identify candidate plants for pollinator habitat restoration efforts directed at wild bee conservation. Increasing our understanding of the attractiveness of a greater diversity of plants increases the potential for more rapid and efficient research in creating pollinator-supportive landscapes. PMID:27429762

  9. Global positioning system & Google Earth in the investigation of an outbreak of cholera in a village of Bengaluru Urban district, Karnataka.

    PubMed

    Masthi, N R Ramesh; Madhusudan, M; Puthussery, Yannick P

    2015-11-01

    The global positioning system (GPS) technology along with Google Earth is used to measure (spatial map) the accurate distribution of morbidity, mortality and planning of interventions in the community. We used this technology to find out its role in the investigation of a cholera outbreak, and also to identify the cause of the outbreak. This study was conducted in a village near Bengaluru, Karnataka in June 2013 during a cholera outbreak. House-to-house survey was done to identify acute watery diarrhoea cases. A hand held GPS receiver was used to record north and east coordinates of the households of cases and these values were subsequently plotted on Google Earth map. Water samples were collected from suspected sources for microbiological analysis. A total of 27 cases of acute watery diarrhoea were reported. Fifty per cent of cases were in the age group of 14-44 yr and one death was reported. GPS technology and Google Earth described the accurate location of household of cases and spot map generated showed clustering of cases around the suspected water sources. The attack rate was 6.92 per cent and case fatality rate was 3.7 per cent. Water samples collected from suspected sources showed the presence of Vibrio cholera O1 Ogawa. GPS technology and Google Earth were easy to use, helpful to accurately pinpoint the location of household of cases, construction of spot map and follow up of cases. Outbreak was found to be due to contamination of drinking water sources.

  10. KSC-2013-3237

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – As seen on Google Maps, the view from the top of the Fixed Service Structure at Launch Complex 39A at NASA's Kennedy Space Center. The FSS, as the structure is known, is 285 feet high and overlooks the Rotating Service Structure that was rolled into place when a space shuttle was at the pad. The path taken by NASA's massive crawler-transporters that carried the shuttle stack 3 miles from Vehicle Assembly Building are also visible leading up to the launch pad. In the distance are seen the launch pads and support structures at Cape Canaveral Air Force Station for the Atlas V, Delta IV and Falcon 9 rockets. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang

  11. KSC-2013-3240

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – As seen on Google Maps, space shuttle Endeavour goes through transition and retirement processing in high bay 4 of the Vehicle Assembly Building at NASA's Kennedy Space Center. The spacecraft completed 25 missions beginning with its first flight, STS-49, in May 1992, and ending with STS-134 in May 2011. It helped construct the International Space Station in orbit and travelled more than 122 million miles in orbit during its career. The reaction control system pods in the shuttle's nose and aft section were removed for processing before Endeavour was put on public display at the California Science Center in Los Angeles. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang

  12. Assessing species habitat using Google Street View: a case study of cliff-nesting vultures.

    PubMed

    Olea, Pedro P; Mateo-Tomás, Patricia

    2013-01-01

    The assessment of a species' habitat is a crucial issue in ecology and conservation. While the collection of habitat data has been boosted by the availability of remote sensing technologies, certain habitat types have yet to be collected through costly, on-ground surveys, limiting study over large areas. Cliffs are ecosystems that provide habitat for a rich biodiversity, especially raptors. Because of their principally vertical structure, however, cliffs are not easy to study by remote sensing technologies, posing a challenge for many researches and managers working with cliff-related biodiversity. We explore the feasibility of Google Street View, a freely available on-line tool, to remotely identify and assess the nesting habitat of two cliff-nesting vultures (the griffon vulture and the globally endangered Egyptian vulture) in northwestern Spain. Two main usefulness of Google Street View to ecologists and conservation biologists were evaluated: i) remotely identifying a species' potential habitat and ii) extracting fine-scale habitat information. Google Street View imagery covered 49% (1,907 km) of the roads of our study area (7,000 km²). The potential visibility covered by on-ground surveys was significantly greater (mean: 97.4%) than that of Google Street View (48.1%). However, incorporating Google Street View to the vulture's habitat survey would save, on average, 36% in time and 49.5% in funds with respect to the on-ground survey only. The ability of Google Street View to identify cliffs (overall accuracy = 100%) outperformed the classification maps derived from digital elevation models (DEMs) (62-95%). Nonetheless, high-performance DEM maps may be useful to compensate Google Street View coverage limitations. Through Google Street View we could examine 66% of the vultures' nesting-cliffs existing in the study area (n = 148): 64% from griffon vultures and 65% from Egyptian vultures. It also allowed us the extraction of fine-scale features of cliffs. This World Wide Web-based methodology may be a useful, complementary tool to remotely map and assess the potential habitat of cliff-dependent biodiversity over large geographic areas, saving survey-related costs.

  13. Assessing Species Habitat Using Google Street View: A Case Study of Cliff-Nesting Vultures

    PubMed Central

    Olea, Pedro P.; Mateo-Tomás, Patricia

    2013-01-01

    The assessment of a species’ habitat is a crucial issue in ecology and conservation. While the collection of habitat data has been boosted by the availability of remote sensing technologies, certain habitat types have yet to be collected through costly, on-ground surveys, limiting study over large areas. Cliffs are ecosystems that provide habitat for a rich biodiversity, especially raptors. Because of their principally vertical structure, however, cliffs are not easy to study by remote sensing technologies, posing a challenge for many researches and managers working with cliff-related biodiversity. We explore the feasibility of Google Street View, a freely available on-line tool, to remotely identify and assess the nesting habitat of two cliff-nesting vultures (the griffon vulture and the globally endangered Egyptian vulture) in northwestern Spain. Two main usefulness of Google Street View to ecologists and conservation biologists were evaluated: i) remotely identifying a species’ potential habitat and ii) extracting fine-scale habitat information. Google Street View imagery covered 49% (1,907 km) of the roads of our study area (7,000 km2). The potential visibility covered by on-ground surveys was significantly greater (mean: 97.4%) than that of Google Street View (48.1%). However, incorporating Google Street View to the vulture’s habitat survey would save, on average, 36% in time and 49.5% in funds with respect to the on-ground survey only. The ability of Google Street View to identify cliffs (overall accuracy = 100%) outperformed the classification maps derived from digital elevation models (DEMs) (62–95%). Nonetheless, high-performance DEM maps may be useful to compensate Google Street View coverage limitations. Through Google Street View we could examine 66% of the vultures’ nesting-cliffs existing in the study area (n = 148): 64% from griffon vultures and 65% from Egyptian vultures. It also allowed us the extraction of fine-scale features of cliffs. This World Wide Web-based methodology may be a useful, complementary tool to remotely map and assess the potential habitat of cliff-dependent biodiversity over large geographic areas, saving survey-related costs. PMID:23355880

  14. archAR: an archaeological augmented reality experience

    NASA Astrophysics Data System (ADS)

    Wiley, Bridgette; Schulze, Jürgen P.

    2015-03-01

    We present an application for Android phones or tablets called "archAR" that uses augmented reality as an alternative, portable way of viewing archaeological information from UCSD's Levantine Archaeology Laboratory. archAR provides a unique experience of flying through an archaeological dig site in the Levantine area and exploring the artifacts uncovered there. Using a Google Nexus tablet and Qualcomm's Vuforia API, we use an image target as a map and overlay a three-dimensional model of the dig site onto it, augmenting reality such that we are able to interact with the plotted artifacts. The user can physically move the Android device around the image target and see the dig site model from any perspective. The user can also move the device closer to the model in order to "zoom" into the view of a particular section of the model and its associated artifacts. This is especially useful, as the dig site model and the collection of artifacts are very detailed. The artifacts are plotted as points, colored by type. The user can touch the virtual points to trigger a popup information window that contains details of the artifact, such as photographs, material descriptions, and more.

  15. NASA Earth Observations (NEO): Data Imagery for Education and Visualization

    NASA Astrophysics Data System (ADS)

    Ward, K.

    2008-12-01

    NASA Earth Observations (NEO) has dramatically simplified public access to georeferenced imagery of NASA remote sensing data. NEO targets the non-traditional data users who are currently underserved by functionality and formats available from the existing data ordering systems. These users include formal and informal educators, museum and science center personnel, professional communicators, and citizen scientists. NEO currently serves imagery from 45 different datasets with daily, weekly, and/or monthly temporal resolutions, with more datasets currently under development. The imagery from these datasets is produced in coordination with several data partners who are affiliated either with the instrument science teams or with the respective data processing center. NEO is a system of three components -- website, WMS (Web Mapping Service), and ftp archive -- which together are able to meet the wide-ranging needs of our users. Some of these needs include the ability to: view and manipulate imagery using the NEO website -- e.g., applying color palettes, resizing, exporting to a variety of formats including PNG, JPEG, KMZ (Google Earth), GeoTIFF; access the NEO collection via a standards-based API (WMS); and create customized exports for select users (ftp archive) such as Science on a Sphere, NASA's Earth Observatory, and others.

  16. Genetics Home Reference: cri-du-chat syndrome

    MedlinePlus

    ... Pinkel D. High-resolution mapping of genotype-phenotype relationships in cri du chat syndrome using array comparative ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  17. A method for vreating a three dimensional model from published geologic maps and cross sections

    USGS Publications Warehouse

    Walsh, Gregory J.

    2009-01-01

    This brief report presents a relatively inexpensive and rapid method for creating a 3D model of geology from published quadrangle-scale maps and cross sections using Google Earth and Google SketchUp software. An example from the Green Mountains of Vermont, USA, is used to illustrate the step by step methods used to create such a model. A second example is provided from the Jebel Saghro region of the Anti-Atlas Mountains of Morocco. The report was published to help enhance the public?s ability to use and visualize geologic map data.

  18. The impact of geo-tagging on the photo industry and creating revenue streams

    NASA Astrophysics Data System (ADS)

    Richter, Rolf; Böge, Henning; Weckmann, Christoph; Schloen, Malte

    2010-02-01

    Internet geo and mapping services like Google Maps, Google Earth and Microsoft Bing Maps have reinvented the use of geographical information and have reached an enormous popularity. Besides that, location technologies like GPS have become affordable and are now being integrated in many camera phones. GPS is also available for standalone cameras as add on products or integrated in cameras. These developments are the enabler for new products for the photo industry or they enhance existing products. New commercial opportunities have been identified in the areas of photo hardware, internet/software and photo finishing.

  19. Web GIS in practice V: 3-D interactive and real-time mapping in Second Life

    PubMed Central

    Boulos, Maged N Kamel; Burden, David

    2007-01-01

    This paper describes technologies from Daden Limited for geographically mapping and accessing live news stories/feeds, as well as other real-time, real-world data feeds (e.g., Google Earth KML feeds and GeoRSS feeds) in the 3-D virtual world of Second Life, by plotting and updating the corresponding Earth location points on a globe or some other suitable form (in-world), and further linking those points to relevant information and resources. This approach enables users to visualise, interact with, and even walk or fly through, the plotted data in 3-D. Users can also do the reverse: put pins on a map in the virtual world, and then view the data points on the Web in Google Maps or Google Earth. The technologies presented thus serve as a bridge between mirror worlds like Google Earth and virtual worlds like Second Life. We explore the geo-data display potential of virtual worlds and their likely convergence with mirror worlds in the context of the future 3-D Internet or Metaverse, and reflect on the potential of such technologies and their future possibilities, e.g. their use to develop emergency/public health virtual situation rooms to effectively manage emergencies and disasters in real time. The paper also covers some of the issues associated with these technologies, namely user interface accessibility and individual privacy. PMID:18042275

  20. Caltrans - California Department of Transportation

    Science.gov Websites

    Caltrans QuickMap QuickMap Mobile QuickMap Android App Check Current Highway Conditions: Enter Highway the App Store. Google Play Apple Store Quickmap Mobile Version Quickmap Full Version CA Safety

  1. Direct visualization of in vitro drug mobilization from Lescol XL tablets using two-dimensional (19)F and (1)H magnetic resonance imaging.

    PubMed

    Chen, Chen; Gladden, Lynn F; Mantle, Michael D

    2014-02-03

    This article reports the application of in vitro multinuclear ((19)F and (1)H) two-dimensional magnetic resonance imaging (MRI) to study both dissolution media ingress and drug egress from a commercial Lescol XL extended release tablet in a United States Pharmacopeia Type IV (USP-IV) dissolution cell under pharmacopoeial conditions. Noninvasive spatial maps of tablet swelling and dissolution, as well as the mobilization and distribution of the drug are quantified and visualized. Two-dimensional active pharmaceutical ingredient (API) mobilization and distribution maps were obtained via (19)F MRI. (19)F API maps were coregistered with (1)H T2-relaxation time maps enabling the simultaneous visualization of drug distribution and gel layer dynamics within the swollen tablet. The behavior of the MRI data is also discussed in terms of its relationship to the UV drug release behavior.

  2. Standardized mappings--a framework to combine different semantic mappers into a standardized web-API.

    PubMed

    Neuhaus, Philipp; Doods, Justin; Dugas, Martin

    2015-01-01

    Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.

  3. GIS Technologies For The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Docasal, R.; Barbarisi, I.; Rios, C.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; De Marchi, G.; Martinez, S.; Grotheer, E.; Lim, T.; Besse, S.; Heather, D.; Fraga, D.; Barthelemy, M.

    2015-12-01

    Geographical information system (GIS) is becoming increasingly used for planetary science. GIS are computerised systems for the storage, retrieval, manipulation, analysis, and display of geographically referenced data. Some data stored in the Planetary Science Archive (PSA), for instance, a set of Mars Express/Venus Express data, have spatial metadata associated to them. To facilitate users in handling and visualising spatial data in GIS applications, the new PSA should support interoperability with interfaces implementing the standards approved by the Open Geospatial Consortium (OGC). These standards are followed in order to develop open interfaces and encodings that allow data to be exchanged with GIS Client Applications, well-known examples of which are Google Earth and NASA World Wind as well as open source tools such as Openlayers. The technology already exists within PostgreSQL databases to store searchable geometrical data in the form of the PostGIS extension. An existing open source maps server is GeoServer, an instance of which has been deployed for the new PSA, uses the OGC standards to allow, among others, the sharing, processing and editing of data and spatial data through the Web Feature Service (WFS) standard as well as serving georeferenced map images through the Web Map Service (WMS). The final goal of the new PSA, being developed by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is to create an archive which enables science exploitation of ESA's planetary missions datasets. This can be facilitated through the GIS framework, offering interfaces (both web GUI and scriptable APIs) that can be used more easily and scientifically by the community, and that will also enable the community to build added value services on top of the PSA.

  4. Novel data sources for women's health research: mapping breast screening online information seeking through Google trends.

    PubMed

    Fazeli Dehkordy, Soudabeh; Carlos, Ruth C; Hall, Kelli S; Dalton, Vanessa K

    2014-09-01

    Millions of people use online search engines everyday to find health-related information and voluntarily share their personal health status and behaviors in various Web sites. Thus, data from tracking of online information seeker's behavior offer potential opportunities for use in public health surveillance and research. Google Trends is a feature of Google which allows Internet users to graph the frequency of searches for a single term or phrase over time or by geographic region. We used Google Trends to describe patterns of information-seeking behavior in the subject of dense breasts and to examine their correlation with the passage or introduction of dense breast notification legislation. To capture the temporal variations of information seeking about dense breasts, the Web search query "dense breast" was entered in the Google Trends tool. We then mapped the dates of legislative actions regarding dense breasts that received widespread coverage in the lay media to information-seeking trends about dense breasts over time. Newsworthy events and legislative actions appear to correlate well with peaks in search volume of "dense breast". Geographic regions with the highest search volumes have passed, denied, or are currently considering the dense breast legislation. Our study demonstrated that any legislative action and respective news coverage correlate with increase in information seeking for "dense breast" on Google, suggesting that Google Trends has the potential to serve as a data source for policy-relevant research. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  5. Google's Geo Education Outreach: Results and Discussion of Outreach Trip to Alaskan High Schools.

    NASA Astrophysics Data System (ADS)

    Kolb, E. J.; Bailey, J.; Bishop, A.; Cain, J.; Goddard, M.; Hurowitz, K.; Kennedy, K.; Ornduff, T.; Sfraga, M.; Wernecke, J.

    2008-12-01

    The focus of Google's Geo Education outreach efforts (http://www.google.com/educators/geo.html) is on helping primary, secondary, and post-secondary educators incorporate Google Earth and Sky, Google Maps, and SketchUp into their classroom lessons. In partnership with the University of Alaska, our Geo Education team members visited several remote Alaskan high schools during a one-week period in September. At each school, we led several 40-minute hands-on learning sessions in which Google products were used by the students to investigate local geologic and environmental processes. For the teachers, we provided several resources including follow-on lesson plans, example KML-based lessons, useful URL's, and website resources that multiple users can contribute to. This talk will highlight results of the trip and discuss how educators can access and use Google's Geo Education resources.

  6. Geospatial Data Science Applications and Visualizations | Geospatial Data

    Science.gov Websites

    . Since before the time of Google Maps, NREL has used the internet to allow stakeholders to view and world, these maps drive understanding. See our collection of key maps for examples. Featured Analysis

  7. KSC-2013-3234

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – As seen on Google Maps, Firing Room 3 inside the Launch Control Center at NASA's Kennedy Space Center was one of the four control rooms used by NASA and contractor launch teams to oversee a space shuttle countdown. This firing room is furnished in the classic style with the same metal computer cabinets and some of the same monitors in place when the first shuttle mission launched April 12, 1981. Specialized operators worked at consoles tailored to keep track of the status of shuttle systems while the spacecraft was processed in the Orbiter Processing Facility, being stacked inside the Vehicle Assembly Building and standing at the launch pad before liftoff. The firing rooms, including 3, were also used during NASA's Apollo Program. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang

  8. An interactive GIS based tool on Chinese history and its topography

    NASA Astrophysics Data System (ADS)

    Konda, Ashish Reddy

    The aim of the thesis is to demonstrate how China was attacked by the foreign powers, the rise and fall of the empires, the border conflicts with India, Russia, Vietnam and territorial disputes in South China Sea. This thesis is focused on creating a GIS tool showcasing the modern Chinese history, which includes the major wars fought during that period. This tool is developed using the features of Google Maps that shows the location of the wars. The topography of China is also represented on the interactive Google Map by creating layers for rivers, mountain ranges and deserts. The provinces with highest population are also represented on the Google Map with circles. The application also shows the historical events in chronological order using a timeline feature. This has been implemented using JQuery, JavaScript, HTML5 and CSS. Chinese culture and biographies of important leaders are also included in this thesis, which is embedded with pictures and videos.

  9. The Ruby UCSC API: accessing the UCSC genome database using Ruby.

    PubMed

    Mishima, Hiroyuki; Aerts, Jan; Katayama, Toshiaki; Bonnal, Raoul J P; Yoshiura, Koh-ichiro

    2012-09-21

    The University of California, Santa Cruz (UCSC) genome database is among the most used sources of genomic annotation in human and other organisms. The database offers an excellent web-based graphical user interface (the UCSC genome browser) and several means for programmatic queries. A simple application programming interface (API) in a scripting language aimed at the biologist was however not yet available. Here, we present the Ruby UCSC API, a library to access the UCSC genome database using Ruby. The API is designed as a BioRuby plug-in and built on the ActiveRecord 3 framework for the object-relational mapping, making writing SQL statements unnecessary. The current version of the API supports databases of all organisms in the UCSC genome database including human, mammals, vertebrates, deuterostomes, insects, nematodes, and yeast.The API uses the bin index-if available-when querying for genomic intervals. The API also supports genomic sequence queries using locally downloaded *.2bit files that are not stored in the official MySQL database. The API is implemented in pure Ruby and is therefore available in different environments and with different Ruby interpreters (including JRuby). Assisted by the straightforward object-oriented design of Ruby and ActiveRecord, the Ruby UCSC API will facilitate biologists to query the UCSC genome database programmatically. The API is available through the RubyGem system. Source code and documentation are available at https://github.com/misshie/bioruby-ucsc-api/ under the Ruby license. Feedback and help is provided via the website at http://rubyucscapi.userecho.com/.

  10. The Ruby UCSC API: accessing the UCSC genome database using Ruby

    PubMed Central

    2012-01-01

    Background The University of California, Santa Cruz (UCSC) genome database is among the most used sources of genomic annotation in human and other organisms. The database offers an excellent web-based graphical user interface (the UCSC genome browser) and several means for programmatic queries. A simple application programming interface (API) in a scripting language aimed at the biologist was however not yet available. Here, we present the Ruby UCSC API, a library to access the UCSC genome database using Ruby. Results The API is designed as a BioRuby plug-in and built on the ActiveRecord 3 framework for the object-relational mapping, making writing SQL statements unnecessary. The current version of the API supports databases of all organisms in the UCSC genome database including human, mammals, vertebrates, deuterostomes, insects, nematodes, and yeast. The API uses the bin index—if available—when querying for genomic intervals. The API also supports genomic sequence queries using locally downloaded *.2bit files that are not stored in the official MySQL database. The API is implemented in pure Ruby and is therefore available in different environments and with different Ruby interpreters (including JRuby). Conclusions Assisted by the straightforward object-oriented design of Ruby and ActiveRecord, the Ruby UCSC API will facilitate biologists to query the UCSC genome database programmatically. The API is available through the RubyGem system. Source code and documentation are available at https://github.com/misshie/bioruby-ucsc-api/ under the Ruby license. Feedback and help is provided via the website at http://rubyucscapi.userecho.com/. PMID:22994508

  11. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui

    PubMed Central

    2012-01-01

    Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications. PMID:22998945

  12. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui.

    PubMed

    Newton, Richard; Deonarine, Andrew; Wernisch, Lorenz

    2012-09-24

    The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications.

  13. A Land-Use-Planning Simulation Using Google Earth

    ERIC Educational Resources Information Center

    Bodzin, Alec M.; Cirucci, Lori

    2009-01-01

    Google Earth (GE) is proving to be a valuable tool in the science classroom for understanding the environment and making responsible environmental decisions (Bodzin 2008). GE provides learners with a dynamic mapping experience using a simple interface with a limited range of functions. This interface makes geospatial analysis accessible and…

  14. Multi-temporal Land Use Mapping of Coastal Wetlands Area using Machine Learning in Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Farda, N. M.

    2017-12-01

    Coastal wetlands provide ecosystem services essential to people and the environment. Changes in coastal wetlands, especially on land use, are important to monitor by utilizing multi-temporal imagery. The Google Earth Engine (GEE) provides many machine learning algorithms (10 algorithms) that are very useful for extracting land use from imagery. The research objective is to explore machine learning in Google Earth Engine and its accuracy for multi-temporal land use mapping of coastal wetland area. Landsat 3 MSS (1978), Landsat 5 TM (1991), Landsat 7 ETM+ (2001), and Landsat 8 OLI (2014) images located in Segara Anakan lagoon are selected to represent multi temporal images. The input for machine learning are visible and near infrared bands, PCA band, invers PCA bands, bare soil index, vegetation index, wetness index, elevation from ASTER GDEM, and GLCM (Harralick) texture, and also polygon samples in 140 locations. There are 10 machine learning algorithms applied to extract coastal wetlands land use from Landsat imagery. The algorithms are Fast Naive Bayes, CART (Classification and Regression Tree), Random Forests, GMO Max Entropy, Perceptron (Multi Class Perceptron), Winnow, Voting SVM, Margin SVM, Pegasos (Primal Estimated sub-GrAdient SOlver for Svm), IKPamir (Intersection Kernel Passive Aggressive Method for Information Retrieval, SVM). Machine learning in Google Earth Engine are very helpful in multi-temporal land use mapping, the highest accuracy for land use mapping of coastal wetland is CART with 96.98 % Overall Accuracy using K-Fold Cross Validation (K = 10). GEE is particularly useful for multi-temporal land use mapping with ready used image and classification algorithms, and also very challenging for other applications.

  15. compuGUT: An in silico platform for simulating intestinal fermentation

    NASA Astrophysics Data System (ADS)

    Moorthy, Arun S.; Eberl, Hermann J.

    The microbiota inhabiting the colon and its effect on health is a topic of significant interest. In this paper, we describe the compuGUT - a simulation tool developed to assist in exploring interactions between intestinal microbiota and their environment. The primary numerical machinery is implemented in C, and the accessory scripts for loading and visualization are prepared in bash (LINUX) and R. SUNDIALS libraries are employed for numerical integration, and googleVis API for interactive visualization. Supplementary material includes a concise description of the underlying mathematical model, and detailed characterization of numerical errors and computing times associated with implementation parameters.

  16. 3D Viewer Platform of Cloud Clustering Management System: Google Map 3D

    NASA Astrophysics Data System (ADS)

    Choi, Sung-Ja; Lee, Gang-Soo

    The new management system of framework for cloud envrionemnt is needed by the platfrom of convergence according to computing environments of changes. A ISV and small business model is hard to adapt management system of platform which is offered from super business. This article suggest the clustering management system of cloud computing envirionments for ISV and a man of enterprise in small business model. It applies the 3D viewer adapt from map3D & earth of google. It is called 3DV_CCMS as expand the CCMS[1].

  17. Utilization of Google Earth for Distribution Mapping of Cholangiocarcinoma: a Case Study in Satuek District, Buriram, Thailand.

    PubMed

    Rattanasing, Wannaporn; Kaewpitoon, Soraya J; Loyd, Ryan A; Rujirakul, Ratana; Yodkaw, Eakachai; Kaewpitoon, Natthawut

    2015-01-01

    Cholangiocarcinoma (CCA) is a serious public health problem in the Northeast of Thailand. CCA is considered to be an incurable and rapidly lethal disease. Knowledge of the distribution of CCA patients is necessary for management strategies. This study aimed to utilize the Geographic Information System and Google EarthTM for distribution mapping of cholangiocarcinoma in Satuek District, Buriram, Thailand, during a 5-year period (2008-2012). In this retrospective study data were collected and reviewed from the OPD cards, definitive cases of CCA were patients who were treated in Satuek hospital and were diagnosed with CCA or ICD-10 code C22.1. CCA cases were used to analyze and calculate with ArcGIS 9.2, all of data were imported into Google Earth using the online web page www.earthpoint.us. Data were displayed at village points. A total of 53 cases were diagnosed and identified as CCA. The incidence was 53.57 per 100,000 population (65.5 for males and 30.8 for females) and the majority of CCA cases were in stages IV and IIA. The average age was 67 years old. The highest attack rate was observed in Thung Wang sub-district (161.4 per 100,000 population). The map display at village points for CCA patients based on Google Earth gave a clear visual deistribution. CCA is still a major problem in Satuek district, Buriram province of Thailand. The Google Earth production process is very simple and easy to learn. It is suitable for the use in further development of CCA management strategies.

  18. Novel Data Sources for Women’s Health Research: Mapping Breast Screening Online Information Seeking Through Google Trends

    PubMed Central

    Dehkordy, Soudabeh Fazeli; Carlos, Ruth C.; Hall, Kelli S.; Dalton, Vanessa K.

    2015-01-01

    Rationale and Objectives Millions of people use online search engines every day to find health-related information and voluntarily share their personal health status and behaviors in various Web sites. Thus, data from tracking of online information seeker’s behavior offer potential opportunities for use in public health surveillance and research. Google Trends is a feature of Google which allows internet users to graph the frequency of searches for a single term or phrase over time or by geographic region. We used Google Trends to describe patterns of information seeking behavior in the subject of dense breasts and to examine their correlation with the passage or introduction of dense breast notification legislation. Materials and Methods In order to capture the temporal variations of information seeking about dense breasts, the web search query “dense breast” was entered in the Google Trends tool. We then mapped the dates of legislative actions regarding dense breasts that received widespread coverage in the lay media to information seeking trends about dense breasts over time. Results Newsworthy events and legislative actions appear to correlate well with peaks in search volume of “dense breast”. Geographic regions with the highest search volumes have either passed, denied, or are currently considering the dense breast legislation. Conclusions Our study demonstrated that any legislative action and respective news coverage correlate with increase in information seeking for “dense breast” on Google, suggesting that Google Trends has the potential to serve as a data source for policy-relevant research. PMID:24998689

  19. Development of RESTful services and map-based user interface tools for access and delivery of data and metadata from the Marine-Geo Digital Library

    NASA Astrophysics Data System (ADS)

    Morton, J. J.; Ferrini, V. L.

    2015-12-01

    The Marine Geoscience Data System (MGDS, www.marine-geo.org) operates an interactive digital data repository and metadata catalog that provides access to a variety of marine geology and geophysical data from throughout the global oceans. Its Marine-Geo Digital Library includes common marine geophysical data types and supporting data and metadata, as well as complementary long-tail data. The Digital Library also includes community data collections and custom data portals for the GeoPRISMS, MARGINS and Ridge2000 programs, for active source reflection data (Academic Seismic Portal), and for marine data acquired by the US Antarctic Program (Antarctic and Southern Ocean Data Portal). Ensuring that these data are discoverable not only through our own interfaces but also through standards-compliant web services is critical for enabling investigators to find data of interest.Over the past two years, MGDS has developed several new RESTful web services that enable programmatic access to metadata and data holdings. These web services are compliant with the EarthCube GeoWS Building Blocks specifications and are currently used to drive our own user interfaces. New web applications have also been deployed to provide a more intuitive user experience for searching, accessing and browsing metadata and data. Our new map-based search interface combines components of the Google Maps API with our web services for dynamic searching and exploration of geospatially constrained data sets. Direct introspection of nearly all data formats for hundreds of thousands of data files curated in the Marine-Geo Digital Library has allowed for precise geographic bounds, which allow geographic searches to an extent not previously possible. All MGDS map interfaces utilize the web services of the Global Multi-Resolution Topography (GMRT) synthesis for displaying global basemap imagery and for dynamically provide depth values at the cursor location.

  20. The World in Spatial Terms: Mapmaking and Map Reading

    ERIC Educational Resources Information Center

    Ekiss, Gale Olp; Trapido-Lurie, Barbara; Phillips, Judy; Hinde, Elizabeth

    2007-01-01

    Maps and mapping activities are essential in the primary grades. Maps are truly ubiquitous today, as evidenced by the popularity of websites such as Google Earth and Mapquest, and by devices such as Global Positioning System (GPS) units in cars, planes, and boats. Maps can give visual settings to travel stories and historical narratives and can…

  1. Dry coating of micronized API powders for improved dissolution of directly compacted tablets with high drug loading.

    PubMed

    Han, Xi; Ghoroi, Chinmay; Davé, Rajesh

    2013-02-14

    Motivated by our recent study showing improved flow and dissolution rate of the active pharmaceutical ingredient (API) powders (20 μm) produced via simultaneous micronization and surface modification through continuous fluid energy milling (FEM) process, the performance of blends and direct compacted tablets with high drug loading is examined. Performance of 50 μm API powders dry coated without micronization is also considered for comparison. Blends of micronized, non-micronized, dry coated or uncoated API powders at 30, 60 and 70% drug loading, are examined. The results show that the blends containing dry coated API powders, even micronized ones, have excellent flowability and high bulk density compared to the blends containing uncoated API, which are required for direct compaction. As the drug loading increases, the difference between dry coated and uncoated blends is more pronounced, as seen in the proposed bulk density-FFC phase map. Dry coating led to improved tablet compactibility profiles, corresponding with the improvements in blend compressibility. The most significant advantage is in tablet dissolution where for all drug loadings, the t(80) for the tablets with dry coated APIs was well under 5 min, indicating that this approach can produce nearly instant release direct compacted tablets at high drug loadings. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Design of Deformation Monitoring System for Volcano Mitigation

    NASA Astrophysics Data System (ADS)

    Islamy, M. R. F.; Salam, R. A.; Munir, M. M.; Irsyam, M.; Khairurrijal

    2016-08-01

    Indonesia has many active volcanoes that are potentially disastrous. It needs good mitigation systems to prevent victims and to reduce casualties from potential disaster caused by volcanoes eruption. Therefore, the system to monitor the deformation of volcano was built. This system employed telemetry with the combination of Radio Frequency (RF) communications of XBEE and General Packet Radio Service (GPRS) communication of SIM900. There are two types of modules in this system, first is the coordinator as a parent and second is the node as a child. Each node was connected to coordinator forming a Wireless Sensor Network (WSN) with a star topology and it has an inclinometer based sensor, a Global Positioning System (GPS), and an XBEE module. The coordinator collects data to each node, one a time, to prevent collision data between nodes, save data to SD Card and transmit data to web server via GPRS. Inclinometer was calibrated with self-built in calibrator and tested in high temperature environment to check the durability. The GPS was tested by displaying its position in web server via Google Map Application Protocol Interface (API v.3). It was shown that the coordinator can receive and transmit data from every node to web server very well and the system works well in a high temperature environment.

  3. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    PubMed

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This method provides an important technique that can be applied to other developing countries where a randomized study design is needed but infrastructure is lacking to implement more traditional participant selection methods.

  4. Make Your Own Mashup Maps

    ERIC Educational Resources Information Center

    Lucking, Robert A.; Christmann, Edwin P.; Whiting, Mervyn J.

    2008-01-01

    "Mashup" is a new technology term used to describe a web application that combines data or technology from several different sources. You can apply this concept in your classroom by having students create their own mashup maps. Google Maps provides you with the simple tools, map databases, and online help you'll need to quickly master this…

  5. KSC-2013-3235

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – As seen on Google Maps, Firing Room 4 inside the Launch Control Center at NASA's Kennedy Space Center was one of the four control rooms used by NASA and contractor launch teams to oversee a space shuttle countdown. This firing room was the most advanced of the control rooms used for shuttle missions and was the primary firing room for the shuttle's final series of launches before retirement. It is furnished in a more contemporary style with wood cabinets and other features, although it retains many of the computer systems the shuttle counted on to operate safely. Specialized operators worked at consoles tailored to keep track of the status of shuttle systems while the spacecraft was processed in the Orbiter Processing Facility, being stacked inside the Vehicle Assembly Building and standing at the launch pad before liftoff. The firing rooms, including 3, were also used during NASA's Apollo Program. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang

  6. Cluster Analysis of Indonesian Province Based on Household Primary Cooking Fuel Using K-Means

    NASA Astrophysics Data System (ADS)

    Huda, S. N.

    2017-03-01

    Each household definitely provides installations for cooking. Kerosene, which is refined from petroleum products once dominated types of primary fuel for cooking in Indonesia, whereas kerosene has an expensive cost and small efficiency. Other household use LPG as their primary cooking fuel. However, LPG supply is also limited. In addition, with a very diverse environments and cultures in Indonesia led to diversity of the installation type of cooking, such as wood-burning stove brazier. The government is also promoting alternative fuels, such as charcoal briquettes, and fuel from biomass. The use of other fuels is part of the diversification of energy that is expected to reduce community dependence on petroleum-based fuels. The use of various fuels in cooking that vary from one region to another reflects the distribution of fuel basic use by household. By knowing the characteristics of each province, the government can take appropriate policies to each province according each character. Therefore, it would be very good if there exist a cluster analysis of all provinces in Indonesia based on the type of primary cooking fuel in household. Cluster analysis is done using K-Means method with K ranging from 2-5. Cluster results are validated using Silhouette Coefficient (SC). The results show that the highest SC achieved from K = 2 with SC value 0.39135818388151. Two clusters reflect provinces in Indonesia, one is a cluster of more traditional provinces and the other is a cluster of more modern provinces. The cluster results are then shown in a map using Google Map API.

  7. Visualizing spatio-temporal war casualty data in Google Earth - A case study of Map the Fallen (Invited)

    NASA Astrophysics Data System (ADS)

    Askay, S.

    2009-12-01

    Published on Memorial Day 2009, Map the Fallen is a Google Earth visualization of the 5500+ US and international soldiers that have died in Iraq and Afghanistan since 2001. In addition to providing photos, stories and links for each solider, the time-animated map visually connects hometowns to places of death. This novel way of representing casualty data brings the geographic reach and magnitude of the issue into focus together with the very personal nature of individual stories. Innovative visualizations techniques were used that illustrate the spatio-temporal nature of this information and to show the global reach and interconnectivity of this issue. Several of advanced KML techniques employed to create this engaging and performance-conscious map will be discussed during this session. These include: 1) the use of HTML iframes and javascript to minimize the KML size, and extensive cross-linking throughout content; 2) the creation of a time-animated, on-screen casualty counter; 3) the use of parabolic arcs to connect each hometown to place of death; 4) the use of concentric spirals to represent chronological data; and 5) numerous performance optimizations to ensure the 23K placemarks, 2500 screen overlays and nearly 250k line vertices performed well in Google Earth. This session will include a demonstration of the map, conceptual discussions of the techniques used, and some in-depth technical explanation of the KML code.

  8. JEnsembl: a version-aware Java API to Ensembl data systems.

    PubMed

    Paterson, Trevor; Law, Andy

    2012-11-01

    The Ensembl Project provides release-specific Perl APIs for efficient high-level programmatic access to data stored in various Ensembl database schema. Although Perl scripts are perfectly suited for processing large volumes of text-based data, Perl is not ideal for developing large-scale software applications nor embedding in graphical interfaces. The provision of a novel Java API would facilitate type-safe, modular, object-orientated development of new Bioinformatics tools with which to access, analyse and visualize Ensembl data. The JEnsembl API implementation provides basic data retrieval and manipulation functionality from the Core, Compara and Variation databases for all species in Ensembl and EnsemblGenomes and is a platform for the development of a richer API to Ensembl datasources. The JEnsembl architecture uses a text-based configuration module to provide evolving, versioned mappings from database schema to code objects. A single installation of the JEnsembl API can therefore simultaneously and transparently connect to current and previous database instances (such as those in the public archive) thus facilitating better analysis repeatability and allowing 'through time' comparative analyses to be performed. Project development, released code libraries, Maven repository and documentation are hosted at SourceForge (http://jensembl.sourceforge.net).

  9. Creating a Geo-Referenced Bibliography with Google Earth and Geocommons: The Coos Bay Bibliography

    ERIC Educational Resources Information Center

    Schmitt, Jenni; Butler, Barb

    2012-01-01

    We compiled a geo-referenced bibliography of research including theses, peer-reviewed articles, agency literature, and books having sample collection sites in and around Coos Bay, Oregon. Using Google Earth and GeoCommons we created a map that allows users such as visiting researchers, faculty, students, and local agencies to identify previous…

  10. Visualizing the geography of genetic variants.

    PubMed

    Marcus, Joseph H; Novembre, John

    2017-02-15

    One of the key characteristics of any genetic variant is its geographic distribution. The geographic distribution can shed light on where an allele first arose, what populations it has spread to, and in turn on how migration, genetic drift, and natural selection have acted. The geographic distribution of a genetic variant can also be of great utility for medical/clinical geneticists and collectively many genetic variants can reveal population structure. Here we develop an interactive visualization tool for rapidly displaying the geographic distribution of genetic variants. Through a REST API and dynamic front-end, the Geography of Genetic Variants (GGV) browser ( http://popgen.uchicago.edu/ggv/ ) provides maps of allele frequencies in populations distributed across the globe. GGV is implemented as a website ( http://popgen.uchicago.edu/ggv/ ) which employs an API to access frequency data ( http://popgen.uchicago.edu/freq_api/ ). Python and javascript source code for the website and the API are available at: http://github.com/NovembreLab/ggv/ and http://github.com/NovembreLab/ggv-api/ . jnovembre@uchicago.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  11. On-line Geoscience Data Resources for Today's Undergraduates

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.; Ryan, W.; Carbotte, S.; Melkonian, A.; Coplan, J.; Arko, R.; O'Hara, S.; Ferrini, V.; Leung, A.; Bonckzowski, J.

    2008-12-01

    Broadening the experience of undergraduates can be achieved by enabling free, unrestricted and convenient access to real scientific data. With funding from the U.S. National Science Foundation, the Marine Geoscience Data System (MGDS) (http://www.marine-geo.org/) serves as the integrated data portal for various NSF-funded projects and provides free public access and preservation to a wide variety of marine and terrestrial data including rock, fluid, biology and sediment samples information, underway geophysical data and multibeam bathymetry, water column and multi-channel seismics data. Users can easily view the locations of cruise tracks, sample and station locations against a backdrop of a multi-resolution global digital elevation model. A Search For Data web page rapidly extracts data holdings from the database and can be filtered on data and device type, field program ID, investigator name, geographical and date bounds. The data access experience is boosted by the MGDS use of standardised OGC-compliant Web Services to support uniform programmatic interfaces. GeoMapApp (http://www.geomapapp.org/), a free MGDS data visualization tool, supports map-based dynamic exploration of a broad suite of geosciences data. Built-in land and marine data sets include tectonic plate boundary compilations, DSDP/ODP core logs, earthquake events, seafloor photos, and submersible dive tracks. Seamless links take users to data held by external partner repositories including PetDB, UNAVCO, IRIS and NGDC. Users can generate custom maps and grids and import their own data sets and grids. A set of short, video-style on-line tutorials familiarises users step- by-step with GeoMapApp functionality (http://www.geomapapp.org/tutorials/). Virtual Ocean (http://www.virtualocean.org/) combines the functionality of GeoMapApp with a 3-D earth browser built using the NASA WorldWind API for a powerful new data resource. MGDS education involvement (http://www.marine-geo.org/, go to Education tab) includes the searchable Media Bank of images and video; KML files for viewing several MGDS data sets in Google Earth (tm); support in developing undergraduate- level teaching modules using NSF-MARGINS data. Examples of many of these data sets will be shown.

  12. An algorithm to estimate building heights from Google street-view imagery using single view metrology across a representational state transfer system

    NASA Astrophysics Data System (ADS)

    Díaz, Elkin; Arguello, Henry

    2016-05-01

    Urban ecosystem studies require monitoring, controlling and planning to analyze building density, urban density, urban planning, atmospheric modeling and land use. In urban planning, there are many methods for building height estimation using optical remote sensing images. These methods however, highly depend on sun illumination and cloud-free weather. In contrast, high resolution synthetic aperture radar provides images independent from daytime and weather conditions, although, these images rely on special hardware and expensive acquisition. Most of the biggest cities around the world have been photographed by Google street view under different conditions. Thus, thousands of images from the principal streets of a city can be accessed online. The availability of this and similar rich city imagery such as StreetSide from Microsoft, represents huge opportunities in computer vision because these images can be used as input in many applications such as 3D modeling, segmentation, recognition and stereo correspondence. This paper proposes a novel algorithm to estimate building heights using public Google Street-View imagery. The objective of this work is to obtain thousands of geo-referenced images from Google Street-View using a representational state transfer system, and estimate their average height using single view metrology. Furthermore, the resulting measurements and image metadata are used to derive a layer of heights in a Google map available online. The experimental results show that the proposed algorithm can estimate an accurate average building height map of thousands of images using Google Street-View Imagery of any city.

  13. Google Earth Mapping Exercises for Structural Geology Students--A Promising Intervention for Improving Penetrative Visualization Ability

    ERIC Educational Resources Information Center

    Giorgis, Scott

    2015-01-01

    Three-dimensional thinking skills are extremely useful for geoscientists, and at the undergraduate level, these skills are often emphasized in structural geology courses. Google Earth is a powerful tool for visualizing the three-dimensional nature of data collected on the surface of Earth. The results of a 5 y pre- and posttest study of the…

  14. JBioWH: an open-source Java framework for bioinformatics data integration

    PubMed Central

    Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor

    2013-01-01

    The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh PMID:23846595

  15. JBioWH: an open-source Java framework for bioinformatics data integration.

    PubMed

    Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor

    2013-01-01

    The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh.

  16. Drawing the Line with Google Earth: The Place of Digital Mapping outside of Geography

    ERIC Educational Resources Information Center

    Mercier, O. Ripeka; Rata, Arama

    2017-01-01

    The "Te Kawa a Maui Atlas" project explores how mapping activities support undergraduate student engagement and learning in Maori studies. This article describes two specific assignments, which used online mapping allowing students to engage with the work of their peers. By analysing student evaluations of these activities, we identify…

  17. JEnsembl: a version-aware Java API to Ensembl data systems

    PubMed Central

    Paterson, Trevor; Law, Andy

    2012-01-01

    Motivation: The Ensembl Project provides release-specific Perl APIs for efficient high-level programmatic access to data stored in various Ensembl database schema. Although Perl scripts are perfectly suited for processing large volumes of text-based data, Perl is not ideal for developing large-scale software applications nor embedding in graphical interfaces. The provision of a novel Java API would facilitate type-safe, modular, object-orientated development of new Bioinformatics tools with which to access, analyse and visualize Ensembl data. Results: The JEnsembl API implementation provides basic data retrieval and manipulation functionality from the Core, Compara and Variation databases for all species in Ensembl and EnsemblGenomes and is a platform for the development of a richer API to Ensembl datasources. The JEnsembl architecture uses a text-based configuration module to provide evolving, versioned mappings from database schema to code objects. A single installation of the JEnsembl API can therefore simultaneously and transparently connect to current and previous database instances (such as those in the public archive) thus facilitating better analysis repeatability and allowing ‘through time’ comparative analyses to be performed. Availability: Project development, released code libraries, Maven repository and documentation are hosted at SourceForge (http://jensembl.sourceforge.net). Contact: jensembl-develop@lists.sf.net, andy.law@roslin.ed.ac.uk, trevor.paterson@roslin.ed.ac.uk PMID:22945789

  18. How to Use This Website | USDA Plant Hardiness Zone Map

    Science.gov Websites

    , regional or national plant hardiness zone maps in three different resolutions using the following steps. To default printing menu option or button. Viewing the Map - Open Full Map Button c. Save Full Map Button can copy the e-mail address and paste it into a different e-mail client (e.g., Google Gmail, Yahoo

  19. Traffic Sign Inventory from Google Street View Images

    NASA Astrophysics Data System (ADS)

    Tsai, Victor J. D.; Chen, Jyun-Han; Huang, Hsun-Sheng

    2016-06-01

    Traffic sign detection and recognition (TSDR) has drawn considerable attention on developing intelligent transportation systems (ITS) and autonomous vehicle driving systems (AVDS) since 1980's. Unlikely to the general TSDR systems that deal with real-time images captured by the in-vehicle cameras, this research aims on developing techniques for detecting, extracting, and positioning of traffic signs from Google Street View (GSV) images along user-selected routes for low-cost, volumetric and quick establishment of the traffic sign infrastructural database that may be associated with Google Maps. The framework and techniques employed in the proposed system are described.

  20. Using Google Streetview Panoramic Imagery for Geoscience Education

    NASA Astrophysics Data System (ADS)

    De Paor, D. G.; Dordevic, M. M.

    2014-12-01

    Google Streetview is a feature of Google Maps and Google Earth that allows viewers to switch from map or satellite view to 360° panoramic imagery recorded close to the ground. Most panoramas are recorded by Google engineers using special cameras mounted on the roofs of cars. Bicycles, snowmobiles, and boats have also been used and sometimes the camera has been mounted on a backpack for off-road use by hikers and skiers or attached to scuba-diving gear for "Underwater Streetview (sic)." Streetview panoramas are linked together so that the viewer can change viewpoint by clicking forward and reverse buttons. They therefore create a 4-D touring effect. As part of the GEODE project ("Google Earth for Onsite and Distance Education"), we are experimenting with the use of Streetview imagery for geoscience education. Our web-based test application allows instructors to select locations for students to study. Students are presented with a set of questions or tasks that they must address by studying the panoramic imagery. Questions include identification of rock types, structures such as faults, and general geological setting. The student view is locked into Streetview mode until they submit their answers, whereupon the map and satellite views become available, allowing students to zoom out and verify their location on Earth. Student learning is scaffolded by automatic computerized feedback. There are lots of existing Streetview panoramas with rich geological content. Additionally, instructors and members of the general public can create panoramas, including 360° Photo Spheres, by stitching images taken with their mobiles devices and submitting them to Google for evaluation and hosting. A multi-thousand-dollar, multi-directional camera and mount can be purchased from DIY-streetview.com. This allows power users to generate their own high-resolution panoramas. A cheaper, 360° video camera is soon to be released according to geonaute.com. Thus there are opportunities for geoscience educators both to use existing Streetview imagery and to generate new imagery for specific locations of geological interest. The GEODE team includes the authors and: H. Almquist, C. Bentley, S. Burgin, C. Cervato, G. Cooper, P. Karabinos, T. Pavlis, J. Piatek, B. Richards, J. Ryan, R. Schott, K. St. John, B. Tewksbury, and S. Whitmeyer.

  1. Cartographic analyses of geographic information available on Google Earth Images

    NASA Astrophysics Data System (ADS)

    Oliveira, J. C.; Ramos, J. R.; Epiphanio, J. C.

    2011-12-01

    The propose was to evaluate planimetric accuracy of satellite images available on database of Google Earth. These images are referents to the vicinities of the Federal Univertisity of Viçosa, Minas Gerais - Brazil. The methodology developed evaluated the geographical information of three groups of images which were in accordance to the level of detail presented in the screen images (zoom). These groups of images were labeled to Zoom 1000 (a single image for the entire study area), Zoom 100 (formed by a mosaic of 73 images) and Zoom 100 with geometric correction (this mosaic is like before, however, it was applied a geometric correction through control points). In each group of image was measured the Cartographic Accuracy based on statistical analyses and brazilian's law parameters about planimetric mapping. For this evaluation were identified 22 points in each group of image, where the coordinates of each point were compared to the coordinates of the field obtained by GPS (Global Positioning System). The Table 1 show results related to accuracy (based on a threshold equal to 0.5 mm * mapping scale) and tendency (abscissa and ordinate) between the coordinates of the image and the coordinates of field. Table 1 The geometric correction applied to the Group Zoom 100 reduced the trends identified earlier, and the statistical tests pointed a usefulness of the data for a mapping at a scale of 1/5000 with error minor than 0.5 mm * scale. The analyses proved the quality of cartographic data provided by Google, as well as the possibility of reduce the divergences of positioning present on the data. It can be concluded that it is possible to obtain geographic information database available on Google Earth, however, the level of detail (zoom) used at the time of viewing and capturing information on the screen influences the quality cartographic of the mapping. Although cartographic and thematic potential present in the database, it is important to note that both the software as data distributed by Google Earth has policies for use and distribution.
    Table 1 - PLANIMETRIC ANALYSIS

  2. Mean composite fire severity metrics computed with Google Earth engine offer improved accuracy and expanded mapping potential

    Treesearch

    Sean A. Parks; Lisa M. Holsinger; Morgan A. Voss; Rachel A. Loehman; Nathaniel P. Robinson

    2018-01-01

    Landsat-based fire severity datasets are an invaluable resource for monitoring and research purposes. These gridded fire severity datasets are generally produced with pre- and post-fire imagery to estimate the degree of fire-induced ecological change. Here, we introduce methods to produce three Landsat-based fire severity metrics using the Google Earth Engine (GEE)...

  3. Exploring Research Contributions of the North American Carbon Program using Google Earth and Google Map

    NASA Astrophysics Data System (ADS)

    Griffith, P. C.; Wilcox, L. E.; Morrell, A.

    2009-12-01

    The central objective of the North American Carbon Program (NACP), a core element of the US Global Change Research Program, is to quantify the sources and sinks of carbon dioxide, carbon monoxide, and methane in North America and adjacent ocean regions. The NACP consists of a wide range of investigators at universities and federal research centers. Although many of these investigators have worked together in the past, many have had few prior interactions and may not know of similar work within knowledge domains, much less across the diversity of environments and scientific approaches in the Program. Coordinating interactions and sharing data are major challenges in conducting NACP. The Google Earth and Google Map Collections on the NACP website (www.nacarbon.org) provide a geographical view of the research products contributed by each core and affiliated NACP project. Other relevant data sources (e.g. AERONET, LVIS) can also be browsed in spatial context with NACP contributions. Each contribution links to project-oriented metadata, or “project profiles”, that provide a greater understanding of the scientific and social context of each dataset and are an important means of communicating within the NACP and to the larger carbon cycle science community. Project profiles store information such as a project's title, leaders, participants, an abstract, keywords, funding agencies, associated intensive campaigns, expected data products, data needs, publications, and URLs to associated data centers, datasets, and metadata. Data products are research contributions that include biometric inventories, flux tower estimates, remote sensing land cover products, tools, services, and model inputs / outputs. Project leaders have been asked to identify these contributions to the site level whenever possible, either through simple latitude/longitude pair, or by uploading a KML, KMZ, or shape file. Project leaders may select custom icons to graphically categorize their contributions; for example, a ship for oceanographic samples, a tower for tower measurements. After post-processing, research contributions are added to the NACP Google Earth and Google Map Collection to facilitate discovery and use in synthesis activities of the Program.

  4. Namibia Dashboard Enhancements

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Handy, Matthew

    2014-01-01

    The purpose of this presentation is for a Technical Interchange Meeting with the Namibia Hydrological Services (NHS) in Namibia. The meeting serves as a capacity building exercise. This presentation goes over existing software functionality developed in collaboration with NHS over the past five years called the Namibia Flood Dashboard. Furthermore, it outlines new functionality developed over the past year and future functionality that will be developed. The main purpose of the Dashboard is to assist in decision support for flood warning. The Namibia Flood Dashboard already exists online in a cloud environment and has been used in prototype mode for the past few years.Functionality in the Dashboard includes river gauge hydrographs, TRMM estimate rainfall, EO-1 flood maps, infrastructure maps and other related functions. Future functionality includes attempting to integrate interoperability standards and crowd-sourcing capability. To this end, we are adding OpenStreetMap compatibility and an Applications Program Interface (API) called a GeoSocial API to enable discovery and sharing of data products useful for decision support via social media.

  5. Genetics Home Reference: sialuria

    MedlinePlus

    ... inheritance of sialuria, an inborn error of feedback inhibition. Am J Hum Genet. 2001 Jun;68(6): ... Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & Players ...

  6. Genetics Home Reference: pilomatricoma

    MedlinePlus

    ... F, Palacios J. beta-catenin expression in pilomatrixomas. Relationship with beta-catenin gene mutations and comparison with ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  7. Genetics Home Reference: trichothiodystrophy

    MedlinePlus

    ... trichothiodystrophy and Cockayne syndrome: a complex genotype-phenotype relationship. Neuroscience. 2007 Apr 14;145(4):1388-96. ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  8. Genetics Home Reference: microphthalmia

    MedlinePlus

    ... CR, Ye M, Garcha K, Bigot K, Perera AG, Staehling-Hampton K, Mema SC, Chanda B, Mushegian ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  9. GEOG 342: Exploring the Virtual Earth

    NASA Astrophysics Data System (ADS)

    Bailey, J. E.; Sfraga, M.

    2007-12-01

    First attributed to Eratosthenes around 200 BC, the word "geography" is derived from Greek words meaning "Earth" and "to describe". It describes the study of our planets, its features, inhabitants, and phenomena. The term "neogeography" put simply is new geography; where new refers to more than just practices that are new in usage. Methodologies of neogeography tend toward the intuitive, personal, artistic or even absurd, and general don't confirm to traditional protocols and boundaries. Mapping and spatial technologies such as Geobrowsers are typical of the tools used by neogeographers. Much of the success of Geobrowsers can be attributed to the fact that they use the methods and technologies of neogeography to provide a better understanding of traditional topics of Geography. The Geography program at the University of Alaska Fairbanks is embracing these new methodologies by offering a new class that explores the world around us through the use of Geobrowsers and other Web 2.0 technologies. Students will learn to use Keyhole Markup Language (KML), Google Maps API, SketchUp and a range of Virtual Globes programs, primarily through geospatial datasets from the Earth Sciences. A special focus will be given to datasets that look at the environments and natural hazards that make Alaska such a unique landscape. The role of forums, wikis and blogs in the expansion of the Geoweb will be explored, and students will be encouraged to be active on these websites. Students will also explore Second Life, the concept of which will be introduced through the class text, Neal Stephenson's "Snow Crash". The primary goal of the class is to encourage students to undertake their own explorations of virtual Earths, in order to better understand the physical and social structure of the real world.

  10. Planetary Data Systems (PDS) Imaging Node Atlas II

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; McAuley, James M.

    2013-01-01

    The Planetary Image Atlas (PIA) is a Rich Internet Application (RIA) that serves planetary imaging data to the science community and the general public. PIA also utilizes the USGS Unified Planetary Coordinate system (UPC) and the on-Mars map server. The Atlas was designed to provide the ability to search and filter through greater than 8 million planetary image files. This software is a three-tier Web application that contains a search engine backend (MySQL, JAVA), Web service interface (SOAP) between server and client, and a GWT Google Maps API client front end. This application allows for the search, retrieval, and download of planetary images and associated meta-data from the following missions: 2001 Mars Odyssey, Cassini, Galileo, LCROSS, Lunar Reconnaissance Orbiter, Mars Exploration Rover, Mars Express, Magellan, Mars Global Surveyor, Mars Pathfinder, Mars Reconnaissance Orbiter, MESSENGER, Phoe nix, Viking Lander, Viking Orbiter, and Voyager. The Atlas utilizes the UPC to translate mission-specific coordinate systems into a unified coordinate system, allowing the end user to query across missions of similar targets. If desired, the end user can also use a mission-specific view of the Atlas. The mission-specific views rely on the same code base. This application is a major improvement over the initial version of the Planetary Image Atlas. It is a multi-mission search engine. This tool includes both basic and advanced search capabilities, providing a product search tool to interrogate the collection of planetary images. This tool lets the end user query information about each image, and ignores the data that the user has no interest in. Users can reduce the number of images to look at by defining an area of interest with latitude and longitude ranges.

  11. Geospatial relationship of road traffic crashes and healthcare facilities with trauma surgical capabilities in Nairobi, Kenya: defining gaps in coverage.

    PubMed

    Shaw, Brian I; Wangara, Ali Akida; Wambua, Gladys Mbatha; Kiruja, Jason; Dicker, Rochelle A; Mweu, Judith Mutindi; Juillard, Catherine

    2017-01-01

    Road traffic injuries (RTIs) are a cause of significant morbidity and mortality in low- and middle-income countries. Access to timely emergency services is needed to decrease the morbidity and mortality of RTIs and other traumatic injuries. Our objective was to describe the distribution of roadtrafficcrashes (RTCs) in Nairobi with the relative distance and travel times for victims of RTCs to health facilities with trauma surgical capabilities. RTCs in Nairobi County were recorded by the Ma3route app from May 2015 to October 2015 with latitude and longitude coordinates for each RTC extracted using geocoding. Health facility administrators were interviewed to determine surgical capacity of their facilities. RTCs and health facilities were plotted on maps using ArcGIS. Distances and travel times between RTCs and health facilities were determined using the Google Maps Distance Matrix API. 89 percent (25/28) of health facilities meeting inclusion criteria were evaluated. Overall, health facilities were well equipped for trauma surgery with 96% meeting WHO Minimal Safety Criteria. 76 percent of facilities performed greater than 12 of three pre-selected 'Bellweather Procedures' shown to correlate with surgical capability. The average travel time and distance from RTCs to the nearest health facilities surveyed were 7 min and 3.4 km, respectively. This increased to 18 min and 9.6 km if all RTC victims were transported to Kenyatta National Hospital (KNH). Almost all hospitals surveyed in the present study have the ability to care for trauma patients. Treating patients directly at these facilities would decrease travel time compared with transfer to KNH. Nairobi County could benefit from formally coordinating the triage of trauma patients to more facilities to decrease travel time and potentially improve patient outcomes. III.

  12. Oceanographic data at your fingertips: the SOCIB App for smartphones

    NASA Astrophysics Data System (ADS)

    Lora, Sebastian; Sebastian, Kristian; Troupin, Charles; Pau Beltran, Joan; Frontera, Biel; Gómara, Sonia; Tintoré, Joaquín

    2015-04-01

    The Balearic Islands Coastal Ocean Observing and Forecasting System (SOCIB, http://www.socib.es), is a multi-platform Marine Research Infrastructure that generates data from nearshore to the open sea in the Western Mediterranean Sea. In line with SOCIB principles of discoverable, freely available and standardized data, an application (App) for smartphones has been designed, with the objective of providing an easy access to all the data managed by SOCIB in real-time: underwater gliders, drifters, profiling buoys, research vessel, HF Radar and numerical model outputs (hydrodynamics and waves). The Data Centre, responsible for the aquisition, processing and visualisation of all SOCIB data, developed a REpresentational State Transfer (REST) application programming interface (API) called "DataDiscovery" (http://apps.socib.es/DataDiscovery/). This API is made up of RESTful web services that provide information on : platforms, instruments, deployments of instruments. It also provides the data themselves. In this way, it is possible to integrate SOCIB data in third-party applications, developed either by the Data Center or externally. The existence of a single point for the data distribution not only allows for an efficient management but also makes easier the concepts and data access for external developers, who are not necessarily familiar with the concepts and tools related to oceanographic or atmospheric data. The SOCIB App for Android (https://play.google.com/store/apps/details?id=com.socib) uses that API as a "data backend", in such a way that it is straightforward to manage which information is shown by the application, without having to modify and upload it again. The only pieces of information that do not depend on the services are the App "Sections" and "Screens", but the content displayed in each of them is obtained through requests to the web services. The API is not used only for the smartphone app: presently, most of SOCIB applications for data visualisation and access rely on the API, for instance: corporative web, deployment Application (Dapp, http://apps.socib.es/dapp/), Sea Boards (http://seaboard.socib.es/).

  13. GeolOkit 1.0: a new Open Source, Cross-Platform software for geological data visualization in Google Earth environment

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Antoine; Bastin, Christophe; Watlet, Arnaud

    2016-04-01

    GIS software suites are today's essential tools to gather and visualise geological data, to apply spatial and temporal analysis and in fine, to create and share interactive maps for further geosciences' investigations. For these purposes, we developed GeolOkit: an open-source, freeware and lightweight software, written in Python, a high-level, cross-platform programming language. GeolOkit software is accessible through a graphical user interface, designed to run in parallel with Google Earth. It is a super user-friendly toolbox that allows 'geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to plot these one into Google Earth environment using KML code. This workflow requires no need of any third party software, except Google Earth itself. GeolOkit comes with large number of geosciences' labels, symbols, colours and placemarks and may process : (i) multi-points data, (ii) contours via several interpolations methods, (iii) discrete planar and linear structural data in 2D or 3D supporting large range of structures input format, (iv) clustered stereonets and rose diagram, (v) drawn cross-sections as vertical sections, (vi) georeferenced maps and vectors, (vii) field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS. We are looking for you to discover all the functionalities of GeolOkit software. As this project is under development, we are definitely looking to discussions regarding your proper needs, your ideas and contributions to GeolOkit project.

  14. Google Sky: A Digital View of the Night Sky

    NASA Astrophysics Data System (ADS)

    Connolly, A. Scranton, R.; Ornduff, T.

    2008-11-01

    From its inception Astronomy has been a visual science, from careful observations of the sky using the naked eye, to the use of telescopes and photographs to map the distribution of stars and galaxies, to the current era of digital cameras that can image the sky over many decades of the electromagnetic spectrum. Sky in Google Earth (http://earth.google.com) and Google Sky (http://www.google.com/sky) continue this tradition, providing an intuitive visual interface to some of the largest astronomical imaging surveys of the sky. Streaming multi-color imagery, catalogs, time domain data, as well as annotating interesting astronomical sources and events with placemarks, podcasts and videos, Sky provides a panchromatic view of the universe accessible to anyone with a computer. Beyond a simple exploration of the sky Google Sky enables users to create and share content with others around the world. With an open interface available on Linux, Mac OS X and Windows, and translations of the content into over 20 different languages we present Sky as the embodiment of a virtual telescope for discovery and sharing the excitement of astronomy and science as a whole.

  15. Utilizing HDF4 File Content Maps for the Cloud

    NASA Technical Reports Server (NTRS)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  16. Genetics Home Reference: Crohn disease

    MedlinePlus

    ... or indirectly, to abnormal inflammation. However, the exact relationship between these factors and Crohn disease risk remains ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  17. Genetics Home Reference: Cowden syndrome

    MedlinePlus

    ... MS, Eng C. A clinical scoring system for selection of patients for PTEN mutation testing is proposed ... should consult with a qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map ...

  18. Genetics Home Reference: Laron syndrome

    MedlinePlus

    ... AL. Obesity, diabetes and cancer: insight into the relationship from a cohort with growth hormone receptor deficiency. ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  19. Genetics Home Reference: prolidase deficiency

    MedlinePlus

    ... mutations as a tool to investigate structure-function relationship. J Hum Genet. 2004;49(9):500-6. ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  20. Genetics Home Reference: xeroderma pigmentosum

    MedlinePlus

    ... trichothiodystrophy and Cockayne syndrome: a complex genotype-phenotype relationship. Neuroscience. 2007 Apr 14;145(4):1388-96. ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  1. Genetics Home Reference: Miyoshi myopathy

    MedlinePlus

    ... Itoyama Y. Dysferlin mutations in Japanese Miyoshi myopathy: relationship to phenotype. Neurology. 2003 Jun 10;60(11): ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  2. Genetics Home Reference: Lowe syndrome

    MedlinePlus

    ... inheritance is that fathers cannot pass X-linked traits to their sons. In some cases of Lowe ... should consult with a qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map ...

  3. Genetics Home Reference: Danon disease

    MedlinePlus

    ... inheritance is that fathers cannot pass X-linked traits to their sons. Related Information What does it ... should consult with a qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map ...

  4. Genetics Home Reference: frontometaphyseal dysplasia

    MedlinePlus

    ... inheritance is that fathers cannot pass X-linked traits to their sons. Related Information What does it ... should consult with a qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map ...

  5. Genetics Home Reference: Tourette syndrome

    MedlinePlus

    ... Rasin MR, Gunel M, Davis NR, Ercan-Sencicek AG, Guez DH, Spertus JA, Leckman JF, Dure LS ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  6. Genetics Home Reference: bradyopsia

    MedlinePlus

    ... 75-8. Citation on PubMed Vincent A, Robson AG, Holder GE. Pathognomonic (diagnostic) ERGs. A review and ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  7. Genetics Home Reference: Clouston syndrome

    MedlinePlus

    ... M, Nakamura M, Farooq M, Fujikawa H, Kibbi AG, Ito M, Dahdah M, Matta M, Diab H, ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  8. Using secure web services to visualize poison center data for nationwide biosurveillance: a case study.

    PubMed

    Savel, Thomas G; Bronstein, Alvin; Duck, William; Rhodes, M Barry; Lee, Brian; Stinn, John; Worthen, Katherine

    2010-01-01

    Real-time surveillance systems are valuable for timely response to public health emergencies. It has been challenging to leverage existing surveillance systems in state and local communities, and, using a centralized architecture, add new data sources and analytical capacity. Because this centralized model has proven to be difficult to maintain and enhance, the US Centers for Disease Control and Prevention (CDC) has been examining the ability to use a federated model based on secure web services architecture, with data stewardship remaining with the data provider. As a case study for this approach, the American Association of Poison Control Centers and the CDC extended an existing data warehouse via a secure web service, and shared aggregate clinical effects and case counts data by geographic region and time period. To visualize these data, CDC developed a web browser-based interface, Quicksilver, which leveraged the Google Maps API and Flot, a javascript plotting library. Two iterations of the NPDS web service were completed in 12 weeks. The visualization client, Quicksilver, was developed in four months. This implementation of web services combined with a visualization client represents incremental positive progress in transitioning national data sources like BioSense and NPDS to a federated data exchange model. Quicksilver effectively demonstrates how the use of secure web services in conjunction with a lightweight, rapidly deployed visualization client can easily integrate isolated data sources for biosurveillance.

  9. Distributed Kernelized Locality-Sensitive Hashing for Faster Image Based Navigation

    DTIC Science & Technology

    2015-03-26

    Facebook, Google, and Yahoo !. Current methods for image retrieval become problematic when implemented on image datasets that can easily reach billions of...correlations. Tech industry leaders like Facebook, Google, and Yahoo ! sort and index even larger volumes of “big data” daily. When attempting to process...open source implementation of Google’s MapReduce programming paradigm [13] which has been used for many different things. Using Apache Hadoop, Yahoo

  10. LLMapReduce: Multi-Lingual Map-Reduce for Supercomputing Environments

    DTIC Science & Technology

    2015-11-20

    1990s. Popularized by Google [36] and Apache Hadoop [37], map-reduce has become a staple technology of the ever- growing big data community...Lexington, MA, U.S.A Abstract— The map-reduce parallel programming model has become extremely popular in the big data community. Many big data ...to big data users running on a supercomputer. LLMapReduce dramatically simplifies map-reduce programming by providing simple parallel programming

  11. Genetics Home Reference: fish-eye disease

    MedlinePlus

    ... levels of HDL cholesterol and atherosclerosis, a variable relationship--a review of LCAT deficiency. Vasc Health Risk ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  12. Genetics Home Reference: X-linked juvenile retinoschisis

    MedlinePlus

    ... juvenile retinoschisis (XLRS): a review of genotype-phenotype relationships. Semin Ophthalmol. 2013 Sep-Nov;28(5-6): ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  13. What Are the Types of Genetic Tests?

    MedlinePlus

    ... or implicate a crime suspect, or establish biological relationships between people (for example, paternity). For more information ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  14. Genetics Home Reference: Y chromosome infertility

    MedlinePlus

    ... deletions" of the human Y chromosome and their relationship with male infertility. J Genet Genomics. 2008 Apr; ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  15. Genetics Home Reference: Fukuyama congenital muscular dystrophy

    MedlinePlus

    ... Fujii T, Aiba H, Toda T. Seizure-genotype relationship in Fukuyama-type congenital muscular dystrophy. Brain Dev. ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  16. Genetics Home Reference: glutaric acidemia type II

    MedlinePlus

    ... E, Bross P, Skovby F, Gregersen N. Clear relationship between ETF/ETFDH genotype and phenotype in patients ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  17. Genetics Home Reference: complete LCAT deficiency

    MedlinePlus

    ... levels of HDL cholesterol and atherosclerosis, a variable relationship--a review of LCAT deficiency. Vasc Health Risk ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  18. Genetics Home Reference: neurohypophyseal diabetes insipidus

    MedlinePlus

    ... G, Colao A. Central diabetes insipidus and autoimmunity: relationship between the occurrence of antibodies to arginine vasopressin- ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  19. Genetics Home Reference: mucopolysaccharidosis type II

    MedlinePlus

    ... inheritance is that fathers cannot pass X-linked traits to their sons. Related Information What does it ... should consult with a qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map ...

  20. Genetics Home Reference: familial dilated cardiomyopathy

    MedlinePlus

    ... inheritance is that fathers cannot pass X-linked traits to their sons. Related Information What does it ... should consult with a qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map ...

  1. Genetics Home Reference: phosphoglycerate kinase deficiency

    MedlinePlus

    ... inheritance is that fathers cannot pass X-linked traits to their sons. Related Information What does it ... should consult with a qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map ...

  2. Genetics Home Reference: Melnick-Needles syndrome

    MedlinePlus

    ... inheritance is that fathers cannot pass X-linked traits to their sons. Related Information What does it ... should consult with a qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map ...

  3. Genetics Home Reference: encephalocraniocutaneous lipomatosis

    MedlinePlus

    ... PubMed or Free article on PubMed Central Hunter AG. Oculocerebrocutaneous and encephalocraniocutaneous lipomatosis syndromes: blind men and ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  4. Genetics Home Reference: Dandy-Walker malformation

    MedlinePlus

    ... KA, Lehmann OJ, Hudgins L, Chizhikov VV, Bassuk AG, Ades LC, Krantz ID, Dobyns WB, Millen KJ. ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  5. Genetics Home Reference: McLeod neuroacanthocytosis syndrome

    MedlinePlus

    ... W, Watt JM, Corbett AJ, Hamdalla HH, Marshall AG, Sutton I, Dotti MT, Malandrini A, Walker RH, ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  6. Genetics Home Reference: vitiligo

    MedlinePlus

    ... PubMed or Free article on PubMed Central Smith AG, Sturm RA. Multiple genes and locus interactions in ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  7. Genetics Home Reference: Knobloch syndrome

    MedlinePlus

    ... for This Page Fukai N, Eklund L, Marneros AG, Oh SP, Keene DR, Tamarkin L, Niemelä M, ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  8. Genetics Home Reference: Coffin-Lowry syndrome

    MedlinePlus

    ... 27(1):85-9. Citation on PubMed Hunter AG. Coffin-Lowry syndrome: a 20-year follow-up ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  9. Genetics Home Reference: essential pentosuria

    MedlinePlus

    ... MacCoss MJ, Levy-Lahad E, King MC, Motulsky AG. Garrod's fourth inborn error of metabolism solved by ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  10. Genetics Home Reference: Klippel-Trenaunay syndrome

    MedlinePlus

    ... 6):291-8. Review. Citation on PubMed Jacob AG, Driscoll DJ, Shaughnessy WJ, Stanson AW, Clay RP, ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  11. Genetics Home Reference: age-related macular degeneration

    MedlinePlus

    ... A, Tosakulwong N, Truitt BJ, Tsironi EE, Uitterlinden AG, van Duijn CM, Vijaya L, Vingerling JR, Vithana ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  12. Genetics Home Reference: pseudohypoaldosteronism type 2

    MedlinePlus

    ... Thameem F, Al-Shahrouri HZ, Radhakrishnan J, Gharavi AG, Goilav B, Lifton RP. Mutations in kelch-like ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  13. Genetics Home Reference: autosomal dominant vitreoretinochoroidopathy

    MedlinePlus

    ... RE, Davidson AE, Urquhart JE, Holder GE, Robson AG, Moore AT, Keefe RO, Black GC, Manson FD. ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  14. Genetics Home Reference: spina bifida

    MedlinePlus

    ... PubMed or Free article on PubMed Central Bassuk AG, Kibar Z. Genetic basis of neural tube defects. ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  15. Genetics Home Reference: Liddle syndrome

    MedlinePlus

    ... Free article on PubMed Central Lifton RP, Gharavi AG, Geller DS. Molecular mechanisms of human hypertension. Cell. ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  16. Genetics Home Reference: actin-accumulation myopathy

    MedlinePlus

    ... F, Sewry C, Hughes I, Sutphen R, Lacson AG, Swoboda KJ, Vigneron J, Wallgren-Pettersson C, Beggs ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  17. Genetics Home Reference: anencephaly

    MedlinePlus

    ... PubMed or Free article on PubMed Central Bassuk AG, Kibar Z. Genetic basis of neural tube defects. ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  18. Genetics Home Reference: Dent disease

    MedlinePlus

    ... 20. Review. Citation on PubMed Wrong OM, Norden AG, Feest TG. Dent's disease; a familial proximal renal ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  19. Crop classification and mapping based on Sentinel missions data in cloud environment

    NASA Astrophysics Data System (ADS)

    Lavreniuk, M. S.; Kussul, N.; Shelestov, A.; Vasiliev, V.

    2017-12-01

    Availability of high resolution satellite imagery (Sentinel-1/2/3, Landsat) over large territories opens new opportunities in agricultural monitoring. In particular, it becomes feasible to solve crop classification and crop mapping task at country and regional scale using time series of heterogenous satellite imagery. But in this case, we face with the problem of Big Data. Dealing with time series of high resolution (10 m) multispectral imagery we need to download huge volumes of data and then process them. The solution is to move "processing chain" closer to data itself to drastically shorten time for data transfer. One more advantage of such approach is the possibility to parallelize data processing workflow and efficiently implement machine learning algorithms. This could be done with cloud platform where Sentinel imagery are stored. In this study, we investigate usability and efficiency of two different cloud platforms Amazon and Google for crop classification and crop mapping problems. Two pilot areas were investigated - Ukraine and England. Google provides user friendly environment Google Earth Engine for Earth observation applications with a lot of data processing and machine learning tools already deployed. At the same time with Amazon one gets much more flexibility in implementation of his own workflow. Detailed analysis of pros and cons will be done in the presentation.

  20. SECURE INTERNET OF THINGS-BASED CLOUD FRAMEWORK TO CONTROL ZIKA VIRUS OUTBREAK.

    PubMed

    Sareen, Sanjay; Sood, Sandeep K; Gupta, Sunil Kumar

    2017-01-01

    Zika virus (ZikaV) is currently one of the most important emerging viruses in the world which has caused outbreaks and epidemics and has also been associated with severe clinical manifestations and congenital malformations. Traditional approaches to combat the ZikaV outbreak are not effective for detection and control. The aim of this study is to propose a cloud-based system to prevent and control the spread of Zika virus disease using integration of mobile phones and Internet of Things (IoT). A Naive Bayesian Network (NBN) is used to diagnose the possibly infected users, and Google Maps Web service is used to provide the geographic positioning system (GPS)-based risk assessment to prevent the outbreak. It is used to represent each ZikaV infected user, mosquito-dense sites, and breeding sites on the Google map that helps the government healthcare authorities to control such risk-prone areas effectively and efficiently. The performance and accuracy of the proposed system are evaluated using dataset for 2 million users. Our system provides high accuracy for initial diagnosis of different users according to their symptoms and appropriate GPS-based risk assessment. The cloud-based proposed system contributed to the accurate NBN-based classification of infected users and accurate identification of risk-prone areas using Google Maps.

  1. jmzReader: A Java parser library to process and visualize multiple text and XML-based mass spectrometry data formats.

    PubMed

    Griss, Johannes; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2012-03-01

    We here present the jmzReader library: a collection of Java application programming interfaces (APIs) to parse the most commonly used peak list and XML-based mass spectrometry (MS) data formats: DTA, MS2, MGF, PKL, mzXML, mzData, and mzML (based on the already existing API jmzML). The library is optimized to be used in conjunction with mzIdentML, the recently released standard data format for reporting protein and peptide identifications, developed by the HUPO proteomics standards initiative (PSI). mzIdentML files do not contain spectra data but contain references to different kinds of external MS data files. As a key functionality, all parsers implement a common interface that supports the various methods used by mzIdentML to reference external spectra. Thus, when developing software for mzIdentML, programmers no longer have to support multiple MS data file formats but only this one interface. The library (which includes a viewer) is open source and, together with detailed documentation, can be downloaded from http://code.google.com/p/jmzreader/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Cross-disciplinary Undergraduate Research: A Case Study in Digital Mapping, western Ireland

    NASA Astrophysics Data System (ADS)

    Whitmeyer, S. J.; de Paor, D. G.; Nicoletti, J.; Rivera, M.; Santangelo, B.; Daniels, J.

    2008-12-01

    As digital mapping technology becomes ever more advanced, field geologists spend a greater proportion of time learning digital methods relative to analyzing rocks and structures. To explore potential solutions to the time commitment implicit in learning digital field methods, we paired James Madison University (JMU) geology majors (experienced in traditional field techniques) with Worcester Polytechnic Institute (WPI) engineering students (experienced in computer applications) during a four week summer mapping project in Connemara, western Ireland. The project consisted of approximately equal parts digital field mapping (directed by the geology students), and lab-based map assembly, evaluation and formatting for virtual 3D terrains (directed by the engineering students). Students collected geologic data in the field using ruggedized handheld computers (Trimble GeoExplorer® series) with ArcPAD® software. Lab work initially focused on building geologic maps in ArcGIS® from the digital field data and then progressed to developing Google Earth-based visualizations of field data and maps. Challenges included exporting GIS data, such as locations and attributes, to KML tags for viewing in Google Earth, which we accomplished using a Linux bash script written by one of our engineers - a task outside the comfort zone of the average geology major. We also attempted to expand the scope of Google Earth by using DEMs of present-day geologically-induced landforms as representative models for paleo-geographic reconstructions of the western Ireland field area. As our integrated approach to digital field work progressed, we found that our digital field mapping produced data at a faster rate than could be effectively managed during our allotted time for lab work. This likely reflected the more developed methodology for digital field data collection, as compared with our lab-based attempts to develop new methods for 3D visualization of geologic maps. However, this experiment in cross-disciplinary undergraduate research was a big success, with an enthusiastic interchange of expertise between undergraduate geology and engineering students that produced new, cutting-edge methods for visualizing geologic data and maps.

  3. What's New in the Ocean in Google Earth and Maps

    NASA Astrophysics Data System (ADS)

    Austin, J.; Sandwell, D. T.

    2014-12-01

    Jenifer Austin, Jamie Adams, Kurt Schwehr, Brian Sullivan, David Sandwell2, Walter Smith3, Vicki Ferrini4, and Barry Eakins5, 1 Google Inc., 1600 Amphitheatre Parkway, Mountain View, California, USA 2 University of California-San Diego, Scripps Institute of Oceanography, La Jolla, California ,USA3 NOAA Laboratory for Satellite Altimetry, College Park, Maryland, USA4 Lamont Doherty, Columbia University5 NOAAMore than two-thirds of Earth is covered by oceans. On the almost 6 year anniversary of launching an explorable ocean seafloor in Google Earth and Maps, we updated our global underwater terrain dataset in partnership with Lamont-Doherty at Columbia, the Scripps Institution of Oceanography, and NOAA. With this update to our ocean map, we'll reveal an additional 2% of the ocean in high resolution representing 2 years of work by Columbia, pulling in data from numerous institutions including the Campeche Escarpment in the Gulf of Mexico in partnership with Charlie Paul at MBARI and the Schmidt Ocean Institute. The Scripps Institution of Oceanography at UCSD has curated 30 years of data from more than 8,000 ship cruises and 135 different institutions to reveal 15 percent of the seafloor at 1 km resolution. In addition, explore new data from an automated pipeline built to make updates to our Ocean Map more scalable in partnership with NOAA's National Geophysical Data Center (link to http://www.ngdc.noaa.gov/mgg/bathymetry/) and the University of Colorado CIRES program (link to http://cires.colorado.edu/index.html).

  4. Vulnerability of the Nigerian coast: An insight into sea level rise owing to climate change and anthropogenic activities

    NASA Astrophysics Data System (ADS)

    Danladi, Iliya Bauchi; Kore, Basiru Mohammed; Gül, Murat

    2017-10-01

    Coastal areas are important regions in the world as they host huge population, diverse ecosystems and natural resources. However, owing to their settings, elevations and proximities to the sea, climate change (global warming) and human activities are threatening issues. Herein, we report the coastline changes and possible future threats related to sea level rise owing to global warming and human activities in the coastal region of Nigeria. Google earth images, Digital Elevation Model (DEM) and geological maps were used. Using google earth images, coastal changes for the past 43 years, 3 years prior to and after the construction of breakwaters along Goshen Beach Estate (Lekki) were examined. Additionally, coastline changes along Lekki Phase I from 2013 to 2016 were evaluated. The DEM map was used to delineate 0-2 m, 2-5 m and 5-10 m asl which correspond to undifferentiated sands and gravels to clays on the geological map. The results of the google earth images revealed remarkable erosion along both Lekki and Lekki Phase I, with the destruction of a lagoon in Lekki Phase I. Based on the result of the DEM map and geology, elevations of 0-2 m, 2-5 m and 5-10 m asl were interpreted as highly risky, moderately risky and risky respectively. Considering factors threatening coastal regions, the erosion and destruction of the lagoon along the Nigerian coast may be ascribed to sea level rise as a result of global warming and intense human activities respectively.

  5. TerraceM: A Matlab® tool to analyze marine terraces from high-resolution topography

    NASA Astrophysics Data System (ADS)

    Jara-Muñoz, Julius; Melnick, Daniel; Strecker, Manfred

    2015-04-01

    To date, Light detection and ranging (LiDAR), high- resolution topographic data sets enable remote identification of submeter-scale geomorphic features bringing valuable information of the landscape and geomorphic markers of tectonic deformation such as fault-scarp offsets, fluvial and marine terraces. Recent studies of marine terraces using LiDAR data have demonstrated that these landforms can be readily isolated from other landforms in the landscape, using slope and roughness parameters that allow for unambiguously mapping regional extents of terrace sequences. Marine terrace elevation has been used since decades as geodetic benchmarks of Quaternary deformation. Uplift rates may be estimated by locating the shoreline angle, a geomorphic feature correlated with the high-stand position of past sea levels. Indeed, precise identification of the shoreline-angle position is an important requirement to obtain reliable tectonic rates and coherent spatial correlation. To improve our ability to rapidly assess and map different shoreline angles at a regional scale we have developed the TerraceM application. TerraceM is a Matlab® tool that allows estimating the shoreline angle and its associated error using high-resolution topography. For convenience, TerraceM includes a graphical user interface (GUI) linked with Google Maps® API. The analysis starts by defining swath profiles from a shapefile created on a GIS platform orientated orthogonally to the terrace riser. TerraceM functions are included to extract and analyze the swath profiles. Two types of coastal landscapes may be analyzed using different methodologies: staircase sequences of multiple terraces and rough, rocky coasts. The former are measured by outlining the paleo-cliffs and paleo-platforms, whereas the latter are assessed by picking the elevation of sea-stack tops. By calculating the intersection between first-order interpolations of the maximum topography of swath profiles we define the shoreline angle in staircase terraces. For rocky coasts, the maximum stack peaks for a defined search ratio as well as a defined inflection point on the adjacent main cliff are interpolated to calculate the shoreline angle at the intersection with the cliff. Error estimates are based on the standard deviation of the linear regressions. The geomorphic age of terraces (Kt) can be also calculated by the linear diffusion equation (Hanks et al., 1989), with a best-fitting model found by minimizing the RMS. TerraceM has the ability to efficiently process several profiles in batch-mode run. Results may be exported in various formats, including Google Earth and ArcGis, basic statistics are automatically computed. Test runs have been made at Santa Cruz, California, using various topographic data sets and comparing results with published field measurements (Anderson and Menking, 1994). Repeatability was evaluated using multiple test runs made by students in a classroom setting.

  6. Implementing a geographical information system to assess endemic fluoride areas in Lamphun, Thailand

    PubMed Central

    Theerawasttanasiri, Nonthaphat; Taneepanichskul, Surasak; Pingchai, Wichain; Nimchareon, Yuwaree; Sriwichai, Sangworn

    2018-01-01

    Introduction Many studies have shown that fluoride can cross the placenta and that exposure to high fluoride during pregnancy may result in premature birth and/or a low birth weight. Lamphun is one of six provinces in Thailand where natural water fluoride (WF) concentrations >10.0 mg/L were found, and it was also found that >50% of households used water with high fluoride levels. Nevertheless, geographical information system (GIS) and maps of endemic fluoride areas are lacking. We aimed to measure the fluoride level of village water supplies to assess endemic fluoride areas and present GIS with maps in Google Maps. Methods A cross-sectional survey was conducted from July 2016 to January 2017. Purpose sampling was used to identify villages of districts with WF >10.0 mg/L in the Mueang Lamphun, Pasang, and Ban Thi districts. Water samples were collected with the geolocation measured by Smart System Info. Fluoride was analyzed with an ion-selective electrode instrument using a total ionic strength adjustment buffer. WF >0.70 mg/L was used to identify unsafe drinking water and areas with high endemic fluoride levels. Descriptive statistics were used to describe the findings, and MS Excel was used to create the GIS database. Maps were created in Google Earth and presented in Google Maps. Results We found that WF concentrations ranged between 0.10–13.60 mg/L. Forty-four percent (n=439) of samples were at unsafe levels (>0.70 mg/L), and. 54% (n=303) of villages and 46% (n=79,807) of households used the unsafe drinking water. Fifty percent (n=26) of subdistricts were classified as being endemic fluoride areas. Five subdistricts were endemic fluoride areas, and in those, there were two subdistricts in which every household used unsafe drinking water. Conclusion These findings show the distribution of endemic fluoride areas and unsafe drinking water in Lamphun. This is useful for health policy authorities, local governments, and villagers and enables collaboration to resolve these issues. The GIS data are available at https://drive.google.com/open?id=1mi4Pvomf5xHZ1MQjK44pdp2xXFw&usp=sharing. PMID:29398924

  7. Implementing a geographical information system to assess endemic fluoride areas in Lamphun, Thailand.

    PubMed

    Theerawasttanasiri, Nonthaphat; Taneepanichskul, Surasak; Pingchai, Wichain; Nimchareon, Yuwaree; Sriwichai, Sangworn

    2018-01-01

    Many studies have shown that fluoride can cross the placenta and that exposure to high fluoride during pregnancy may result in premature birth and/or a low birth weight. Lamphun is one of six provinces in Thailand where natural water fluoride (WF) concentrations >10.0 mg/L were found, and it was also found that >50% of households used water with high fluoride levels. Nevertheless, geographical information system (GIS) and maps of endemic fluoride areas are lacking. We aimed to measure the fluoride level of village water supplies to assess endemic fluoride areas and present GIS with maps in Google Maps. A cross-sectional survey was conducted from July 2016 to January 2017. Purpose sampling was used to identify villages of districts with WF >10.0 mg/L in the Mueang Lamphun, Pasang, and Ban Thi districts. Water samples were collected with the geolocation measured by Smart System Info. Fluoride was analyzed with an ion-selective electrode instrument using a total ionic strength adjustment buffer. WF >0.70 mg/L was used to identify unsafe drinking water and areas with high endemic fluoride levels. Descriptive statistics were used to describe the findings, and MS Excel was used to create the GIS database. Maps were created in Google Earth and presented in Google Maps. We found that WF concentrations ranged between 0.10-13.60 mg/L. Forty-four percent (n=439) of samples were at unsafe levels (>0.70 mg/L), and. 54% (n=303) of villages and 46% (n=79,807) of households used the unsafe drinking water. Fifty percent (n=26) of subdistricts were classified as being endemic fluoride areas. Five subdistricts were endemic fluoride areas, and in those, there were two subdistricts in which every household used unsafe drinking water. These findings show the distribution of endemic fluoride areas and unsafe drinking water in Lamphun. This is useful for health policy authorities, local governments, and villagers and enables collaboration to resolve these issues. The GIS data are available at https://drive.google.com/open?id=1mi4Pvomf5xHZ1MQjK44pdp2xXFw&usp=sharing.

  8. Genetics Home Reference: tarsal-carpal coalition syndrome

    MedlinePlus

    ... Belmonte JC, Choe S. Structural basis of BMP signalling inhibition by the cystine knot protein Noggin. Nature. 2002 ... Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & Players ...

  9. Genetics Home Reference: bare lymphocyte syndrome type I

    MedlinePlus

    ... R. ABC proteins in antigen translocation and viral inhibition. Nat Chem Biol. 2010 Aug;6(8):572- ... Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & Players ...

  10. Genetics Home Reference: familial adenomatous polyposis

    MedlinePlus

    ... Järvinen HJ, Peltomäki P. The complex genotype-phenotype relationship in familial adenomatous polyposis. Eur J Gastroenterol Hepatol. ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  11. Genetics Home Reference: CDKL5 deficiency disorder

    MedlinePlus

    ... Recurrent mutations in the CDKL5 gene: genotype-phenotype relationships. Am J Med Genet A. 2012 Jul;158A( ... for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & ...

  12. Genetics Home Reference: Ohdo syndrome, Maat-Kievit-Brunner type

    MedlinePlus

    ... inheritance is that fathers cannot pass X-linked traits to their sons. Related Information What does it ... should consult with a qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map ...

  13. Genetics Home Reference: action myoclonus-renal failure syndrome

    MedlinePlus

    ... Vears DF, Franceschetti S, Canafoglia L, Wallace R, Bassuk AG, Power DA, Tassinari CA, Andermann E, Lehesjoki AE, ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  14. Genetics Home Reference: cone-rod dystrophy

    MedlinePlus

    ... Citation on PubMed Sergouniotis PI, McKibbin M, Robson AG, Bolz HJ, De Baere E, Müller PL, Heller ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  15. Genetics Home Reference: lysosomal acid lipase deficiency

    MedlinePlus

    ... Cegielska J, Whitley CB, Eckert S, Valayannopoulos V, Quinn AG. Clinical Features of Lysosomal Acid Lipase Deficiency. J ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  16. Genetics Home Reference: scalp-ear-nipple syndrome

    MedlinePlus

    ... ear nipple syndrome Sources for This Page Marneros AG, Beck AE, Turner EH, McMillin MJ, Edwards MJ, ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  17. Genetics Home Reference: hyperphosphatemic familial tumoral calcinosis

    MedlinePlus

    ... PubMed Central Ichikawa S, Baujat G, Seyahi A, Garoufali AG, Imel EA, Padgett LR, Austin AM, Sorenson AH, ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  18. Genetics Home Reference: SYNGAP1-related intellectual disability

    MedlinePlus

    ... TK, Ozkan ED, Shi Y, Reish NJ, Almonte AG, Miller BH, Wiltgen BJ, Miller CA, Xu X, ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  19. Genetics Home Reference: leptin receptor deficiency

    MedlinePlus

    ... Ferraz-Amaro I, Dattani MT, Ercan O, Myhre AG, Retterstol L, Stanhope R, Edge JA, McKenzie S, Lessan ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  20. Genetics Home Reference: complement factor I deficiency

    MedlinePlus

    ... F, Zelazko M, Marquart H, Muller K, Sjöholm AG, Truedsson L, Villoutreix BO, Blom AM. Genetic, molecular ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  1. Genetics Home Reference: proximal 18q deletion syndrome

    MedlinePlus

    ... Feenstra I, Vissers LE, Orsel M, van Kessel AG, Brunner HG, Veltman JA, van Ravenswaaij-Arts CM. ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  2. Genetics Home Reference: spastic paraplegia type 4

    MedlinePlus

    ... 2005 Jul 7. Review. Citation on PubMed Yip AG, Dürr A, Marchuk DA, Ashley-Koch A, Hentati ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  3. Moonshot Laboratories' Lava Relief Google Mapping Project

    NASA Astrophysics Data System (ADS)

    Brennan, B.; Tomita, M.

    2016-12-01

    The Moonshot Laboratories were conceived at the University Laboratory School (ULS) on Oahu, Hawaii as way to develop creative problem solvers able to resourcefully apply 21st century technologies to respond to the problems and needs of their communities. One example of this was involved students from ULS using modern mapping and imaging technologies to assist peers who had been displaced from their own school in Pahoe on the Big Island of Hawaii. During 2015, lava flows from the eruption of Kilauea Volcano were slowly encroaching into the district of Puna in 2015. The lava flow was cutting the main town of Pahoa in half, leaving no safe routes of passage into or out of the town. One elementary school in the path of the flow was closed entirely and a new one was erected north of the flow for students living on that side. Pahoa High School students and teachers living to the north were been forced to leave their school and transfer to Kea'au High School. These students were separated from friends, family and the community they grew up in and were being thrust into a foreign environment that until then had been their local rival. Using Google Mapping technologies, Moonshot Laboratories students created a dynamic map to introduce the incoming Pahoa students to their new school in Kea'au. Elements included a stylized My Maps basemap, YouTube video descriptions of the building, videos recorded by Google Glass showing first person experiences, and immersive images of classrooms were created using 360 cameras. During the first day of orientation at Kea'au for the 200 Pahoa students, each of them were given a tablet to view the map as they toured and got to know their new campus. The methods and technologies, and more importantly innovative thinking, used to create this map have enormous potential for how to educate all students about the world around us, and the issues facing it. http://www.moonshotincubator.com/

  4. Visualizing Mars data and imagery with Google Earth

    NASA Astrophysics Data System (ADS)

    Beyer, R. A.; Broxton, M.; Gorelick, N.; Hancher, M.; Lundy, M.; Kolb, E.; Moratto, Z.; Nefian, A.; Scharff, T.; Weiss-Malik, M.

    2009-12-01

    There is a vast store of planetary geospatial data that has been collected by NASA but is difficult to access and visualize. Virtual globes have revolutionized the way we visualize and understand the Earth, but other planetary bodies including Mars and the Moon can be visualized in similar ways. Extraterrestrial virtual globes are poised to revolutionize planetary science, bring an exciting new dimension to science education, and allow ordinary users to explore imagery being sent back to Earth by planetary science satellites. The original Google Mars Web site allowed users to view base maps of Mars via the Web, but it did not have the full features of the 3D Google Earth client. We have previously demonstrated the use of Google Earth to display Mars imagery, but now with the launch of Mars in Google Earth, there is a base set of Mars data available for anyone to work from and add to. There are a variety of global maps to choose from and display. The Terrain layer has the MOLA gridded data topography, and where available, HRSC terrain models are mosaicked into the topography. In some locations there is also meter-scale terrain derived from HiRISE stereo imagery. There is rich information in the form of the IAU nomenclature database, data for the rovers and landers on the surface, and a Spacecraft Imagery layer which contains the image outlines for all HiRISE, CTX, CRISM, HRSC, and MOC image data released to the PDS and links back to their science data. There are also features like the Traveler's Guide to Mars, Historic Maps, Guided Tours, as well as the 'Live from Mars' feature, which shows the orbital tracks of both the Mars Odyssey and Mars Reconnaissance Orbiter for a few days in the recent past. It shows where they have acquired imagery, and also some preview image data. These capabilities have obvious public outreach and education benefits, but the potential benefits of allowing planetary scientists to rapidly explore these large and varied data collections—in geological context and within a single user interface—are also becoming evident. Because anyone can produce additional KML content for use in Google Earth, scientists can customize the environment to their needs as well as publish their own processed data and results for others to use. Many scientists and organizations have begun to do this already, resulting in a useful and growing collection of planetary-science-oriented Google Earth layers.

  5. A campus-based course in field geology

    NASA Astrophysics Data System (ADS)

    Richard, G. A.; Hanson, G. N.

    2009-12-01

    GEO 305: Field Geology offers students practical experience in the field and in the computer laboratory conducting geological field studies on the Stony Brook University campus. Computer laboratory exercises feature mapping techniques and field studies of glacial and environmental geology, and include geophysical and hydrological analysis, interpretation, and mapping. Participants learn to use direct measurement and mathematical techniques to compute the location and geometry of features and gain practical experience in representing raster imagery and vector geographic data as features on maps. Data collecting techniques in the field include the use of hand-held GPS devices, compasses, ground-penetrating radar, tape measures, pacing, and leveling devices. Assignments that utilize these skills and techniques include mapping campus geology with GPS, using Google Earth to explore our geologic context, data file management and ArcGIS, tape and compass mapping of woodland trails, pace and compass mapping of woodland trails, measuring elevation differences on a hillside, measuring geologic sections and cores, drilling through glacial deposits, using ground penetrating radar on glaciotectonic topography, mapping the local water table, and the identification and mapping of boulders. Two three-hour sessions are offered per week, apportioned as needed between lecture; discussion; guided hands-on instruction in geospatial and other software such as ArcGIS, Google Earth, spreadsheets, and custom modules such as an arc intersection calculator; outdoor data collection and mapping; and writing of illustrated reports.

  6. Application and API for Real-time Visualization of Ground-motions and Tsunami

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Kunugi, T.; Suzuki, W.; Kubo, T.; Nakamura, H.; Azuma, H.; Fujiwara, H.

    2015-12-01

    Due to the recent progress of seismograph and communication environment, real-time and continuous ground-motion observation becomes technically and economically feasible. K-NET and KiK-net, which are nationwide strong motion networks operated by NIED, cover all Japan by about 1750 stations in total. More than half of the stations transmit the ground-motion indexes and/or waveform data in every second. Traditionally, strong-motion data were recorded by event-triggering based instruments with non-continues telephone line which is connected only after an earthquake. Though the data from such networks mainly contribute to preparations for future earthquakes, huge amount of real-time data from dense network are expected to directly contribute to the mitigation of ongoing earthquake disasters through, e.g., automatic shutdown plants and helping decision-making for initial response. By generating the distribution map of these indexes and uploading them to the website, we implemented the real-time ground motion monitoring system, Kyoshin (strong-motion in Japanese) monitor. This web service (www.kyoshin.bosai.go.jp) started in 2008 and anyone can grasp the current ground motions of Japan. Though this service provides only ground-motion map in GIF format, to take full advantage of real-time strong-motion data to mitigate the ongoing disasters, digital data are important. We have developed a WebAPI to provide real-time data and related information such as ground motions (5 km-mesh) and arrival times estimated from EEW (earthquake early warning). All response data from this WebAPI are in JSON format and are easy to parse. We also developed Kyoshin monitor application for smartphone, 'Kmoni view' using the API. In this application, ground motions estimated from EEW are overlapped on the map with the observed one-second-interval indexes. The application can playback previous earthquakes for demonstration or disaster drill. In mobile environment, data traffic and battery are limited and it is not practical to regularly visualize all the data. The application has automatic starting (pop-up) function triggered by EEW. Similar WebAPI and application for tsunami are being prepared using the pressure data recorded by dense offshore observation network (S-net), which is under construction along the Japan Trench.

  7. Raman spectroscopy as a PAT for pharmaceutical blending: Advantages and disadvantages.

    PubMed

    Riolo, Daniela; Piazza, Alessandro; Cottini, Ciro; Serafini, Margherita; Lutero, Emilio; Cuoghi, Erika; Gasparini, Lorena; Botturi, Debora; Marino, Iari Gabriel; Aliatis, Irene; Bersani, Danilo; Lottici, Pier Paolo

    2018-02-05

    Raman spectroscopy has been positively evaluated as a tool for the in-line and real-time monitoring of powder blending processes and it has been proved to be effective in the determination of the endpoint of the mixing, showing its potential role as process analytical technology (PAT). The aim of this study is to show advantages and disadvantages of Raman spectroscopy with respect to the most traditional HPLC analysis. The spectroscopic results, obtained directly on raw powders, sampled from a two-axis blender in real case conditions, were compared with the chromatographic data obtained on the same samples. The formulation blend used for the experiment consists of active pharmaceutical ingredient (API, concentrations 6.0% and 0.5%), lactose and magnesium stearate (as excipients). The first step of the monitoring process was selecting the appropriate wavenumber region where the Raman signal of API is maximal and interference from the spectral features of excipients is minimal. Blend profiles were created by plotting the area ratios of the Raman peak of API (A API ) at 1598cm -1 and the Raman bands of excipients (A EXC ), in the spectral range between 1560 and 1630cm -1 , as a function of mixing time: the API content can be considered homogeneous when the time-dependent dispersion of the area ratio is minimized. In order to achieve a representative sampling with Raman spectroscopy, each sample was mapped in a motorized XY stage by a defocused laser beam of a micro-Raman apparatus. Good correlation between the two techniques has been found only for the composition at 6.0% (w/w). However, standard deviation analysis, applied to both HPLC and Raman data, showed that Raman results are more substantial than HPLC ones, since Raman spectroscopy enables generating data rich blend profiles. In addition, the relative standard deviation calculated from a single map (30 points) turned out to be representative of the degree of homogeneity for that blend time. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. EpiCollect: linking smartphones to web applications for epidemiology, ecology and community data collection.

    PubMed

    Aanensen, David M; Huntley, Derek M; Feil, Edward J; al-Own, Fada'a; Spratt, Brian G

    2009-09-16

    Epidemiologists and ecologists often collect data in the field and, on returning to their laboratory, enter their data into a database for further analysis. The recent introduction of mobile phones that utilise the open source Android operating system, and which include (among other features) both GPS and Google Maps, provide new opportunities for developing mobile phone applications, which in conjunction with web applications, allow two-way communication between field workers and their project databases. Here we describe a generic framework, consisting of mobile phone software, EpiCollect, and a web application located within www.spatialepidemiology.net. Data collected by multiple field workers can be submitted by phone, together with GPS data, to a common web database and can be displayed and analysed, along with previously collected data, using Google Maps (or Google Earth). Similarly, data from the web database can be requested and displayed on the mobile phone, again using Google Maps. Data filtering options allow the display of data submitted by the individual field workers or, for example, those data within certain values of a measured variable or a time period. Data collection frameworks utilising mobile phones with data submission to and from central databases are widely applicable and can give a field worker similar display and analysis tools on their mobile phone that they would have if viewing the data in their laboratory via the web. We demonstrate their utility for epidemiological data collection and display, and briefly discuss their application in ecological and community data collection. Furthermore, such frameworks offer great potential for recruiting 'citizen scientists' to contribute data easily to central databases through their mobile phone.

  9. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  10. The Proteins API: accessing key integrated protein and genome information

    PubMed Central

    Antunes, Ricardo; Alpi, Emanuele; Gonzales, Leonardo; Liu, Wudong; Luo, Jie; Qi, Guoying; Turner, Edd

    2017-01-01

    Abstract The Proteins API provides searching and programmatic access to protein and associated genomics data such as curated protein sequence positional annotations from UniProtKB, as well as mapped variation and proteomics data from large scale data sources (LSS). Using the coordinates service, researchers are able to retrieve the genomic sequence coordinates for proteins in UniProtKB. This, the LSS genomics and proteomics data for UniProt proteins is programmatically only available through this service. A Swagger UI has been implemented to provide documentation, an interface for users, with little or no programming experience, to ‘talk’ to the services to quickly and easily formulate queries with the services and obtain dynamically generated source code for popular programming languages, such as Java, Perl, Python and Ruby. Search results are returned as standard JSON, XML or GFF data objects. The Proteins API is a scalable, reliable, fast, easy to use RESTful services that provides a broad protein information resource for users to ask questions based upon their field of expertise and allowing them to gain an integrated overview of protein annotations available to aid their knowledge gain on proteins in biological processes. The Proteins API is available at (http://www.ebi.ac.uk/proteins/api/doc). PMID:28383659

  11. The Proteins API: accessing key integrated protein and genome information.

    PubMed

    Nightingale, Andrew; Antunes, Ricardo; Alpi, Emanuele; Bursteinas, Borisas; Gonzales, Leonardo; Liu, Wudong; Luo, Jie; Qi, Guoying; Turner, Edd; Martin, Maria

    2017-07-03

    The Proteins API provides searching and programmatic access to protein and associated genomics data such as curated protein sequence positional annotations from UniProtKB, as well as mapped variation and proteomics data from large scale data sources (LSS). Using the coordinates service, researchers are able to retrieve the genomic sequence coordinates for proteins in UniProtKB. This, the LSS genomics and proteomics data for UniProt proteins is programmatically only available through this service. A Swagger UI has been implemented to provide documentation, an interface for users, with little or no programming experience, to 'talk' to the services to quickly and easily formulate queries with the services and obtain dynamically generated source code for popular programming languages, such as Java, Perl, Python and Ruby. Search results are returned as standard JSON, XML or GFF data objects. The Proteins API is a scalable, reliable, fast, easy to use RESTful services that provides a broad protein information resource for users to ask questions based upon their field of expertise and allowing them to gain an integrated overview of protein annotations available to aid their knowledge gain on proteins in biological processes. The Proteins API is available at (http://www.ebi.ac.uk/proteins/api/doc). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Using Google Earth for Submarine Operations at Pavilion Lake

    NASA Astrophysics Data System (ADS)

    Deans, M. C.; Lees, D. S.; Fong, T.; Lim, D. S.

    2009-12-01

    During the July 2009 Pavilion Lake field test, we supported submarine "flight" operations using Google Earth. The Intelligent Robotics Group at NASA Ames has experience with ground data systems for NASA missions, earth analog field tests, disaster response, and the Gigapan camera system. Leveraging this expertise and existing software, we put together a set of tools to support sub tracking and mapping, called the "Surface Data System." This system supports flight planning, real time flight operations, and post-flight analysis. For planning, we make overlays of the regional bedrock geology, sonar bathymetry, and sonar backscatter maps that show geology, depth, and structure of the bottom. Placemarks show the mooring locations for start and end points. Flight plans are shown as polylines with icons for waypoints. Flight tracks and imagery from previous field seasons are embedded in the map for planning follow-on activities. These data provide context for flight planning. During flights, sub position is updated every 5 seconds from the nav computer on the chase boat. We periodically update tracking KML files and refresh them with network links. A sub icon shows current location of the sub. A compass rose shows bearings to indicate heading to the next waypoint. A "Science Stenographer" listens on the voice loop and transcribes significant observations in real time. Observations called up to the surface immediately appear on the map as icons with date, time, position, and what was said. After each flight, the science back room immediately has the flight track and georeferenced notes from the pilots. We add additional information in post-processing. The submarines record video continuously, with "event" timestamps marked by the pilot. We cross-correlate the event timestamps with position logs to geolocate events and put a preview image and compressed video clip into the map. Animated flight tracks are also generated, showing timestamped position and providing timelapse playback of the flight. Neogeography tools are increasing in popularity and offer an excellent platform for geoinformatics. The scientists on the team are already familiar with Google Earth, eliminating up-front training on new tools. The flight maps and archived data are available immediately and in a usable format. Google Earth provides lots of measurement tools, annotation tools, and other built-in functions that we can use to create and analyze the map. All of this information is saved to a shared filesystem so that everyone on the team has access to all of the same map data. After the field season, the map data will be used by the team to analyse and correlate information from across the lake and across different flights to support their research, and to plan next year's activities.

  13. Genetics Home Reference: Pallister-Killian mosaic syndrome

    MedlinePlus

    ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & Players U.S. Department of Health & Human Services National Institutes of Health National Library of ...

  14. Genetics Home Reference: STING-associated vasculopathy with onset in infancy

    MedlinePlus

    ... Free article on PubMed Central Paludan SR, Bowie AG. Immune sensing of DNA. Immunity. 2013 May 23; ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  15. Genetics Home Reference: myoclonic epilepsy with ragged-red fibers

    MedlinePlus

    ... Rahman S, Poulton J, Marchington DR, Landon DN, Debono AG, Morgan-Hughes JA, Hanna MG. New phenotypic diversity ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  16. Streets? Where We're Going, We Don't Need Streets

    NASA Astrophysics Data System (ADS)

    Bailey, J.

    2017-12-01

    In 2007 Google Street View started as a project to provide 360-degree imagery along streets, but in the decade since has evolved into a platform through which to explore everywhere from the slope of everest, to the middle of the Amazon rainforest to under the ocean. As camera technology has evolved it has also become a tool for ground truthing maps, and provided scientific observations, storytelling and education. The Google Street View "special collects" team has undertaken increasingly more challenging projects across 80+ countries and every continent. All of which culminated in possibly the most ambitious collection yet, the capture of Street View on board the International Space Station. Learn about the preparation and obstacles behind this and other special collects. Explore these datasets through both Google Earth and Google Expeditions VR, an educational tool to take students on virtual field trips using 360 degree imagery.

  17. Enabling Mobile Air Quality App Development with an AirNow API

    NASA Astrophysics Data System (ADS)

    Dye, T.; White, J. E.; Ludewig, S. A.; Dickerson, P.; Healy, A. N.; West, J. W.; Prince, L. A.

    2013-12-01

    The U.S. Environmental Protection Agency's (EPA) AirNow program works with over 130 participating state, local, and federal air quality agencies to obtain, quality control, and store real-time air quality observations and forecasts. From these data, the AirNow system generates thousands of maps and products each hour. Each day, information from AirNow is published online and in other media to assist the public in making health-based decisions related to air quality. However, an increasing number of people use mobile devices as their primary tool for obtaining information, and AirNow has responded to this trend by publishing an easy-to-use Web API that is useful for mobile app developers. This presentation will describe the various features of the AirNow application programming interface (API), including Representational State Transfer (REST)-type web services, file outputs, and RSS feeds. In addition, a web portal for the AirNow API will be shown, including documentation on use of the system, a query tool for configuring and running web services, and general information about the air quality data and forecasts available. Data published via the AirNow API includes corresponding Air Quality Index (AQI) levels for each pollutant. We will highlight examples of mobile apps that are using the AirNow API to provide location-based, real-time air quality information. Examples will include mobile apps developed for Minnesota ('Minnesota Air') and Washington, D.C. ('Clean Air Partners Air Quality'), and an app developed by EPA ('EPA AirNow').

  18. Sally Ride EarthKAM - Automated Image Geo-Referencing Using Google Earth Web Plug-In

    NASA Technical Reports Server (NTRS)

    Andres, Paul M.; Lazar, Dennis K.; Thames, Robert Q.

    2013-01-01

    Sally Ride EarthKAM is an educational program funded by NASA that aims to provide the public the ability to picture Earth from the perspective of the International Space Station (ISS). A computer-controlled camera is mounted on the ISS in a nadir-pointing window; however, timing limitations in the system cause inaccurate positional metadata. Manually correcting images within an orbit allows the positional metadata to be improved using mathematical regressions. The manual correction process is time-consuming and thus, unfeasible for a large number of images. The standard Google Earth program allows for the importing of KML (keyhole markup language) files that previously were created. These KML file-based overlays could then be manually manipulated as image overlays, saved, and then uploaded to the project server where they are parsed and the metadata in the database is updated. The new interface eliminates the need to save, download, open, re-save, and upload the KML files. Everything is processed on the Web, and all manipulations go directly into the database. Administrators also have the control to discard any single correction that was made and validate a correction. This program streamlines a process that previously required several critical steps and was probably too complex for the average user to complete successfully. The new process is theoretically simple enough for members of the public to make use of and contribute to the success of the Sally Ride EarthKAM project. Using the Google Earth Web plug-in, EarthKAM images, and associated metadata, this software allows users to interactively manipulate an EarthKAM image overlay, and update and improve the associated metadata. The Web interface uses the Google Earth JavaScript API along with PHP-PostgreSQL to present the user the same interface capabilities without leaving the Web. The simpler graphical user interface will allow the public to participate directly and meaningfully with EarthKAM. The use of similar techniques is being investigated to place ground-based observations in a Google Mars environment, allowing the MSL (Mars Science Laboratory) Science Team a means to visualize the rover and its environment.

  19. Quantifying the effect of media limitations on outbreak data in a global online web-crawling epidemic intelligence system, 2008–2011

    PubMed Central

    Scales, David; Zelenev, Alexei; Brownstein, John S.

    2013-01-01

    Background This is the first study quantitatively evaluating the effect that media-related limitations have on data from an automated epidemic intelligence system. Methods We modeled time series of HealthMap's two main data feeds, Google News and Moreover, to test for evidence of two potential limitations: first, human resources constraints, and second, high-profile outbreaks “crowding out” coverage of other infectious diseases. Results Google News events declined by 58.3%, 65.9%, and 14.7% on Saturday, Sunday and Monday, respectively, relative to other weekdays. Events were reduced by 27.4% during Christmas/New Years weeks and 33.6% lower during American Thanksgiving week than during an average week for Google News. Moreover data yielded similar results with the addition of Memorial Day (US) being associated with a 36.2% reduction in events. Other holiday effects were not statistically significant. We found evidence for a crowd out phenomenon for influenza/H1N1, where a 50% increase in influenza events corresponded with a 4% decline in other disease events for Google News only. Other prominent diseases in this database – avian influenza (H5N1), cholera, or foodborne illness – were not associated with a crowd out phenomenon. Conclusions These results provide quantitative evidence for the limited impact of editorial biases on HealthMap's web-crawling epidemic intelligence. PMID:24206612

  20. Genetics Home Reference: dilated cardiomyopathy with ataxia syndrome

    MedlinePlus

    ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA.gov Copyright Privacy Accessibility FOIA Viewers & Players U.S. Department of Health & Human Services National Institutes of Health National Library of ...

  1. Genetics Home Reference: alveolar capillary dysplasia with misalignment of pulmonary veins

    MedlinePlus

    ... K, Schultz R, Hallam L, McRae D, Nicholson AG, Newbury R, Durham-O'Donnell J, Knight G, ... qualified healthcare professional . About Selection Criteria for Links Data Files & API Site Map Subscribe Customer Support USA. ...

  2. Visualizing Moon Data and Imagery with Google Earth

    NASA Astrophysics Data System (ADS)

    Weiss-Malik, M.; Scharff, T.; Nefian, A.; Moratto, Z.; Kolb, E.; Lundy, M.; Hancher, M.; Gorelick, N.; Broxton, M.; Beyer, R. A.

    2009-12-01

    There is a vast store of planetary geospatial data that has been collected by NASA but is difficult to access and visualize. Virtual globes have revolutionized the way we visualize and understand the Earth, but other planetary bodies including Mars and the Moon can be visualized in similar ways. Extraterrestrial virtual globes are poised to revolutionize planetary science, bring an exciting new dimension to science education, and allow ordinary users to explore imagery being sent back to Earth by planetary science satellites. The original Google Moon Web site was a limited series of maps and Apollo content. The new Moon in Google Earth feature provides a similar virtual planet experience for the Moon as we have for the Earth and Mars. We incorporated existing Clementine and Lunar Orbiter imagery for the basemaps and a combination of Kaguya LALT topography and some terrain created from Apollo Metric and Panoramic images. We also have information about the Apollo landings and other robotic landers on the surface, as well as historic maps and charts, and guided tours. Some of the first-released LROC imagery of the Apollo landing sites has been put in place, and we look forward to incorporating more data as it is released from LRO, Chandraayan-1, and Kaguya. These capabilities have obvious public outreach and education benefits, but the potential benefits of allowing planetary scientists to rapidly explore these large and varied data collections — in geological context and within a single user interface — are also becoming evident. Because anyone can produce additional KML content for use in Google Earth, scientists can customize the environment to their needs as well as publish their own processed data and results for others to use. Many scientists and organizations have begun to do this already, resulting in a useful and growing collection of planetary-science-oriented Google Earth layers. Screen shot of Moon in Google Earth, a freely downloadable application for visualizing Moon imagery and data.

  3. Real-time bus location monitoring using Arduino

    NASA Astrophysics Data System (ADS)

    Ibrahim, Mohammad Y. M.; Audah, Lukman

    2017-09-01

    The Internet of Things (IoT) is the network of objects, such as a vehicles, mobile devices, and buildings that have electronic components, software, and network connectivity that enable them to collect data, run commands, and be controlled through the Internet. Controlling physical items from the Internet will increase efficiency and save time. The growing number of devices used by people increases the practicality of having IoT devices on the market. The IoT is also an opportunity to develop products that can save money and time and increase work efficiency. Initially, they need more efficiency for real-time bus location systems, especially in university campuses. This system can easily find the accurate locations of and distances between each bus stop and the estimated time to reach a new location. This system has been separated into two parts, which are the hardware and the software. The hardware parts are the Arduino Uno and the Global Positioning System (GPS), while Google Earth and GpsGate are the software parts. The GPS continuously takes input data from the satellite and stores the latitude and longitude values in the Arduino Uno. If we want to track the vehicle, we need to send the longitude and latitude as a message to the Google Earth software to convert these into maps for navigation. Once the Arduino Uno is activated, it takes the last received latitude and longitude positions' values from GpsGate and sends a message to Google Earth. Once the message has been sent to Google Earth, the current location will be shown, and navigation will be activated automatically. Then it will be broadcast using ManyCam, Google+ Hangouts, and YouTube, as well as Facebook, and appear to users. The additional features use Google Forms for determining problems faced by students, who can also take immediate action against the responsible department. Then after several successful simulations, the results will be shown in real time on a map.

  4. Combining Google Earth and GIS mapping technologies in a dengue surveillance system for developing countries

    PubMed Central

    Chang, Aileen Y; Parrales, Maria E; Jimenez, Javier; Sobieszczyk, Magdalena E; Hammer, Scott M; Copenhaver, David J; Kulkarni, Rajan P

    2009-01-01

    Background Dengue fever is a mosquito-borne illness that places significant burden on tropical developing countries with unplanned urbanization. A surveillance system using Google Earth and GIS mapping technologies was developed in Nicaragua as a management tool. Methods and Results Satellite imagery of the town of Bluefields, Nicaragua captured from Google Earth was used to create a base-map in ArcGIS 9. Indices of larval infestation, locations of tire dumps, cemeteries, large areas of standing water, etc. that may act as larval development sites, and locations of the homes of dengue cases collected during routine epidemiologic surveying were overlaid onto this map. Visual imagery of the location of dengue cases, larval infestation, and locations of potential larval development sites were used by dengue control specialists to prioritize specific neighborhoods for targeted control interventions. Conclusion This dengue surveillance program allows public health workers in resource-limited settings to accurately identify areas with high indices of mosquito infestation and interpret the spatial relationship of these areas with potential larval development sites such as garbage piles and large pools of standing water. As a result, it is possible to prioritize control strategies and to target interventions to highest risk areas in order to eliminate the likely origin of the mosquito vector. This program is well-suited for resource-limited settings since it utilizes readily available technologies that do not rely on Internet access for daily use and can easily be implemented in many developing countries for very little cost. PMID:19627614

  5. Tracking the polio virus down the Congo River: a case study on the use of Google Earth™ in public health planning and mapping

    PubMed Central

    Kamadjeu, Raoul

    2009-01-01

    Background The use of GIS in public health is growing, a consequence of a rapidly evolving technology and increasing accessibility to a wider audience. Google Earth™ (GE) is becoming an important mapping infrastructure for public health. However, generating traditional public health maps for GE is still beyond the reach of most public health professionals. In this paper, we explain, through the example of polio eradication activities in the Democratic Republic of Congo, how we used GE Earth as a planning tool and we share the methods used to generate public health maps. Results The use of GE improved field operations and resulted in better dispatch of vaccination teams and allocation of resources. It also allowed the creation of maps of high quality for advocacy, training and to help understand the spatiotemporal relationship between all the entities involved in the polio outbreak and response. Conclusion GE has the potential of making mapping available to a new set of public health users in developing countries. High quality and free satellite imagery, rich features including Keyhole Markup Language or image overlay provide a flexible but yet powerful platform that set it apart from traditional GIS tools and this power is still to be fully harnessed by public health professionals. PMID:19161606

  6. Using Google Maps to Access USGS Volcano Hazards Information

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Snedigar, S.; Guffanti, M.; Bailey, J. E.; Wall, B. G.

    2006-12-01

    The U.S. Geological Survey (USGS) Volcano Hazard Program (VHP) is revising the information architecture of our website to provide data within a geospatial context for emergency managers, educators, landowners in volcanic areas, researchers, and the general public. Using a map-based interface for displaying hazard information provides a synoptic view of volcanic activity along with the ability to quickly ascertain where hazards are in relation to major population and infrastructure centers. At the same time, the map interface provides a gateway for educators and the public to find information about volcanoes in their geographic context. A plethora of data visualization solutions are available that are flexible, customizable, and can be run on individual websites. We are currently using a Google map interface because it can be accessed immediately from a website (a downloadable viewer is not required), and it provides simple features for moving around and zooming within the large map area that encompasses U.S. volcanism. A text interface will also be available. The new VHP website will serve as a portal to information for each volcano the USGS monitors with icons for alert levels and aviation color codes. When a volcano is clicked, a window will provide additional information including links to maps, images, and real-time data, thereby connecting information from individual observatories, the Smithsonian Institution, and our partner universities. In addition to the VHP home page, many observatories and partners have detailed graphical interfaces to data and images that include the activity pages for the Alaska Volcano Observatory, the Smithsonian Google Earth files, and Yellowstone Volcano Observatory pictures and data. Users with varied requests such as raw data, scientific papers, images, or brief overviews expect to be able to quickly access information for their specialized needs. Over the next few years we will be gathering, cleansing, reorganizing, and posting data in multiple formats to meet these needs.

  7. A Google Glass navigation system for ultrasound and fluorescence dual-mode image-guided surgery

    NASA Astrophysics Data System (ADS)

    Zhang, Zeshu; Pei, Jing; Wang, Dong; Hu, Chuanzhen; Ye, Jian; Gan, Qi; Liu, Peng; Yue, Jian; Wang, Benzhong; Shao, Pengfei; Povoski, Stephen P.; Martin, Edward W.; Yilmaz, Alper; Tweedle, Michael F.; Xu, Ronald X.

    2016-03-01

    Surgical resection remains the primary curative intervention for cancer treatment. However, the occurrence of a residual tumor after resection is very common, leading to the recurrence of the disease and the need for re-resection. We develop a surgical Google Glass navigation system that combines near infrared fluorescent imaging and ultrasonography for intraoperative detection of sites of tumor and assessment of surgical resection boundaries, well as for guiding sentinel lymph node (SLN) mapping and biopsy. The system consists of a monochromatic CCD camera, a computer, a Google Glass wearable headset, an ultrasonic machine and an array of LED light sources. All the above components, except the Google Glass, are connected to a host computer by a USB or HDMI port. Wireless connection is established between the glass and the host computer for image acquisition and data transport tasks. A control program is written in C++ to call OpenCV functions for image calibration, processing and display. The technical feasibility of the system is tested in both tumor simulating phantoms and in a human subject. When the system is used for simulated phantom resection tasks, the tumor boundaries, invisible to the naked eye, can be clearly visualized with the surgical Google Glass navigation system. This system has also been used in an IRB approved protocol in a single patient during SLN mapping and biopsy in the First Affiliated Hospital of Anhui Medical University, demonstrating the ability to successfully localize and resect all apparent SLNs. In summary, our tumor simulating phantom and human subject studies have demonstrated the technical feasibility of successfully using the proposed goggle navigation system during cancer surgery.

  8. Towards a geospatial wikipedia

    NASA Astrophysics Data System (ADS)

    Fritz, S.; McCallum, I.; Schill, C.; Perger, C.; Kraxner, F.; Obersteiner, M.

    2009-04-01

    Based on the Google Earth (http://earth.google.com) platform we have developed a geospatial Wikipedia (geo-wiki.org). The tool allows everybody in the world to contribute to spatial validation and is made available to the internet community interested in that task. We illustrate how this tool can be used for different applications. In our first application we combine uncertainty hotspot information from three global land cover datasets (GLC, MODIS, GlobCover). With an ever increasing amount of high resolution images available on Google Earth, it is becoming increasingly possible to distinguish land cover features with a high degree of accuracy. We first direct the land cover validation community to certain hotspots of land cover uncertainty and then ask them to fill in a small popup menu on type of land cover, possibly a picture at that location with the different cardinal points as well as date and what type of validation was chosen (google earth imagery/panoramio or if the person has ground truth data). We have implemented the tool via a land cover validation community at FACEBOOK which is based on a snowball system which allows the tracking of individuals and the possibility to ignore users which misuse the system. In a second application we illustrate how the tool could possibly be used for mapping malaria occurrence and small water bodies as well as overall malaria risk. For this application we have implemented a polygon as well as attribute function using Google maps as along with virtual earth using openlayers. The third application deals with illegal logging and how an alert system for illegal logging detection within a certain land tenure system could be implemented. Here we show how the tool can be used to document illegal logging via a YouTube video.

  9. Jupyter meets Earth: Creating Comprehensible and Reproducible Scientific Workflows with Jupyter Notebooks and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T.

    2016-12-01

    Deriving actionable information from Earth observation data obtained from sensors or models can be quite complicated, and sharing those insights with others in a form that they can understand, reproduce, and improve upon is equally difficult. Journal articles, even if digital, commonly present just a summary of an analysis that cannot be understood in depth or reproduced without major effort on the part of the reader. Here we show a method of improving scientific literacy by pairing a recently developed scientific presentation technology (Jupyter Notebooks) with a petabyte-scale platform for accessing and analyzing Earth observation and model data (Google Earth Engine). Jupyter Notebooks are interactive web documents that mix live code with annotations such as rich-text markup, equations, images, videos, hyperlinks and dynamic output. Notebooks were first introduced as part of the IPython project in 2011, and have since gained wide acceptance in the scientific programming community, initially among Python programmers but later by a wide range of scientific programming languages. While Jupyter Notebooks have been widely adopted for general data analysis, data visualization, and machine learning, to date there have been relatively few examples of using Jupyter Notebooks to analyze geospatial datasets. Google Earth Engine is cloud-based platform for analyzing geospatial data, such as satellite remote sensing imagery and/or Earth system model output. Through its Python API, Earth Engine makes petabytes of Earth observation data accessible, and provides hundreds of algorithmic building blocks that can be chained together to produce high-level algorithms and outputs in real-time. We anticipate that this technology pairing will facilitate a better way of creating, documenting, and sharing complex analyses that derive information on our Earth that can be used to promote broader understanding of the complex issues that it faces. http://jupyter.orghttps://earthengine.google.com

  10. Identifying Severe Weather Impacts and Damage with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Molthan, A.; Burks, J. E.; Bell, J. R.

    2015-12-01

    Hazards associated with severe convective storms can lead to rapid changes in land surface vegetation. Depending upon the type of vegetation that has been impacted, their impacts can be relatively short lived, such as damage to seasonal crops that are eventually removed by harvest, or longer-lived, such as damage to a stand of trees or expanse of forest that require several years to recover. Since many remote sensing imagers provide their highest spatial resolution bands in the red and near-infrared to support monitoring of vegetation, these impacts can be readily identified as short-term and marked decreases in common vegetation indices such as NDVI, along with increases in land surface temperature that are observed at a reduced spatial resolution. The ability to identify an area of vegetation change is improved by understanding the conditions that are normal for a given time of year and location, along with a typical range of variability in a given parameter. This analysis requires a period of record well beyond the availability of near real-time data. These activities would typically require an analyst to download large volumes of data from sensors such as NASA's MODIS (aboard Terra and Aqua) or higher resolution imagers from the Landsat series of satellites. Google's Earth Engine offers a "big data" solution to these challenges, by providing a streamlined API and option to process the period of record of NASA MODIS and Landsat products through relatively simple Javascript coding. This presentation will highlight efforts to date in using Earth Engine holdings to produce vegetation and land surface temperature anomalies that are associated with damage to agricultural and other vegetation caused by severe thunderstorms across the Central and Southeastern United States. Earth Engine applications will show how large data holdings can be used to map severe weather damage, ascertain longer-term impacts, and share best practices learned and challenges with applying Earth Engine holdings to the analysis of severe weather damage. Other applications are also demonstrated, such as use of Earth Engine to prepare pre-event composites that can be used to subjectively identify other severe weather impacts. Future extension to flooding and wildfires is also proposed.

  11. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    NASA Astrophysics Data System (ADS)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  12. Geographical accessibility to community pharmacies by the elderly in metropolitan Lisbon.

    PubMed

    Padeiro, Miguel

    2018-07-01

    In ageing societies, community pharmacies play an important role in delivering medicines, responsible advising, and other targeted services. Elderly people are among their main consumers, as they use more prescription drugs, need more specific health care, and experience more mobility issues than other age groups. This makes geographical accessibility a relevant concern for them. To measure geographical pedestrian accessibility to community pharmacies by elderly people in the Lisbon Metropolitan Area (LMA). The number of elderly people living within a 10- and 15-min walk was estimated based on the exploitation of population census data, the address-based location of 801 community pharmacies, and a Google Maps Application Programming Interface (API) method for calculating distances between pharmacies and the centroids of census statistical subsections. Results were compared to figures attained via traditional methods. In the LMA, 61.2% of the elderly live less than a 10 min walk from the nearest pharmacy and 76.9% live less than 15 min away. This opposes the common view that pharmacies are highly accessible in urban areas. In addition, results show a high spatial variability of proximity to pharmacies. Despite the illusion of good coverage suggested at the metropolitan scale, accessibility measures demonstrate the existence of pharmaceutical deprivation areas for the elderly. The findings indicate the need for more accuracy in both access measurements and redistribution policies. Measurement methods and population targets should be reconsidered. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Using Secure Web Services to Visualize Poison Center Data for Nationwide Biosurveillance: A Case Study

    PubMed Central

    Savel, Thomas G; Bronstein, Alvin; Duck, William; Rhodes, M. Barry; Lee, Brian; Stinn, John; Worthen, Katherine

    2010-01-01

    Objectives Real-time surveillance systems are valuable for timely response to public health emergencies. It has been challenging to leverage existing surveillance systems in state and local communities, and, using a centralized architecture, add new data sources and analytical capacity. Because this centralized model has proven to be difficult to maintain and enhance, the US Centers for Disease Control and Prevention (CDC) has been examining the ability to use a federated model based on secure web services architecture, with data stewardship remaining with the data provider. Methods As a case study for this approach, the American Association of Poison Control Centers and the CDC extended an existing data warehouse via a secure web service, and shared aggregate clinical effects and case counts data by geographic region and time period. To visualize these data, CDC developed a web browser-based interface, Quicksilver, which leveraged the Google Maps API and Flot, a javascript plotting library. Results Two iterations of the NPDS web service were completed in 12 weeks. The visualization client, Quicksilver, was developed in four months. Discussion This implementation of web services combined with a visualization client represents incremental positive progress in transitioning national data sources like BioSense and NPDS to a federated data exchange model. Conclusion Quicksilver effectively demonstrates how the use of secure web services in conjunction with a lightweight, rapidly deployed visualization client can easily integrate isolated data sources for biosurveillance. PMID:23569581

  14. RecutClub.com: An Open Source, Whole Slide Image-based Pathology Education System

    PubMed Central

    Christensen, Paul A.; Lee, Nathan E.; Thrall, Michael J.; Powell, Suzanne Z.; Chevez-Barrios, Patricia; Long, S. Wesley

    2017-01-01

    Background: Our institution's pathology unknown conferences provide educational cases for our residents. However, the cases have not been previously available digitally, have not been collated for postconference review, and were not accessible to a wider audience. Our objective was to create an inexpensive whole slide image (WSI) education suite to address these limitations and improve the education of pathology trainees. Materials and Methods: We surveyed residents regarding their preference between four unique WSI systems. We then scanned weekly unknown conference cases and study set cases and uploaded them to our custom built WSI viewer located at RecutClub.com. We measured site utilization and conference participation. Results: Residents preferred our OpenLayers WSI implementation to Ventana Virtuoso, Google Maps API, and OpenSlide. Over 16 months, we uploaded 1366 cases from 77 conferences and ten study sets, occupying 793.5 GB of cloud storage. Based on resident evaluations, the interface was easy to use and demonstrated minimal latency. Residents are able to review cases from home and from their mobile devices. Worldwide, 955 unique IP addresses from 52 countries have viewed cases in our site. Conclusions: We implemented a low-cost, publicly available repository of WSI slides for resident education. Our trainees are very satisfied with the freedom to preview either the glass slides or WSI and review the WSI postconference. Both local users and worldwide users actively and repeatedly view cases in our study set. PMID:28382224

  15. Placing User-Generated Content on the Map with Confidence

    DTIC Science & Technology

    2014-11-03

    Terms Theory,Algorithms Keywords Geographic information retrieval, Geolocation 1. INTRODUCTION We describe a method that places on the map short text...we collected using twitter4j, a Java library for the Twitter API . After filtering, there were 44,289 documents in the Twitter test set We evaluate how...Baldwin. Text-based twitter user geolocation prediction. J. Artif. Intell. Res.(JAIR), 49:451–500, 2014. [4] C. Hauff, B. Thomee, and M. Trevisiol

  16. Analysis of world terror networks from the reduced Google matrix of Wikipedia

    NASA Astrophysics Data System (ADS)

    El Zant, Samer; Frahm, Klaus M.; Jaffrès-Runser, Katia; Shepelyansky, Dima L.

    2018-01-01

    We apply the reduced Google matrix method to analyze interactions between 95 terrorist groups and determine their relationships and influence on 64 world countries. This is done on the basis of the Google matrix of the English Wikipedia (2017) composed of 5 416 537 articles which accumulate a great part of global human knowledge. The reduced Google matrix takes into account the direct and hidden links between a selection of 159 nodes (articles) appearing due to all paths of a random surfer moving over the whole network. As a result we obtain the network structure of terrorist groups and their relations with selected countries including hidden indirect links. Using the sensitivity of PageRank to a weight variation of specific links we determine the geopolitical sensitivity and influence of specific terrorist groups on world countries. The world maps of the sensitivity of various countries to influence of specific terrorist groups are obtained. We argue that this approach can find useful application for more extensive and detailed data bases analysis.

  17. Local Air Quality Conditions and Forecasts

    MedlinePlus

    ... Monitor Location Archived Maps by Region Canada Air Quality Air Quality on Google Earth Links A-Z About AirNow AirNow International Air Quality Action Days / Alerts AirCompare Air Quality Index (AQI) ...

  18. Zebra Crossing Spotter: Automatic Population of Spatial Databases for Increased Safety of Blind Travelers

    PubMed Central

    Ahmetovic, Dragan; Manduchi, Roberto; Coughlan, James M.; Mascetti, Sergio

    2016-01-01

    In this paper we propose a computer vision-based technique that mines existing spatial image databases for discovery of zebra crosswalks in urban settings. Knowing the location of crosswalks is critical for a blind person planning a trip that includes street crossing. By augmenting existing spatial databases (such as Google Maps or OpenStreetMap) with this information, a blind traveler may make more informed routing decisions, resulting in greater safety during independent travel. Our algorithm first searches for zebra crosswalks in satellite images; all candidates thus found are validated against spatially registered Google Street View images. This cascaded approach enables fast and reliable discovery and localization of zebra crosswalks in large image datasets. While fully automatic, our algorithm could also be complemented by a final crowdsourcing validation stage for increased accuracy. PMID:26824080

  19. Understanding the productive author who published papers in medicine using National Health Insurance Database: A systematic review and meta-analysis.

    PubMed

    Chien, Tsair-Wei; Chang, Yu; Wang, Hsien-Yi

    2018-02-01

    Many researchers used National Health Insurance database to publish medical papers which are often retrospective, population-based, and cohort studies. However, the author's research domain and academic characteristics are still unclear.By searching the PubMed database (Pubmed.com), we used the keyword of [Taiwan] and [National Health Insurance Research Database], then downloaded 2913 articles published from 1995 to 2017. Social network analysis (SNA), Gini coefficient, and Google Maps were applied to gather these data for visualizing: the most productive author; the pattern of coauthor collaboration teams; and the author's research domain denoted by abstract keywords and Pubmed MESH (medical subject heading) terms.Utilizing the 2913 papers from Taiwan's National Health Insurance database, we chose the top 10 research teams shown on Google Maps and analyzed one author (Dr. Kao) who published 149 papers in the database in 2015. In the past 15 years, we found Dr. Kao had 2987 connections with other coauthors from 13 research teams. The cooccurrence abstract keywords with the highest frequency are cohort study and National Health Insurance Research Database. The most coexistent MESH terms are tomography, X-ray computed, and positron-emission tomography. The strength of the author research distinct domain is very low (Gini < 0.40).SNA incorporated with Google Maps and Gini coefficient provides insight into the relationships between entities. The results obtained in this study can be applied for a comprehensive understanding of other productive authors in the field of academics.

  20. Augmented Reality as a Navigation Tool to Employment Opportunities for Postsecondary Education Students with Intellectual Disabilities and Autism

    ERIC Educational Resources Information Center

    McMahon, Don; Cihak, David F.; Wright, Rachel

    2015-01-01

    The purpose of this study was to examine the effects of location-based augmented reality navigation compared to Google Maps and paper maps as navigation aids for students with disabilities. The participants in this single subject study were three college students with intellectual disability and one college student with autism spectrum disorder.…

  1. Exploring the Spatial Representativeness of NAAQS and Near Roadway Sites Using High-Spatial Resolution Air Pollution Maps Produced by A Mobile Mapping Platform

    EPA Science Inventory

    In the current study, three Google Street View cars were equipped with the Aclima Environmental Intelligence ™ Platform. The air pollutants of interest, including O3, NO, NO2, CO2, black carbon, and particle number in several size ranges, were measured using a suite of fast...

  2. Interactive Mapping of Inundation Metrics Using Cloud Computing for Improved Floodplain Conservation and Management

    NASA Astrophysics Data System (ADS)

    Bulliner, E. A., IV; Lindner, G. A.; Bouska, K.; Paukert, C.; Jacobson, R. B.

    2017-12-01

    Within large-river ecosystems, floodplains serve a variety of important ecological functions. A recent survey of 80 managers of floodplain conservation lands along the Upper and Middle Mississippi and Lower Missouri Rivers in the central United States found that the most critical information needed to improve floodplain management centered on metrics for characterizing depth, extent, frequency, duration, and timing of inundation. These metrics can be delivered to managers efficiently through cloud-based interactive maps. To calculate these metrics, we interpolated an existing one-dimensional hydraulic model for the Lower Missouri River, which simulated water surface elevations at cross sections spaced (<1 km) to sufficiently characterize water surface profiles along an approximately 800 km stretch upstream from the confluence with the Mississippi River over an 80-year record at a daily time step. To translate these water surface elevations to inundation depths, we subtracted a merged terrain model consisting of floodplain LIDAR and bathymetric surveys of the river channel. This approach resulted in a 29000+ day time series of inundation depths across the floodplain using grid cells with 30 m spatial resolution. Initially, we used these data on a local workstation to calculate a suite of nine spatially distributed inundation metrics for the entire model domain. These metrics are calculated on a per pixel basis and encompass a variety of temporal criteria generally relevant to flora and fauna of interest to floodplain managers, including, for example, the average number of days inundated per year within a growing season. Using a local workstation, calculating these metrics for the entire model domain requires several hours. However, for the needs of individual floodplain managers working at site scales, these metrics may be too general and inflexible. Instead of creating a priori a suite of inundation metrics able to satisfy all user needs, we present the usage of Google's cloud-based Earth Engine API to allow users to define and query their own inundation metrics from our dataset and produce maps nearly instantaneously. This approach allows users to select the time periods and inundation depths germane to managing local species, potentially facilitating conservation of floodplain ecosystems.

  3. www.fallasdechile.cl, the First Online Repository for Neotectonic Faults in the Chilean Andes

    NASA Astrophysics Data System (ADS)

    Aron, F.; Salas, V.; Bugueño, C. J.; Hernández, C.; Leiva, L.; Santibanez, I.; Cembrano, J. M.

    2016-12-01

    We introduce the site www.fallasdechile.cl, created and maintained by undergraduate students and researchers at the Catholic University of Chile. Though the web page seeks to inform and educate the general public about potentially seismogenic faults of the country, layers of increasing content complexity allow students, researchers and educators to consult the site as a scientific tool as well. This is the first comprehensive, open access database on Chilean geologic faults; we envision that it may grow organically with contributions from peer scientists, resembling the SCEC community fault model for southern California. Our website aims at filling a gap between science and society providing users the opportunity to get involved by self-driven learning through interactive education modules. The main page highlights recent developments and open questions in Chilean earthquake science. Front pages show first level information of general concepts in earthquake topics such as tectonic settings, definition of geologic faults, and space-time constraints of faults. Users can navigate interactive modules to explore, with real data, different earthquake scenarios and compute values of seismic moment and magnitude. A second level covers Chilean/Andean faults classified according to their geographic location containing at least one of the following parameters: mapped trace, 3D geometry, sense of slip, recurrence times and date of last event. Fault traces are displayed on an interactive map using a Google Maps API. The material is compiled and curated in an effort to present, up to our knowledge, accurate and up to date information. If interested, the user can navigate to a third layer containing more advanced technical details including primary sources of the data, a brief structural description, published scientific articles, and links to other online content complementing our site. Also, geographically referenced fault traces with attributes (kml, shapefiles) and fault 3D surfaces (contours, tsurf files) will be available to download. Given its potential for becoming a referential database for active faults in Chile, this project evidences that undergrads can go beyond the classroom, be of service to the scientific community, and make contributions with broader impacts.

  4. NaviCell: a web-based environment for navigation, curation and maintenance of large molecular interaction maps

    PubMed Central

    2013-01-01

    Background Molecular biology knowledge can be formalized and systematically represented in a computer-readable form as a comprehensive map of molecular interactions. There exist an increasing number of maps of molecular interactions containing detailed and step-wise description of various cell mechanisms. It is difficult to explore these large maps, to organize discussion of their content and to maintain them. Several efforts were recently made to combine these capabilities together in one environment, and NaviCell is one of them. Results NaviCell is a web-based environment for exploiting large maps of molecular interactions, created in CellDesigner, allowing their easy exploration, curation and maintenance. It is characterized by a combination of three essential features: (1) efficient map browsing based on Google Maps; (2) semantic zooming for viewing different levels of details or of abstraction of the map and (3) integrated web-based blog for collecting community feedback. NaviCell can be easily used by experts in the field of molecular biology for studying molecular entities of interest in the context of signaling pathways and crosstalk between pathways within a global signaling network. NaviCell allows both exploration of detailed molecular mechanisms represented on the map and a more abstract view of the map up to a top-level modular representation. NaviCell greatly facilitates curation, maintenance and updating the comprehensive maps of molecular interactions in an interactive and user-friendly fashion due to an imbedded blogging system. Conclusions NaviCell provides user-friendly exploration of large-scale maps of molecular interactions, thanks to Google Maps and WordPress interfaces, with which many users are already familiar. Semantic zooming which is used for navigating geographical maps is adopted for molecular maps in NaviCell, making any level of visualization readable. In addition, NaviCell provides a framework for community-based curation of maps. PMID:24099179

  5. NaviCell: a web-based environment for navigation, curation and maintenance of large molecular interaction maps.

    PubMed

    Kuperstein, Inna; Cohen, David P A; Pook, Stuart; Viara, Eric; Calzone, Laurence; Barillot, Emmanuel; Zinovyev, Andrei

    2013-10-07

    Molecular biology knowledge can be formalized and systematically represented in a computer-readable form as a comprehensive map of molecular interactions. There exist an increasing number of maps of molecular interactions containing detailed and step-wise description of various cell mechanisms. It is difficult to explore these large maps, to organize discussion of their content and to maintain them. Several efforts were recently made to combine these capabilities together in one environment, and NaviCell is one of them. NaviCell is a web-based environment for exploiting large maps of molecular interactions, created in CellDesigner, allowing their easy exploration, curation and maintenance. It is characterized by a combination of three essential features: (1) efficient map browsing based on Google Maps; (2) semantic zooming for viewing different levels of details or of abstraction of the map and (3) integrated web-based blog for collecting community feedback. NaviCell can be easily used by experts in the field of molecular biology for studying molecular entities of interest in the context of signaling pathways and crosstalk between pathways within a global signaling network. NaviCell allows both exploration of detailed molecular mechanisms represented on the map and a more abstract view of the map up to a top-level modular representation. NaviCell greatly facilitates curation, maintenance and updating the comprehensive maps of molecular interactions in an interactive and user-friendly fashion due to an imbedded blogging system. NaviCell provides user-friendly exploration of large-scale maps of molecular interactions, thanks to Google Maps and WordPress interfaces, with which many users are already familiar. Semantic zooming which is used for navigating geographical maps is adopted for molecular maps in NaviCell, making any level of visualization readable. In addition, NaviCell provides a framework for community-based curation of maps.

  6. Leveraging Google Geo Tools for Interactive STEM Education: Insights from the GEODE Project

    NASA Astrophysics Data System (ADS)

    Dordevic, M.; Whitmeyer, S. J.; De Paor, D. G.; Karabinos, P.; Burgin, S.; Coba, F.; Bentley, C.; St John, K. K.

    2016-12-01

    Web-based imagery and geospatial tools have transformed our ability to immerse students in global virtual environments. Google's suite of geospatial tools, such as Google Earth (± Engine), Google Maps, and Street View, allow developers and instructors to create interactive and immersive environments, where students can investigate and resolve common misconceptions in STEM concepts and natural processes. The GEODE (.net) project is developing digital resources to enhance STEM education. These include virtual field experiences (VFEs), such as an interactive visualization of the breakup of the Pangaea supercontinent, a "Grand Tour of the Terrestrial Planets," and GigaPan-based VFEs of sites like the Canadian Rockies. Web-based challenges, such as EarthQuiz (.net) and the "Fold Analysis Challenge," incorporate scaffolded investigations of geoscience concepts. EarthQuiz features web-hosted imagery, such as Street View, Photo Spheres, GigaPans, and Satellite View, as the basis for guided inquiry. In the Fold Analysis Challenge, upper-level undergraduates use Google Earth to evaluate a doubly-plunging fold at Sheep Mountain, WY. GEODE.net also features: "Reasons for the Seasons"—a Google Earth-based visualization that addresses misconceptions that abound amongst students, teachers, and the public, many of whom believe that seasonality is caused by large variations in Earth's distance from the Sun; "Plate Euler Pole Finder," which helps students understand rotational motion of tectonic plates on the globe; and "Exploring Marine Sediments Using Google Earth," an exercise that uses empirical data to explore the surficial distribution of marine sediments in the modern ocean. The GEODE research team includes the authors and: Heather Almquist, Cinzia Cervato, Gene Cooper, Helen Crompton, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Barb Tewksbury, and their students and collaborating colleagues. We are supported by NSF DUE 1323419 and a Google Geo Curriculum Award.

  7. Travel burden associated with granulocyte colony-stimulating factor administration in a Medicare aged population: a geospatial analysis.

    PubMed

    Stephens, J Mark; Bensink, Mark; Bowers, Charles; Hollenbeak, Christopher S

    2017-07-31

    Prophylaxis with granulocyte colony-stimulating factors (G-CSFs) is recommended for patients receiving myelosuppressive chemotherapy regimens with a high risk of febrile neutropenia (FN). G-CSFs should be administered starting the day after chemotherapy, necessitating return trips to the oncology clinic at the end of each cycle. We examined the travel burden related to prophylactic G-CSF injections after chemotherapy in the US. We used 2012-2014 Medicare claims data to identify a national cohort of beneficiaries age 65+ with non-myeloid cancers who received both chemotherapy and prophylactic G-CSFs. Patient travel origin was based on residence ZIP code. Oncologist practice locations and hospital addresses were obtained from the Medicare Physician Compare and Hospital Compare websites and geocoded using the Google Maps Application Programming Interface (API). Driving distance and time to the care site from each patient ZIP code tabulation area (ZCTA) were calculated using Open Street Maps road networks. Geographic and socio-economic characteristics of each ZCTA from the US Census Bureau's American Community Survey were used to stratify and analyze travel estimates. The mean one-way driving distance to the G-CSF provider was 23.8 (SD 30.1) miles and the mean one-way driving time was 33.3 (SD 37.8) minutes. When stratified by population density, the mean one-way travel time varied from 12.1 (SD 10.1) minutes in Very Dense Urban areas to 76.7 (SD 72.1) minutes in Super Rural areas. About 48% of patients had one-way travel times of <20 minutes, but 19% of patients traveled ≥50 minutes one way for G-CSF prophylaxis. Patients in areas with above average concentrations of aged, poor or disabled residents were more likely to experience longer travel. Administration of G-CSF therapy after chemotherapy can present a significant travel burden for cancer patients. Technological improvements in the form and methods of drug delivery for G-CSFs might significantly reduce this travel burden.

  8. An UAV scheduling and planning method for post-disaster survey

    NASA Astrophysics Data System (ADS)

    Li, G. Q.; Zhou, X. G.; Yin, J.; Xiao, Q. Y.

    2014-11-01

    Annually, the extreme climate and special geological environments lead to frequent natural disasters, e.g., earthquakes, floods, etc. The disasters often bring serious casualties and enormous economic losses. Post-disaster surveying is very important for disaster relief and assessment. As the Unmanned Aerial Vehicle (UAV) remote sensing with the advantage of high efficiency, high precision, high flexibility, and low cost, it is widely used in emergency surveying in recent years. As the UAVs used in emergency surveying cannot stop and wait for the happening of the disaster, when the disaster happens the UAVs usually are working at everywhere. In order to improve the emergency surveying efficiency, it is needed to track the UAVs and assign the emergency surveying task for each selected UAV. Therefore, a UAV tracking and scheduling method for post-disaster survey is presented in this paper. In this method, Global Positioning System (GPS), and GSM network are used to track the UAVs; an emergency tracking UAV information database is built in advance by registration, the database at least includes the following information, e.g., the ID of the UAVs, the communication number of the UAVs; when catastrophe happens, the real time location of all UAVs in the database will be gotten using emergency tracking method at first, then the traffic cost time for all UAVs to the disaster region will be calculated based on the UAVs' the real time location and the road network using the nearest services analysis algorithm; the disaster region is subdivided to several emergency surveying regions based on DEM, area, and the population distribution map; the emergency surveying regions are assigned to the appropriated UAV according to shortest cost time rule. The UAVs tracking and scheduling prototype is implemented using SQLServer2008, ArcEnginge 10.1 SDK, Visual Studio 2010 C#, Android, SMS Modem, and Google Maps API.

  9. The National 3-D Geospatial Information Web-Based Service of Korea

    NASA Astrophysics Data System (ADS)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of spatial information industry of Korea is expected in the near future.

  10. Mapping for the masses: using free remote sensing data for disaster management

    NASA Astrophysics Data System (ADS)

    Teeuw, R.; McWilliam, N.; Morris, N.; Saunders, C.

    2009-04-01

    We examine the uses of free satellite imagery and Digital Elevation Models (DEMs) for disaster management, targeting three data sources: the United Nations Charter on Space and Disasters, Google Earth and internet-based satellite data archives, such as the Global Land Cover Facility (GLCF). The research has assessed SRTM and ASTER DEM data, Landsat TM/ETM+ and ASTER imagery, as well as utilising datasets and basic GIS operations available via Google Earth. As an aid to Disaster Risk Reduction, four sets of maps can be produced from satellite data: (i) Multiple Geohazards: areas prone to slope instability, coastal inundation and fluvial flooding; (ii) Vulnerability: population density, habitation types, land cover types and infrastructure; (iii) Disaster Risk: produced by combining severity scores from (i) and (ii); (iv) Reconstruction: zones of rock/sediment with construction uses; areas of woodland (for fuel/construction) water sources; transport routes; zones suitable for re-settlement. This set of Disaster Risk Reduction maps are ideal for regional (1:50,000 to 1:250,000 scale) planning for in low-income countries: more detailed assessments require relatively expensive high resolution satellite imagery or aerial photography, although Google Earth has a good track record for posting high-res imagery of disaster zones (e.g. the 2008 Burma storm surge). The Disaster Risk maps highlight areas of maximum risk to a region's emergency planners and decision makers, enabling various types of public education and other disaster mitigation measures. The Reconstruction map also helps to save lives, by facilitating disaster recovery. Many problems have been identified. Access to the UN Charter imagery is fine after a disaster, but very difficult if assessing pre-disaster indicators: the data supplied also tends to be pre-processed, when some relief agencies would prefer to have raw data. The limited and expensive internet access in many developing countries limits access to archives of free satellite data, such as the GLCF. Finally, data integration, spatial/temporal analysis and map production are all hindered by the high price of most GIS software, making the development of suitable open-source software a priority.

  11. Electricity Data Browser

    EIA Publications

    The Electricity Data Browser shows generation, consumption, fossil fuel receipts, stockpiles, retail sales, and electricity prices. The data appear on an interactive web page and are updated each month. The Electricity Data Browser includes all the datasets collected and published in EIA's Electric Power Monthly and allows users to perform dynamic charting of data sets as well as map the data by state. The data browser includes a series of reports that appear in the Electric Power Monthly and allows readers to drill down to plant level statistics, where available. All images and datasets are available for download. Users can also link to the data series in EIA's Application Programming Interface (API). An API makes our data machine-readable and more accessible to users.

  12. Google Maps for Crowdsourced Emergency Routing

    NASA Astrophysics Data System (ADS)

    Nedkov, S.; Zlatanova, S.

    2012-08-01

    Gathering infrastructure data in emergency situations is challenging. The affected by a disaster areas are often large and the needed observations numerous. Spaceborne remote sensing techniques cover large areas but they are of limited use as their field of view may be blocked by clouds, smoke, buildings, highways, etc. Remote sensing products furthermore require specialists to collect and analyze the data. This contrasts the nature of the damage detection problem: almost everyone is capable of observing whether a street is usable or not. The crowd is fit for solving these challenges as its members are numerous, they are willing to help and are often in the vicinity of the disaster thereby forming a highly dispersed sensor network. This paper proposes and implements a small WebGIS application for performing shortest path calculations based on crowdsourced information about the infrastructure health. The application is built on top of Google Maps and uses its routing service to calculate the shortest distance between two locations. Impassable areas are indicated on a map by people performing in-situ observations on a mobile device, and by users on a desktop machine who consult a multitude of information sources.

  13. The military health system's personal health record pilot with Microsoft HealthVault and Google Health.

    PubMed

    Do, Nhan V; Barnhill, Rick; Heermann-Do, Kimberly A; Salzman, Keith L; Gimbel, Ronald W

    2011-01-01

    To design, build, implement, and evaluate a personal health record (PHR), tethered to the Military Health System, that leverages Microsoft® HealthVault and Google® Health infrastructure based on user preference. A pilot project was conducted in 2008-2009 at Madigan Army Medical Center in Tacoma, Washington. Our PHR was architected to a flexible platform that incorporated standards-based models of Continuity of Document and Continuity of Care Record to map Department of Defense-sourced health data, via a secure Veterans Administration data broker, to Microsoft® HealthVault and Google® Health based on user preference. The project design and implementation were guided by provider and patient advisory panels with formal user evaluation. The pilot project included 250 beneficiary users. Approximately 73.2% of users were < 65 years of age, and 38.4% were female. Of the users, 169 (67.6%) selected Microsoft® HealthVault, and 81 (32.4%) selected Google® Health as their PHR of preference. Sample evaluation of users reflected 100% (n = 60) satisfied with convenience of record access and 91.7% (n = 55) satisfied with overall functionality of PHR. Key lessons learned related to data-transfer decisions (push vs pull), purposeful delays in reporting sensitive information, understanding and mapping PHR use and clinical workflow, and decisions on information patients may choose to share with their provider. Currently PHRs are being viewed as empowering tools for patient activation. Design and implementation issues (eg, technical, organizational, information security) are substantial and must be thoughtfully approached. Adopting standards into design can enhance the national goal of portability and interoperability.

  14. Use of Openly Available Satellite Images for Remote Sensing Education

    NASA Astrophysics Data System (ADS)

    Wang, C.-K.

    2011-09-01

    With the advent of Google Earth, Google Maps, and Microsoft Bing Maps, high resolution satellite imagery are becoming more easily accessible than ever. It have been the case that the college students may already have wealth experiences with the high resolution satellite imagery by using these software and web services prior to any formal remote sensing education. It is obvious that the remote sensing education should be adjusted to the fact that the audience are already the customers of remote sensing products (through the use of the above mentioned services). This paper reports the use of openly available satellite imagery in an introductory-level remote sensing course in the Department of Geomatics of National Cheng Kung University as a term project. From the experience learned from the fall of 2009 and 2010, it shows that this term project has effectively aroused the students' enthusiastic toward Remote Sensing.

  15. In campus location finder using mobile application services

    NASA Astrophysics Data System (ADS)

    Fai, Low Weng; Audah, Lukman

    2017-09-01

    Navigation services become very common in this era, the application include Google Map, Waze and etc. Although navigation application contains the main routing service in open area but not all of the buildings are recorded in the database. In this project, an application is made for the indoor and outdoor navigation in Universiti Tun Hussein Onn Malaysia (UTHM). It is used to help outsider and new incoming students by navigating them from their current location to destination using mobile application name "U Finder". Thunkable website has been used to build the application for outdoor and indoor navigation. Outdoor navigation is linked to the Google Map and indoor navigation is using the QR code for positioning and routing picture for navigation. The outdoor navigation can route user to the main faculties in UTHM and indoor navigation is only done for the G1 building in UTHM.

  16. USGS Coastal and Marine Geology Survey Data in Google Earth

    NASA Astrophysics Data System (ADS)

    Reiss, C.; Steele, C.; Ma, A.; Chin, J.

    2006-12-01

    The U.S. Geological Survey (USGS) Coastal and Marine Geology (CMG) program has a rich data catalog of geologic field activities and metadata called InfoBank, which has been a standard tool for researchers within and outside of the agency. Along with traditional web maps, the data are now accessible in Google Earth, which greatly expands the possible user audience. The Google Earth interface provides geographic orientation and panning/zooming capabilities to locate data relative to topography, bathymetry, and coastal areas. Viewing navigation with Google Earth's background imagery allows queries such as, why areas were not surveyed (answer presence of islands, shorelines, cliffs, etc.). Detailed box core subsample photos from selected sampling activities, published geotechnical data, and sample descriptions are now viewable on Google Earth, (for example, M-1-95-MB, P-2-95-MB, and P-1-97- MB box core samples). One example of the use of Google Earth is CMG's surveys of San Francisco's Ocean Beach since 2004. The surveys are conducted with an all-terrain vehicle (ATV) and shallow-water personal watercraft (PWC) equipped with Global Positioning System (GPS), and elevation and echo sounder data collectors. 3D topographic models with centimeter accuracy have been produced from these surveys to monitor beach and nearshore processes, including sand transport, sedimentation patterns, and seasonal trends. Using Google Earth, multiple track line data (examples: OB-1-05-CA and OB-2-05-CA) can be overlaid on beach imagery. The images also help explain the shape of track lines as objects are encountered.

  17. Googling trends in conservation biology.

    PubMed

    Proulx, Raphaël; Massicotte, Philippe; Pépino, Marc

    2014-02-01

    Web-crawling approaches, that is, automated programs data mining the internet to obtain information about a particular process, have recently been proposed for monitoring early signs of ecosystem degradation or for establishing crop calendars. However, lack of a clear conceptual and methodological framework has prevented the development of such approaches within the field of conservation biology. Our objective was to illustrate how Google Trends, a freely accessible web-crawling engine, can be used to track changes in timing of biological processes, spatial distribution of invasive species, and level of public awareness about key conservation issues. Google Trends returns the number of internet searches that were made for a keyword in a given region of the world over a defined period. Using data retrieved online for 13 countries, we exemplify how Google Trends can be used to study the timing of biological processes, such as the seasonal recurrence of pollen release or mosquito outbreaks across a latitudinal gradient. We mapped the spatial extent of results from Google Trends for 5 invasive species in the United States and found geographic patterns in invasions that are consistent with their coarse-grained distribution at state levels. From 2004 through 2012, Google Trends showed that the level of public interest and awareness about conservation issues related to ecosystem services, biodiversity, and climate change increased, decreased, and followed both trends, respectively. Finally, to further the development of research approaches at the interface of conservation biology, collective knowledge, and environmental management, we developed an algorithm that allows the rapid retrieval of Google Trends data. © 2013 Society for Conservation Biology.

  18. Analysis of low active-pharmaceutical-ingredient signal drugs based on thin layer chromatography and surface-enhanced Raman spectroscopy.

    PubMed

    Li, Xiao; Chen, Hui; Zhu, Qingxia; Liu, Yan; Lu, Feng

    2016-11-30

    Active pharmaceutical ingredients (API) embedded in the excipients of the formula can usually be unravelled by normal Raman spectroscopy (NRS). However, more and more drugs with low API content and/or low Raman scattering coefficient were insensitive to NRS analysis, which was for the first time defined as Low API-Signal Drugs (LASIDs) in this paper. The NRS spectra of these LASIDs were similar to their dominant excipients' profiles, such as lactose, starch, microcrystalline cellulose (MCC), etc., and were classified into three types as such. 21 out of 100 kinds of drugs were screened as LASIDs and characterized further by Raman microscopic mapping. Accordingly, we proposed a tailored solution to the qualitation and quantitation problem of these LASIDs, using surface-enhanced Raman spectroscopic (SERS) detection on the thin layer chromatographic (TLC) plate both in situ and after-separation. Experimental conditions and parameters including TLC support matrix, SERS substrate, detection mode, similarity threshold, internal standard, etc., were optimized. All LASIDs were satisfactorily identified and the quantitation results agreed well with those of high performance liquid chromatography (HPLC). For some structural analogues of LASIDs, although they presented highly similar SERS spectra and were tough to distinguish even with Raman microscopic mapping, they could be successfully discriminated from each other by coupling SERS (with portable Raman spectrometer) with TLC. These results demonstrated that the proposed solution could be employed to detect the LASIDs with high accuracy and cost-effectiveness. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Web-based network analysis and visualization using CellMaps

    PubMed Central

    Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín

    2016-01-01

    Summary: CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. Availability and Implementation: The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps. The client is implemented in JavaScript and the server in C and Java. Contact: jdopazo@cipf.es Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27296979

  20. Web-based network analysis and visualization using CellMaps.

    PubMed

    Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín

    2016-10-01

    : CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps The client is implemented in JavaScript and the server in C and Java. jdopazo@cipf.es Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  1. Harnessing Satellite Imageries in Feature Extraction Using Google Earth Pro

    NASA Astrophysics Data System (ADS)

    Fernandez, Sim Joseph; Milano, Alan

    2016-07-01

    Climate change has been a long-time concern worldwide. Impending flooding, for one, is among its unwanted consequences. The Phil-LiDAR 1 project of the Department of Science and Technology (DOST), Republic of the Philippines, has developed an early warning system in regards to flood hazards. The project utilizes the use of remote sensing technologies in determining the lives in probable dire danger by mapping and attributing building features using LiDAR dataset and satellite imageries. A free mapping software named Google Earth Pro (GEP) is used to load these satellite imageries as base maps. Geotagging of building features has been done so far with the use of handheld Global Positioning System (GPS). Alternatively, mapping and attribution of building features using GEP saves a substantial amount of resources such as manpower, time and budget. Accuracy-wise, geotagging by GEP is dependent on either the satellite imageries or orthophotograph images of half-meter resolution obtained during LiDAR acquisition and not on the GPS of three-meter accuracy. The attributed building features are overlain to the flood hazard map of Phil-LiDAR 1 in order to determine the exposed population. The building features as obtained from satellite imageries may not only be used in flood exposure assessment but may also be used in assessing other hazards and a number of other uses. Several other features may also be extracted from the satellite imageries.

  2. Predicting Ambulance Time of Arrival to the Emergency Department Using Global Positioning System and Google Maps

    PubMed Central

    Fleischman, Ross J.; Lundquist, Mark; Jui, Jonathan; Newgard, Craig D.; Warden, Craig

    2014-01-01

    Objective To derive and validate a model that accurately predicts ambulance arrival time that could be implemented as a Google Maps web application. Methods This was a retrospective study of all scene transports in Multnomah County, Oregon, from January 1 through December 31, 2008. Scene and destination hospital addresses were converted to coordinates. ArcGIS Network Analyst was used to estimate transport times based on street network speed limits. We then created a linear regression model to improve the accuracy of these street network estimates using weather, patient characteristics, use of lights and sirens, daylight, and rush-hour intervals. The model was derived from a 50% sample and validated on the remainder. Significance of the covariates was determined by p < 0.05 for a t-test of the model coefficients. Accuracy was quantified by the proportion of estimates that were within 5 minutes of the actual transport times recorded by computer-aided dispatch. We then built a Google Maps-based web application to demonstrate application in real-world EMS operations. Results There were 48,308 included transports. Street network estimates of transport time were accurate within 5 minutes of actual transport time less than 16% of the time. Actual transport times were longer during daylight and rush-hour intervals and shorter with use of lights and sirens. Age under 18 years, gender, wet weather, and trauma system entry were not significant predictors of transport time. Our model predicted arrival time within 5 minutes 73% of the time. For lights and sirens transports, accuracy was within 5 minutes 77% of the time. Accuracy was identical in the validation dataset. Lights and sirens saved an average of 3.1 minutes for transports under 8.8 minutes, and 5.3 minutes for longer transports. Conclusions An estimate of transport time based only on a street network significantly underestimated transport times. A simple model incorporating few variables can predict ambulance time of arrival to the emergency department with good accuracy. This model could be linked to global positioning system data and an automated Google Maps web application to optimize emergency department resource use. Use of lights and sirens had a significant effect on transport times. PMID:23865736

  3. Whole-genome scan in thelytokous-laying workers of the Cape honeybee (Apis mellifera capensis): central fusion, reduced recombination rates and centromere mapping using half-tetrad analysis.

    PubMed Central

    Baudry, Emmanuelle; Kryger, Per; Allsopp, Mike; Koeniger, Nikolaus; Vautrin, Dominique; Mougel, Florence; Cornuet, Jean-Marie; Solignac, Michel

    2004-01-01

    While workers of almost all subspecies of honeybee are able to lay only haploid male eggs, Apis mellifera capensis workers are able to produce diploid female eggs by thelytokous parthenogenesis. Cytological analyses have shown that during parthenogenesis, egg diploidy is restored by fusion of the two central meiotic products. This peculiarity of the Cape bee preserves two products of a single meiosis in the daughters and can be used to map centromere positions using half-tetrad analysis. In this study, we use the thelytokous progenies of A. m. capensis workers and a sample of individuals from a naturally occurring A. m. capensis thelytokous clone to map centromere position for most of the linkage groups of the honeybee. We also show that the recombination rate is reduced by >10-fold during the meiosis of A. m. capensis workers. This reduction is restricted to thelytokous parthenogenesis of capensis workers and is not observed in the meiosis of queen within the same subspecies or in arrhenotokous workers of another subspecies. The reduced rate of recombination seems to be associated with negative crossover interference. These results are discussed in relation to evolution of thelytokous parthenogenesis and maintenance of heterozygosity and female sex after thelytoky. PMID:15166151

  4. Using the Generic Mapping Tools From Within the MATLAB, Octave and Julia Computing Environments

    NASA Astrophysics Data System (ADS)

    Luis, J. M. F.; Wessel, P.

    2016-12-01

    The Generic Mapping Tools (GMT) is a widely used software infrastructure tool set for analyzing and displaying geoscience data. Its power to analyze and process data and produce publication-quality graphics has made it one of several standard processing toolsets used by a large segment of the Earth and Ocean Sciences. GMT's strengths lie in superior publication-quality vector graphics, geodetic-quality map projections, robust data processing algorithms scalable to enormous data sets, and ability to run under all common operating systems. The GMT tool chest offers over 120 modules sharing a common set of command options, file structures, and documentation. GMT modules are command line tools that accept input and write output, and this design allows users to write scripts in which one module's output becomes another module's input, creating highly customized GMT workflows. With the release of GMT 5, these modules are high-level functions with a C API, potentially allowing users access to high-level GMT capabilities from any programmable environment. Many scientists who use GMT also use other computational tools, such as MATLAB® and its clone Octave. We have built a MATLAB/Octave interface on top of the GMT 5 C API. Thus, MATLAB or Octave now has full access to all GMT modules as well as fundamental input/output of GMT data objects via a MEX function. Internally, the GMT/MATLAB C API defines six high-level composite data objects that handle input and output of data via individual GMT modules. These are data tables, grids, text tables (text/data mixed records), color palette tables, raster images (1-4 color bands), and PostScript. The API is responsible for translating between the six GMT objects and the corresponding native MATLAB objects. References to data arrays are passed if transposing of matrices is not required. The GMT and MATLAB/Octave combination is extremely flexible, letting the user harvest the general numerical and graphical capabilities of both systems, and represents a giant step forward in interoperability between GMT and other software package. We will present examples of the symbiotic benefits of combining these platforms. Two other extensions are also in the works: a nearly finished Julia wrapper and an embryonic Python module. Publication supported by FCT- project UID/GEO/50019/2013 - Instituto D. Luiz

  5. Mapping rice extent map with crop intensity in south China through integration of optical and microwave images based on google earth engine

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Wu, B.; Zhang, M.; Zeng, H.

    2017-12-01

    Rice is one of the main staple foods in East Asia and Southeast Asia, which has occupied more than half of the world's population with 11% of cultivated land. Study on rice can provide direct or indirect information on food security and water source management. Remote sensing has proven to be the most effective method to monitoring the cropland in large scale by using temporary and spectral information. There are two main kinds of satellite have been used to mapping rice including microwave and optical. Rice, as the main crop of paddy fields, the main feature different from other crops is flooding phenomenon at planning stage (Figure 1). Microwave satellites can penetrate through clouds and efficiency on monitoring flooding phenomenon. Meanwhile, the vegetation index based on optical satellite can well distinguish rice from other vegetation. Google Earth Engine is a cloud-based platform that makes it easy to access high-performance computing resources for processing very large geospatial datasets. Google has collected large number of remote sensing satellite data around the world, which providing researchers with the possibility of doing application by using multi-source remote sensing data in a large area. In this work, we map rice planting area in south China through integration of Landsat-8 OLI, Sentienl-2, and Sentinel-1 Synthetic Aperture Radar (SAR) images. The flowchart is shown in figure 2. First, a threshold method the VH polarized backscatter from SAR sensor and vegetation index including normalized difference vegetation index (NDVI) and enhanced vegetation index (EVI) from optical sensor were used the classify the rice extent map. The forest and water surface extent map provided by earth engine were used to mask forest and water. To overcome the problem of the "salt and pepper effect" by Pixel-based classification when the spatial resolution increased, we segment the optical image and use the pixel- based classification results to merge the object-oriented segmentation data, and finally get the rice extent map. At last, by using the time series analysis, the peak count was obtained for each rice area to ensure the crop intensity. In this work, the rice ground point from a GVG crowdsourcing smartphone and rice area statistical results from National Bureau of Statistics were used to validate and evaluate our result.

  6. Integrating Radar Image Data with Google Maps

    NASA Technical Reports Server (NTRS)

    Chapman, Bruce D.; Gibas, Sarah

    2010-01-01

    A public Web site has been developed as a method for displaying the multitude of radar imagery collected by NASA s Airborne Synthetic Aperture Radar (AIRSAR) instrument during its 16-year mission. Utilizing NASA s internal AIRSAR site, the new Web site features more sophisticated visualization tools that enable the general public to have access to these images. The site was originally maintained at NASA on six computers: one that held the Oracle database, two that took care of the software for the interactive map, and three that were for the Web site itself. Several tasks were involved in moving this complicated setup to just one computer. First, the AIRSAR database was migrated from Oracle to MySQL. Then the back-end of the AIRSAR Web site was updated in order to access the MySQL database. To do this, a few of the scripts needed to be modified; specifically three Perl scripts that query that database. The database connections were then updated from Oracle to MySQL, numerous syntax errors were corrected, and a query was implemented that replaced one of the stored Oracle procedures. Lastly, the interactive map was designed, implemented, and tested so that users could easily browse and access the radar imagery through the Google Maps interface.

  7. Wilber 3: A Python-Django Web Application For Acquiring Large-scale Event-oriented Seismic Data

    NASA Astrophysics Data System (ADS)

    Newman, R. L.; Clark, A.; Trabant, C. M.; Karstens, R.; Hutko, A. R.; Casey, R. E.; Ahern, T. K.

    2013-12-01

    Since 2001, the IRIS Data Management Center (DMC) WILBER II system has provided a convenient web-based interface for locating seismic data related to a particular event, and requesting a subset of that data for download. Since its launch, both the scale of available data and the technology of web-based applications have developed significantly. Wilber 3 is a ground-up redesign that leverages a number of public and open-source projects to provide an event-oriented data request interface with a high level of interactivity and scalability for multiple data types. Wilber 3 uses the IRIS/Federation of Digital Seismic Networks (FDSN) web services for event data, metadata, and time-series data. Combining a carefully optimized Google Map with the highly scalable SlickGrid data API, the Wilber 3 client-side interface can load tens of thousands of events or networks/stations in a single request, and provide instantly responsive browsing, sorting, and filtering of event and meta data in the web browser, without further reliance on the data service. The server-side of Wilber 3 is a Python-Django application, one of over a dozen developed in the last year at IRIS, whose common framework, components, and administrative overhead represent a massive savings in developer resources. Requests for assembled datasets, which may include thousands of data channels and gigabytes of data, are queued and executed using the Celery distributed Python task scheduler, giving Wilber 3 the ability to operate in parallel across a large number of nodes.

  8. Microreact: visualizing and sharing data for genomic epidemiology and phylogeography

    PubMed Central

    Argimón, Silvia; Abudahab, Khalil; Goater, Richard J. E.; Fedosejev, Artemij; Bhai, Jyothish; Glasner, Corinna; Feil, Edward J.; Holden, Matthew T. G.; Yeats, Corin A.; Grundmann, Hajo; Spratt, Brian G.

    2016-01-01

    Visualization is frequently used to aid our interpretation of complex datasets. Within microbial genomics, visualizing the relationships between multiple genomes as a tree provides a framework onto which associated data (geographical, temporal, phenotypic and epidemiological) are added to generate hypotheses and to explore the dynamics of the system under investigation. Selected static images are then used within publications to highlight the key findings to a wider audience. However, these images are a very inadequate way of exploring and interpreting the richness of the data. There is, therefore, a need for flexible, interactive software that presents the population genomic outputs and associated data in a user-friendly manner for a wide range of end users, from trained bioinformaticians to front-line epidemiologists and health workers. Here, we present Microreact, a web application for the easy visualization of datasets consisting of any combination of trees, geographical, temporal and associated metadata. Data files can be uploaded to Microreact directly via the web browser or by linking to their location (e.g. from Google Drive/Dropbox or via API), and an integrated visualization via trees, maps, timelines and tables provides interactive querying of the data. The visualization can be shared as a permanent web link among collaborators, or embedded within publications to enable readers to explore and download the data. Microreact can act as an end point for any tool or bioinformatic pipeline that ultimately generates a tree, and provides a simple, yet powerful, visualization method that will aid research and discovery and the open sharing of datasets. PMID:28348833

  9. "Where On Mars?": An Open Planetary Mapping Platform for Researchers, Educators, and the General Public

    NASA Astrophysics Data System (ADS)

    Manaud, Nicolas; Carter, John; Boix, Oriol

    2016-10-01

    The "Where On Mars?" project is essentially the evolution of an existing outreach product developed in collaboration between ESA and CartoDB; an interactive map visualisation of the ESA's ExoMars Rover candidate landing sites (whereonmars.co). Planetary imagery data and maps are increasingly produced by the scientific community, and shared typically as images, in scientific publications, presentations or public outreach websites. However, this media lacks of interactivity and contextual information available for further exploration, making it difficult for any audience to relate one location-based information to another. We believe that interactive web maps are a powerful way of telling stories, engaging with and educating people who, over the last decade, have become familiar with tools such as Google Maps. A few planetary web maps exist but they are either too complex for non-experts, or are closed-systems that do not allows anyone to publish and share content. The long-term vision for the project is to provide researchers, communicators, educators and a worldwide public with an open planetary mapping and social platform enabling them to create, share, communicate and consume research-based content. We aim for this platform to become the reference website everyone will go to learn about Mars and other planets in our Solar System; just like people head to Google Maps to find their bearings or any location-based information. The driver is clearly to create for people an emotional connection with Mars. The short-term objectives for the project are (1) to produce and curate an open repository of basemaps, geospatial data sets, map visualisations, and story maps; (2) to develop a beautifully crafted and engaging interactive map of Mars. Based on user-generated content, the underlying framework should (3) make it easy to create and share additional interactive maps telling specific stories.

  10. The RNASeq-er API-a gateway to systematically updated analysis of public RNA-seq data.

    PubMed

    Petryszak, Robert; Fonseca, Nuno A; Füllgrabe, Anja; Huerta, Laura; Keays, Maria; Tang, Y Amy; Brazma, Alvis

    2017-07-15

    The exponential growth of publicly available RNA-sequencing (RNA-Seq) data poses an increasing challenge to researchers wishing to discover, analyse and store such data, particularly those based in institutions with limited computational resources. EMBL-EBI is in an ideal position to address these challenges and to allow the scientific community easy access to not just raw, but also processed RNA-Seq data. We present a Web service to access the results of a systematically and continually updated standardized alignment as well as gene and exon expression quantification of all public bulk (and in the near future also single-cell) RNA-Seq runs in 264 species in European Nucleotide Archive, using Representational State Transfer. The RNASeq-er API (Application Programming Interface) enables ontology-powered search for and retrieval of CRAM, bigwig and bedGraph files, gene and exon expression quantification matrices (Fragments Per Kilobase Of Exon Per Million Fragments Mapped, Transcripts Per Million, raw counts) as well as sample attributes annotated with ontology terms. To date over 270 00 RNA-Seq runs in nearly 10 000 studies (1PB of raw FASTQ data) in 264 species in ENA have been processed and made available via the API. The RNASeq-er API can be accessed at http://www.ebi.ac.uk/fg/rnaseq/api . The commands used to analyse the data are available in supplementary materials and at https://github.com/nunofonseca/irap/wiki/iRAP-single-library . rnaseq@ebi.ac.uk ; rpetry@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  11. Learning the Semantics of Structured Data Sources

    ERIC Educational Resources Information Center

    Taheriyan, Mohsen

    2015-01-01

    Information sources such as relational databases, spreadsheets, XML, JSON, and Web APIs contain a tremendous amount of structured data, however, they rarely provide a semantic model to describe their contents. Semantic models of data sources capture the intended meaning of data sources by mapping them to the concepts and relationships defined by a…

  12. Measuring Neighborhood Walkable Environments: A Comparison of Three Approaches

    PubMed Central

    Chiang, Yen-Cheng; Sullivan, William; Larsen, Linda

    2017-01-01

    Multiple studies have revealed the impact of walkable environments on physical activity. Scholars attach considerable importance to leisure and health-related walking. Recent studies have used Google Street View as an instrument to assess city streets and walkable environments; however, no study has compared the validity of Google Street View assessments of walkable environment attributes to assessments made by local residents and compiled from field visits. In this study, we involved nearby residents and compared the extent to which Google Street View assessments of the walkable environment correlated with assessments from local residents and with field visits. We determined the assessment approaches (local resident or field visit assessments) that exhibited the highest agreement with Google Street View. One city with relatively high-quality walkable environments and one city with relatively low-quality walkable environments were examined, and three neighborhoods from each city were surveyed. Participants in each neighborhood used one of three approaches to assess the walkability of the environment: 15 local residents assessed the environment using a map, 15 participants made a field visit to assess the environment, and 15 participants used Google Street View to assess the environment, yielding a total of 90 valid samples for the two cities. Findings revealed that the three approaches to assessing neighborhood walkability were highly correlated for traffic safety, aesthetics, sidewalk quality, and physical barriers. Compared with assessments from participants making field visits, assessments by local residents were more highly correlated with Google Street View assessments. Google Street View provides a more convenient, low-cost, efficient, and safe approach to assess neighborhood walkability. The results of this study may facilitate future large-scale walkable environment surveys, effectively reduce expenses, and improve survey efficiency. PMID:28587186

  13. Assessing the environmental characteristics of cycling routes to school: a study on the reliability and validity of a Google Street View-based audit.

    PubMed

    Vanwolleghem, Griet; Van Dyck, Delfien; Ducheyne, Fabian; De Bourdeaudhuij, Ilse; Cardon, Greet

    2014-06-10

    Google Street View provides a valuable and efficient alternative to observe the physical environment compared to on-site fieldwork. However, studies on the use, reliability and validity of Google Street View in a cycling-to-school context are lacking. We aimed to study the intra-, inter-rater reliability and criterion validity of EGA-Cycling (Environmental Google Street View Based Audit - Cycling to school), a newly developed audit using Google Street View to assess the physical environment along cycling routes to school. Parents (n = 52) of 11-to-12-year old Flemish children, who mostly cycled to school, completed a questionnaire and identified their child's cycling route to school on a street map. Fifty cycling routes of 11-to-12-year olds were identified and physical environmental characteristics along the identified routes were rated with EGA-Cycling (5 subscales; 37 items), based on Google Street View. To assess reliability, two researchers performed the audit. Criterion validity of the audit was examined by comparing the ratings based on Google Street View with ratings through on-site assessments. Intra-rater reliability was high (kappa range 0.47-1.00). Large variations in the inter-rater reliability (kappa range -0.03-1.00) and criterion validity scores (kappa range -0.06-1.00) were reported, with acceptable inter-rater reliability values for 43% of all items and acceptable criterion validity for 54% of all items. EGA-Cycling can be used to assess physical environmental characteristics along cycling routes to school. However, to assess the micro-environment specifically related to cycling, on-site assessments have to be added.

  14. Geolokit: An interactive tool for visualising and exploring geoscientific data in Google Earth

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Antoine; Watlet, Arnaud; Bastin, Christophe

    2017-10-01

    Virtual globes have been developed to showcase different types of data combining a digital elevation model and basemaps of high resolution satellite imagery. Hence, they became a standard to share spatial data and information, although they suffer from a lack of toolboxes dedicated to the formatting of large geoscientific dataset. From this perspective, we developed Geolokit: a free and lightweight software that allows geoscientists - and every scientist working with spatial data - to import their data (e.g., sample collections, structural geology, cross-sections, field pictures, georeferenced maps), to handle and to transcribe them to Keyhole Markup Language (KML) files. KML files are then automatically opened in the Google Earth virtual globe and the spatial data accessed and shared. Geolokit comes with a large number of dedicated tools that can process and display: (i) multi-points data, (ii) scattered data interpolations, (iii) structural geology features in 2D and 3D, (iv) rose diagrams, stereonets and dip-plunge polar histograms, (v) cross-sections and oriented rasters, (vi) georeferenced field pictures, (vii) georeferenced maps and projected gridding. Therefore, together with Geolokit, Google Earth becomes not only a powerful georeferenced data viewer but also a stand-alone work platform. The toolbox (available online at http://www.geolokit.org) is written in Python, a high-level, cross-platform programming language and is accessible through a graphical user interface, designed to run in parallel with Google Earth, through a workflow that requires no additional third party software. Geolokit features are demonstrated in this paper using typical datasets gathered from two case studies illustrating its applicability at multiple scales of investigation: a petro-structural investigation of the Ile d'Yeu orthogneissic unit (Western France) and data collection of the Mariana oceanic subduction zone (Western Pacific).

  15. The military health system's personal health record pilot with Microsoft HealthVault and Google Health

    PubMed Central

    Barnhill, Rick; Heermann-Do, Kimberly A; Salzman, Keith L; Gimbel, Ronald W

    2011-01-01

    Objective To design, build, implement, and evaluate a personal health record (PHR), tethered to the Military Health System, that leverages Microsoft® HealthVault and Google® Health infrastructure based on user preference. Materials and methods A pilot project was conducted in 2008–2009 at Madigan Army Medical Center in Tacoma, Washington. Our PHR was architected to a flexible platform that incorporated standards-based models of Continuity of Document and Continuity of Care Record to map Department of Defense-sourced health data, via a secure Veterans Administration data broker, to Microsoft® HealthVault and Google® Health based on user preference. The project design and implementation were guided by provider and patient advisory panels with formal user evaluation. Results The pilot project included 250 beneficiary users. Approximately 73.2% of users were <65 years of age, and 38.4% were female. Of the users, 169 (67.6%) selected Microsoft® HealthVault, and 81 (32.4%) selected Google® Health as their PHR of preference. Sample evaluation of users reflected 100% (n=60) satisfied with convenience of record access and 91.7% (n=55) satisfied with overall functionality of PHR. Discussion Key lessons learned related to data-transfer decisions (push vs pull), purposeful delays in reporting sensitive information, understanding and mapping PHR use and clinical workflow, and decisions on information patients may choose to share with their provider. Conclusion Currently PHRs are being viewed as empowering tools for patient activation. Design and implementation issues (eg, technical, organizational, information security) are substantial and must be thoughtfully approached. Adopting standards into design can enhance the national goal of portability and interoperability. PMID:21292705

  16. Usability evaluation of cloud-based mapping tools for the display of very large datasets

    NASA Astrophysics Data System (ADS)

    Stotz, Nicole Marie

    The elasticity and on-demand nature of cloud services have made it easier to create web maps. Users only need access to a web browser and the Internet to utilize cloud based web maps, eliminating the need for specialized software. To encourage a wide variety of users, a map must be well designed; usability is a very important concept in designing a web map. Fusion Tables, a new product from Google, is one example of newer cloud-based distributed GIS services. It allows for easy spatial data manipulation and visualization, within the Google Maps framework. ESRI has also introduced a cloud based version of their software, called ArcGIS Online, built on Amazon's EC2 cloud. Utilizing a user-centered design framework, two prototype maps were created with data from the San Diego East County Economic Development Council. One map was built on Fusion Tables, and another on ESRI's ArcGIS Online. A usability analysis was conducted and used to compare both map prototypes in term so of design and functionality. Load tests were also ran, and performance metrics gathered on both map prototypes. The usability analysis was taken by 25 geography students, and consisted of time based tasks and questions on map design and functionality. Survey participants completed the time based tasks for the Fusion Tables map prototype quicker than those of the ArcGIS Online map prototype. While response was generally positive towards the design and functionality of both prototypes, overall the Fusion Tables map prototype was preferred. For the load tests, the data set was broken into 22 groups for a total of 44 tests. While the Fusion Tables map prototype performed more efficiently than the ArcGIS Online prototype, differences are almost unnoticeable. A SWOT analysis was conducted for each prototype. The results from this research point to the Fusion Tables map prototype. A redesign of this prototype would incorporate design suggestions from the usability survey, while some functionality would need to be dropped. This is a free product and would therefore be the best option if cost is an issue, but this map may not be supported in the future.

  17. DistMap: a toolkit for distributed short read mapping on a Hadoop cluster.

    PubMed

    Pandey, Ram Vinay; Schlötterer, Christian

    2013-01-01

    With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/

  18. DistMap: A Toolkit for Distributed Short Read Mapping on a Hadoop Cluster

    PubMed Central

    Pandey, Ram Vinay; Schlötterer, Christian

    2013-01-01

    With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/ PMID:24009693

  19. Learning GIS and exploring geolocated data with the all-in-one Geolokit toolbox for Google Earth

    NASA Astrophysics Data System (ADS)

    Watlet, A.; Triantafyllou, A.; Bastin, C.

    2016-12-01

    GIS software are today's essential tools to gather and visualize geological data, to apply spatial and temporal analysis and finally, to create and share interactive maps for further investigations in geosciences. Such skills are especially essential to learn for students who go through fieldtrips, samples collections or field experiments. However, time is generally missing to teach in detail all the aspects of visualizing geolocated geoscientific data. For these purposes, we developed Geolokit: a lightweight freeware dedicated to geodata visualization and written in Python, a high-level, cross-platform programming language. Geolokit software is accessible through a graphical user interface, designed to run in parallel with Google Earth, benefitting from the numerous interactive capabilities. It is designed as a very user-friendly toolbox that allows `geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to visualize these into the Google Earth environment using KML code; with no require of third party software, except Google Earth itself. Geolokit comes with a large number of geosciences labels, symbols, colours and placemarks and is applicable to display several types of geolocated data, including: Multi-points datasets Automatically computed contours of multi-points datasets via several interpolation methods Discrete planar and linear structural geology data in 2D or 3D supporting large range of structures input format Clustered stereonets and rose diagrams 2D cross-sections as vertical sections Georeferenced maps and grids with user defined coordinates Field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS In the end, Geolokit is helpful for quickly visualizing and exploring data without losing too much time in the numerous capabilities of GIS software suites. We are looking for students and teachers to discover all the functionalities of Geolokit. As this project is under development and planned to be open source, we are definitely looking to discussions regarding particular needs or ideas, and to contributions in the Geolokit project.

  20. Large Scale Crop Mapping in Ukraine Using Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Shelestov, A.; Lavreniuk, M. S.; Kussul, N.

    2016-12-01

    There are no globally available high resolution satellite-derived crop specific maps at present. Only coarse-resolution imagery (> 250 m spatial resolution) has been utilized to derive global cropland extent. In 2016 we are going to carry out a country level demonstration of Sentinel-2 use for crop classification in Ukraine within the ESA Sen2-Agri project. But optical imagery can be contaminated by cloud cover that makes it difficult to acquire imagery in an optimal time range to discriminate certain crops. Due to the Copernicus program since 2015, a lot of Sentinel-1 SAR data at high spatial resolution is available for free for Ukraine. It allows us to use the time series of SAR data for crop classification. Our experiment for one administrative region in 2015 showed much higher crop classification accuracy with SAR data than with optical only time series [1, 2]. Therefore, in 2016 within the Google Earth Engine Research Award we use SAR data together with optical ones for large area crop mapping (entire territory of Ukraine) using cloud computing capabilities available at Google Earth Engine (GEE). This study compares different classification methods for crop mapping for the whole territory of Ukraine using data and algorithms from GEE. Classification performance assessed using overall classification accuracy, Kappa coefficients, and user's and producer's accuracies. Also, crop areas from derived classification maps compared to the official statistics [3]. S. Skakun et al., "Efficiency assessment of multitemporal C-band Radarsat-2 intensity and Landsat-8 surface reflectance satellite imagery for crop classification in Ukraine," IEEE Journal of Selected Topics in Applied Earth Observ. and Rem. Sens., 2015, DOI: 10.1109/JSTARS.2015.2454297. N. Kussul, S. Skakun, A. Shelestov, O. Kussul, "The use of satellite SAR imagery to crop classification in Ukraine within JECAM project," IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp.1497-1500, 13-18 July 2014, Quebec City, Canada. F.J. Gallego, N. Kussul, S. Skakun, O. Kravchenko, A. Shelestov, O. Kussul, "Efficiency assessment of using satellite data for crop area estimation in Ukraine," International Journal of Applied Earth Observation and Geoinformation vol. 29, pp. 22-30, 2014.

  1. Results of Prospecting of Impact Craters in Morocco

    NASA Astrophysics Data System (ADS)

    Chaabout, S.; Chennaoui Aoudjehane, H.; Reimold, W. U.; Baratoux, D.

    2014-09-01

    This work is based to use satellite images of Google Earth and Yahoo-Maps scenes; we examined the surface of our country to be able to locate the structures that have a circular morphology such as impact craters, which potentially could be.

  2. Measuring the Carolina Bays Using Archetype Template Overlays on the Google Earth Virtual Globe; Planform Metrics for 25,000 Bays Extracted from LiDAR and Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Davias, M. E.; Gilbride, J. L.

    2011-12-01

    Aerial photographs of Carolina bays taken in the 1930's sparked the initial research into their geomorphology. Satellite Imagery available today through the Google Earth Virtual Globe facility expands the regions available for interrogation, but reveal only part of their unique planforms. Digital Elevation Maps (DEMs), using Light Detection And Ranging (LiDAR) remote sensing data, accentuate the visual presentation of these aligned ovoid shallow basins by emphasizing their robust circumpheral rims. To support a geospatial survey of Carolina bay landforms in the continental USA, 400,000 km2 of hsv-shaded DEMs were created as KML-JPEG tile sets. A majority of these DEMs were generated with LiDAR-derived data. We demonstrate the tile generation process and their integration into Google Earth, where the DEMs augment available photographic imagery for the visualization of bay planforms. While the generic Carolina bay planform is considered oval, we document subtle regional variations. Using a small set of empirically derived planform shapes, we created corresponding Google Earth overlay templates. We demonstrate the analysis of an individual Carolina bay by placing an appropriate overlay onto the virtually globe, then orientating, sizing and rotating it by edit handles such that it satisfactorily represents the bay's rim. The resulting overlay data element is extracted from Google Earth's object directory and programmatically processed to generate metrics such as geographic location, elevation, major and minor axis and inferred orientation. Utilizing a virtual globe facility for data capture may result in higher quality data compared to methods that reference flat maps, where geospatial shape and orientation of the bays could be skewed and distorted in the orthographic projection process. Using the methodology described, we have measured over 25k distinct Carolina bays. We discuss the Google Fusion geospatial data repository facility, through which these data have been assembled and made web-accessible to other researchers. Preliminary findings from the survey are discussed, such as how bay surface area, eccentricity and orientation vary across ~800 1/4° × 1/4° grid elements. Future work includes measuring 25k additional bays, as well as interrogation of the orientation data to identify any possible systematic geospatial relationships.

  3. Mapping of Sample Collection Data: GIS Tools for the Natural Product Researcher

    PubMed Central

    Oberlies, Nicholas H.; Rineer, James I.; Alali, Feras Q.; Tawaha, Khaled; Falkinham, Joseph O.; Wheaton, William D.

    2009-01-01

    Scientists engaged in the research of natural products often either conduct field collections themselves or collaborate with partners who do, such as botanists, mycologists, or SCUBA divers. The information gleaned from such collecting trips (e.g. longitude/latitude coordinates, geography, elevation, and a multitude of other field observations) have provided valuable data to the scientific community (e.g., biodiversity), even if it is tangential to the direct aims of the natural products research, which are often focused on drug discovery and/or chemical ecology. Geographic Information Systems (GIS) have been used to display, manage, and analyze geographic data, including collection sites for natural products. However, to the uninitiated, these tools are often beyond the financial and/or computational means of the natural product scientist. With new, free, and easy-to-use geospatial visualization tools, such as Google Earth, mapping and geographic imaging of sampling data are now within the reach of natural products scientists. The goals of the present study were to develop simple tools that are tailored for the natural products setting, thereby presenting a means to map such information, particularly via open source software like Google Earth. PMID:20161345

  4. Satellite Radar Detects Damage from Sept. 19, 2017 Raboso, Mexico, Quake

    NASA Image and Video Library

    2017-09-20

    The Advanced Rapid Imaging and Analysis (ARIA) team at NASA's Jet Propulsion Laboratory in Pasadena, California, and Caltech, also in Pasadena, created this Damage Proxy Map (DPM) depicting areas of Central Mexico, including Mexico City, that are likely damaged (shown by red and yellow pixels) from the magnitude 7.1 Raboso earthquake of Sept. 19, 2017 (local time). The map is derived from synthetic aperture radar (SAR) images from the Copernicus Sentinel-1A and Sentinel-1B satellites, operated by the European Space Agency (ESA). The images were taken before (Sept. 8, 2017) and after (Sept. 20, 2017) the earthquake. The map covers an area of 109 by 106 miles (175 by 170 kilometers). Each pixel measures about 33 yards (30 meters) across. The color variation from yellow to red indicates increasingly more significant ground and building surface change. Preliminary validation was done by comparing the DPM to a crowd-sourced Google Map (https://www.google.com/maps/d/u/0/viewer?mid=1_-V97lbdgLFHpx-CtqhLWlJAnYY&ll=19.41452166501326%2C-99.16498240436704&z=16). This damage proxy map should be used as guidance to identify damaged areas, and may be less reliable over vegetated areas. Sentinel-1 data were accessed through the Copernicus Open Access Hub. The image contains modified Copernicus Sentinel data (2017), processed by ESA and analyzed by the NASA-JPL/Caltech ARIA team. This research was carried out at JPL under contract with NASA. https://photojournal.jpl.nasa.gov/catalog/PIA21963

  5. Measurable realistic image-based 3D mapping

    NASA Astrophysics Data System (ADS)

    Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.

    2011-12-01

    Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable realistic image-based (MRI) system can produce. The major contribution here is the implementation of measurable images on 3D maps to obtain various measurements from real scenes.

  6. Changes of Earthquake Vulnerability of Marunouchi and Ginza Area in Tokyo and Urban Recovery Digital Archives on Google Earth

    NASA Astrophysics Data System (ADS)

    Igarashi, Masayasu; Murao, Osamu

    In this paper, the authors develop a multiple regression model which estimates urban earthquake vulnerability (building collapse risk and conflagration risk) for different eras, and clarify the historical changes of urban risk in Marunouchi and Ginza Districts in Tokyo, Japan using old maps and contemporary geographic information data. Also, we compare the change of urban vulnerability of the districts with the significant historical events in Tokyo. Finally, the results are loaded onto Google Earth with timescale extension to consider the possibility of urban recovery digital archives in the era of the recent geoinformatic technologies.

  7. Linkage map of the honey bee, Apis mellifera, based on RAPD markers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, G.J.; Page, R.E. Jr.

    A linkage map was constructed for the honey bee based on the segregation of 365 random amplified polymorphic DNA (RAPD) markers in haploid male progeny of a single female bee. The X locus for sex determination and genes for black body color and malate dehydrogenase were mapped to separate linkage groups. RAPD markers were very efficient for mapping, with an average of about 2.8 loci mapped for each 10-nucleotide primer that was used in polymerase chain reactions. The mean interval size between markers on the map was 9.1 cM. The map covered 3110 cM of linked markers on 26 linkagemore » groups. We estimate the total genome size to be {approximately}3450 cM. The size of the map indicated a very high recombination rate for the honey bee. The relationship of physical to genetic distance was estimated at 52 kb/cM, suggesting that map-based cloning of genes will be feasible for this species. 71 refs., 6 figs., 1 tab.« less

  8. Genomic correlates of recombination rate and its variability across eight recombination maps in the western honey bee (Apis mellifera L.).

    PubMed

    Ross, Caitlin R; DeFelice, Dominick S; Hunt, Greg J; Ihle, Kate E; Amdam, Gro V; Rueppell, Olav

    2015-02-21

    Meiotic recombination has traditionally been explained based on the structural requirement to stabilize homologous chromosome pairs to ensure their proper meiotic segregation. Competing hypotheses seek to explain the emerging findings of significant heterogeneity in recombination rates within and between genomes, but intraspecific comparisons of genome-wide recombination patterns are rare. The honey bee (Apis mellifera) exhibits the highest rate of genomic recombination among multicellular animals with about five cross-over events per chromatid. Here, we present a comparative analysis of recombination rates across eight genetic linkage maps of the honey bee genome to investigate which genomic sequence features are correlated with recombination rate and with its variation across the eight data sets, ranging in average marker spacing ranging from 1 Mbp to 120 kbp. Overall, we found that GC content explained best the variation in local recombination rate along chromosomes at the analyzed 100 kbp scale. In contrast, variation among the different maps was correlated to the abundance of microsatellites and several specific tri- and tetra-nucleotides. The combined evidence from eight medium-scale recombination maps of the honey bee genome suggests that recombination rate variation in this highly recombining genome might be due to the DNA configuration instead of distinct sequence motifs. However, more fine-scale analyses are needed. The empirical basis of eight differing genetic maps allowed for robust conclusions about the correlates of the local recombination rates and enabled the study of the relation between DNA features and variability in local recombination rates, which is particularly relevant in the honey bee genome with its exceptionally high recombination rate.

  9. Immunochromatographic diagnostic test analysis using Google Glass.

    PubMed

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2014-03-25

    We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health.

  10. Immunochromatographic Diagnostic Test Analysis Using Google Glass

    PubMed Central

    2014-01-01

    We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health. PMID:24571349

  11. JournalMap: Geo-semantic searching for relevant knowledge

    USDA-ARS?s Scientific Manuscript database

    Ecologists struggling to understand rapidly changing environments and evolving ecosystem threats need quick access to relevant research and documentation of natural systems. The advent of semantic and aggregation searching (e.g., Google Scholar, Web of Science) has made it easier to find useful lite...

  12. Correlation of sensitizing capacity and T-cell recognition within the Bet v 1 family

    PubMed Central

    Kitzmüller, Claudia; Zulehner, Nora; Roulias, Anargyros; Briza, Peter; Ferreira, Fatima; Faé, Ingrid; Fischer, Gottfried F.; Bohle, Barbara

    2015-01-01

    Background Bet v 1 is the main sensitizing allergen in birch pollen. Like many other major allergens, it contains an immunodominant T cell–activating region (Bet v 1142-156). Api g 1, the Bet v 1 homolog in celery, lacks the ability to sensitize and is devoid of major T-cell epitopes. Objective We analyzed the T-cell epitopes of Mal d 1, the nonsensitizing Bet v 1 homolog in apple, and assessed possible differences in uptake and antigen processing of Bet v 1, Api g 1, and Mal d 1. Methods For epitope mapping, Mal d 1–specific T-cell lines were stimulated with overlapping synthetic 12-mer peptides. The surface binding, internalization, and intracellular degradation of Bet v 1, Api g 1, and Mal d 1 by antigen-presenting cells were compared by using flow cytometry. All proteins were digested with endolysosomal extracts, and the resulting peptides were identified by means of mass spectrometry. The binding of Bet v 1142-156 and the homologous region in Mal d 1 by HLA class II molecules was analyzed in silico. Results Like Api g 1, Mal d 1 lacked dominant T-cell epitopes. The degree of surface binding and the kinetics of uptake and endolysosomal degradation of Bet v 1, Api g 1, and Mal d 1 were comparable. Endolysosomal degradation of Bet v 1 and Mal d 1 resulted in very similar fragments. The Bet v 1142-156 and Mal d 1141-155 regions showed no striking difference in their binding affinities to the most frequent HLA-DR alleles. Conclusion The sensitizing activity of different Bet v 1 homologs correlates with the presence of immunodominant T-cell epitopes. However, the presence of Bet v 1142-156 is not conferred by differential antigen processing. PMID:25670010

  13. Galaxy Portal: interacting with the galaxy platform through mobile devices.

    PubMed

    Børnich, Claus; Grytten, Ivar; Hovig, Eivind; Paulsen, Jonas; Čech, Martin; Sandve, Geir Kjetil

    2016-06-01

    : We present Galaxy Portal app, an open source interface to the Galaxy system through smart phones and tablets. The Galaxy Portal provides convenient and efficient monitoring of job completion, as well as opportunities for inspection of results and execution history. In addition to being useful to the Galaxy community, we believe that the app also exemplifies a useful way of exploiting mobile interfaces for research/high-performance computing resources in general. The source is freely available under a GPL license on GitHub, along with user documentation and pre-compiled binaries and instructions for several platforms: https://github.com/Tarostar/QMLGalaxyPortal It is available for iOS version 7 (and newer) through the Apple App Store, and for Android through Google Play for version 4.1 (API 16) or newer. geirksa@ifi.uio.no. © The Author 2016. Published by Oxford University Press.

  14. Evaluation of an electrocardiogram on QR code.

    PubMed

    Nakayama, Masaharu; Shimokawa, Hiroaki

    2013-01-01

    An electrocardiogram (ECG) is an indispensable tool to diagnose cardiac diseases, such as ischemic heart disease, myocarditis, arrhythmia, and cardiomyopathy. Since ECG patterns vary depend on patient status, it is also used to monitor patients during treatment and comparison with ECGs with previous results is important for accurate diagnosis. However, the comparison requires connection to ECG data server in a hospital and the availability of data connection among hospitals is limited. To improve the portability and availability of ECG data regardless of server connection, we here introduce conversion of ECG data into 2D barcodes as text data and decode of the QR code for drawing ECG with Google Chart API. Fourteen cardiologists and six general physicians evaluated the system using iPhone and iPad. Overall, they were satisfied with the system in usability and accuracy of decoded ECG compared to the original ECG. This new coding system may be useful in utilizing ECG data irrespective of server connections.

  15. Scientific Data Storage for Cloud Computing

    NASA Astrophysics Data System (ADS)

    Readey, J.

    2014-12-01

    Traditionally data storage used for geophysical software systems has centered on file-based systems and libraries such as NetCDF and HDF5. In contrast cloud based infrastructure providers such as Amazon AWS, Microsoft Azure, and the Google Cloud Platform generally provide storage technologies based on an object based storage service (for large binary objects) complemented by a database service (for small objects that can be represented as key-value pairs). These systems have been shown to be highly scalable, reliable, and cost effective. We will discuss a proposed system that leverages these cloud-based storage technologies to provide an API-compatible library for traditional NetCDF and HDF5 applications. This system will enable cloud storage suitable for geophysical applications that can scale up to petabytes of data and thousands of users. We'll also cover other advantages of this system such as enhanced metadata search.

  16. New Delhi Metallo-beta-lactamase around the world: an eReview using Google Maps.

    PubMed

    Berrazeg, M; Diene, Sm; Medjahed, L; Parola, P; Drissi, M; Raoult, D; Rolain, Jm

    2014-05-22

    Gram-negative carbapenem-resistant bacteria, in particular those producing New Delhi Metallo-betalactamase-1 (NDM-1), are a major global health problem. To inform the scientific and medical community in real time about worldwide dissemination of isolates of NDM-1-producing bacteria, we used the PubMed database to review all available publications from the first description in 2009 up to 31 December 2012, and created a regularly updated worldwide dissemination map using a web-based mapping application. We retrieved 33 reviews, and 136 case reports describing 950 isolates of NDM-1-producing bacteria. Klebsiella pneumoniae (n= 359) and Escherichia coli (n=268) were the most commonly reported bacteria producing NDM-1 enzyme. Several case reports of infections due to imported NDM-1 producing bacteria have been reported in a number of countries, including the United Kingdom, Italy, and Oman. In most cases (132/153, 86.3%), patients had connections with the Indian subcontinent or Balkan countries. Those infected were originally from these areas, had either spent time and/or been hospitalised there, or were potentially linked to other patients who had been hospitalised in these regions. By using Google Maps, we were able to trace spread of NDM-1-producing bacteria. We strongly encourage epidemiologists to use these types of interactive tools for surveillance purposes and use the information to prevent the spread and outbreaks of such bacteria.

  17. The neXtProt peptide uniqueness checker: a tool for the proteomics community.

    PubMed

    Schaeffer, Mathieu; Gateau, Alain; Teixeira, Daniel; Michel, Pierre-André; Zahn-Zabal, Monique; Lane, Lydie

    2017-11-01

    The neXtProt peptide uniqueness checker allows scientists to define which peptides can be used to validate the existence of human proteins, i.e. map uniquely versus multiply to human protein sequences taking into account isobaric substitutions, alternative splicing and single amino acid variants. The pepx program is available at https://github.com/calipho-sib/pepx and can be launched from the command line or through a cgi web interface. Indexing requires a sequence file in FASTA format. The peptide uniqueness checker tool is freely available on the web at https://www.nextprot.org/tools/peptide-uniqueness-checker and from the neXtProt API at https://api.nextprot.org/. lydie.lane@sib.swiss. © The Author(s) 2017. Published by Oxford University Press.

  18. Researchermap: a tool for visualizing author locations using Google maps.

    PubMed

    Rastegar-Mojarad, Majid; Bales, Michael E; Yu, Hong

    2013-01-01

    We hereby present ResearcherMap, a tool to visualize locations of authors of scholarly papers. In response to a query, the system returns a map of author locations. To develop the system we first populated a database of author locations, geocoding institution locations for all available institutional affiliation data in our database. The database includes all authors of Medline papers from 1990 to 2012. We conducted a formative heuristic usability evaluation of the system and measured the system's accuracy and performance. The accuracy of finding the accurate address is 97.5% in our system.

  19. A browser-based event display for the CMS Experiment at the LHC using WebGL

    NASA Astrophysics Data System (ADS)

    McCauley, T.

    2017-10-01

    Modern web browsers are powerful and sophisticated applications that support an ever-wider range of uses. One such use is rendering high-quality, GPU-accelerated, interactive 2D and 3D graphics in an HTML canvas. This can be done via WebGL, a JavaScript API based on OpenGL ES. Applications delivered via the browser have several distinct benefits for the developer and user. For example, they can be implemented using well-known and well-developed technologies, while distribution and use via a browser allows for rapid prototyping and deployment and ease of installation. In addition, delivery of applications via the browser allows for easy use on mobile, touch-enabled devices such as phones and tablets. iSpy WebGL is an application for visualization of events detected and reconstructed by the CMS Experiment at the Large Hadron Collider at CERN. The first event display developed for an LHC experiment to use WebGL, iSpy WebGL is a client-side application written in JavaScript, HTML, and CSS and uses the WebGL API three.js. iSpy WebGL is used for monitoring of CMS detector performance, for production of images and animations of CMS collisions events for the public, as a virtual reality application using Google Cardboard, and asa tool available for public education and outreach such as in the CERN Open Data Portal and the CMS masterclasses. We describe here its design, development, and usage as well as future plans.

  20. Open Availability of Patient Medical Photographs in Google Images Search Results: Cross-Sectional Study of Transgender Research

    PubMed Central

    Brunger, Fern; Welch, Vivian; Asghari, Shabnam; Kaposy, Chris

    2018-01-01

    Background This paper focuses on the collision of three factors: a growing emphasis on sharing research through open access publication, an increasing awareness of big data and its potential uses, and an engaged public interested in the privacy and confidentiality of their personal health information. One conceptual space where this collision is brought into sharp relief is with the open availability of patient medical photographs from peer-reviewed journal articles in the search results of online image databases such as Google Images. Objective The aim of this study was to assess the availability of patient medical photographs from published journal articles in Google Images search results and the factors impacting this availability. Methods We conducted a cross-sectional study using data from an evidence map of research with transgender, gender non-binary, and other gender diverse (trans) participants. For the original evidence map, a comprehensive search of 15 academic databases was developed in collaboration with a health sciences librarian. Initial search results produced 25,230 references after duplicates were removed. Eligibility criteria were established to include empirical research of any design that included trans participants or their personal information and that was published in English in peer-reviewed journals. We identified all articles published between 2008 and 2015 with medical photographs of trans participants. For each reference, images were individually numbered in order to track the total number of medical photographs. We used odds ratios (OR) to assess the association between availability of the clinical photograph on Google Images and the following factors: whether the article was openly available online (open access, Researchgate.net, or Academia.edu), whether the article included genital images, if the photographs were published in color, and whether the photographs were located on the journal article landing page. Results We identified 94 articles with medical photographs of trans participants, including a total of 605 photographs. Of the 94 publications, 35 (37%) included at least one medical photograph that was found on Google Images. The ability to locate the article freely online contributes to the availability of at least one image from the article on Google Images (OR 2.99, 95% CI 1.20-7.45). Conclusions This is the first study to document the existence of medical photographs from peer-reviewed journals appearing in Google Images search results. Almost all of the images we searched for included sensitive photographs of patient genitals, chests, or breasts. Given that it is unlikely that patients consented to sharing their personal health information in these ways, this constitutes a risk to patient privacy. Based on the impact of current practices, revisions to informed consent policies and guidelines are required. PMID:29483069

  1. A Web Portal-Based Time-Aware KML Animation Tool for Exploring Spatiotemporal Dynamics of Hydrological Events

    NASA Astrophysics Data System (ADS)

    Bao, X.; Cai, X.; Liu, Y.

    2009-12-01

    Understanding spatiotemporal dynamics of hydrological events such as storms and droughts is highly valuable for decision making on disaster mitigation and recovery. Virtual Globe-based technologies such as Google Earth and Open Geospatial Consortium KML standards show great promises for collaborative exploration of such events using visual analytical approaches. However, currently there are two barriers for wider usage of such approaches. First, there lacks an easy way to use open source tools to convert legacy or existing data formats such as shapefiles, geotiff, or web services-based data sources to KML and to produce time-aware KML files. Second, an integrated web portal-based time-aware animation tool is currently not available. Thus users usually share their files in the portal but have no means to visually explore them without leaving the portal environment which the users are familiar with. We develop a web portal-based time-aware KML animation tool for viewing extreme hydrologic events. The tool is based on Google Earth JavaScript API and Java Portlet standard 2.0 JSR-286, and it is currently deployable in one of the most popular open source portal frameworks, namely Liferay. We have also developed an open source toolkit kml-soc-ncsa (http://code.google.com/p/kml-soc-ncsa/) to facilitate the conversion of multiple formats into KML and the creation of time-aware KML files. We illustrate our tool using some example cases, in which drought and storm events with both time and space dimension can be explored in this web-based KML animation portlet. The tool provides an easy-to-use web browser-based portal environment for multiple users to collaboratively share and explore their time-aware KML files as well as improving the understanding of the spatiotemporal dynamics of the hydrological events.

  2. Transport Statistics - Transport - UNECE

    Science.gov Websites

    Statistics and Data Online Infocards Database SDG Papers E-Road Census Traffic Census Map Traffic Census 2015 available. Two new datasets have been added to the transport statistics database: bus and coach statistics Database Evaluations Follow UNECE Facebook Rss Twitter You tube Contact us Instagram Flickr Google+ Â

  3. An Automated Approach to Extracting River Bank Locations from Aerial Imagery Using Image Texture

    DTIC Science & Technology

    2013-01-01

    Atchafalaya River, LA. Map Data: Google, United States Department of Agriculture Farm Ser- vice Agency, Europa Technologies AUTOMATED RIVER BANK...traverse morphologically smooth landscapes including rivers in sand or ice . Within these limitations, we hold that this technique rep- resents a valuable

  4. Children Creating Multimodal Stories about a Familiar Environment

    ERIC Educational Resources Information Center

    Kervin, Lisa; Mantei, Jessica

    2017-01-01

    Storytelling is a practice that enables children to apply their literacy skills. This article shares a collaborative literacy strategy devised to enable children to create multimodal stories about their familiar school environment. The strategy uses resources, including the children's own drawings, images from Google Maps, and the Puppet Pals…

  5. SemantGeo: Powering Ecological and Environment Data Discovery and Search with Standards-Based Geospatial Reasoning

    NASA Astrophysics Data System (ADS)

    Seyed, P.; Ashby, B.; Khan, I.; Patton, E. W.; McGuinness, D. L.

    2013-12-01

    Recent efforts to create and leverage standards for geospatial data specification and inference include the GeoSPARQL standard, Geospatial OWL ontologies (e.g., GAZ, Geonames), and RDF triple stores that support GeoSPARQL (e.g., AllegroGraph, Parliament) that use RDF instance data for geospatial features of interest. However, there remains a gap on how best to fuse software engineering best practices and GeoSPARQL within semantic web applications to enable flexible search driven by geospatial reasoning. In this abstract we introduce the SemantGeo module for the SemantEco framework that helps fill this gap, enabling scientists find data using geospatial semantics and reasoning. SemantGeo provides multiple types of geospatial reasoning for SemantEco modules. The server side implementation uses the Parliament SPARQL Endpoint accessed via a Tomcat servlet. SemantGeo uses the Google Maps API for user-specified polygon construction and JsTree for providing containment and categorical hierarchies for search. SemantGeo uses GeoSPARQL for spatial reasoning alone and in concert with RDFS/OWL reasoning capabilities to determine, e.g., what geofeatures are within, partially overlap with, or within a certain distance from, a given polygon. We also leverage qualitative relationships defined by the Gazetteer ontology that are composites of spatial relationships as well as administrative designations or geophysical phenomena. We provide multiple mechanisms for exploring data, such as polygon (map-based) and named-feature (hierarchy-based) selection, that enable flexible search constraints using boolean combination of selections. JsTree-based hierarchical search facets present named features and include a 'part of' hierarchy (e.g., measurement-site-01, Lake George, Adirondack Region, NY State) and type hierarchies (e.g., nodes in the hierarchy for WaterBody, Park, MeasurementSite), depending on the ';axis of choice' option selected. Using GeoSPARQL and aforementioned ontology, these hierarchies are constrained based on polygon selection, where the corresponding polygons of the contained features are visually rendered to assist exploration. Once measurement sites are plotted based on initial search, subsequent searches using JsTree selections can extend the previous based on nearby waterbodies in some semantic relationship of interest. For example, ';tributary of' captures water bodies that flow into the current one, and extending the original search to include tributaries of the observed water body is useful to environmental scientists for isolating the source of characteristic levels, including pollutants. Ultimately any SemantEco module can leverage SemantGeo's underlying APIs, leveraged in a deployment of SemantEco that combines EPA and USGS water quality data, and one customized for searching data available from the Darrin Freshwater Institute. Future work will address generating RDF geometry data from shape files, aligning RDF data sources to better leverage qualitative and spatial relationships, and validating newly generated RDF data adhering to the GeoSPARQL standard.

  6. A modern Python interface for the Generic Mapping Tools

    NASA Astrophysics Data System (ADS)

    Uieda, L.; Wessel, P.

    2017-12-01

    Figures generated by The Generic Mapping Tools (GMT) are present in countless publications across the Earth sciences. The command-line interface of GMT lends the tool its flexibility but also creates a barrier to entry for begginers. Meanwhile, adoption of the Python programming language has grown across the scientific community. This growth is largely due to the simplicity and low barrier to entry of the language and its ecosystem of tools. Thus, it is not surprising that there have been at least three attempts to create Python interfaces for GMT: gmtpy (github.com/emolch/gmtpy), pygmt (github.com/ian-r-rose/pygmt), and PyGMT (github.com/glimmer-cism/PyGMT). None of these projects are currently active and, with the exception of pygmt, they do not use the GMT Application Programming Interface (API) introduced in GMT 5. The two main Python libraries for plotting data on maps are the matplotlib Basemap toolkit (matplotlib.org/basemap) and Cartopy (scitools.org.uk/cartopy), both of which rely on matplotlib (matplotlib.org) as the backend for generating the figures. Basemap is known to have limitations and is being discontinued. Cartopy is an improvement over Basemap but is still bound by the speed and memory constraints of matplotlib. We present a new Python interface for GMT (GMT/Python) that makes use of the GMT API and of new features being developed for the upcoming GMT 6 release. The GMT/Python library is designed according to the norms and styles of the Python community. The library integrates with the scientific Python ecosystem by using the "virtual files" from the GMT API to implement input and output of Python data types (numpy "ndarray" for tabular data and xarray "Dataset" for grids). Other features include an object-oriented interface for creating figures, the ability to display figures in the Jupyter notebook, and descriptive aliases for GMT arguments (e.g., "region" instead of "R" and "projection" instead of "J"). GMT/Python can also serve as a backend for developing new high-level interfaces, which can help make GMT more accessible to beginners and more intuitive for Python users. GMT/Python is an open-source project hosted on Github (github.com/GenericMappingTools/gmt-python) and is in early stages of development. A first release will accompany the release of GMT 6, which is expected for early 2018.

  7. Interactive web-based portals to improve patient navigation and connect patients with primary care and specialty services in underserved communities.

    PubMed

    Highfield, Linda; Ottenweller, Cecelia; Pfanz, Andre; Hanks, Jeanne

    2014-01-01

    This article presents a case study in the redesign, development, and implementation of a web-based healthcare clinic search tool for virtual patient navigation in underserved populations in Texas. It describes the workflow, assessment of system requirements, and design and implementation of two online portals: Project Safety Net and the Breast Health Portal. The primary focus of the study was to demonstrate the use of health information technology for the purpose of bridging the gap between underserved populations and access to healthcare. A combination of interviews and focus groups was used to guide the development process. Interviewees were asked a series of questions about usage, usability, and desired features of the new system. The redeveloped system offers a multitier architecture consisting of data, business, and presentation layers. The technology used in the new portals include Microsoft .NET Framework 3.5, Microsoft SQL Server 2008, Google Maps JavaScript API v3, jQuery, Telerik RadControls (ASP.NET AJAX), and HTML. The redesigned portals have 548 registered clinics, and they have averaged 355 visits per month since their launch in late 2011, with the average user visiting five pages per visit. Usage has remained relatively constant over time, with an average of 142 new users (40 percent) each month. This study demonstrates the successful application of health information technology to improve access to healthcare and the successful adoption of the technology by targeted end users. The portals described in this study could be replicated by health information specialists in other areas of the United States to address disparities in healthcare access.

  8. Interactive Web-based Portals to Improve Patient Navigation and Connect Patients with Primary Care and Specialty Services in Underserved Communities

    PubMed Central

    Highfield, Linda; Ottenweller, Cecelia; Pfanz, Andre; Hanks, Jeanne

    2014-01-01

    This article presents a case study in the redesign, development, and implementation of a web-based healthcare clinic search tool for virtual patient navigation in underserved populations in Texas. It describes the workflow, assessment of system requirements, and design and implementation of two online portals: Project Safety Net and the Breast Health Portal. The primary focus of the study was to demonstrate the use of health information technology for the purpose of bridging the gap between underserved populations and access to healthcare. A combination of interviews and focus groups was used to guide the development process. Interviewees were asked a series of questions about usage, usability, and desired features of the new system. The redeveloped system offers a multitier architecture consisting of data, business, and presentation layers. The technology used in the new portals include Microsoft .NET Framework 3.5, Microsoft SQL Server 2008, Google Maps JavaScript API v3, jQuery, Telerik RadControls (ASP.NET AJAX), and HTML. The redesigned portals have 548 registered clinics, and they have averaged 355 visits per month since their launch in late 2011, with the average user visiting five pages per visit. Usage has remained relatively constant over time, with an average of 142 new users (40 percent) each month. This study demonstrates the successful application of health information technology to improve access to healthcare and the successful adoption of the technology by targeted end users. The portals described in this study could be replicated by health information specialists in other areas of the United States to address disparities in healthcare access. PMID:24808806

  9. Comparison of spray drying, electroblowing and electrospinning for preparation of Eudragit E and itraconazole solid dispersions.

    PubMed

    Sóti, Péter Lajos; Bocz, Katalin; Pataki, Hajnalka; Eke, Zsuzsanna; Farkas, Attila; Verreck, Geert; Kiss, Éva; Fekete, Pál; Vigh, Tamás; Wagner, István; Nagy, Zsombor K; Marosi, György

    2015-10-15

    Three solvent based methods: spray drying (SD), electrospinning (ES) and air-assisted electrospinning (electroblowing; EB) were used to prepare solid dispersions of itraconazole and Eudragit E. Samples with the same API/polymer ratios were prepared in order to make the three technologies comparable. The structure and morphology of solid dispersions were identified by scanning electron microscopy and solid phase analytical methods such as, X-ray powder diffraction (XRPD), differential scanning calorimetry (DSC) and Raman chemical mapping. Moreover, the residual organic solvents of the solid products were determined by static headspace-gas chromatography/mass spectroscopy measurements and the wettability of samples was characterized by contact angle measurement. The pharmaceutical performance of the three dispersion type, evaluated by dissolution tests, proved to be very similar. According to XRPD and DSC analyses, made after the production, all the solid dispersions were free of any API crystal clusters but about 10 wt% drug crystallinity was observed after three months of storage in the case of the SD samples in contrast to the samples produced by ES and EB in which the polymer matrix preserved the API in amorphous state. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Using open-source programs to create a web-based portal for hydrologic information

    NASA Astrophysics Data System (ADS)

    Kim, H.

    2013-12-01

    Some hydrologic data sets, such as basin climatology, precipitation, and terrestrial water storage, are not easily obtainable and distributable due to their size and complexity. We present a Hydrologic Information Portal (HIP) that has been implemented at the University of California for Hydrologic Modeling (UCCHM) and that has been organized around the large river basins of North America. This portal can be easily accessed through a modern web browser that enables easy access and visualization of such hydrologic data sets. Some of the main features of our HIP include a set of data visualization features so that users can search, retrieve, analyze, integrate, organize, and map data within large river basins. Recent information technologies such as Google Maps, Tornado (Python asynchronous web server), NumPy/SciPy (Scientific Library for Python) and d3.js (Visualization library for JavaScript) were incorporated into the HIP to create ease in navigating large data sets. With such open source libraries, HIP can give public users a way to combine and explore various data sets by generating multiple chart types (Line, Bar, Pie, Scatter plot) directly from the Google Maps viewport. Every rendered object such as a basin shape on the viewport is clickable, and this is the first step to access the visualization of data sets.

  11. Atlas of Cancer Signalling Network: a systems biology resource for integrative analysis of cancer data with Google Maps

    PubMed Central

    Kuperstein, I; Bonnet, E; Nguyen, H-A; Cohen, D; Viara, E; Grieco, L; Fourquet, S; Calzone, L; Russo, C; Kondratova, M; Dutreix, M; Barillot, E; Zinovyev, A

    2015-01-01

    Cancerogenesis is driven by mutations leading to aberrant functioning of a complex network of molecular interactions and simultaneously affecting multiple cellular functions. Therefore, the successful application of bioinformatics and systems biology methods for analysis of high-throughput data in cancer research heavily depends on availability of global and detailed reconstructions of signalling networks amenable for computational analysis. We present here the Atlas of Cancer Signalling Network (ACSN), an interactive and comprehensive map of molecular mechanisms implicated in cancer. The resource includes tools for map navigation, visualization and analysis of molecular data in the context of signalling network maps. Constructing and updating ACSN involves careful manual curation of molecular biology literature and participation of experts in the corresponding fields. The cancer-oriented content of ACSN is completely original and covers major mechanisms involved in cancer progression, including DNA repair, cell survival, apoptosis, cell cycle, EMT and cell motility. Cell signalling mechanisms are depicted in detail, together creating a seamless ‘geographic-like' map of molecular interactions frequently deregulated in cancer. The map is browsable using NaviCell web interface using the Google Maps engine and semantic zooming principle. The associated web-blog provides a forum for commenting and curating the ACSN content. ACSN allows uploading heterogeneous omics data from users on top of the maps for visualization and performing functional analyses. We suggest several scenarios for ACSN application in cancer research, particularly for visualizing high-throughput data, starting from small interfering RNA-based screening results or mutation frequencies to innovative ways of exploring transcriptomes and phosphoproteomes. Integration and analysis of these data in the context of ACSN may help interpret their biological significance and formulate mechanistic hypotheses. ACSN may also support patient stratification, prediction of treatment response and resistance to cancer drugs, as well as design of novel treatment strategies. PMID:26192618

  12. Understanding Urban Watersheds through Digital Interactive Maps, San Francisco Bay Area, California

    NASA Astrophysics Data System (ADS)

    Sowers, J. M.; Ticci, M. G.; Mulvey, P.

    2014-12-01

    Dense urbanization has resulted in the "disappearance" of many local creeks in urbanized areas surrounding the San Francisco Bay. Long reaches of creeks now flow in underground pipes. Municipalities and water agencies trying to reduce non-point-source pollution are faced with a public that cannot see and therefore does not understand the interconnected nature of the drainage system or its ultimate discharge to the bay. Since 1993, we have collaborated with the Oakland Museum, the San Francisco Estuary Institute, public agencies, and municipalities to create creek and watershed maps to address the need for public understanding of watershed concepts. Fifteen paper maps are now published (www.museumca.org/creeks), which have become a standard reference for educators and anyone working on local creek-related issues. We now present digital interactive creek and watershed maps in Google Earth. Four maps are completed covering urbanized areas of Santa Clara and Alameda Counties. The maps provide a 3D visualization of the watersheds, with cartography draped over the landscape in transparent colors. Each mapped area includes both Present and Past (circa 1800s) layers which can be clicked on or off by the user. The Present layers include the modern drainage network, watershed boundaries, and reservoirs. The Past layers include the 1800s-era creek systems, tidal marshes, lagoons, and other habitats. All data are developed in ArcGIS software and converted to Google Earth format. To ensure the maps are interesting and engaging, clickable icons pop-up provide information on places to visit, restoration projects, history, plants, and animals. Maps of Santa Clara Valley are available at http://www.valleywater.org/WOW.aspx. Maps of western Alameda County will soon be available at http://acfloodcontrol.org/. Digital interactive maps provide several advantages over paper maps. They are seamless within each map area, and the user can zoom in or out, and tilt, and fly over to explore any area of interest. They can be easily customized, for example, adding placemarks or notes. Enrichment information can be added, using clickable icons, without cluttering the map. Best, the maps are fun to use. Digital interactive maps will be another effective tool for enhancing public understanding of urban creeks & watersheds.

  13. Reusable Client-Side JavaScript Modules for Immersive Web-Based Real-Time Collaborative Neuroimage Visualization.

    PubMed

    Bernal-Rusiel, Jorge L; Rannou, Nicolas; Gollub, Randy L; Pieper, Steve; Murphy, Shawn; Robertson, Richard; Grant, Patricia E; Pienaar, Rudolph

    2017-01-01

    In this paper we present a web-based software solution to the problem of implementing real-time collaborative neuroimage visualization. In both clinical and research settings, simple and powerful access to imaging technologies across multiple devices is becoming increasingly useful. Prior technical solutions have used a server-side rendering and push-to-client model wherein only the server has the full image dataset. We propose a rich client solution in which each client has all the data and uses the Google Drive Realtime API for state synchronization. We have developed a small set of reusable client-side object-oriented JavaScript modules that make use of the XTK toolkit, a popular open-source JavaScript library also developed by our team, for the in-browser rendering and visualization of brain image volumes. Efficient realtime communication among the remote instances is achieved by using just a small JSON object, comprising a representation of the XTK image renderers' state, as the Google Drive Realtime collaborative data model. The developed open-source JavaScript modules have already been instantiated in a web-app called MedView , a distributed collaborative neuroimage visualization application that is delivered to the users over the web without requiring the installation of any extra software or browser plugin. This responsive application allows multiple physically distant physicians or researchers to cooperate in real time to reach a diagnosis or scientific conclusion. It also serves as a proof of concept for the capabilities of the presented technological solution.

  14. Hadoop-BAM: directly manipulating next generation sequencing data in the cloud.

    PubMed

    Niemenmaa, Matti; Kallio, Aleksi; Schumacher, André; Klemelä, Petri; Korpelainen, Eija; Heljanko, Keijo

    2012-03-15

    Hadoop-BAM is a novel library for the scalable manipulation of aligned next-generation sequencing data in the Hadoop distributed computing framework. It acts as an integration layer between analysis applications and BAM files that are processed using Hadoop. Hadoop-BAM solves the issues related to BAM data access by presenting a convenient API for implementing map and reduce functions that can directly operate on BAM records. It builds on top of the Picard SAM JDK, so tools that rely on the Picard API are expected to be easily convertible to support large-scale distributed processing. In this article we demonstrate the use of Hadoop-BAM by building a coverage summarizing tool for the Chipster genome browser. Our results show that Hadoop offers good scalability, and one should avoid moving data in and out of Hadoop between analysis steps.

  15. The implementation of a modernized Dynamic Digital Map on Gale Crater, Mars

    NASA Astrophysics Data System (ADS)

    McBeck, J.; Condit, C. D.

    2012-12-01

    Currently, geology instructors present information to students via PowerPoint, Word, Excel and other programs that are not designed to parse or present geologic data. More tech-savvy, and perhaps better-funded, instructors use Google Earth or ArcGIS to display geologic maps and other visual information. However, Google Earth lacks the ability to present large portions of text, and ArcGIS restricts such functionality to labels and annotations. The original Dynamic Digital Map, which we have renamed Dynamic Digital Map Classic (DDMC), allows instructors to represent both visual and large portions of textual information to students. This summer we generalized the underlying architecture of DDMC, redesigned the user interface, modernized the analytical functionality, renamed the older version and labeled this new creature Dynamic Digital Map Extended (DDME). With the new DDME instructors can showcase maps, images, articles and movies, and create digital field trips. They can set the scale, coordinate system and caption of maps and images, add symbol links to maps and images that can transport the user to any specified destination—either internally (to data contained within the DDME) or externally (to a website address). Instructors and students can also calculate non-linear distances and irregular areas of maps and images, and create digital field trips with any number of stops—complete with notes and driving directions. DDMEs are perhaps best described as a sort of computerized, self-authored, interactive textbook. To display the vast capabilities of DDME, we created a DDME of Gale Crater (DDME-GC), which is the landing site of the most sophisticated NASA Mars Rover—Curiosity. DDME-GC hosts six thematic maps: a detailed geologic map provided by Brad Thompson of the Boston University Center for Remote Sensing (Thompson, et al., 2010), and five maps maintained in ASU's JMARS system, including global mosaics from Mars Global Surveyor's Mars Orbiter Laser Altimeter (MOLA), Mars Odyssey's Thermal Emission Imaging System (THEMIS), and the Mars Digital Image Model. DDME-GC offers a diverse suite of images, with over 40 images captured in the High Resolution Imaging Science Experiment (HiRISE), as well as several global mosaics created from Viking Orbiter, Hubble Telescope, THEMIS, MOLA and HiRISE data. DDME-GC also provides more than 25 articles that span subjects from the possible origins of the mound located in Gale Crater to the goals of NASA's Mars Exploration Program. The movies hosted by DDME-GC describe the difficulties of selecting a landing site for Curiosity, landing Curiosity on Mars and several other dynamic topics. The most significant advantage of the modernized DDME is its easily augmented functionality. In the future, DDME will be able to communicate with databases, import Keyhole Markup Language (KML) files from Google Earth, and be available on iOS and Android operating system. (Imagine: a field trip without the burden of notebooks, pens or pencils, paper or clipboards, with this information maintained on a mobile device.) The most recent DDME is a mere skeleton of its full capabilities—a robust architecture upon which myriad functionality can be supplemented.

  16. Tracking changes of river morphology in Ayeyarwady River in Myanmar using earth observations and surface water mapping tool

    NASA Astrophysics Data System (ADS)

    Piman, T.; Schellekens, J.; Haag, A.; Donchyts, G.; Apirumanekul, C.; Hlaing, K. T.

    2017-12-01

    River morphology changes is one of the key issues in Ayeyarwady River in Myanmar which cause impacts on navigation, riverine habitats, agriculture lands, communities and livelihoods near the bank of the river. This study is aimed to track the changes in river morphology in the middle reach of Ayeyarwady River over last 30 years from 1984-2014 to improve understanding of riverbank dynamic, erosion and deposition procress. Earth observations including LandSat-7, LandSat-8, Digital Elevation Model from SRTM Plus and, ASTER-2 GoogleMap and Open Street Map were obtained for the study. GIS and remote sensing tools were used to analyze changes in river morphology while surface water mapping tool was applied to determine how the dynamic behaviour of the surface river and effect of river morphology changes. The tool consists of two components: (1) a Google Earth Engine (GEE) javascript or python application that performs image analysis and (2) a user-friendly site/app using Google's appspot.com that exposes the application to the users. The results of this study shown that the fluvial morphology in the middle reach of Ayeyarwady River is continuously changing under the influence of high water flows in particularly from extreme flood events and land use change from mining and deforestation. It was observed that some meandering sections of the riverbank were straightened, which results in the movement of sediment downstream and created new sections of meandering riverbank. Several large islands have formed due to the stabilization by vegetation and is enforced by sedimentation while many small bars were formed and migrated dynamically due to changes in water levels and flow velocity in the wet and dry seasons. The main channel was changed to secondary channel in some sections of the river. This results a constant shift of the navigation route. We also found that some villages were facing riverbank erosion which can force villagers to relocate. The study results demonstrated that the products from earth observations and the surface water mapping tool could detect dynamic changes of river morphology in the Ayeyarwady River. This information is useful to support navigation and riverbank protection planning and formulating mitigation measures for local communities that are affecting by riverbank erosion.

  17. Development of Visualizations and Loggable Activities for the Geosciences. Results from Recent TUES Sponsored Projects

    NASA Astrophysics Data System (ADS)

    De Paor, D. G.; Bailey, J. E.; Whitmeyer, S. J.

    2012-12-01

    Our TUES research centers on the role of digital data, visualizations, animations, and simulations in undergraduate geoscience education. Digital hardware (smartphones, tablets, GPSs, GigaPan robotic camera mounts, etc.) are revolutionizing field data collection. Software products (GIS, 3-D scanning and modeling programs, virtual globes, etc.) have truly transformed the way geoscientists teach, learn, and do research. Whilst Google-Earth-style visualizations are famously user-friend for the person browsing, they can be notoriously unfriendly for the content creator. Therefore, we developed tools to help educators create and share visualizations as easily as if posting on Facebook. Anyone whoIf you wish to display geological cross sections on Google Earth, go to digitalplanet.org, upload image files, position them on a line of section, and share with the world through our KMZ hosting service. Other tools facilitate screen overlay and 3-D map symbol generation. We advocate use of such technology to enable undergraduate students to 'publish' their first mapping efforts even while they are working in the field. A second outcome of our TUES projects merges Second-Life-style interaction with Google Earth. We created games in which students act as first responders for natural hazard mitigation, prospectors for natural resource explorations, and structural geologist for map-making. Students are represented by avatars and collaborate by exchange of text messages - the natural mode of communication for the current generation. Teachers view logs showing student movements as well as transcripts of text messages and can scaffold student learning and geofence students to prevent wandering. Early results of in-class testing show positive learning outcomes. The third aspect of our program emphasizes dissemination. Experience shows that great effort is required to overcome activation energy and ensure adoption of new technology into the curriculum. We organized a GSA Penrose Conference, a GSA Pardee Keynote Symposium, and AGU Townhall Meeting, and numerous workshops at annual and regional meetings, and set up a web site dedicated to dissemination of program products. Future plans include development of augmented reality teaching resources, hosting of community mapping services, and creation of a truly 4-D virtual globe.;

  18. Improving Land Cover Mapping: a Mobile Application Based on ESA Sentinel 2 Imagery

    NASA Astrophysics Data System (ADS)

    Melis, M. T.; Dessì, F.; Loddo, P.; La Mantia, C.; Da Pelo, S.; Deflorio, A. M.; Ghiglieri, G.; Hailu, B. T.; Kalegele, K.; Mwasi, B. N.

    2018-04-01

    The increasing availability of satellite data is a real value for the enhancement of environmental knowledge and land management. Possibilities to integrate different source of geo-data are growing and methodologies to create thematic database are becoming very sophisticated. Moreover, the access to internet services and, in particular, to web mapping services is well developed and spread either between expert users than the citizens. Web map services, like Google Maps or Open Street Maps, give the access to updated optical imagery or topographic maps but information on land cover/use - are not still provided. Therefore, there are many failings in the general utilization -non-specialized users- and access to those maps. This issue is particularly felt where the digital (web) maps could form the basis for land use management as they are more economic and accessible than the paper maps. These conditions are well known in many African countries where, while the internet access is becoming open to all, the local map agencies and their products are not widespread.

  19. A global map of rainfed cropland areas (GMRCA) at the end of last millennium using remote sensing

    USGS Publications Warehouse

    Biradar, C.M.; Thenkabail, P.S.; Noojipady, P.; Li, Y.; Dheeravath, V.; Turral, H.; Velpuri, M.; Gumma, M.K.; Gangalakunta, O.R.P.; Cai, X.L.; Xiao, X.; Schull, M.A.; Alankara, R.D.; Gunasinghe, S.; Mohideen, S.

    2009-01-01

    The overarching goal of this study was to produce a global map of rainfed cropland areas (GMRCA) and calculate country-by-country rainfed area statistics using remote sensing data. A suite of spatial datasets, methods and protocols for mapping GMRCA were described. These consist of: (a) data fusion and composition of multi-resolution time-series mega-file data-cube (MFDC), (b) image segmentation based on precipitation, temperature, and elevation zones, (c) spectral correlation similarity (SCS), (d) protocols for class identification and labeling through uses of SCS R2-values, bi-spectral plots, space-time spiral curves (ST-SCs), rich source of field-plot data, and zoom-in-views of Google Earth (GE), and (e) techniques for resolving mixed classes by decision tree algorithms, and spatial modeling. The outcome was a 9-class GMRCA from which country-by-country rainfed area statistics were computed for the end of the last millennium. The global rainfed cropland area estimate from the GMRCA 9-class map was 1.13 billion hectares (Bha). The total global cropland areas (rainfed plus irrigated) was 1.53 Bha which was close to national statistics compiled by FAOSTAT (1.51 Bha). The accuracies and errors of GMRCA were assessed using field-plot and Google Earth data points. The accuracy varied between 92 and 98% with kappa value of about 0.76, errors of omission of 2-8%, and the errors of commission of 19-36%. ?? 2008 Elsevier B.V.

  20. Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)

    NASA Astrophysics Data System (ADS)

    Hancher, M.

    2013-12-01

    Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.

  1. The research and implementation of coalfield spontaneous combustion of carbon emission WebGIS based on Silverlight and ArcGIS server

    NASA Astrophysics Data System (ADS)

    Zhu, Z.; Bi, J.; Wang, X.; Zhu, W.

    2014-02-01

    As an important sub-topic of the natural process of carbon emission data public information platform construction, coalfield spontaneous combustion of carbon emission WebGIS system has become an important study object. In connection with data features of coalfield spontaneous combustion carbon emissions (i.e. a wide range of data, which is rich and complex) and the geospatial characteristics, data is divided into attribute data and spatial data. Based on full analysis of the data, completed the detailed design of the Oracle database and stored on the Oracle database. Through Silverlight rich client technology and the expansion of WCF services, achieved the attribute data of web dynamic query, retrieval, statistical, analysis and other functions. For spatial data, we take advantage of ArcGIS Server and Silverlight-based API to invoke GIS server background published map services, GP services, Image services and other services, implemented coalfield spontaneous combustion of remote sensing image data and web map data display, data analysis, thematic map production. The study found that the Silverlight technology, based on rich client and object-oriented framework for WCF service, can efficiently constructed a WebGIS system. And then, combined with ArcGIS Silverlight API to achieve interactive query attribute data and spatial data of coalfield spontaneous emmission, can greatly improve the performance of WebGIS system. At the same time, it provided a strong guarantee for the construction of public information on China's carbon emission data.

  2. UNAVCO Software and Services for Visualization and Exploration of Geoscience Data

    NASA Astrophysics Data System (ADS)

    Meertens, C.; Wier, S.

    2007-12-01

    UNAVCO has been involved in visualization of geoscience data to support education and research for several years. An early and ongoing service is the Jules Verne Voyager, a web browser applet built on the GMT that displays any area on Earth, with many data set choices, including maps, satellite images, topography, geoid heights, sea-floor ages, strain rates, political boundaries, rivers and lakes, earthquake and volcano locations, focal mechanisms, stress axes, and observed and modeled plate motion and deformation velocity vectors from geodetic measurements around the world. As part of the GEON project, UNAVCO has developed the GEON IDV, a research-level, 4D (earth location, depth and/or altitude, and time), Java application for interactive display and analysis of geoscience data. The GEON IDV is designed to meet the challenge of investigating complex, multi-variate, time-varying, three-dimensional geoscience data anywhere on earth. The GEON IDV supports simultaneous displays of data sets from differing sources, with complete control over colors, time animation, map projection, map area, point of view, and vertical scale. The GEON IDV displays gridded and point data, images, GIS shape files, and several other types of data. The GEON IDV has symbols and displays for GPS velocity vectors, seismic tomography, earthquake focal mechanisms, earthquake locations with magnitude or depth, seismic ray paths in 3D, seismic anisotropy, convection model visualization, earth strain axes and strain field imagery, and high-resolution 3D topographic relief maps. Multiple data sources and display types may appear in one view. As an example of GEON IDV utility, it can display hypocenters under a volcano, a surface geology map of the volcano draped over 3D topographic relief, town locations and political boundaries, and real-time 3D weather radar clouds of volcanic ash in the atmosphere, with time animation. The GEON IDV can drive a GeoWall or other 3D stereo system. IDV output includes imagery, movies, and KML files for Google Earth use of IDV static images, where Google Earth can handle the display. The IDV can be scripted to create display images on user request or automatically on data arrival, offering the use of the IDV as a back end to support a data web site. We plan to extend the power of the IDV by accepting new data types and data services, such as GeoSciML. An active program of online and video training in GEON IDV use is planned. UNAVCO will support users who need assistance converting their data to the standard formats used by the GEON IDV. The UNAVCO Facility provides web-accessible support for Google Earth and Google Maps display of any of more than 9500 GPS stations and survey points, including metadata for each installation. UNAVCO provides corresponding Open Geospatial Consortium (OGC) web services with the same data. UNAVCO's goal is to facilitate data access, interoperability, and efficient searches, exploration, and use of data by promoting web services, standards for GEON IDV data formats and metadata, and software able to simultaneously read and display multiple data sources, formats, and map locations or projections. Retention and propagation of semantics and metadata with observational and experimental values is essential for interoperability and understanding diverse data sources.

  3. A Web-Based Earth-Systems Knowledge Portal and Collaboration Platform

    NASA Astrophysics Data System (ADS)

    D'Agnese, F. A.; Turner, A. K.

    2010-12-01

    In support of complex water-resource sustainability projects in the Great Basin region of the United States, Earth Knowledge, Inc. has developed several web-based data management and analysis platforms that have been used by its scientists, clients, and public to facilitate information exchanges, collaborations, and decision making. These platforms support accurate water-resource decision-making by combining second-generation internet (Web 2.0) technologies with traditional 2D GIS and web-based 2D and 3D mapping systems such as Google Maps, and Google Earth. Most data management and analysis systems use traditional software systems to address the data needs and usage behavior of the scientific community. In contrast, these platforms employ more accessible open-source and “off-the-shelf” consumer-oriented, hosted web-services. They exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize earth, engineering, and social science datasets. Thus, they respond to the information needs and web-interface expectations of both subject-matter experts and the public. Because the platforms continue to gather and store all the contributions of their broad-spectrum of users, each new assessment leverages the data, information, and expertise derived from previous investigations. In the last year, Earth Knowledge completed a conceptual system design and feasibility study for a platform, which has a Knowledge Portal providing access to users wishing to retrieve information or knowledge developed by the science enterprise and a Collaboration Environment Module, a framework that links the user-access functions to a Technical Core supporting technical and scientific analyses including Data Management, Analysis and Modeling, and Decision Management, and to essential system administrative functions within an Administrative Module. The over-riding technical challenge is the design and development of a single technical platform that is accessed through a flexible series of knowledge portal and collaboration environment styles reflecting the information needs and user expectations of a diverse community of users. Recent investigations have defined the information needs and expectations of the major end-users and also have reviewed and assessed a wide variety of modern web-based technologies. Combining these efforts produced design specifications and recommendations for the selection and integration of web- and client-based tools. When fully developed, the resulting platform will: -Support new, advanced information systems and decision environments that take full advantage of multiple data sources and platforms; -Provide a distribution network tailored to the timely delivery of products to a broad range of users that are needed to support applications in disaster management, resource management, energy, and urban sustainability; -Establish new integrated multiple-user requirements and knowledge databases that support researchers and promote infusion of successful technologies into existing processes; and -Develop new decision support strategies and presentation methodologies for applied earth science applications to reduce risk, cost, and time.

  4. Web-Based Survey Application to Collect Contextually Relevant Geographic Data With Exposure Times: Application Development and Feasibility Testing

    PubMed Central

    Tobin, Karin; Rudolph, Jonathan; Latkin, Carl

    2018-01-01

    Background Although studies that characterize the risk environment by linking contextual factors with individual-level data have advanced infectious disease and substance use research, there are opportunities to refine how we define relevant neighborhood exposures; this can in turn reduce the potential for exposure misclassification. For example, for those who do not inject at home, injection risk behaviors may be more influenced by the environment where they inject than where they live. Similarly, among those who spend more time away from home, a measure that accounts for different neighborhood exposures by weighting each unique location proportional to the percentage of time spent there may be more correlated with health behaviors than one’s residential environment. Objective This study aimed to develop a Web-based application that interacts with Google Maps application program interfaces (APIs) to collect contextually relevant locations and the amount of time spent in each. Our analysis examined the extent of overlap across different location types and compared different approaches for classifying neighborhood exposure. Methods Between May 2014 and March 2017, 547 participants enrolled in a Baltimore HIV care and prevention study completed an interviewer-administered Web-based survey that collected information about where participants were recruited, worked, lived, socialized, injected drugs, and spent most of their time. For each location, participants gave an address or intersection which they confirmed using Google Map and Street views. Geographic coordinates (and hours spent in each location) were joined to neighborhood indicators by Community Statistical Area (CSA). We computed a weighted exposure based on the proportion of time spent in each unique location. We compared neighborhood exposures based on each of the different location types with one another and the weighted exposure using analysis of variance with Bonferroni corrections to account for multiple comparisons. Results Participants reported spending the most time at home, followed by the location where they injected drugs. Injection locations overlapped most frequently with locations where people reported socializing and living or sleeping. The least time was spent in the locations where participants reported earning money and being recruited for the study; these locations were also the least likely to overlap with other location types. We observed statistically significant differences in neighborhood exposures according to the approach used. Overall, people reported earning money in higher-income neighborhoods and being recruited for the study and injecting in neighborhoods with more violent crime, abandoned houses, and poverty. Conclusions This analysis revealed statistically significant differences in neighborhood exposures when defined by different locations or weighted based on exposure time. Future analyses are needed to determine which exposure measures are most strongly associated with health and risk behaviors and to explore whether associations between individual-level behaviors and neighborhood exposures are modified by exposure times. PMID:29351899

  5. MemAxes Visualization Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardware advancements such as Intel's PEBS and AMD's IBS, as well as software developments such as the perf_event API in Linux have made available the acquisition of memory access samples with performance information. MemAxes is a visualization and analysis tool for memory access sample data. By mapping the samples to their associated code, variables, node topology, and application dataset, MemAxes provides intuitive views of the data.

  6. Near real-time qualitative monitoring of lake water chlorophyll globally using GoogleEarth Engine

    NASA Astrophysics Data System (ADS)

    Zlinszky, András; Supan, Peter; Koma, Zsófia

    2017-04-01

    Monitoring ocean chlorophyll and suspended sediment has been made possible using optical satellite imaging, and has contributed immensely to our understanding of the Earth and its climate. However, lake water quality monitoring has limitations due to the optical complexity of shallow, sediment- and organic matter-laden waters. Meanwhile, timely and detailed information on basic lake water quality parameters would be essential for sustainable management of inland waters. Satellite-based remote sensing can deliver area-covering, high resolution maps of basic lake water quality parameters, but scientific application of these datasets for lake monitoring has been hindered by limitations to calibration and accuracy evaluation, and therefore access to such data has been the privilege of scientific users. Nevertheless, since for many inland waters satellite imaging is the only source of monitoring data, we believe it is urgent to make map products of chlorophyll and suspended sediment concentrations available to a wide range of users. Even if absolute accuracy can not be validated, patterns, processes and qualitative information delivered by such datasets in near-real time can act as an early warning system, raise awareness to water quality processes and serve education, in addition to complementing local monitoring activities. By making these datasets openly available on the internet through an easy to use framework, dialogue between stakeholders, management and governance authorities can be facilitated. We use GoogleEarthEngine to access and process archive and current satellite data. GoogleEarth Engine is a development and visualization framework that provides access to satellite datasets and processing capacity for analysis at the Petabyte scale. Based on earlier investigations, we chose the fluorescence line height index to represent water chlorophyll concentration. This index relies on the chlorophyll fluorescence peak at 680 nm, and has been tested for open ocean but also inland lake situations for MODIS and MERIS satellite sensor data. In addition to being relatively robust and less sensitive to atmospheric influence, this algorithm is also very simple, being based on the height of the 680 nm peak above the linear interpolation of the two neighbouring bands. However, not all satellite datasets suitable for FLH are catalogued for GoogleEarth Engine. In the current testing phase, Landsat 7, Landsat 8 (30 m resolution), and Sentinel 2 (20 m) are being tested. Landsat 7 has suitable band configuration, but has a strip error due to a sensor problem. Landsat 8 and Sentinel 2 lack a single spectral optimal for FLH. Sentinel 3 would be an optimal data source and has shown good performace during small-scale initial tests, but is not distributed globally for GoogleEarth Engine. In addition to FLH data from these satellites, our system delivers cloud and ice masking, qualitative suspended sediment data (based on the band closest to 600 nm) and true colour images, all within an easy-to-use Google Maps background. This allows on-demand understanding and interpretation of water quality patterns and processes in near real time. While the system is still under development, we believe it could significantly contribute to lake water quality management and monitoring worldwide.

  7. The Dimensions of the Solar System

    ERIC Educational Resources Information Center

    Schneider, Stephen E.; Davis, Kathleen S.

    2007-01-01

    A few new wrinkles have been added to the popular activity of building a scale model of the solar system. Students can learn about maps and scaling using easily accessible online resources that include satellite images. This is accomplished by taking advantage of some of the special features of Google Earth. This activity gives students a much…

  8. Geospatial Services in Special Libraries: A Needs Assessment Perspective

    ERIC Educational Resources Information Center

    Barnes, Ilana

    2013-01-01

    Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…

  9. Map Scale, Proportion, and Google[TM] Earth

    ERIC Educational Resources Information Center

    Roberge, Martin C.; Cooper, Linda L.

    2010-01-01

    Aerial imagery has a great capacity to engage and maintain student interest while providing a contextual setting to strengthen their ability to reason proportionally. Free, on-demand, high-resolution, large-scale aerial photography provides both a bird's eye view of the world and a new perspective on one's own community. This article presents an…

  10. Mapping land cover change over continental Africa using Landsat and Google Earth Engine cloud computing.

    PubMed

    Midekisa, Alemayehu; Holl, Felix; Savory, David J; Andrade-Pacheco, Ricardo; Gething, Peter W; Bennett, Adam; Sturrock, Hugh J W

    2017-01-01

    Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth's land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources.

  11. Detecting Potential Water Quality Issues by Mapping Trophic Status Using Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Nguy-Robertson, A. L.; Harvey, K.; Huening, V.; Robinson, H.

    2017-12-01

    The identification, timing, and spatial distribution of recurrent algal blooms and aquatic vegetation can help water managers and policy makers make better water resource decisions. In many parts of the world there is little monitoring or reporting of water quality due to the required costs and effort to collect and process water samples. We propose to use Google Earth Engine to quickly identify the recurrence of trophic states in global inland water systems. Utilizing Landsat and Sentinel multispectral imagery, inland water quality parameters (i.e. chlorophyll a concentration) can be estimated and waters can be classified by trophic state; oligotrophic, mesotrophic, eutrophic, and hypereutrophic. The recurrence of eutrophic and hypereutrophic observations can highlight potentially problematic locations where algal blooms or aquatic vegetation occur routinely. Eutrophic and hypereutrophic waters commonly include many harmful algal blooms and waters prone to fish die-offs from hypoxia. While these maps may be limited by the accuracy of the algorithms utilized to estimate chlorophyll a; relative comparisons at a local scale can help water managers to focus limited resources.

  12. Mapping land cover change over continental Africa using Landsat and Google Earth Engine cloud computing

    PubMed Central

    Holl, Felix; Savory, David J.; Andrade-Pacheco, Ricardo; Gething, Peter W.; Bennett, Adam; Sturrock, Hugh J. W.

    2017-01-01

    Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth’s land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources. PMID:28953943

  13. A landslide susceptibility map of Africa

    NASA Astrophysics Data System (ADS)

    Broeckx, Jente; Vanmaercke, Matthias; Duchateau, Rica; Poesen, Jean

    2017-04-01

    Studies on landslide risks and fatalities indicate that landslides are a global threat to humans, infrastructure and the environment, certainly in Africa. Nonetheless our understanding of the spatial patterns of landslides and rockfalls on this continent is very limited. Also in global landslide susceptibility maps, Africa is mostly underrepresented in the inventories used to construct these maps. As a result, predicted landslide susceptibilities remain subject to very large uncertainties. This research aims to produce a first continent-wide landslide susceptibility map for Africa, calibrated with a well-distributed landslide dataset. As a first step, we compiled all available landslide inventories for Africa. This data was supplemented by additional landslide mapping with Google Earth in underrepresented regions. This way, we compiled 60 landslide inventories from the literature (ca. 11000 landslides) and an additional 6500 landslides through mapping in Google Earth (including 1500 rockfalls). Various environmental variables such as slope, lithology, soil characteristics, land use, precipitation and seismic activity, were investigated for their significance in explaining the observed spatial patterns of landslides. To account for potential mapping biases in our dataset, we used Monte Carlo simulations that selected different subsets of mapped landslides, tested the significance of the considered environmental variables and evaluated the performance of the fitted multiple logistic regression model against another subset of mapped landslides. Based on these analyses, we constructed two landslide susceptibility maps for Africa: one for all landslide types and one excluding rockfalls. In both maps, topography, lithology and seismic activity were the most significant variables. The latter factor may be surprising, given the overall limited degree of seismicity in Africa. However, its significance indicates that frequent seismic events may serve as in important preparatory factor for landslides. This finding concurs with several other recent studies. Rainfall explains a significant, but limited part of the observed landslide pattern and becomes insignificant when also rockfalls are considered. This may be explained by the fact that a significant fraction of the mapped rockfalls occurred in the Sahara desert. Overall, both maps perform well in predicting intra-continental patterns of mass movements in Africa and explain about 80% of the observed variance in landslide occurrence. As a result, these maps may be a valuable tool for planning and risk reduction strategies.

  14. Snake River Plain Geothermal Play Fairway Analysis - Phase 1 KMZ files

    DOE Data Explorer

    John Shervais

    2015-10-10

    This dataset contain raw data files in kmz files (Google Earth georeference format). These files include volcanic vent locations and age, the distribution of fine-grained lacustrine sediments (which act as both a seal and an insulating layer for hydrothermal fluids), and post-Miocene faults compiled from the Idaho Geological Survey, the USGS Quaternary Fault database, and unpublished mapping. It also contains the Composite Common Risk Segment Map created during Phase 1 studies, as well as a file with locations of select deep wells used to interrogate the subsurface.

  15. Combating Conflict Related Sexual Violence: More Than a Stability Concern

    DTIC Science & Technology

    2014-06-13

    violence can cause serious bodily harm or mental harm to members of the group (International Criminal Court 2002, 3; Ellis 2007). Under crimes against...population was subjugated to Japanese 46 rule and were provided with horrific visual, physical, and emotional reminders of the futility of any...maps.google.com/ maps /ms?msid=214870171076954118166.0004b9bcb533b0ee2c1f8&msa=0&ie= UTF8&ll=12.848235,58.136902&spn=43.135476,135.258178&t=m&output=em bed

  16. Accuracy comparison in mapping water bodies using Landsat images and Google Earth Images

    NASA Astrophysics Data System (ADS)

    Zhou, Z.; Zhou, X.

    2016-12-01

    A lot of research has been done for the extraction of water bodies with multiple satellite images. The Water Indexes with the use of multi-spectral images are the mostly used methods for the water bodies' extraction. In order to extract area of water bodies from satellite images, accuracy may depend on the spatial resolution of images and relative size of the water bodies. To quantify the impact of spatial resolution and size (major and minor lengths) of the water bodies on the accuracy of water area extraction, we use Georgetown Lake, Montana and coalbed methane (CBM) water retention ponds in the Montana Powder River Basin as test sites to evaluate the impact of spatial resolution and the size of water bodies on water area extraction. Data sources used include Landsat images and Google Earth images covering both large water bodies and small ponds. Firstly we used water indices to extract water coverage from Landsat images for both large lake and small ponds. Secondly we used a newly developed visible-index method to extract water coverage from Google Earth images covering both large lake and small ponds. Thirdly, we used the image fusion method in which the Google Earth Images are fused with multi-spectral Landsat images to obtain multi-spectral images of the same high spatial resolution as the Google earth images. The actual area of the lake and ponds are measured using GPS surveys. Results will be compared and the optimal method will be selected for water body extraction.

  17. Open Availability of Patient Medical Photographs in Google Images Search Results: Cross-Sectional Study of Transgender Research.

    PubMed

    Marshall, Zack; Brunger, Fern; Welch, Vivian; Asghari, Shabnam; Kaposy, Chris

    2018-02-26

    This paper focuses on the collision of three factors: a growing emphasis on sharing research through open access publication, an increasing awareness of big data and its potential uses, and an engaged public interested in the privacy and confidentiality of their personal health information. One conceptual space where this collision is brought into sharp relief is with the open availability of patient medical photographs from peer-reviewed journal articles in the search results of online image databases such as Google Images. The aim of this study was to assess the availability of patient medical photographs from published journal articles in Google Images search results and the factors impacting this availability. We conducted a cross-sectional study using data from an evidence map of research with transgender, gender non-binary, and other gender diverse (trans) participants. For the original evidence map, a comprehensive search of 15 academic databases was developed in collaboration with a health sciences librarian. Initial search results produced 25,230 references after duplicates were removed. Eligibility criteria were established to include empirical research of any design that included trans participants or their personal information and that was published in English in peer-reviewed journals. We identified all articles published between 2008 and 2015 with medical photographs of trans participants. For each reference, images were individually numbered in order to track the total number of medical photographs. We used odds ratios (OR) to assess the association between availability of the clinical photograph on Google Images and the following factors: whether the article was openly available online (open access, Researchgate.net, or Academia.edu), whether the article included genital images, if the photographs were published in color, and whether the photographs were located on the journal article landing page. We identified 94 articles with medical photographs of trans participants, including a total of 605 photographs. Of the 94 publications, 35 (37%) included at least one medical photograph that was found on Google Images. The ability to locate the article freely online contributes to the availability of at least one image from the article on Google Images (OR 2.99, 95% CI 1.20-7.45). This is the first study to document the existence of medical photographs from peer-reviewed journals appearing in Google Images search results. Almost all of the images we searched for included sensitive photographs of patient genitals, chests, or breasts. Given that it is unlikely that patients consented to sharing their personal health information in these ways, this constitutes a risk to patient privacy. Based on the impact of current practices, revisions to informed consent policies and guidelines are required. ©Zack Marshall, Fern Brunger, Vivian Welch, Shabnam Asghari, Chris Kaposy. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 26.02.2018.

  18. libNeuroML and PyLEMS: using Python to combine procedural and declarative modeling approaches in computational neuroscience.

    PubMed

    Vella, Michael; Cannon, Robert C; Crook, Sharon; Davison, Andrew P; Ganapathy, Gautham; Robinson, Hugh P C; Silver, R Angus; Gleeson, Padraig

    2014-01-01

    NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell, and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two Application Programming Interfaces (APIs) written in Python (http://www.python.org), which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment.

  19. An application programming interface for CellNetAnalyzer.

    PubMed

    Klamt, Steffen; von Kamp, Axel

    2011-08-01

    CellNetAnalyzer (CNA) is a MATLAB toolbox providing computational methods for studying structure and function of metabolic and cellular signaling networks. In order to allow non-experts to use these methods easily, CNA provides GUI-based interactive network maps as a means of parameter input and result visualization. However, with the availability of high-throughput data, there is a need to make CNA's functionality also accessible in batch mode for automatic data processing. Furthermore, as some algorithms of CNA are of general relevance for network analysis it would be desirable if they could be called as sub-routines by other applications. For this purpose, we developed an API (application programming interface) for CNA allowing users (i) to access the content of network models in CNA, (ii) to use CNA's network analysis capabilities independent of the GUI, and (iii) to interact with the GUI to facilitate the development of graphical plugins. Here we describe the organization of network projects in CNA and the application of the new API functions to these projects. This includes the creation of network projects from scratch, loading and saving of projects and scenarios, and the application of the actual analysis methods. Furthermore, API functions for the import/export of metabolic models in SBML format and for accessing the GUI are described. Lastly, two example applications demonstrate the use and versatile applicability of CNA's API. CNA is freely available for academic use and can be downloaded from http://www.mpi-magdeburg.mpg.de/projects/cna/cna.html. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  20. libNeuroML and PyLEMS: using Python to combine procedural and declarative modeling approaches in computational neuroscience

    PubMed Central

    Vella, Michael; Cannon, Robert C.; Crook, Sharon; Davison, Andrew P.; Ganapathy, Gautham; Robinson, Hugh P. C.; Silver, R. Angus; Gleeson, Padraig

    2014-01-01

    NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell, and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two Application Programming Interfaces (APIs) written in Python (http://www.python.org), which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment. PMID:24795618

  1. AFRC2016-0054-528

    NASA Image and Video Library

    2016-02-27

    Sam Choi and Naiara Pinto observe Google Earth overlaid with in almost real time what the synthetic aperture radar is mapping from the C-20A aircraft. Researchers were in the sky and on the ground to take measurements of plant mass, distribution of trees, shrubs and ground cover and the diversity of plants and how much carbon is absorbed by them.

  2. Ingress in Geography: Portals to Academic Success?

    ERIC Educational Resources Information Center

    Davis, Michael

    2017-01-01

    Niantic Labs has developed an augmented virtual reality mobile app game called Ingress in which agents must seek out and control locations for their designated factions. The app uses the Google Maps interface along with GPS to enhance a geocaching-like experience with elements of other classical games such as capture-the-flag. This study aims to…

  3. Applying Modern Stage Theory to Mauritania: A Prescription to Encourage Entrepreneurship

    DTIC Science & Technology

    2014-12-01

    entrepreneurship, stage theory, development, Africa , factor-driven, trade freedom, business freedom 15. NUMBER OF PAGES 77 16. PRICE CODE 17...SOUTH ASIA, SUB-SAHARAN AFRICA ) from the NAVAL POSTGRADUATE SCHOOL December 2014 Author: Jennifer M. Warren Approved by: Robert E...Notes, Coins) .......................................................................... 4  Figure 2.  Satellite map of West Africa (from Google Earth

  4. Re-Purposing Google Maps Visualisation for Teaching Logistics Systems

    ERIC Educational Resources Information Center

    Cheong, France; Cheong, Christopher; Jie, Ferry

    2012-01-01

    Routing is the process of selecting appropriate paths and ordering waypoints in a network. It plays an important part in logistics and supply chain management as choosing the optimal route can minimise distribution costs. Routing optimisation, however, is a difficult problem to solve and computer software is often used to determine the best route.…

  5. Regional early flood warning system: design and implementation

    NASA Astrophysics Data System (ADS)

    Chang, L. C.; Yang, S. N.; Kuo, C. L.; Wang, Y. F.

    2017-12-01

    This study proposes a prototype of the regional early flood inundation warning system in Tainan City, Taiwan. The AI technology is used to forecast multi-step-ahead regional flood inundation maps during storm events. The computing time is only few seconds that leads to real-time regional flood inundation forecasting. A database is built to organize data and information for building real-time forecasting models, maintaining the relations of forecasted points, and displaying forecasted results, while real-time data acquisition is another key task where the model requires immediately accessing rain gauge information to provide forecast services. All programs related database are constructed in Microsoft SQL Server by using Visual C# to extracting real-time hydrological data, managing data, storing the forecasted data and providing the information to the visual map-based display. The regional early flood inundation warning system use the up-to-date Web technologies driven by the database and real-time data acquisition to display the on-line forecasting flood inundation depths in the study area. The friendly interface includes on-line sequentially showing inundation area by Google Map, maximum inundation depth and its location, and providing KMZ file download of the results which can be watched on Google Earth. The developed system can provide all the relevant information and on-line forecast results that helps city authorities to make decisions during typhoon events and make actions to mitigate the losses.

  6. How Would You Move Mount Fuji - And Why Would You Want To?

    NASA Astrophysics Data System (ADS)

    de Paor, D. G.

    2008-12-01

    According to author William Poundstone, "How Would You Move Mt Fuji?" typifies the kind of question that corporations such as Microsoft are wont to ask job applicants in order to test their lateral thinking skills. One answer (albeit not one that would necessarily secure a job at Microsoft) is: "With Google Earth and a Macintosh or PC." The answer to the more profound follow-up question "Why Would You Want To?" is hinted at by one of the great quotations of earth science, namely Charles Lyell's proposition that "The Present Is Key to the Past." Google Earth is a phenomenally powerful tool for visualizing today's earth, ocean, and atmosphere. With the aid of Google SketchUp, that visualization can be extended to reconstruct the past using relocated samples of present-day landscapes and environments as models of paleo-DEM and paleogeography. Volcanoes are particularly useful models because their self similar growth can be simulated by changing KML altitude tags within a timespan, but numerous other landforms and geologic structures serve as useful keys to the past. Examples range in scale from glaciers and fault scarps to island arcs and mountain ranges. The ability to generate a paleo-terrain model in Google Earth brings us one step closer to a truly four- dimensional, interactive geological map of the world throughout time.

  7. The Snow Data System at NASA JPL

    NASA Astrophysics Data System (ADS)

    Laidlaw, R.; Painter, T. H.; Mattmann, C. A.; Ramirez, P.; Bormann, K.; Brodzik, M. J.; Burgess, A. B.; Rittger, K.; Goodale, C. E.; Joyce, M.; McGibbney, L. J.; Zimdars, P.

    2014-12-01

    NASA JPL's Snow Data System has a data-processing pipeline powered by Apache OODT, an open source software tool. The pipeline has been running for several years and has successfully generated a significant amount of cryosphere data, including MODIS-based products such as MODSCAG, MODDRFS and MODICE, with historical and near-real time windows and covering regions such as the Artic, Western US, Alaska, Central Europe, Asia, South America, Australia and New Zealand. The team continues to improve the pipeline, using monitoring tools such as Ganglia to give an overview of operations, and improving fault-tolerance with automated recovery scripts. Several alternative adaptations of the Snow Covered Area and Grain size (SCAG) algorithm are being investigated. These include using VIIRS and Landsat TM/ETM+ satellite data as inputs. Parallel computing techniques are being considered for core SCAG processing, such as using the PyCUDA Python API to utilize multi-core GPU architectures. An experimental version of MODSCAG is also being developed for the Google Earth Engine platform, a cloud-based service.

  8. Seismicity map tools for earthquake studies

    NASA Astrophysics Data System (ADS)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  9. The New USGS Volcano Hazards Program Web Site

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Graham, S. E.; Parker, T. J.; Snedigar, S. F.

    2008-12-01

    The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) has launched a revised web site that uses a map-based interface to display hazards information for U.S. volcanoes. The web site is focused on better communication of hazards and background volcano information to our varied user groups by reorganizing content based on user needs and improving data display. The Home Page provides a synoptic view of the activity level of all volcanoes for which updates are written using a custom Google® Map. Updates are accessible by clicking on one of the map icons or clicking on the volcano of interest in the adjacent color-coded list of updates. The new navigation provides rapid access to volcanic activity information, background volcano information, images and publications, volcanic hazards, information about VHP, and the USGS volcano observatories. The Volcanic Activity section was tailored for emergency managers but provides information for all our user groups. It includes a Google® Map of the volcanoes we monitor, an Elevated Activity Page, a general status page, information about our Volcano Alert Levels and Aviation Color Codes, monitoring information, and links to monitoring data from VHP's volcano observatories: Alaska Volcano Observatory (AVO), Cascades Volcano Observatory (CVO), Long Valley Observatory (LVO), Hawaiian Volcano Observatory (HVO), and Yellowstone Volcano Observatory (YVO). The YVO web site was the first to move to the new navigation system and we are working on integrating the Long Valley Observatory web site next. We are excited to continue to implement new geospatial technologies to better display our hazards and supporting volcano information.

  10. Digital surveying and mapping of forest road network for development of a GIS tool for the effective protection and management of natural ecosystems

    NASA Astrophysics Data System (ADS)

    Drosos, Vasileios C.; Liampas, Sarantis-Aggelos G.; Doukas, Aristotelis-Kosmas G.

    2014-08-01

    In our time, the Geographic Information Systems (GIS) have become important tools, not only in the geosciences and environmental sciences, as well as virtually for all researches that require monitoring, planning or land management. The purpose of this paper was to develop a planning tool and decision making tool using AutoCAD Map software, ArcGIS and Google Earth with emphasis on the investigation of the suitability of forest roads' mapping and the range of its implementation in Greece in prefecture level. Integrating spatial information into a database makes data available throughout the organization; improving quality, productivity, and data management. Also working in such an environment, you can: Access and edit information, integrate and analyze data and communicate effectively. To select desirable information such as forest road network in a very early stage in the planning of silviculture operations, for example before the planning of the harvest is carried out. The software programs that were used were AutoCAD Map for the export in shape files for the GPS data, and ArcGIS in shape files (ArcGlobe), while Google Earth with KML files (Keyhole Markup Language) in order to better visualize and evaluate existing conditions, design in a real-world context and exchange information with government agencies, utilities, and contractors in both CAD and GIS data formats. The automation of the updating procedure and transfer of any files between agencies-departments is one of the main tasks of the integrated GIS-tool among the others should be addressed.

  11. [Implementation of Oncomelania hupensis monitoring system based on Baidu Map].

    PubMed

    Zhi-Hua, Chen; Yi-Sheng, Zhu; Zhi-Qiang, Xue; Xue-Bing, Li; Yi-Min, Ding; Li-Jun, Bi; Kai-Min, Gao; You, Zhang

    2017-10-25

    To construct the Oncomelania hupensis snail monitoring system based on the Baidu Map. The environmental basic information about historical snail environment and existing snail environment, etc. was collected with the monitoring data about different kinds of O. hupensis snails, and then the O. hupensis snail monitoring system was built. Geographic Information System (GIS) and the electronic fence technology and Application Program Interface (API) were applied to set up the electronic fence of the snail surveillance environments, and the electronic fence was connected to the database of the snail surveillance. The O. hupensis snail monitoring system based on the Baidu Map were built up, including three modules of O. hupensis Snail Monitoring Environmental Database, Dynamic Monitoring Platform and Electronic Map. The information about monitoring O. hupensis snails could be obtained through the computer and smartphone simultaneously. The O. hupensis snail monitoring system, which is based on Baidu Map, is a visible platform to follow the process of snailsearching and molluscaciding.

  12. Google Earth-Based Grand Tours of the World's Ocean Basins and Marine Sediments

    NASA Astrophysics Data System (ADS)

    St John, K. K.; De Paor, D. G.; Suranovic, B.; Robinson, C.; Firth, J. V.; Rand, C.

    2016-12-01

    The GEODE project has produced a collection of Google Earth-based marine geology teaching resources that offer grand tours of the world's ocean basins and marine sediments. We use a map of oceanic crustal ages from Müller et al (2008; doi:10.1029/2007GC001743), and a set of emergent COLLADA models of IODP drill core data as a basis for a Google Earth tour introducing students to the world's ocean basins. Most students are familiar with basic seafloor spreading patterns but teaching experience suggests that few students have an appreciation of the number of abandoned ocean basins on Earth. Students also lack a valid visualization of the west Pacific where the oldest crust forms an isolated triangular patch and the ocean floor becomes younger towards the subduction zones. Our tour links geographic locations to mechanical models of rifting, seafloor spreading, subduction, and transform faulting. Google Earth's built-in earthquake and volcano data are related to ocean floor patterns. Marine sediments are explored in a Google Earth tour that draws on exemplary IODP core samples of a range of sediment types (e.g., turbidites, diatom ooze). Information and links are used to connect location to sediment type. This tour compliments a physical core kit of core catcher sections that can be employed for classroom instruction (geode.net/marine-core-kit/). At a larger scale, we use data from IMLGS to explore the distribution of the marine sediments types in the modern global ocean. More than 2,500 sites are plotted with access to original data. Students are guided to compare modern "type sections" of primary marine sediment lithologies, as well as examine site transects to address questions of bathymetric setting, ocean circulation, chemistry (e.g., CCD), and bioproductivity as influences on modern seafloor sedimentation. KMZ files, student exercises, and tips for instructors are available at geode.net/exploring-marine-sediments-using-google-earth.

  13. Towards the Development and Validation of a Global Field Size and Irrigation Map using Crowdsourcing, Mobile Apps and Google Earth Engine in support of GEOGLAM

    NASA Astrophysics Data System (ADS)

    Fritz, S.; Nordling, J.; See, L. M.; McCallum, I.; Perger, C.; Becker-Reshef, I.; Mucher, S.; Bydekerke, L.; Havlik, P.; Kraxner, F.; Obersteiner, M.

    2014-12-01

    The International Institute for Applied Systems Analysis (IIASA) has developed a global cropland extent map, which supports the monitoring and assessment activities of GEOGLAM (Group on Earth Observations Global Agricultural Monitoring Initiative). Through the European-funded SIGMA (Stimulating Innovation for Global Monitoring of Agriculture and its Impact on the Environment in support of GEOGLAM) project, IIASA is continuing to support GEOGLAM by providing cropland projections in the future and modelling environmental impacts on agriculture under various scenarios. In addition, IIASA is focusing on two specific elements within SIGMA: the development of a global field size and irrigation map; and mobile app development for in-situ data collection and validation of remotely-sensed products. Cropland field size is a very useful indicator for agricultural monitoring yet the information we have at a global scale is currently very limited. IIASA has already created a global map of field size at a 1 km resolution using crowdsourced data from Geo-Wiki as a first approximation. Using automatic classification of Landsat imagery and algorithms contained within Google Earth Engine, initial experimentation has shown that circular fields and landscape structures can easily be extracted. Not only will this contribute to improving the global map of field size, it can also be used to create a global map that contains a large proportion of the world's irrigated areas, which will be another useful contribution to GEOGLAM. The field size map will also be used to stratify and develop a global crop map in SIGMA. Mobile app development in support of in-situ data collection is another area where IIASA is currently working. An Android app has been built using the Open Data Toolkit (ODK) and extended further with spatial mapping capabilities called GeoODK. The app allows users to collect data on different crop types and delineate fields on the ground, which can be used to validate the field size map. The app can also cache map data so that high resolution satellite imagery and reference data from the users can be viewed in the field without the need for an internet connection. This app will be used for calibration and validation of the data products in SIGMA, e.g. data collection at JECAM (Joint Experiment of Crop Assessment and Monitoring) sites.

  14. The GLIMS Glacier Database

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2007-12-01

    The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), MapInfo, GML (Geography Markup Language) and GMT (Generic Mapping Tools). This "clip-and-ship" function allows users to download only the data they are interested in. Our flexible web interfaces to the database, which includes various support layers (e.g. a layer to help collaborators identify satellite imagery over their region of expertise) will facilitate enhanced analysis to be undertaken on glacier systems, their distribution, and their impacts on other Earth systems.

  15. Reusable Client-Side JavaScript Modules for Immersive Web-Based Real-Time Collaborative Neuroimage Visualization

    PubMed Central

    Bernal-Rusiel, Jorge L.; Rannou, Nicolas; Gollub, Randy L.; Pieper, Steve; Murphy, Shawn; Robertson, Richard; Grant, Patricia E.; Pienaar, Rudolph

    2017-01-01

    In this paper we present a web-based software solution to the problem of implementing real-time collaborative neuroimage visualization. In both clinical and research settings, simple and powerful access to imaging technologies across multiple devices is becoming increasingly useful. Prior technical solutions have used a server-side rendering and push-to-client model wherein only the server has the full image dataset. We propose a rich client solution in which each client has all the data and uses the Google Drive Realtime API for state synchronization. We have developed a small set of reusable client-side object-oriented JavaScript modules that make use of the XTK toolkit, a popular open-source JavaScript library also developed by our team, for the in-browser rendering and visualization of brain image volumes. Efficient realtime communication among the remote instances is achieved by using just a small JSON object, comprising a representation of the XTK image renderers' state, as the Google Drive Realtime collaborative data model. The developed open-source JavaScript modules have already been instantiated in a web-app called MedView, a distributed collaborative neuroimage visualization application that is delivered to the users over the web without requiring the installation of any extra software or browser plugin. This responsive application allows multiple physically distant physicians or researchers to cooperate in real time to reach a diagnosis or scientific conclusion. It also serves as a proof of concept for the capabilities of the presented technological solution. PMID:28507515

  16. Profile-IQ: Web-based data query system for local health department infrastructure and activities.

    PubMed

    Shah, Gulzar H; Leep, Carolyn J; Alexander, Dayna

    2014-01-01

    To demonstrate the use of National Association of County & City Health Officials' Profile-IQ, a Web-based data query system, and how policy makers, researchers, the general public, and public health professionals can use the system to generate descriptive statistics on local health departments. This article is a descriptive account of an important health informatics tool based on information from the project charter for Profile-IQ and the authors' experience and knowledge in design and use of this query system. Profile-IQ is a Web-based data query system that is based on open-source software: MySQL 5.5, Google Web Toolkit 2.2.0, Apache Commons Math library, Google Chart API, and Tomcat 6.0 Web server deployed on an Amazon EC2 server. It supports dynamic queries of National Profile of Local Health Departments data on local health department finances, workforce, and activities. Profile-IQ's customizable queries provide a variety of statistics not available in published reports and support the growing information needs of users who do not wish to work directly with data files for lack of staff skills or time, or to avoid a data use agreement. Profile-IQ also meets the growing demand of public health practitioners and policy makers for data to support quality improvement, community health assessment, and other processes associated with voluntary public health accreditation. It represents a step forward in the recent health informatics movement of data liberation and use of open source information technology solutions to promote public health.

  17. Interfaces to PeptideAtlas: a case study of standard data access systems

    PubMed Central

    Handcock, Jeremy; Robinson, Thomas; Deutsch, Eric W.; Boyle, John

    2012-01-01

    Access to public data sets is important to the scientific community as a resource to develop new experiments or validate new data. Projects such as the PeptideAtlas, Ensembl and The Cancer Genome Atlas (TCGA) offer both access to public data and a repository to share their own data. Access to these data sets is often provided through a web page form and a web service API. Access technologies based on web protocols (e.g. http) have been in use for over a decade and are widely adopted across the industry for a variety of functions (e.g. search, commercial transactions, and social media). Each architecture adapts these technologies to provide users with tools to access and share data. Both commonly used web service technologies (e.g. REST and SOAP), and custom-built solutions over HTTP are utilized in providing access to research data. Providing multiple access points ensures that the community can access the data in the simplest and most effective manner for their particular needs. This article examines three common access mechanisms for web accessible data: BioMart, caBIG, and Google Data Sources. These are illustrated by implementing each over the PeptideAtlas repository and reviewed for their suitability based on specific usages common to research. BioMart, Google Data Sources, and caBIG are each suitable for certain uses. The tradeoffs made in the development of the technology are dependent on the uses each was designed for (e.g. security versus speed). This means that an understanding of specific requirements and tradeoffs is necessary before selecting the access technology. PMID:22941959

  18. A mangrove forest map of China in 2015: Analysis of time series Landsat 7/8 and Sentinel-1A imagery in Google Earth Engine cloud computing platform

    NASA Astrophysics Data System (ADS)

    Chen, Bangqian; Xiao, Xiangming; Li, Xiangping; Pan, Lianghao; Doughty, Russell; Ma, Jun; Dong, Jinwei; Qin, Yuanwei; Zhao, Bin; Wu, Zhixiang; Sun, Rui; Lan, Guoyu; Xie, Guishui; Clinton, Nicholas; Giri, Chandra

    2017-09-01

    Due to rapid losses of mangrove forests caused by anthropogenic disturbances and climate change, accurate and contemporary maps of mangrove forests are needed to understand how mangrove ecosystems are changing and establish plans for sustainable management. In this study, a new classification algorithm was developed using the biophysical characteristics of mangrove forests in China. More specifically, these forests were mapped by identifying: (1) greenness, canopy coverage, and tidal inundation from time series Landsat data, and (2) elevation, slope, and intersection-with-sea criterion. The annual mean Normalized Difference Vegetation Index (NDVI) was found to be a key variable in determining the classification thresholds of greenness, canopy coverage, and tidal inundation of mangrove forests, which are greatly affected by tide dynamics. In addition, the integration of Sentinel-1A VH band and modified Normalized Difference Water Index (mNDWI) shows great potential in identifying yearlong tidal and fresh water bodies, which is related to mangrove forests. This algorithm was developed using 6 typical Regions of Interest (ROIs) as algorithm training and was run on the Google Earth Engine (GEE) cloud computing platform to process 1941 Landsat images (25 Path/Row) and 586 Sentinel-1A images circa 2015. The resultant mangrove forest map of China at 30 m spatial resolution has an overall/users/producer's accuracy greater than 95% when validated with ground reference data. In 2015, China's mangrove forests had a total area of 20,303 ha, about 92% of which was in the Guangxi Zhuang Autonomous Region, Guangdong, and Hainan Provinces. This study has demonstrated the potential of using the GEE platform, time series Landsat and Sentine-1A SAR images to identify and map mangrove forests along the coastal zones. The resultant mangrove forest maps are likely to be useful for the sustainable management and ecological assessments of mangrove forests in China.

  19. Mapping paddy rice planting area in northeastern Asia with Landsat 8 images, phenology-based algorithm and Google Earth Engine

    PubMed Central

    Dong, Jinwei; Xiao, Xiangming; Menarguez, Michael A.; Zhang, Geli; Qin, Yuanwei; Thau, David; Biradar, Chandrashekhar; Moore, Berrien

    2016-01-01

    Area and spatial distribution information of paddy rice are important for understanding of food security, water use, greenhouse gas emission, and disease transmission. Due to climatic warming and increasing food demand, paddy rice has been expanding rapidly in high latitude areas in the last decade, particularly in northeastern (NE) Asia. Current knowledge about paddy rice fields in these cold regions is limited. The phenology- and pixel-based paddy rice mapping (PPPM) algorithm, which identifies the flooding signals in the rice transplanting phase, has been effectively applied in tropical areas, but has not been tested at large scale of cold regions yet. Despite the effects from more snow/ice, paddy rice mapping in high latitude areas is assumed to be more encouraging due to less clouds, lower cropping intensity, and more observations from Landsat sidelaps. Moreover, the enhanced temporal and geographic coverage from Landsat 8 provides an opportunity to acquire phenology information and map paddy rice. This study evaluated the potential of Landsat 8 images on annual paddy rice mapping in NE Asia which was dominated by single cropping system, including Japan, North Korea, South Korea, and NE China. The cloud computing approach was used to process all the available Landsat 8 imagery in 2014 (143 path/rows, ~3290 scenes) with the Google Earth Engine (GEE) platform. The results indicated that the Landsat 8, GEE, and improved PPPM algorithm can effectively support the yearly mapping of paddy rice in NE Asia. The resultant paddy rice map has a high accuracy with the producer (user) accuracy of 73% (92%), based on the validation using very high resolution images and intensive field photos. Geographic characteristics of paddy rice distribution were analyzed from aspects of country, elevation, latitude, and climate. The resultant 30-m paddy rice map is expected to provide unprecedented details about the area, spatial distribution, and landscape pattern of paddy rice fields in NE Asia, which will contribute to food security assessment, water resource management, estimation of greenhouse gas emissions, and disease control. PMID:28025586

  20. Mapping paddy rice planting area in northeastern Asia with Landsat 8 images, phenology-based algorithm and Google Earth Engine.

    PubMed

    Dong, Jinwei; Xiao, Xiangming; Menarguez, Michael A; Zhang, Geli; Qin, Yuanwei; Thau, David; Biradar, Chandrashekhar; Moore, Berrien

    2016-11-01

    Area and spatial distribution information of paddy rice are important for understanding of food security, water use, greenhouse gas emission, and disease transmission. Due to climatic warming and increasing food demand, paddy rice has been expanding rapidly in high latitude areas in the last decade, particularly in northeastern (NE) Asia. Current knowledge about paddy rice fields in these cold regions is limited. The phenology- and pixel-based paddy rice mapping (PPPM) algorithm, which identifies the flooding signals in the rice transplanting phase, has been effectively applied in tropical areas, but has not been tested at large scale of cold regions yet. Despite the effects from more snow/ice, paddy rice mapping in high latitude areas is assumed to be more encouraging due to less clouds, lower cropping intensity, and more observations from Landsat sidelaps. Moreover, the enhanced temporal and geographic coverage from Landsat 8 provides an opportunity to acquire phenology information and map paddy rice. This study evaluated the potential of Landsat 8 images on annual paddy rice mapping in NE Asia which was dominated by single cropping system, including Japan, North Korea, South Korea, and NE China. The cloud computing approach was used to process all the available Landsat 8 imagery in 2014 (143 path/rows, ~3290 scenes) with the Google Earth Engine (GEE) platform. The results indicated that the Landsat 8, GEE, and improved PPPM algorithm can effectively support the yearly mapping of paddy rice in NE Asia. The resultant paddy rice map has a high accuracy with the producer (user) accuracy of 73% (92%), based on the validation using very high resolution images and intensive field photos. Geographic characteristics of paddy rice distribution were analyzed from aspects of country, elevation, latitude, and climate. The resultant 30-m paddy rice map is expected to provide unprecedented details about the area, spatial distribution, and landscape pattern of paddy rice fields in NE Asia, which will contribute to food security assessment, water resource management, estimation of greenhouse gas emissions, and disease control.

  1. A Framework for Building and Reasoning with Adaptive and Interoperable PMESII Models

    DTIC Science & Technology

    2007-11-01

    Description Logic SOA Service Oriented Architecture SPARQL Simple Protocol And RDF Query Language SQL Standard Query Language SROM Stability and...another by providing a more expressive ontological structure for one of the models, e.g., semantic networks can be mapped to first- order logical...Pellet is an open-source reasoner that works with OWL-DL. It accepts the SPARQL protocol and RDF query language ( SPARQL ) and provides a Java API to

  2. Web application and database modeling of traffic impact analysis using Google Maps

    NASA Astrophysics Data System (ADS)

    Yulianto, Budi; Setiono

    2017-06-01

    Traffic impact analysis (TIA) is a traffic study that aims at identifying the impact of traffic generated by development or change in land use. In addition to identifying the traffic impact, TIA is also equipped with mitigation measurement to minimize the arising traffic impact. TIA has been increasingly important since it was defined in the act as one of the requirements in the proposal of Building Permit. The act encourages a number of TIA studies in various cities in Indonesia, including Surakarta. For that reason, it is necessary to study the development of TIA by adopting the concept Transportation Impact Control (TIC) in the implementation of the TIA standard document and multimodal modeling. It includes TIA's standardization for technical guidelines, database and inspection by providing TIA checklists, monitoring and evaluation. The research was undertaken by collecting the historical data of junctions, modeling of the data in the form of relational database, building a user interface for CRUD (Create, Read, Update and Delete) the TIA data in the form of web programming with Google Maps libraries. The result research is a system that provides information that helps the improvement and repairment of TIA documents that exist today which is more transparent, reliable and credible.

  3. KML Super Overlay to WMS Translator

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2007-01-01

    This translator is a server-based application that automatically generates KML super overlay configuration files required by Google Earth for map data access via the Open Geospatial Consortium WMS (Web Map Service) standard. The translator uses a set of URL parameters that mirror the WMS parameters as much as possible, and it also can generate a super overlay subdivision of any given area that is only loaded when needed, enabling very large areas of coverage at very high resolutions. It can make almost any dataset available as a WMS service visible and usable in any KML application, without the need to reformat the data.

  4. TU-D-201-06: HDR Plan Prechecks Using Eclipse Scripting API

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palaniswaamy, G; Morrow, A; Kim, S

    Purpose: Automate brachytherapy treatment plan quality check using Eclipse v13.6 scripting API based on pre-configured rules to minimize human error and maximize efficiency. Methods: The HDR Precheck system is developed based on a rules-driven approach using Eclipse scripting API. This system checks for critical plan parameters like channel length, first source position, source step size and channel mapping. The planned treatment time is verified independently based on analytical methods. For interstitial or SAVI APBI treatment plans, a Patterson-Parker system calculation is performed to verify the planned treatment time. For endobronchial treatments, an analytical formula from TG-59 is used. Acceptable tolerancesmore » were defined based on clinical experiences in our department. The system was designed to show PASS/FAIL status levels. Additional information, if necessary, is indicated appropriately in a separate comments field in the user interface. Results: The HDR Precheck system has been developed and tested to verify the treatment plan parameters that are routinely checked by the clinical physicist. The report also serves as a reminder or checklist for the planner to perform any additional critical checks such as applicator digitization or scenarios where the channel mapping was intentionally changed. It is expected to reduce the current manual plan check time from 15 minutes to <1 minute. Conclusion: Automating brachytherapy plan prechecks significantly reduces treatment plan precheck time and reduces human errors. When fully developed, this system will be able to perform TG-43 based second check of the treatment planning system’s dose calculation using random points in the target and critical structures. A histogram will be generated along with tabulated mean and standard deviation values for each structure. A knowledge database will also be developed for Brachyvision plans which will then be used for knowledge-based plan quality checks to further reduce treatment planning errors and increase confidence in the planned treatment.« less

  5. The Lunar Mapping and Modeling Project

    NASA Technical Reports Server (NTRS)

    Nall, M.; French, R.; Noble, S.; Muery, K.

    2010-01-01

    The Lunar Mapping and Modeling Project (LMMP) is managing a suite of lunar mapping and modeling tools and data products that support lunar exploration activities, including the planning, de-sign, development, test, and operations associated with crewed and/or robotic operations on the lunar surface. Although the project was initiated primarily to serve the needs of the Constellation program, it is equally suited for supporting landing site selection and planning for a variety of robotic missions, including NASA science and/or human precursor missions and commercial missions such as those planned by the Google Lunar X-Prize participants. In addition, LMMP should prove to be a convenient and useful tool for scientific analysis and for education and public out-reach (E/PO) activities.

  6. Assessing Coupled Social Ecological Flood Vulnerability from Uttarakhand, India, to the State of New York with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Schwarz, B.

    2014-12-01

    This talk describes the development of a web application to predict and communicate vulnerability to floods given publicly available data, disaster science, and geotech cloud capabilities. The proof of concept in Google Earth Engine API with initial testing on case studies in New York and Utterakhand India demonstrates the potential of highly parallelized cloud computing to model socio-ecological disaster vulnerability at high spatial and temporal resolution and in near real time. Cloud computing facilitates statistical modeling with variables derived from large public social and ecological data sets, including census data, nighttime lights (NTL), and World Pop to derive social parameters together with elevation, satellite imagery, rainfall, and observed flood data from Dartmouth Flood Observatory to derive biophysical parameters. While more traditional, physically based hydrological models that rely on flow algorithms and numerical methods are currently unavailable in parallelized computing platforms like Google Earth Engine, there is high potential to explore "data driven" modeling that trades physics for statistics in a parallelized environment. A data driven approach to flood modeling with geographically weighted logistic regression has been initially tested on Hurricane Irene in southeastern New York. Comparison of model results with observed flood data reveals a 97% accuracy of the model to predict flooded pixels. Testing on multiple storms is required to further validate this initial promising approach. A statistical social-ecological flood model that could produce rapid vulnerability assessments to predict who might require immediate evacuation and where could serve as an early warning. This type of early warning system would be especially relevant in data poor places lacking the computing power, high resolution data such as LiDar and stream gauges, or hydrologic expertise to run physically based models in real time. As the data-driven model presented relies on globally available data, the only real time data input required would be typical data from a weather service, e.g. precipitation or coarse resolution flood prediction. However, model uncertainty will vary locally depending upon the resolution and frequency of observed flood and socio-economic damage impact data.

  7. Using Webgis and Cloud Tools to Promote Cultural Heritage Dissemination: the Historic up Project

    NASA Astrophysics Data System (ADS)

    Tommasi, A.; Cefalo, R.; Zardini, F.; Nicolaucig, M.

    2017-05-01

    On the occasion of the First World War centennial, GeoSNav Lab (Geodesy and Satellite Navigation Laboratory), Department of Engineering and Architecture, University of Trieste, Italy, in coooperation with Radici&Futuro Association, Trieste, Italy, carried out an educational Project named "Historic Up" involving a group of students from "F. Petrarca" High School of Trieste, Italy. The main goal of the project is to make available to students of Middle and High Schools a set of historical and cultural contents in a simple and immediate way, through the production of a virtual and interactive tour following the event that caused the burst of the First World War: the assassination of Franz Ferdinand and his wife Sofia in Sarajevo occurred on June 28, 1914. A set of Google Apps was used, including Google Earth, Maps, Tour Builder, Street View, Gmail, Drive, and Docs. The Authors instructed the students about software and team-working and supported them along the research. After being checked, all the historical and geographic data have been uploaded on a Google Tour Builder to create a sequence of historical checkpoints. Each checkpoint has texts, pictures and videos that connect the tour-users to 1914. Moreover, GeoSNaV Lab researchers produced a KML (Keyhole Markup Language) file, formed by several polylines and points, representing the itinerary of the funeral procession that has been superimposed on ad-hoc georeferenced historical maps. This tour, freely available online, starts with the arrival of the royals, on June 28th 1914, and follows the couple along the events, from the assassination to the burial in Arstetten (Austria), including their passages through Trieste (Italy), Ljubljana (Slovenia), Graz and Wien (Austria).

  8. TumorMap: Exploring the Molecular Similarities of Cancer Samples in an Interactive Portal.

    PubMed

    Newton, Yulia; Novak, Adam M; Swatloski, Teresa; McColl, Duncan C; Chopra, Sahil; Graim, Kiley; Weinstein, Alana S; Baertsch, Robert; Salama, Sofie R; Ellrott, Kyle; Chopra, Manu; Goldstein, Theodore C; Haussler, David; Morozova, Olena; Stuart, Joshua M

    2017-11-01

    Vast amounts of molecular data are being collected on tumor samples, which provide unique opportunities for discovering trends within and between cancer subtypes. Such cross-cancer analyses require computational methods that enable intuitive and interactive browsing of thousands of samples based on their molecular similarity. We created a portal called TumorMap to assist in exploration and statistical interrogation of high-dimensional complex "omics" data in an interactive and easily interpretable way. In the TumorMap, samples are arranged on a hexagonal grid based on their similarity to one another in the original genomic space and are rendered with Google's Map technology. While the important feature of this public portal is the ability for the users to build maps from their own data, we pre-built genomic maps from several previously published projects. We demonstrate the utility of this portal by presenting results obtained from The Cancer Genome Atlas project data. Cancer Res; 77(21); e111-4. ©2017 AACR . ©2017 American Association for Cancer Research.

  9. Urban topography for flood modeling by fusion of OpenStreetMap, SRTM and local knowledge

    NASA Astrophysics Data System (ADS)

    Winsemius, Hessel; Donchyts, Gennadii; Eilander, Dirk; Chen, Jorik; Leskens, Anne; Coughlan, Erin; Mawanda, Shaban; Ward, Philip; Diaz Loaiza, Andres; Luo, Tianyi; Iceland, Charles

    2016-04-01

    Topography data is essential for understanding and modeling of urban flood hazard. Within urban areas, much of the topography is defined by highly localized man-made features such as roads, channels, ditches, culverts and buildings. This results in the requirement that urban flood models require high resolution topography, and water conveying connections within the topography are considered. In recent years, more and more topography information is collected through LIDAR surveys however there are still many cities in the world where high resolution topography data is not available. Furthermore, information on connectivity is required for flood modelling, even when LIDAR data are used. In this contribution, we demonstrate how high resolution terrain data can be synthesized using a fusion between features in OpenStreetMap (OSM) data (including roads, culverts, channels and buildings) and existing low resolution and noisy SRTM elevation data using the Google Earth Engine platform. Our method uses typical existing OSM properties to estimate heights and topology associated with the features, and uses these to correct noise and burn features on top of the existing low resolution SRTM elevation data. The method has been setup in the Google Earth Engine platform so that local stakeholders and mapping teams can on-the-fly propose, include and visualize the effect of additional features and properties of features, which are deemed important for topography and water conveyance. These features can be included in a workshop environment. We pilot our tool over Dar Es Salaam.

  10. Machine Learning for Flood Prediction in Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Kuhn, C.; Tellman, B.; Max, S. A.; Schwarz, B.

    2015-12-01

    With the increasing availability of high-resolution satellite imagery, dynamic flood mapping in near real time is becoming a reachable goal for decision-makers. This talk describes a newly developed framework for predicting biophysical flood vulnerability using public data, cloud computing and machine learning. Our objective is to define an approach to flood inundation modeling using statistical learning methods deployed in a cloud-based computing platform. Traditionally, static flood extent maps grounded in physically based hydrologic models can require hours of human expertise to construct at significant financial cost. In addition, desktop modeling software and limited local server storage can impose restraints on the size and resolution of input datasets. Data-driven, cloud-based processing holds promise for predictive watershed modeling at a wide range of spatio-temporal scales. However, these benefits come with constraints. In particular, parallel computing limits a modeler's ability to simulate the flow of water across a landscape, rendering traditional routing algorithms unusable in this platform. Our project pushes these limits by testing the performance of two machine learning algorithms, Support Vector Machine (SVM) and Random Forests, at predicting flood extent. Constructed in Google Earth Engine, the model mines a suite of publicly available satellite imagery layers to use as algorithm inputs. Results are cross-validated using MODIS-based flood maps created using the Dartmouth Flood Observatory detection algorithm. Model uncertainty highlights the difficulty of deploying unbalanced training data sets based on rare extreme events.

  11. Reliable, Memory Speed Storage for Cluster Computing Frameworks

    DTIC Science & Technology

    2014-06-16

    specification API that can capture computations in many of today’s popular data -parallel computing models, e.g., MapReduce and SQL. We also ported the Hadoop ...today’s big data workloads: • Immutable data : Data is immutable once written, since dominant underlying storage systems, such as HDFS [3], only support...network transfers, so reads can be data -local. • Program size vs. data size: In big data processing, the same operation is repeatedly applied on massive

  12. iAnn: an event sharing platform for the life sciences.

    PubMed

    Jimenez, Rafael C; Albar, Juan P; Bhak, Jong; Blatter, Marie-Claude; Blicher, Thomas; Brazas, Michelle D; Brooksbank, Cath; Budd, Aidan; De Las Rivas, Javier; Dreyer, Jacqueline; van Driel, Marc A; Dunn, Michael J; Fernandes, Pedro L; van Gelder, Celia W G; Hermjakob, Henning; Ioannidis, Vassilios; Judge, David P; Kahlem, Pascal; Korpelainen, Eija; Kraus, Hans-Joachim; Loveland, Jane; Mayer, Christine; McDowall, Jennifer; Moran, Federico; Mulder, Nicola; Nyronen, Tommi; Rother, Kristian; Salazar, Gustavo A; Schneider, Reinhard; Via, Allegra; Villaveces, Jose M; Yu, Ping; Schneider, Maria V; Attwood, Teresa K; Corpas, Manuel

    2013-08-01

    We present iAnn, an open source community-driven platform for dissemination of life science events, such as courses, conferences and workshops. iAnn allows automatic visualisation and integration of customised event reports. A central repository lies at the core of the platform: curators add submitted events, and these are subsequently accessed via web services. Thus, once an iAnn widget is incorporated into a website, it permanently shows timely relevant information as if it were native to the remote site. At the same time, announcements submitted to the repository are automatically disseminated to all portals that query the system. To facilitate the visualization of announcements, iAnn provides powerful filtering options and views, integrated in Google Maps and Google Calendar. All iAnn widgets are freely available. http://iann.pro/iannviewer manuel.corpas@tgac.ac.uk.

  13. A data colocation grid framework for big data medical image processing: backend design

    NASA Astrophysics Data System (ADS)

    Bao, Shunxing; Huo, Yuankai; Parvathaneni, Prasanna; Plassard, Andrew J.; Bermudez, Camilo; Yao, Yuang; Lyu, Ilwoo; Gokhale, Aniruddha; Landman, Bennett A.

    2018-03-01

    When processing large medical imaging studies, adopting high performance grid computing resources rapidly becomes important. We recently presented a "medical image processing-as-a-service" grid framework that offers promise in utilizing the Apache Hadoop ecosystem and HBase for data colocation by moving computation close to medical image storage. However, the framework has not yet proven to be easy to use in a heterogeneous hardware environment. Furthermore, the system has not yet validated when considering variety of multi-level analysis in medical imaging. Our target design criteria are (1) improving the framework's performance in a heterogeneous cluster, (2) performing population based summary statistics on large datasets, and (3) introducing a table design scheme for rapid NoSQL query. In this paper, we present a heuristic backend interface application program interface (API) design for Hadoop and HBase for Medical Image Processing (HadoopBase-MIP). The API includes: Upload, Retrieve, Remove, Load balancer (for heterogeneous cluster) and MapReduce templates. A dataset summary statistic model is discussed and implemented by MapReduce paradigm. We introduce a HBase table scheme for fast data query to better utilize the MapReduce model. Briefly, 5153 T1 images were retrieved from a university secure, shared web database and used to empirically access an in-house grid with 224 heterogeneous CPU cores. Three empirical experiments results are presented and discussed: (1) load balancer wall-time improvement of 1.5-fold compared with a framework with built-in data allocation strategy, (2) a summary statistic model is empirically verified on grid framework and is compared with the cluster when deployed with a standard Sun Grid Engine (SGE), which reduces 8-fold of wall clock time and 14-fold of resource time, and (3) the proposed HBase table scheme improves MapReduce computation with 7 fold reduction of wall time compare with a naïve scheme when datasets are relative small. The source code and interfaces have been made publicly available.

  14. A Data Colocation Grid Framework for Big Data Medical Image Processing: Backend Design.

    PubMed

    Bao, Shunxing; Huo, Yuankai; Parvathaneni, Prasanna; Plassard, Andrew J; Bermudez, Camilo; Yao, Yuang; Lyu, Ilwoo; Gokhale, Aniruddha; Landman, Bennett A

    2018-03-01

    When processing large medical imaging studies, adopting high performance grid computing resources rapidly becomes important. We recently presented a "medical image processing-as-a-service" grid framework that offers promise in utilizing the Apache Hadoop ecosystem and HBase for data colocation by moving computation close to medical image storage. However, the framework has not yet proven to be easy to use in a heterogeneous hardware environment. Furthermore, the system has not yet validated when considering variety of multi-level analysis in medical imaging. Our target design criteria are (1) improving the framework's performance in a heterogeneous cluster, (2) performing population based summary statistics on large datasets, and (3) introducing a table design scheme for rapid NoSQL query. In this paper, we present a heuristic backend interface application program interface (API) design for Hadoop & HBase for Medical Image Processing (HadoopBase-MIP). The API includes: Upload, Retrieve, Remove, Load balancer (for heterogeneous cluster) and MapReduce templates. A dataset summary statistic model is discussed and implemented by MapReduce paradigm. We introduce a HBase table scheme for fast data query to better utilize the MapReduce model. Briefly, 5153 T1 images were retrieved from a university secure, shared web database and used to empirically access an in-house grid with 224 heterogeneous CPU cores. Three empirical experiments results are presented and discussed: (1) load balancer wall-time improvement of 1.5-fold compared with a framework with built-in data allocation strategy, (2) a summary statistic model is empirically verified on grid framework and is compared with the cluster when deployed with a standard Sun Grid Engine (SGE), which reduces 8-fold of wall clock time and 14-fold of resource time, and (3) the proposed HBase table scheme improves MapReduce computation with 7 fold reduction of wall time compare with a naïve scheme when datasets are relative small. The source code and interfaces have been made publicly available.

  15. A Data Colocation Grid Framework for Big Data Medical Image Processing: Backend Design

    PubMed Central

    Huo, Yuankai; Parvathaneni, Prasanna; Plassard, Andrew J.; Bermudez, Camilo; Yao, Yuang; Lyu, Ilwoo; Gokhale, Aniruddha; Landman, Bennett A.

    2018-01-01

    When processing large medical imaging studies, adopting high performance grid computing resources rapidly becomes important. We recently presented a "medical image processing-as-a-service" grid framework that offers promise in utilizing the Apache Hadoop ecosystem and HBase for data colocation by moving computation close to medical image storage. However, the framework has not yet proven to be easy to use in a heterogeneous hardware environment. Furthermore, the system has not yet validated when considering variety of multi-level analysis in medical imaging. Our target design criteria are (1) improving the framework’s performance in a heterogeneous cluster, (2) performing population based summary statistics on large datasets, and (3) introducing a table design scheme for rapid NoSQL query. In this paper, we present a heuristic backend interface application program interface (API) design for Hadoop & HBase for Medical Image Processing (HadoopBase-MIP). The API includes: Upload, Retrieve, Remove, Load balancer (for heterogeneous cluster) and MapReduce templates. A dataset summary statistic model is discussed and implemented by MapReduce paradigm. We introduce a HBase table scheme for fast data query to better utilize the MapReduce model. Briefly, 5153 T1 images were retrieved from a university secure, shared web database and used to empirically access an in-house grid with 224 heterogeneous CPU cores. Three empirical experiments results are presented and discussed: (1) load balancer wall-time improvement of 1.5-fold compared with a framework with built-in data allocation strategy, (2) a summary statistic model is empirically verified on grid framework and is compared with the cluster when deployed with a standard Sun Grid Engine (SGE), which reduces 8-fold of wall clock time and 14-fold of resource time, and (3) the proposed HBase table scheme improves MapReduce computation with 7 fold reduction of wall time compare with a naïve scheme when datasets are relative small. The source code and interfaces have been made publicly available. PMID:29887668

  16. Alabama Public Scoping Meeting | NOAA Gulf Spill Restoration

    Science.gov Websites

    : Mobile, AL Start Time: 6:30 p.m. Central Time Description: As part of the public scoping process, the co open at 6:30 p.m. and the meeting will begin at 7:30 p.m. Location: The Battle House Renaissance Mobile Hotel & Spa 26 North Royal Street Mobile, AL 36602 (google map of location) Gulf Spill Restoration

  17. Tactical Level Commander and Staff Toolkit

    DTIC Science & Technology

    2010-01-01

    Sites Geodata.gov (for maps) http://gos2.geodata.gov Google Earth for .mil (United States Army Corps of Engineers (USACE) site) https...the eyes, ears, head, hands, back, and feet. When appropriate, personnel should wear protective lenses, goggles, or face shields . Leaders should...Typical hurricanes are about 300 miles wide, although they can vary considerably. Size is not necessarily an indication of hurricane intensity. The

  18. biobambam: tools for read pair collation based algorithms on BAM files

    PubMed Central

    2014-01-01

    Background Sequence alignment data is often ordered by coordinate (id of the reference sequence plus position on the sequence where the fragment was mapped) when stored in BAM files, as this simplifies the extraction of variants between the mapped data and the reference or of variants within the mapped data. In this order paired reads are usually separated in the file, which complicates some other applications like duplicate marking or conversion to the FastQ format which require to access the full information of the pairs. Results In this paper we introduce biobambam, a set of tools based on the efficient collation of alignments in BAM files by read name. The employed collation algorithm avoids time and space consuming sorting of alignments by read name where this is possible without using more than a specified amount of main memory. Using this algorithm tasks like duplicate marking in BAM files and conversion of BAM files to the FastQ format can be performed very efficiently with limited resources. We also make the collation algorithm available in the form of an API for other projects. This API is part of the libmaus package. Conclusions In comparison with previous approaches to problems involving the collation of alignments by read name like the BAM to FastQ or duplication marking utilities our approach can often perform an equivalent task more efficiently in terms of the required main memory and run-time. Our BAM to FastQ conversion is faster than all widely known alternatives including Picard and bamUtil. Our duplicate marking is about as fast as the closest competitor bamUtil for small data sets and faster than all known alternatives on large and complex data sets.

  19. Hadoop-BAM: directly manipulating next generation sequencing data in the cloud

    PubMed Central

    Niemenmaa, Matti; Kallio, Aleksi; Schumacher, André; Klemelä, Petri; Korpelainen, Eija; Heljanko, Keijo

    2012-01-01

    Summary: Hadoop-BAM is a novel library for the scalable manipulation of aligned next-generation sequencing data in the Hadoop distributed computing framework. It acts as an integration layer between analysis applications and BAM files that are processed using Hadoop. Hadoop-BAM solves the issues related to BAM data access by presenting a convenient API for implementing map and reduce functions that can directly operate on BAM records. It builds on top of the Picard SAM JDK, so tools that rely on the Picard API are expected to be easily convertible to support large-scale distributed processing. In this article we demonstrate the use of Hadoop-BAM by building a coverage summarizing tool for the Chipster genome browser. Our results show that Hadoop offers good scalability, and one should avoid moving data in and out of Hadoop between analysis steps. Availability: Available under the open-source MIT license at http://sourceforge.net/projects/hadoop-bam/ Contact: matti.niemenmaa@aalto.fi Supplementary information: Supplementary material is available at Bioinformatics online. PMID:22302568

  20. Coordinating complex decision support activities across distributed applications

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  1. The E3 ubiquitin ligases β-TrCP and FBXW7 cooperatively mediates GSK3-dependent Mcl-1 degradation induced by the Akt inhibitor API-1, resulting in apoptosis.

    PubMed

    Ren, Hui; Koo, Junghui; Guan, Baoxiang; Yue, Ping; Deng, Xingming; Chen, Mingwei; Khuri, Fadlo R; Sun, Shi-Yong

    2013-11-22

    The novel Akt inhibitor, API-1, induces apoptosis through undefined mechanisms. The current study focuses on revealing the mechanisms by which API-1 induces apoptosis. API-1 rapidly and potently reduced the levels of Mcl-1 primarily in API-1-senstive lung cancer cell lines. Ectopic expression of Mcl-1 protected cells from induction of apoptosis by API-1. API-1 treatment decreased the half-life of Mcl-1, whereas inhibition of the proteasome with MG132 rescued Mcl-1 reduction induced by API-1. API-1 decreased Mcl-1 levels accompanied with a rapid increase in Mcl-1 phosphorylation (S159/T163). Moreover, inhibition of GSK3 inhibited Mcl-1 phosphorylation and reduction induced by API-1 and antagonized the effect of API-1 on induction of apoptosis. Knockdown of either FBXW7 or β-TrCP alone, both of which are E3 ubiquitin ligases involved in Mcl-1 degradation, only partially rescued Mcl-1 reduction induced by API-1. However, double knockdown of both E3 ubiquitin ligases enhanced the rescue of API-1-induced Mcl-1 reduction. API-1 induces GSK3-dependent, β-TrCP- and FBXW7-mediated Mcl-1 degradation, resulting in induction of apoptosis.

  2. The E3 ubiquitin ligases β-TrCP and FBXW7 cooperatively mediates GSK3-dependent Mcl-1 degradation induced by the Akt inhibitor API-1, resulting in apoptosis

    PubMed Central

    2013-01-01

    Background The novel Akt inhibitor, API-1, induces apoptosis through undefined mechanisms. The current study focuses on revealing the mechanisms by which API-1 induces apoptosis. Results API-1 rapidly and potently reduced the levels of Mcl-1 primarily in API-1-senstive lung cancer cell lines. Ectopic expression of Mcl-1 protected cells from induction of apoptosis by API-1. API-1 treatment decreased the half-life of Mcl-1, whereas inhibition of the proteasome with MG132 rescued Mcl-1 reduction induced by API-1. API-1 decreased Mcl-1 levels accompanied with a rapid increase in Mcl-1 phosphorylation (S159/T163). Moreover, inhibition of GSK3 inhibited Mcl-1 phosphorylation and reduction induced by API-1 and antagonized the effect of API-1 on induction of apoptosis. Knockdown of either FBXW7 or β-TrCP alone, both of which are E3 ubiquitin ligases involved in Mcl-1 degradation, only partially rescued Mcl-1 reduction induced by API-1. However, double knockdown of both E3 ubiquitin ligases enhanced the rescue of API-1-induced Mcl-1 reduction. Conclusions API-1 induces GSK3-dependent, β-TrCP- and FBXW7-mediated Mcl-1 degradation, resulting in induction of apoptosis. PMID:24261825

  3. DyninstAPI Patches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LeGendre, M.

    2012-04-01

    We are seeking a code review of patches against DyninstAPI 8.0. DyninstAPI is an open source binary instrumentation library from the University of Wisconsin and University of Maryland. Our patches port DyninstAPI to the BlueGene/P and BlueGene/Q systems, as well as fix DyninstAPI bugs and implement minor new features in DyninstAPI.

  4. Utility of Mobile phones to support In-situ data collection for Land Cover Mapping

    NASA Astrophysics Data System (ADS)

    Oduor, P.; Omondi, S.; Wahome, A.; Mugo, R. M.; Flores, A.

    2017-12-01

    With the compelling need to create better monitoring tools for our landscapes to enhance better decision making processes, it becomes imperative to do so in much more sophisticated yet simple ways. Making it possible to leverage untapped potential of our "lay men" at the same time enabling us to respond to the complexity of the information we have to get out. SERVIR Eastern and Southern Africa has developed a mobile app that can be utilized with very little prior knowledge or no knowledge at all to collect spatial information on land cover. This set of in-situ data can be collected by masses because the tools is very simple to use, and have this information fed in classification algorithms than can then be used to map out our ever changing landscape. The LULC Mapper is a subset of JiMap system and is able to pull the google earth imagery and open street maps to enable user familiarize with their location. It uses phone GPS, phone network information to map location coordinates and at the same time gives the user sample picture of what to categorize their landscape. The system is able to work offline and when user gets access to internet they can push the information into an amazon database as bulk data. The location details including geotagged photos allows the data to be used in development of a lot of spatial information including land cover data. The app is currently available in Google Play Store and will soon be uploaded on Appstore for utilization by a wider community. We foresee a lot of potential in this tool in terms of making data collection cheaper and affordable. Taking advantage of the advances made in phone technology. We envisage to do a data collection campaign where we can have the tool used for crowdsourcing.

  5. The Marine Geoscience Data System and the Global Multi-Resolution Topography Synthesis: Online Resources for Exploring Ocean Mapping Data

    NASA Astrophysics Data System (ADS)

    Ferrini, V. L.; Morton, J. J.; Carbotte, S. M.

    2016-02-01

    The Marine Geoscience Data System (MGDS: www.marine-geo.org) provides a suite of tools and services for free public access to data acquired throughout the global oceans including maps, grids, near-bottom photos, and geologic interpretations that are essential for habitat characterization and marine spatial planning. Users can explore, discover, and download data through a combination of APIs and front-end interfaces that include dynamic service-driven maps, a geospatially enabled search engine, and an easy to navigate user interface for browsing and discovering related data. MGDS offers domain-specific data curation with a team of scientists and data specialists who utilize a suite of back-end tools for introspection of data files and metadata assembly to verify data quality and ensure that data are well-documented for long-term preservation and re-use. Funded by the NSF as part of the multi-disciplinary IEDA Data Facility, MGDS also offers Data DOI registration and links between data and scientific publications. MGDS produces and curates the Global Multi-Resolution Topography Synthesis (GMRT: gmrt.marine-geo.org), a continuously updated Digital Elevation Model that seamlessly integrates multi-resolutional elevation data from a variety of sources including the GEBCO 2014 ( 1 km resolution) and International Bathymetric Chart of the Southern Ocean ( 500 m) compilations. A significant component of GMRT includes ship-based multibeam sonar data, publicly available through NOAA's National Centers for Environmental Information, that are cleaned and quality controlled by the MGDS Team and gridded at their full spatial resolution (typically 100 m resolution in the deep sea). Additional components include gridded bathymetry products contributed by individual scientists (up to meter scale resolution in places), publicly accessible regional bathymetry, and high-resolution terrestrial elevation data. New data are added to GMRT on an ongoing basis, with two scheduled releases per year. GMRT is available as both gridded data and images that can be viewed and downloaded directly through the Java application GeoMapApp (www.geomapapp.org) and the web-based GMRT MapTool. In addition, the GMRT GridServer API provides programmatic access to grids, imagery, profiles, and single point elevation values.

  6. Fusion of the C-terminal triskaidecapeptide of hirudin variant 3 to alpha1-proteinase inhibitor M358R increases the serpin-mediated rate of thrombin inhibition

    PubMed Central

    2013-01-01

    Background Alpha-1 proteinase inhibitor (API) is a plasma serpin superfamily member that inhibits neutrophil elastase; variant API M358R inhibits thrombin and activated protein C (APC). Fusing residues 1-75 of another serpin, heparin cofactor II (HCII), to API M358R (in HAPI M358R) was previously shown to accelerate thrombin inhibition over API M358R by conferring thrombin exosite 1 binding properties. We hypothesized that replacing HCII 1-75 region with the 13 C-terminal residues (triskaidecapeptide) of hirudin variant 3 (HV354-66) would further enhance the inhibitory potency of API M358R fusion proteins. We therefore expressed HV3API M358R (HV354-66 fused to API M358R) and HV3API RCL5 (HV354-66 fused to API F352A/L353V/E354V/A355I/I356A/I460L/M358R) API M358R) as N-terminally hexahistidine-tagged polypeptides in E. coli. Results HV3API M358R inhibited thrombin 3.3-fold more rapidly than API M358R; for HV3API RCL5 the rate enhancement was 1.9-fold versus API RCL5; neither protein inhibited thrombin as rapidly as HAPI M358R. While the thrombin/Activated Protein C rate constant ratio was 77-fold higher for HV3API RCL5 than for HV3API M358R, most of the increased specificity derived from the API F352A/L353V/E354V/A355I/I356A/I460L API RCL 5 mutations, since API RCL5 remained 3-fold more specific than HV3API RCL5. An HV3 54-66 peptide doubled the Thrombin Clotting Time (TCT) and halved the binding of thrombin to immobilized HCII 1-75 at lower concentrations than free HCII 1-75. HV3API RCL5 bound active site-inhibited FPR-chloromethyl ketone-thrombin more effectively than HAPI RCL5. Transferring the position of the fused HV3 triskaidecapeptide to the C-terminus of API M358R decreased the rate of thrombin inhibition relative to that mediated by HV3API M358R by 11-to 14-fold. Conclusions Fusing the C-terminal triskaidecapeptide of HV3 to API M358R-containing serpins significantly increased their effectiveness as thrombin inhibitors, but the enhancement was less than that seen in HCII 1-75–API M358R fusion proteins. HCII 1-75 was a superior fusion partner, in spite of the greater affinity of the HV3 triskaidecapeptide, manifested both in isolated and API-fused form, for thrombin exosite 1. Our results suggest that HCII 1-75 binds thrombin exosite 1 and orients the attached serpin scaffold for more efficient interaction with the active site of thrombin than the HV3 triskaidecapeptide. PMID:24215622

  7. Cloud Geospatial Analysis Tools for Global-Scale Comparisons of Population Models for Decision Making

    NASA Astrophysics Data System (ADS)

    Hancher, M.; Lieber, A.; Scott, L.

    2017-12-01

    The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.

  8. Neighbourhood looking glass: 360º automated characterisation of the built environment for neighbourhood effects research.

    PubMed

    Nguyen, Quynh C; Sajjadi, Mehdi; McCullough, Matt; Pham, Minh; Nguyen, Thu T; Yu, Weijun; Meng, Hsien-Wen; Wen, Ming; Li, Feifei; Smith, Ken R; Brunisholz, Kim; Tasdizen, Tolga

    2018-03-01

    Neighbourhood quality has been connected with an array of health issues, but neighbourhood research has been limited by the lack of methods to characterise large geographical areas. This study uses innovative computer vision methods and a new big data source of street view images to automatically characterise neighbourhood built environments. A total of 430 000 images were obtained using Google's Street View Image API for Salt Lake City, Chicago and Charleston. Convolutional neural networks were used to create indicators of street greenness, crosswalks and building type. We implemented log Poisson regression models to estimate associations between built environment features and individual prevalence of obesity and diabetes in Salt Lake City, controlling for individual-level and zip code-level predisposing characteristics. Computer vision models had an accuracy of 86%-93% compared with manual annotations. Charleston had the highest percentage of green streets (79%), while Chicago had the highest percentage of crosswalks (23%) and commercial buildings/apartments (59%). Built environment characteristics were categorised into tertiles, with the highest tertile serving as the referent group. Individuals living in zip codes with the most green streets, crosswalks and commercial buildings/apartments had relative obesity prevalences that were 25%-28% lower and relative diabetes prevalences that were 12%-18% lower than individuals living in zip codes with the least abundance of these neighbourhood features. Neighbourhood conditions may influence chronic disease outcomes. Google Street View images represent an underused data resource for the construction of built environment features. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Graph Unification and Tangram Hypothesis Explanation Representation (GATHER) and System and Component Modeling Framework (SCMF)

    DTIC Science & Technology

    2008-08-01

    services, DIDS and DMS, are deployable on the TanGrid system and are accessible via two APIs, a Java client and a servlet based interface. Additionally...but required the user to instantiate an IGraph object with several Java Maps containing the nodes, node attributes, edge types, and the connections...restrictions imposed by the bulk ingest process. Finally, once the bulk ingest process was available in the GraphUnification Java Archives (JAR), DC was

  10. Building and Vegetation Rasterization for the Three-dimensional Wind Field (3DWF) Model

    DTIC Science & Technology

    2010-12-01

    Maps API. By design, JavaScript limits access to local resources. This is done to protect against the execution of malicious code. However, ActiveX ...to only use these types of objects ( ActiveX or XPCOM) from a trusted source in order to minimize the exposure of a computer system to malware...Microsoft ActiveX . There is also a need to restructure and rethink the implementation of the JavaScript code. It would be desirable to save the digitized

  11. N-Terminal Ile-Orn- and Trp-Orn-Motif Repeats Enhance Membrane Interaction and Increase the Antimicrobial Activity of Apidaecins against Pseudomonas aeruginosa

    PubMed Central

    Bluhm, Martina E. C.; Schneider, Viktoria A. F.; Schäfer, Ingo; Piantavigna, Stefania; Goldbach, Tina; Knappe, Daniel; Seibel, Peter; Martin, Lisandra L.; Veldhuizen, Edwin J. A.; Hoffmann, Ralf

    2016-01-01

    The Gram-negative bacterium Pseudomonas aeruginosa is a life-threatening nosocomial pathogen due to its generally low susceptibility toward antibiotics. Furthermore, many strains have acquired resistance mechanisms requiring new antimicrobials with novel mechanisms to enhance treatment options. Proline-rich antimicrobial peptides, such as the apidaecin analog Api137, are highly efficient against various Enterobacteriaceae infections in mice, but less active against P. aeruginosa in vitro. Here, we extended our recent work by optimizing lead peptides Api755 (gu-OIORPVYOPRPRPPHPRL-OH; gu = N,N,N′,N′-tetramethylguanidino, O = L-ornithine) and Api760 (gu-OWORPVYOPRPRPPHPRL-OH) by incorporation of Ile-Orn- and Trp-Orn-motifs, respectively. Api795 (gu-O(IO)2RPVYOPRPRPPHPRL-OH) and Api794 (gu-O(WO)3RPVYOPRPRPPHPRL-OH) were highly active against P. aeruginosa with minimal inhibitory concentrations of 8–16 and 8–32 μg/mL against Escherichia coli and Klebsiella pneumoniae. Assessed using a quartz crystal microbalance, these peptides inserted into a membrane layer and the surface activity increased gradually from Api137, over Api795, to Api794. This mode of action was confirmed by transmission electron microscopy indicating some membrane damage only at the high peptide concentrations. Api794 and Api795 were highly stable against serum proteases (half-life times >5 h) and non-hemolytic to human erythrocytes at peptide concentrations of 0.6 g/L. At this concentration, Api795 reduced the cell viability of HeLa cells only slightly, whereas the IC50 of Api794 was 0.23 ± 0.09 g/L. Confocal fluorescence microscopy revealed no colocalization of 5(6)-carboxyfluorescein-labeled Api794 or Api795 with the mitochondria, excluding interactions with the mitochondrial membrane. Interestingly, Api795 was localized in endosomes, whereas Api794 was present in endosomes and the cytosol. This was verified using flow cytometry showing a 50% higher uptake of Api794 in HeLa cells compared with Api795. The uptake was reduced for both peptides by 50 and 80%, respectively, after inhibiting endocytotic uptake with dynasore. In summary, Api794 and Api795 were highly active against P. aeruginosa in vitro. Both peptides passed across the bacterial membrane efficiently, most likely then disturbing the ribosome assembly, and resulting in further intracellular damage. Api795 with its IOIO-motif, which was particularly active and only slightly toxic in vitro, appears to represent a promising third generation lead compound for the development of novel antibiotics against P. aeruginosa. PMID:27243004

  12. High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.

    PubMed

    Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue

    2010-11-13

    Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.

  13. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  14. Impact of Nosema ceranae and Nosema apis on individual worker bees of the two host species (Apis cerana and Apis mellifera) and regulation of host immune response.

    PubMed

    Sinpoo, Chainarong; Paxton, Robert J; Disayathanoowat, Terd; Krongdang, Sasiprapa; Chantawannakul, Panuwan

    Nosema apis and Nosema ceranae are obligate intracellular microsporidian parasites infecting midgut epithelial cells of host adult honey bees, originally Apis mellifera and Apis cerana respectively. Each microsporidia cross-infects the other host and both microsporidia nowadays have a worldwide distribution. In this study, cross-infection experiments using both N. apis and N. ceranae in both A. mellifera and A. cerana were carried out to compare pathogen proliferation and impact on hosts, including host immune response. Infection by N. ceranae led to higher spore loads than by N. apis in both host species, and there was greater proliferation of microsporidia in A. mellifera compared to A. cerana. Both N. apis and N. ceranae were pathogenic in both host Apis species. N. ceranae induced subtly, though not significantly, higher mortality than N. apis in both host species, yet survival of A. cerana was no different to that of A. mellifera in response to N. apis or N. ceranae. Infections of both host species with N. apis and N. ceranae caused significant up-regulation of AMP genes and cellular mediated immune genes but did not greatly alter apoptosis-related gene expression. In this study, A. cerana enlisted a higher immune response and displayed lower loads of N. apis and N. ceranae spores than A. mellifera, suggesting it may be better able to defend itself against microsporidia infection. We caution against over-interpretation of our results, though, because differences between host and parasite species in survival were insignificant and because size differences between microsporidia species and between host Apis species may alternatively explain the differential proliferation of N. ceranae in A. mellifera. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Comparison of plasma amino acid profile-based index and CA125 in the diagnosis of epithelial ovarian cancers and borderline malignant tumors.

    PubMed

    Miyagi, Etsuko; Maruyama, Yasuyo; Mogami, Tae; Numazaki, Reiko; Ikeda, Atsuko; Yamamoto, Hiroshi; Hirahara, Fumiki

    2017-02-01

    We previously developed a new plasma amino acid profile-based index (API) to detect ovarian, cervical, and endometrial cancers. Here, we compared API to serum cancer antigen 125 (CA125) for distinguishing epithelial ovarian malignant tumors from benign growths. API and CA125 were measured preoperatively in patients with ovarian tumors, which were later classified into 59 epithelial ovarian cancers, 21 epithelial borderline malignant tumors, and 97 benign tumors including 40 endometriotic cysts. The diagnostic accuracy and cutoff points of API were evaluated using receiver operating characteristic (ROC) curves. The area under the ROC curves showed the equivalent performance of API and CA125 to discriminate between malignant/borderline malignant and benign tumors (both 0.77), and API was superior to CA125 for discrimination between malignant/borderline malignant lesions and endometriotic cysts (API, 0.75 vs. CA125, 0.59; p < 0.05). At the API cutoff level of 6.0, API and CA125 had equal positive rates of detecting cancers and borderline malignancies (API, 0.71 vs. CA125, 0.74; p = 0.84) or cancers alone (API, 0.73 vs. CA125, 0.85; p = 0.12). However, API had a significantly lower detection rate of benign endometriotic cysts (0.35; 95 % CI, 0.21-0.52) compared with that of CA125 (0.65; 95 % CI, 0.48-0.79) (p < 0.05). API is an effective new tumor marker to detect ovarian cancers and borderline malignancies with a low false-positive rate for endometriosis. A large-scale prospective clinical study using the cutoff value of API determined in this study is warranted to validate API for practical clinical use.

  16. Simulating a 40-year flood event climatology of Australia with a view to ocean-land teleconnections

    NASA Astrophysics Data System (ADS)

    Schumann, Guy J.-P.; Andreadis, Konstantinos; Stampoulis, Dimitrios; Bates, Paul

    2015-04-01

    We develop, for the first time, a proof-of-concept version for a high-resolution global flood inundation model to generate a flood inundation climatology of the past 40 years (1973-2012) for the entire Australian continent at a native 1 km resolution. The objectives of our study includes (1) deriving an inundation climatology for a continent (Australia) as a demonstrator case to understand the requirements for expanding globally; (2) developing a test bed to assess the potential and value of current and future satellite missions (GRACE, SMAP, ICESat-2, AMSR-2, Sentinels and SWOT) in flood monitoring; and (3) answering science questions such as the linking of inundation to ocean circulation teleconnections. We employ the LISFLOOD-FP hydrodynamic model to generate a flood inundation climatology. The model will be built from freely available SRTM-derived data (channel widths, bank heights and floodplain topography corrected for vegetation canopy using ICESat canopy heights). Lakes and reservoirs are represented and channel hydraulics are resolved using actual channel data with bathymetry inferred from hydraulic geometry. Simulations are run with gauged flows and floodplain inundation climatology are compared to observations from GRACE, flood maps from Landsat, SAR, and MODIS. Simulations have been completed for the entire Australian continent. Additionally, changes in flood inundation have been correlated with indices related to global ocean circulation, such as the El Niño Southern Oscillation index. We will produce data layers on flood event climatology and other derived (default) products from the proposed model including channel and floodplain depths, flow direction, velocity vectors, floodplain water volume, shoreline extent and flooded area. These data layers will be in the form of simple vector and raster formats. Since outputs will be large in size we propose to upload them onto Google Earth under the GEE API license.

  17. Reaching Forward in the War against the Islamic State

    DTIC Science & Technology

    2016-12-07

    every week with U.S.- and Coalition-advised ISOF troops taking the lead in combat operations using cellular communications systems that link them...tions—Offline Maps, Google Earth , and Viber, to name a few—which allowed them to bring tablets and phones on their operations to help communicate ...provided an initial Remote Advise and Assist capability that enabled the special forces advisors to track, communicate , and share limited data with

  18. Interpretation of earthquake-induced landslides triggered by the 12 May 2008, M7.9 Wenchuan earthquake in the Beichuan area, Sichuan Province, China using satellite imagery and Google Earth

    USGS Publications Warehouse

    Sato, H.P.; Harp, E.L.

    2009-01-01

    The 12 May 2008 M7.9 Wenchuan earthquake in the People's Republic of China represented a unique opportunity for the international community to use commonly available GIS (Geographic Information System) tools, like Google Earth (GE), to rapidly evaluate and assess landslide hazards triggered by the destructive earthquake and its aftershocks. In order to map earthquake-triggered landslides, we provide details on the applicability and limitations of publicly available 3-day-post- and pre-earthquake imagery provided by GE from the FORMOSAT-2 (formerly ROCSAT-2; Republic of China Satellite 2). We interpreted landslides on the 8-m-resolution FORMOSAT-2 image by GE; as a result, 257 large landslides were mapped with the highest concentration along the Beichuan fault. An estimated density of 0.3 landslides/km2 represents a minimum bound on density given the resolution of available imagery; higher resolution data would have identified more landslides. This is a preliminary study, and further study is needed to understand the landslide characteristics in detail. Although it is best to obtain landslide locations and measurements from satellite imagery having high resolution, it was found that GE is an effective and rapid reconnaissance tool. ?? 2009 Springer-Verlag.

  19. GeneOnEarth: fitting genetic PC plots on the globe.

    PubMed

    Torres-Sánchez, Sergio; Medina-Medina, Nuria; Gignoux, Chris; Abad-Grau, María M; González-Burchard, Esteban

    2013-01-01

    Principal component (PC) plots have become widely used to summarize genetic variation of individuals in a sample. The similarity between genetic distance in PC plots and geographical distance has shown to be quite impressive. However, in most situations, individual ancestral origins are not precisely known or they are heterogeneously distributed; hence, they are hardly linked to a geographical area. We have developed GeneOnEarth, a user-friendly web-based tool to help geneticists to understand whether a linear isolation-by-distance model may apply to a genetic data set; thus, genetic distances among a set of individuals resemble geographical distances among their origins. Its main goal is to allow users to first apply a by-view Procrustes method to visually learn whether this model holds. To do that, the user can choose the exact geographical area from an on line 2D or 3D world map by using, respectively, Google Maps or Google Earth, and rotate, flip, and resize the images. GeneOnEarth can also compute the optimal rotation angle using Procrustes analysis and assess statistical evidence of similarity when a different rotation angle has been chosen by the user. An online version of GeneOnEarth is available for testing and using purposes at http://bios.ugr.es/GeneOnEarth.

  20. An intelligent and secure system for predicting and preventing Zika virus outbreak using Fog computing

    NASA Astrophysics Data System (ADS)

    Sareen, Sanjay; Gupta, Sunil Kumar; Sood, Sandeep K.

    2017-10-01

    Zika virus is a mosquito-borne disease that spreads very quickly in different parts of the world. In this article, we proposed a system to prevent and control the spread of Zika virus disease using integration of Fog computing, cloud computing, mobile phones and the Internet of things (IoT)-based sensor devices. Fog computing is used as an intermediary layer between the cloud and end users to reduce the latency time and extra communication cost that is usually found high in cloud-based systems. A fuzzy k-nearest neighbour is used to diagnose the possibly infected users, and Google map web service is used to provide the geographic positioning system (GPS)-based risk assessment to prevent the outbreak. It is used to represent each Zika virus (ZikaV)-infected user, mosquito-dense sites and breeding sites on the Google map that help the government healthcare authorities to control such risk-prone areas effectively and efficiently. The proposed system is deployed on Amazon EC2 cloud to evaluate its performance and accuracy using data set for 2 million users. Our system provides high accuracy of 94.5% for initial diagnosis of different users according to their symptoms and appropriate GPS-based risk assessment.

  1. An Interactive Web System for Field Data Sharing and Collaboration

    NASA Astrophysics Data System (ADS)

    Weng, Y.; Sun, F.; Grigsby, J. D.

    2010-12-01

    A Web 2.0 system is designed and developed to facilitate data collection for the field studies in the Geological Sciences department at Ball State University. The system provides a student-centered learning platform that enables the users to first upload their collected data in various formats, interact and collaborate dynamically online, and ultimately create a shared digital repository of field experiences. The data types considered for the system and their corresponding format and requirements are listed in the table below. The system has six main functionalities as follows. (1) Only the registered users can access the system with confidential identification and password. (2) Each user can upload/revise/delete data in various formats such as image, audio, video, and text files to the system. (3) Interested users are allowed to co-edit the contents and join the collaboration whiteboard for further discussion. (4) The system integrates with Google, Yahoo, or Flickr to search for similar photos with same tags. (5) Users can search the web system according to the specific key words. (6) Photos with recorded GPS readings can be mashed and mapped to Google Maps/Earth for visualization. Application of the system to geology field trips at Ball State University will be demonstrated to assess the usability of the system.Data Requirements

  2. HCLS 2.0/3.0: health care and life sciences data mashup using Web 2.0/3.0.

    PubMed

    Cheung, Kei-Hoi; Yip, Kevin Y; Townsend, Jeffrey P; Scotch, Matthew

    2008-10-01

    We describe the potential of current Web 2.0 technologies to achieve data mashup in the health care and life sciences (HCLS) domains, and compare that potential to the nascent trend of performing semantic mashup. After providing an overview of Web 2.0, we demonstrate two scenarios of data mashup, facilitated by the following Web 2.0 tools and sites: Yahoo! Pipes, Dapper, Google Maps and GeoCommons. In the first scenario, we exploited Dapper and Yahoo! Pipes to implement a challenging data integration task in the context of DNA microarray research. In the second scenario, we exploited Yahoo! Pipes, Google Maps, and GeoCommons to create a geographic information system (GIS) interface that allows visualization and integration of diverse categories of public health data, including cancer incidence and pollution prevalence data. Based on these two scenarios, we discuss the strengths and weaknesses of these Web 2.0 mashup technologies. We then describe Semantic Web, the mainstream Web 3.0 technology that enables more powerful data integration over the Web. We discuss the areas of intersection of Web 2.0 and Semantic Web, and describe the potential benefits that can be brought to HCLS research by combining these two sets of technologies.

  3. High-Resolution Air Pollution Mapping with Google Street View Cars: Exploiting Big Data.

    PubMed

    Apte, Joshua S; Messier, Kyle P; Gani, Shahzad; Brauer, Michael; Kirchstetter, Thomas W; Lunden, Melissa M; Marshall, Julian D; Portier, Christopher J; Vermeulen, Roel C H; Hamburg, Steven P

    2017-06-20

    Air pollution affects billions of people worldwide, yet ambient pollution measurements are limited for much of the world. Urban air pollution concentrations vary sharply over short distances (≪1 km) owing to unevenly distributed emission sources, dilution, and physicochemical transformations. Accordingly, even where present, conventional fixed-site pollution monitoring methods lack the spatial resolution needed to characterize heterogeneous human exposures and localized pollution hotspots. Here, we demonstrate a measurement approach to reveal urban air pollution patterns at 4-5 orders of magnitude greater spatial precision than possible with current central-site ambient monitoring. We equipped Google Street View vehicles with a fast-response pollution measurement platform and repeatedly sampled every street in a 30-km 2 area of Oakland, CA, developing the largest urban air quality data set of its type. Resulting maps of annual daytime NO, NO 2 , and black carbon at 30 m-scale reveal stable, persistent pollution patterns with surprisingly sharp small-scale variability attributable to local sources, up to 5-8× within individual city blocks. Since local variation in air quality profoundly impacts public health and environmental equity, our results have important implications for how air pollution is measured and managed. If validated elsewhere, this readily scalable measurement approach could address major air quality data gaps worldwide.

  4. HCLS 2.0/3.0: Health Care and Life Sciences Data Mashup Using Web 2.0/3.0

    PubMed Central

    Cheung, Kei-Hoi; Yip, Kevin Y.; Townsend, Jeffrey P.; Scotch, Matthew

    2010-01-01

    We describe the potential of current Web 2.0 technologies to achieve data mashup in the health care and life sciences (HCLS) domains, and compare that potential to the nascent trend of performing semantic mashup. After providing an overview of Web 2.0, we demonstrate two scenarios of data mashup, facilitated by the following Web 2.0 tools and sites: Yahoo! Pipes, Dapper, Google Maps and GeoCommons. In the first scenario, we exploited Dapper and Yahoo! Pipes to implement a challenging data integration task in the context of DNA microarray research. In the second scenario, we exploited Yahoo! Pipes, Google Maps, and GeoCommons to create a geographic information system (GIS) interface that allows visualization and integration of diverse categories of public health data, including cancer incidence and pollution prevalence data. Based on these two scenarios, we discuss the strengths and weaknesses of these Web 2.0 mashup technologies. We then describe Semantic Web, the mainstream Web 3.0 technology that enables more powerful data integration over the Web. We discuss the areas of intersection of Web 2.0 and Semantic Web, and describe the potential benefits that can be brought to HCLS research by combining these two sets of technologies. PMID:18487092

  5. Mapping the Americanization of English in space and time

    PubMed Central

    Gonçalves, Bruno; Loureiro-Porto, Lucía; Ramasco, José J.

    2018-01-01

    As global political preeminence gradually shifted from the United Kingdom to the United States, so did the capacity to culturally influence the rest of the world. In this work, we analyze how the world-wide varieties of written English are evolving. We study both the spatial and temporal variations of vocabulary and spelling of English using a large corpus of geolocated tweets and the Google Books datasets corresponding to books published in the US and the UK. The advantage of our approach is that we can address both standard written language (Google Books) and the more colloquial forms of microblogging messages (Twitter). We find that American English is the dominant form of English outside the UK and that its influence is felt even within the UK borders. Finally, we analyze how this trend has evolved over time and the impact that some cultural events have had in shaping it. PMID:29799872

  6. Mapping the Americanization of English in space and time.

    PubMed

    Gonçalves, Bruno; Loureiro-Porto, Lucía; Ramasco, José J; Sánchez, David

    2018-01-01

    As global political preeminence gradually shifted from the United Kingdom to the United States, so did the capacity to culturally influence the rest of the world. In this work, we analyze how the world-wide varieties of written English are evolving. We study both the spatial and temporal variations of vocabulary and spelling of English using a large corpus of geolocated tweets and the Google Books datasets corresponding to books published in the US and the UK. The advantage of our approach is that we can address both standard written language (Google Books) and the more colloquial forms of microblogging messages (Twitter). We find that American English is the dominant form of English outside the UK and that its influence is felt even within the UK borders. Finally, we analyze how this trend has evolved over time and the impact that some cultural events have had in shaping it.

  7. A tutorial for software development in quantitative proteomics using PSI standard formats☆

    PubMed Central

    Gonzalez-Galarza, Faviel F.; Qi, Da; Fan, Jun; Bessant, Conrad; Jones, Andrew R.

    2014-01-01

    The Human Proteome Organisation — Proteomics Standards Initiative (HUPO-PSI) has been working for ten years on the development of standardised formats that facilitate data sharing and public database deposition. In this article, we review three HUPO-PSI data standards — mzML, mzIdentML and mzQuantML, which can be used to design a complete quantitative analysis pipeline in mass spectrometry (MS)-based proteomics. In this tutorial, we briefly describe the content of each data model, sufficient for bioinformaticians to devise proteomics software. We also provide guidance on the use of recently released application programming interfaces (APIs) developed in Java for each of these standards, which makes it straightforward to read and write files of any size. We have produced a set of example Java classes and a basic graphical user interface to demonstrate how to use the most important parts of the PSI standards, available from http://code.google.com/p/psi-standard-formats-tutorial. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. PMID:23584085

  8. Using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.

    2016-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  9. Active pharmaceutical ingredients for antiretroviral treatment in low- and middle-income countries: a survey.

    PubMed

    Fortunak, Joseph M; de Souza, Rodrigo O M A; Kulkarni, Amol A; King, Christopher L; Ellison, Tiffany; Miranda, Leandro S M

    2014-01-01

    Active pharmaceutical ingredients (APIs) are the molecular entities that exert the therapeutic effects of medicines. This article provides an overview of the major APIs that are entered into antiretroviral therapy (ART), outlines how APIs are manufactured, and examines the regulatory and cost frameworks of manufacturing ART APIs used in low- and middle-income countries (LMICs). Almost all APIs for ART are prepared by chemical synthesis. Roughly 15 APIs account for essentially all of the ARTs used in LMICs. Nearly all of the ART APIs purchased through the Global Fund for AIDS, TB and Malaria (GFATM) or the United States President's Emergency Plan for AIDS Relief (PEPFAR) are produced by generic companies. API costs are very important because they are the largest contribution to the overall cost of ART. Efficient API production requires substantial investment in chemical manufacturing technologies and the ready availability of raw materials and energy at competitive prices. Generic API production is practiced in only a limited number of countries; the API market for ART is dominated by Indian companies. The quality of these APIs is ensured by manufacturing under good manufacturing practice (GMP), including process validation, testing against previously established specifications and the demonstration of clinical bioequivalence. The investment and personnel costs of a quality management system for GMP contribute significantly to the cost of API production. Chinese companies are the major suppliers for many advanced intermediates in API production. Improved chemistry of manufacturing, economies of scale and optimization of procurement have enabled drastic cost reductions for many ART APIs. The available capacity for global production of quality-assured APIs is likely adequate to meet forecasted demand for 2015. The increased use of ART for paediatric treatment, for second-line and salvage therapy, and the introduction of new APIs and combinations are important factors for the future of treatment in LMICs. The introduction of new fixed-dose combinations for ART and use of new drug delivery technologies could plausibly provide robust, durable ART for all patients in need, at an overall cost that is only moderately higher than what is presently being spent.

  10. Active pharmaceutical ingredients for antiretroviral treatment in low- and middle-income countries: a survey

    PubMed Central

    Fortunak, Joseph M; de Souza, Rodrigo OMA; Kulkarni, Amol A; King, Christopher L; Ellison, Tiffany; Miranda, Leandro SM

    2015-01-01

    Active pharmaceutical ingredients (APIs) are the molecular entities that exert the therapeutic effects of medicines. This article provides an overview of the major APIs that are entered into antiretroviral therapy (ART), outlines how APIs are manufactured, and examines the regulatory and cost frameworks of manufacturing ART APIs used in low- and middle-income countries (LMICs). Almost all APIs for ART are prepared by chemical synthesis. Roughly 15 APIs account for essentially all of the ARTs used in LMICs. Nearly all of the ART APIs purchased through the Global Fund for AIDS, TB and Malaria (GFATM) or the United States President’s Emergency Plan for AIDS Relief (PEPFAR) are produced by generic companies. API costs are very important because they are the largest contribution to the overall cost of ART. Efficient API production requires substantial investment in chemical manufacturing technologies and the ready availability of raw materials and energy at competitive prices. Generic API production is practiced in only a limited number of countries; the API market for ART is dominated by Indian companies. The quality of these APIs is ensured by manufacturing under good manufacturing practice (GMP), including process validation, testing against previously established specifications and the demonstration of clinical bioequivalence. The investment and personnel costs of a quality management system for GMP contribute significantly to the cost of API production. Chinese companies are the major suppliers for many advanced intermediates in API production. Improved chemistry of manufacturing, economies of scale and optimization of procurement have enabled drastic cost reductions for many ART APIs. The available capacity for global production of quality-assured APIs is likely adequate to meet forecasted demand for 2015. The increased use of ART for paediatric treatment, for second-line and salvage therapy, and the introduction of new APIs and combinations are important factors for the future of treatment in LMICs. The introduction of new fixed-dose combinations for ART and use of new drug delivery technologies could plausibly provide robust, durable ART for all patients in need, at an overall cost that is only moderately higher than what is presently being spent. PMID:25310430

  11. Developing a Global Database of Historic Flood Events to Support Machine Learning Flood Prediction in Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Sullivan, J.; Kettner, A.; Brakenridge, G. R.; Slayback, D. A.; Kuhn, C.; Doyle, C.

    2016-12-01

    There is an increasing need to understand flood vulnerability as the societal and economic effects of flooding increases. Risk models from insurance companies and flood models from hydrologists must be calibrated based on flood observations in order to make future predictions that can improve planning and help societies reduce future disasters. Specifically, to improve these models both traditional methods of flood prediction from physically based models as well as data-driven techniques, such as machine learning, require spatial flood observation to validate model outputs and quantify uncertainty. A key dataset that is missing for flood model validation is a global historical geo-database of flood event extents. Currently, the most advanced database of historical flood extent is hosted and maintained at the Dartmouth Flood Observatory (DFO) that has catalogued 4320 floods (1985-2015) but has only mapped 5% of these floods. We are addressing this data gap by mapping the inventory of floods in the DFO database to create a first-of- its-kind, comprehensive, global and historical geospatial database of flood events. To do so, we combine water detection algorithms on MODIS and Landsat 5,7 and 8 imagery in Google Earth Engine to map discrete flood events. The created database will be available in the Earth Engine Catalogue for download by country, region, or time period. This dataset can be leveraged for new data-driven hydrologic modeling using machine learning algorithms in Earth Engine's highly parallelized computing environment, and we will show examples for New York and Senegal.

  12. Soil erodibility mapping using the RUSLE model to prioritize erosion control in the Wadi Sahouat basin, North-West of Algeria.

    PubMed

    Toubal, Abderrezak Kamel; Achite, Mohammed; Ouillon, Sylvain; Dehni, Abdelatif

    2018-03-12

    Soil losses must be quantified over watersheds in order to set up protection measures against erosion. The main objective of this paper is to quantify and to map soil losses in the Wadi Sahouat basin (2140 km 2 ) in the north-west of Algeria, using the Revised Universal Soil Loss Equation (RUSLE) model assisted by a Geographic Information System (GIS) and remote sensing. The Model Builder of the GIS allowed the automation of the different operations for establishing thematic layers of the model parameters: the erosivity factor (R), the erodibility factor (K), the topographic factor (LS), the crop management factor (C), and the conservation support practice factor (P). The average annual soil loss rate in the Wadi Sahouat basin ranges from 0 to 255 t ha -1  year -1 , maximum values being observed over steep slopes of more than 25% and between 600 and 1000 m elevations. 3.4% of the basin is classified as highly susceptible to erosion, 4.9% with a medium risk, and 91.6% at a low risk. Google Earth reveals a clear conformity with the degree of zones to erosion sensitivity. Based on the soil loss map, 32 sub-basins were classified into three categories by priority of intervention: high, moderate, and low. This priority is available to sustain a management plan against sediment filling of the Ouizert dam at the basin outlet. The method enhancing the RUSLE model and confrontation with Google Earth can be easily adapted to other watersheds.

  13. Keemei: cloud-based validation of tabular bioinformatics file formats in Google Sheets.

    PubMed

    Rideout, Jai Ram; Chase, John H; Bolyen, Evan; Ackermann, Gail; González, Antonio; Knight, Rob; Caporaso, J Gregory

    2016-06-13

    Bioinformatics software often requires human-generated tabular text files as input and has specific requirements for how those data are formatted. Users frequently manage these data in spreadsheet programs, which is convenient for researchers who are compiling the requisite information because the spreadsheet programs can easily be used on different platforms including laptops and tablets, and because they provide a familiar interface. It is increasingly common for many different researchers to be involved in compiling these data, including study coordinators, clinicians, lab technicians and bioinformaticians. As a result, many research groups are shifting toward using cloud-based spreadsheet programs, such as Google Sheets, which support the concurrent editing of a single spreadsheet by different users working on different platforms. Most of the researchers who enter data are not familiar with the formatting requirements of the bioinformatics programs that will be used, so validating and correcting file formats is often a bottleneck prior to beginning bioinformatics analysis. We present Keemei, a Google Sheets Add-on, for validating tabular files used in bioinformatics analyses. Keemei is available free of charge from Google's Chrome Web Store. Keemei can be installed and run on any web browser supported by Google Sheets. Keemei currently supports the validation of two widely used tabular bioinformatics formats, the Quantitative Insights into Microbial Ecology (QIIME) sample metadata mapping file format and the Spatially Referenced Genetic Data (SRGD) format, but is designed to easily support the addition of others. Keemei will save researchers time and frustration by providing a convenient interface for tabular bioinformatics file format validation. By allowing everyone involved with data entry for a project to easily validate their data, it will reduce the validation and formatting bottlenecks that are commonly encountered when human-generated data files are first used with a bioinformatics system. Simplifying the validation of essential tabular data files, such as sample metadata, will reduce common errors and thereby improve the quality and reliability of research outcomes.

  14. NeuroVault.org: A repository for sharing unthresholded statistical maps, parcellations, and atlases of the human brain.

    PubMed

    Gorgolewski, Krzysztof J; Varoquaux, Gael; Rivera, Gabriel; Schwartz, Yannick; Sochat, Vanessa V; Ghosh, Satrajit S; Maumet, Camille; Nichols, Thomas E; Poline, Jean-Baptiste; Yarkoni, Tal; Margulies, Daniel S; Poldrack, Russell A

    2016-01-01

    NeuroVault.org is dedicated to storing outputs of analyses in the form of statistical maps, parcellations and atlases, a unique strategy that contrasts with most neuroimaging repositories that store raw acquisition data or stereotaxic coordinates. Such maps are indispensable for performing meta-analyses, validating novel methodology, and deciding on precise outlines for regions of interest (ROIs). NeuroVault is open to maps derived from both healthy and clinical populations, as well as from various imaging modalities (sMRI, fMRI, EEG, MEG, PET, etc.). The repository uses modern web technologies such as interactive web-based visualization, cognitive decoding, and comparison with other maps to provide researchers with efficient, intuitive tools to improve the understanding of their results. Each dataset and map is assigned a permanent Universal Resource Locator (URL), and all of the data is accessible through a REST Application Programming Interface (API). Additionally, the repository supports the NIDM-Results standard and has the ability to parse outputs from popular FSL and SPM software packages to automatically extract relevant metadata. This ease of use, modern web-integration, and pioneering functionality holds promise to improve the workflow for making inferences about and sharing whole-brain statistical maps. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. IgE-Api m 4 Is Useful for Identifying a Particular Phenotype of Bee Venom Allergy.

    PubMed

    Ruiz, B; Serrano, P; Moreno, C

    Different clinical behaviors have been identified in patients allergic to bee venom. Compound-resolved diagnosis could be an appropriate tool for investigating these differences. The aims of this study were to analyze whether specific IgE to Api m 4 (sIgE-Api m 4) can identify a particular kind of bee venom allergy and to describe response to bee venom immunotherapy (bVIT). Prospective study of 31 patients allergic to bee venom who were assigned to phenotype group A (sIgE-Api m 4 <0.98 kU/L), treated with native aqueous (NA) extract, or phenotype group B (sIgE-Api m 4 ≥0.98 kU/L), treated with purified aqueous (PA) extract. Sex, age, cardiovascular risk, severity of preceding sting reaction, exposure to beekeeping, and immunological data (intradermal test, sIgE/sIgG4-Apis-nApi m 1, and sIgE-rApi m 2-Api m 4 were analyzed. Systemic reactions (SRs) during bVIT build-up were analyzed. Immunological and sting challenge outcomes were evaluated in each group after 1 and 2 years of bVIT. Phenotype B patients had more severe reactions (P=.049) and higher skin sensitivity (P=.011), baseline sIgE-Apis (P=.0004), sIgE-nApi m 1 (P=.0004), and sIgG4-Apis (P=.027) than phenotype A patients. Furthermore, 41% of patients in group B experienced SRs during the build-up phase with NA; the sting challenge success rate in this group was 82%. There were no significant reductions in serial intradermal test results, but an intense reduction in sIgE-nApi m 1 (P=.013) and sIgE-Api m 4 (P=.004) was observed after the first year of bVIT. Use of IgE-Api m 4 as the only discrimination criterion demonstrated differences in bee venom allergy. Further investigation with larger populations is necessary.

  16. Chimeras of Bet v 1 and Api g 1 reveal heterogeneous IgE responses in patients with birch pollen allergy

    PubMed Central

    Gepp, Barbara; Lengger, Nina; Bublin, Merima; Hemmer, Wolfgang; Breiteneder, Heimo; Radauer, Christian

    2014-01-01

    Background Characterization of IgE-binding epitopes of allergens and determination of their patient-specific relevance is crucial for the diagnosis and treatment of allergy. Objective We sought to assess the contribution of specific surface areas of the major birch pollen allergen Bet v 1.0101 to binding IgE of individual patients. Methods Four distinct areas of Bet v 1 representing in total 81% of its surface were grafted onto the scaffold of its homolog, Api g 1.0101, to yield the chimeras Api-Bet-1 to Api-Bet-4. The chimeras were expressed in Escherichia coli and purified. IgE binding of 64 sera from Bet v 1–sensitized subjects with birch pollen allergy was determined by using direct ELISA. Specificity was assessed by means of inhibition ELISA. Results rApi g 1.0101, Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4 bound IgE from 44%, 89%, 80%, 78%, and 48% of the patients, respectively. By comparing the amount of IgE binding to the chimeras and to rApi g 1.0101, 81%, 70%, 75%, and 45% of the patients showed significantly enhanced IgE binding to Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4, respectively. The minority (8%) of the sera revealed enhanced IgE binding exclusively to a single chimera, whereas 31% showed increased IgE binding to all 4 chimeras compared with rApi g 1.0101. The chimeras inhibited up to 70% of IgE binding to rBet v 1.0101, confirming the specific IgE recognition of the grafted regions. Conclusion The Bet v 1–specific IgE response is polyclonal, and epitopes are spread across the entire Bet v 1 surface. Furthermore, the IgE recognition profile of Bet v 1 is highly patient specific. PMID:24529686

  17. Predominant Api m 10 sensitization as risk factor for treatment failure in honey bee venom immunotherapy.

    PubMed

    Frick, Marcel; Fischer, Jörg; Helbling, Arthur; Ruëff, Franziska; Wieczorek, Dorothea; Ollert, Markus; Pfützner, Wolfgang; Müller, Sabine; Huss-Marp, Johannes; Dorn, Britta; Biedermann, Tilo; Lidholm, Jonas; Ruecker, Gerta; Bantleon, Frank; Miehe, Michaela; Spillner, Edzard; Jakob, Thilo

    2016-12-01

    Component resolution recently identified distinct sensitization profiles in honey bee venom (HBV) allergy, some of which were dominated by specific IgE to Api m 3 and/or Api m 10, which have been reported to be underrepresented in therapeutic HBV preparations. We performed a retrospective analysis of component-resolved sensitization profiles in HBV-allergic patients and association with treatment outcome. HBV-allergic patients who had undergone controlled honey bee sting challenge after at least 6 months of HBV immunotherapy (n = 115) were included and classified as responder (n = 79) or treatment failure (n = 36) on the basis of absence or presence of systemic allergic reactions upon sting challenge. IgE reactivity to a panel of HBV allergens was analyzed in sera obtained before immunotherapy and before sting challenge. No differences were observed between responders and nonresponders regarding levels of IgE sensitization to Api m 1, Api m 2, Api m 3, and Api m 5. In contrast, Api m 10 specific IgE was moderately but significantly increased in nonresponders. Predominant Api m 10 sensitization (>50% of specific IgE to HBV) was the best discriminator (specificity, 95%; sensitivity, 25%) with an odds ratio of 8.444 (2.127-33.53; P = .0013) for treatment failure. Some but not all therapeutic HBV preparations displayed a lack of Api m 10, whereas Api m 1 and Api m 3 immunoreactivity was comparable to that of crude HBV. In line with this, significant Api m 10 sIgG 4 induction was observed only in those patients who were treated with HBV in which Api m 10 was detectable. Component-resolved sensitization profiles in HBV allergy suggest predominant IgE sensitization to Api m 10 as a risk factor for treatment failure in HBV immunotherapy. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Chimeras of Bet v 1 and Api g 1 reveal heterogeneous IgE responses in patients with birch pollen allergy.

    PubMed

    Gepp, Barbara; Lengger, Nina; Bublin, Merima; Hemmer, Wolfgang; Breiteneder, Heimo; Radauer, Christian

    2014-07-01

    Characterization of IgE-binding epitopes of allergens and determination of their patient-specific relevance is crucial for the diagnosis and treatment of allergy. We sought to assess the contribution of specific surface areas of the major birch pollen allergen Bet v 1.0101 to binding IgE of individual patients. Four distinct areas of Bet v 1 representing in total 81% of its surface were grafted onto the scaffold of its homolog, Api g 1.0101, to yield the chimeras Api-Bet-1 to Api-Bet-4. The chimeras were expressed in Escherichia coli and purified. IgE binding of 64 sera from Bet v 1-sensitized subjects with birch pollen allergy was determined by using direct ELISA. Specificity was assessed by means of inhibition ELISA. rApi g 1.0101, Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4 bound IgE from 44%, 89%, 80%, 78%, and 48% of the patients, respectively. By comparing the amount of IgE binding to the chimeras and to rApi g 1.0101, 81%, 70%, 75%, and 45% of the patients showed significantly enhanced IgE binding to Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4, respectively. The minority (8%) of the sera revealed enhanced IgE binding exclusively to a single chimera, whereas 31% showed increased IgE binding to all 4 chimeras compared with rApi g 1.0101. The chimeras inhibited up to 70% of IgE binding to rBet v 1.0101, confirming the specific IgE recognition of the grafted regions. The Bet v 1-specific IgE response is polyclonal, and epitopes are spread across the entire Bet v 1 surface. Furthermore, the IgE recognition profile of Bet v 1 is highly patient specific. Copyright © 2014 The Authors. Published by Mosby, Inc. All rights reserved.

  19. Synthetic environments

    NASA Astrophysics Data System (ADS)

    Lukes, George E.; Cain, Joel M.

    1996-02-01

    The Advanced Distributed Simulation (ADS) Synthetic Environments Program seeks to create robust virtual worlds from operational terrain and environmental data sources of sufficient fidelity and currency to interact with the real world. While some applications can be met by direct exploitation of standard digital terrain data, more demanding applications -- particularly those support operations 'close to the ground' -- are well-served by emerging capabilities for 'value-adding' by the user working with controlled imagery. For users to rigorously refine and exploit controlled imagery within functionally different workstations they must have a shared framework to allow interoperability within and between these environments in terms of passing image and object coordinates and other information using a variety of validated sensor models. The Synthetic Environments Program is now being expanded to address rapid construction of virtual worlds with research initiatives in digital mapping, softcopy workstations, and cartographic image understanding. The Synthetic Environments Program is also participating in a joint initiative for a sensor model applications programer's interface (API) to ensure that a common controlled imagery exploitation framework is available to all researchers, developers and users. This presentation provides an introduction to ADS and the associated requirements for synthetic environments to support synthetic theaters of war. It provides a technical rationale for exploring applications of image understanding technology to automated cartography in support of ADS and related programs benefitting from automated analysis of mapping, earth resources and reconnaissance imagery. And it provides an overview and status of the joint initiative for a sensor model API.

  20. Next-generation small RNA sequencing for microRNAs profiling in the honey bee Apis mellifera.

    PubMed

    Chen, X; Yu, X; Cai, Y; Zheng, H; Yu, D; Liu, G; Zhou, Q; Hu, S; Hu, F

    2010-12-01

    MicroRNAs (miRNAs) are key regulators in various physiological and pathological processes via post-transcriptional regulation of gene expression. The honey bee (Apis mellifera) is a key model for highly social species, and its complex social behaviour can be interpreted theoretically as changes in gene regulation, in which miRNAs are thought to be involved. We used the SOLiD sequencing system to identify the repertoire of miRNAs in the honey bee by sequencing a mixed small RNA library from different developmental stages. We obtained a total of 36,796,459 raw sequences; of which 5,491,100 short sequences were fragments of mRNA and other noncoding RNAs (ncRNA), and 1,759,346 reads mapped to the known miRNAs. We predicted 267 novel honey bee miRNAs representing 380,182 short reads, including eight miRNAs of other insects in 14,107,583 genome-mapped sequences. We verified 50 of them using stem-loop reverse-transcription PCR (RT-PCR), in which 35 yielded PCR products. Cross-species analyses showed 81 novel miRNAs with homologues in other insects, suggesting that they were authentic miRNAs and have similar functions. The results of this study provide a basis for studies of the miRNA-modulating networks in development and some intriguing phenomena such as caste differentiation in A. mellifera. © 2010 The Authors. Insect Molecular Biology © 2010 The Royal Entomological Society.

  1. Temporal and spatial behavior of pharmaceuticals in ...

    EPA Pesticide Factsheets

    The behavior of active pharmaceutical ingredients (APIs) in urban estuaries is not well understood. In this study, 15 high volume usage APIs were measured over a one year period throughout Narragansett Bay, RI, USA to determine factors controlling their concentration and distribution. Dissolved APIs ranged in concentration from not detected to 310 ng/L, with numerous APIs present at all sites and sampling periods. Eight APIs were present in suspended particulate material, ranging in concentration from <1 ng/g to 44 ng/g. Partitioning coefficients (Kds) were determined for APIs present in both the dissolved and particulate phases, with their range and variability remaining relatively constant during the study. Organic carbon normalization reduced the observed variability of several APIs to a small extent; however, other factors appear to play a role in controlling partitioning behavior. The continuous discharge of wastewater treatment plant effluents into upper Narragansett Bay resulted in sustained levels of APIs, resulting in a zone of “pseudo-persistence.” For most of the APIs, there was a strong relationship with salinity, indicating conservative behavior within the estuary. Short flushing times in Narragansett Bay coupled with APIs present primarily in the dissolved phase suggests that most APIs will be diluted and transported out of the estuary, with only small amounts of several compounds removed to and sequestered in sediments. This study ide

  2. 49 CFR 195.565 - How do I install cathodic protection on breakout tanks?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...

  3. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...

  4. 49 CFR 195.565 - How do I install cathodic protection on breakout tanks?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...

  5. 49 CFR 195.565 - How do I install cathodic protection on breakout tanks?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...

  6. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...

  7. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...

  8. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...

  9. 78 FR 48738 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    ... depend upon the Application Programming Interface (``API'') a Permit Holder is using.\\4\\ Currently, the Exchange offers two APIs: CBOE Market Interface (``CMi'') API and Financial Information eXchange (``FIX... available APIs, and if applicable, which version, it would like to use. \\4\\ An API is a computer interface...

  10. 49 CFR 195.565 - How do I install cathodic protection on breakout tanks?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...

  11. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...

  12. 49 CFR 195.565 - How do I install cathodic protection on breakout tanks?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...

  13. Near infrared and Raman spectroscopy as Process Analytical Technology tools for the manufacturing of silicone-based drug reservoirs.

    PubMed

    Mantanus, J; Rozet, E; Van Butsele, K; De Bleye, C; Ceccato, A; Evrard, B; Hubert, Ph; Ziémons, E

    2011-08-05

    Using near infrared (NIR) and Raman spectroscopy as PAT tools, 3 critical quality attributes of a silicone-based drug reservoir were studied. First, the Active Pharmaceutical Ingredient (API) homogeneity in the reservoir was evaluated using Raman spectroscopy (mapping): the API distribution within the industrial drug reservoirs was found to be homogeneous while API aggregates were detected in laboratory scale samples manufactured with a non optimal mixing process. Second, the crosslinking process of the reservoirs was monitored at different temperatures with NIR spectroscopy. Conformity tests and Principal Component Analysis (PCA) were performed on the collected data to find out the relation between the temperature and the time necessary to reach the crosslinking endpoints. An agreement was found between the conformity test results and the PCA results. Compared to the conformity test method, PCA had the advantage to discriminate the heating effect from the crosslinking effect occurring together during the monitored process. Therefore the 2 approaches were found to be complementary. Third, based on the HPLC reference method, a NIR model able to quantify the API in the drug reservoir was developed and thoroughly validated. Partial Least Squares (PLS) regression on the calibration set was performed to build prediction models of which the ability to quantify accurately was tested with the external validation set. The 1.2% Root Mean Squared Error of Prediction (RMSEP) of the NIR model indicated the global accuracy of the model. The accuracy profile based on tolerance intervals was used to generate a complete validation report. The 95% tolerance interval calculated on the validation results indicated that each future result will have a relative error below ±5% with a probability of at least 95%. In conclusion, 3 critical quality attributes of silicone-based drug reservoirs were quickly and efficiently evaluated by NIR and Raman spectroscopy. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Effects of Spatial Ability, Gender Differences, and Pictorial Training on Children Using 2-D and 3-D Environments to Recall Landmark Locations from Memory

    ERIC Educational Resources Information Center

    Kopcha, Theodore J.; Otumfuor, Beryl A.; Wang, Lu

    2015-01-01

    This study examines the effects of spatial ability, gender differences, and pictorial training on fourth grade students' ability to recall landmark locations from memory. Ninety-six students used Google Earth over a 3-week period to locate landmarks (3-D) and mark their location on a 2-D topographical map. Analysis of covariance on posttest scores…

  15. VizieR Online Data Catalog: Orion Integral Filament ALMA+IRAM30m N2H+(1-0) data (Hacar+, 2018)

    NASA Astrophysics Data System (ADS)

    Hacar, A.; Tafalla, M.; Forbrich, J.; Alves, J.; Meingast, S.; Grossschedl, J.; Teixeira, P. S.

    2018-01-01

    Combined ALMA+IRAM30m large-scale N2H+(1-0) emission in the Orion ISF. Two datasets are presented here in FITS format: 1.- Full data cube: spectral resolution = 0.1 kms-1 2.- Total integrated line intensity (moment 0) map Units are in Jy/beam See also: https://sites.google.com/site/orion4dproject/home (2 data files).

  16. Nominal 30-M Cropland Extent Map of Continental Africa by Integrating Pixel-Based and Object-Based Algorithms Using Sentinel-2 and Landsat-8 Data on Google Earth Engine

    NASA Technical Reports Server (NTRS)

    Xiong, Jun; Thenkabail, Prasad S.; Tilton, James C.; Gumma, Murali K.; Teluguntla, Pardhasaradhi; Oliphant, Adam; Congalton, Russell G.; Yadav, Kamini; Gorelick, Noel

    2017-01-01

    A satellite-derived cropland extent map at high spatial resolution (30-m or better) is a must for food and water security analysis. Precise and accurate global cropland extent maps, indicating cropland and non-cropland areas, is a starting point to develop high-level products such as crop watering methods (irrigated or rainfed), cropping intensities (e.g., single, double, or continuous cropping), crop types, cropland fallows, as well as assessment of cropland productivity (productivity per unit of land), and crop water productivity (productivity per unit of water). Uncertainties associated with the cropland extent map have cascading effects on all higher-level cropland products. However, precise and accurate cropland extent maps at high spatial resolution over large areas (e.g., continents or the globe) are challenging to produce due to the small-holder dominant agricultural systems like those found in most of Africa and Asia. Cloud-based Geospatial computing platforms and multi-date, multi-sensor satellite image inventories on Google Earth Engine offer opportunities for mapping croplands with precision and accuracy over large areas that satisfy the requirements of broad range of applications. Such maps are expected to provide highly significant improvements compared to existing products, which tend to be coarser in resolution, and often fail to capture fragmented small-holder farms especially in regions with high dynamic change within and across years. To overcome these limitations, in this research we present an approach for cropland extent mapping at high spatial resolution (30-m or better) using the 10-day, 10 to 20-m, Sentinel-2 data in combination with 16-day, 30-m, Landsat-8 data on Google Earth Engine (GEE). First, nominal 30-m resolution satellite imagery composites were created from 36,924 scenes of Sentinel-2 and Landsat-8 images for the entire African continent in 2015-2016. These composites were generated using a median-mosaic of five bands (blue, green, red, near-infrared, NDVI) during each of the two periods (period 1: January-June 2016 and period 2: July-December 2015) plus a 30-m slope layer derived from the Shuttle Radar Topographic Mission (SRTM) elevation dataset. Second, we selected Cropland/Non-cropland training samples (sample size 9791) from various sources in GEE to create pixel-based classifications. As supervised classification algorithm, Random Forest (RF) was used as the primary classifier because of its efficiency, and when over-fitting issues of RF happened due to the noise of input training data, Support Vector Machine (SVM) was applied to compensate for such defects in specific areas. Third, the Recursive Hierarchical Segmentation (RHSeg) algorithm was employed to generate an object-oriented segmentation layer based on spectral and spatial properties from the same input data. This layer was merged with the pixel-based classification to improve segmentation accuracy. Accuracies of the merged 30-m crop extent product were computed using an error matrix approach in which 1754 independent validation samples were used. In addition, a comparison was performed with other available cropland maps as well as with LULC maps to show spatial similarity. Finally, the cropland area results derived from the map were compared with UN FAO statistics. The independent accuracy assessment showed a weighted overall accuracy of 94, with a producers accuracy of 85.9 (or omission error of 14.1), and users accuracy of 68.5 (commission error of 31.5) for the cropland class. The total net cropland area (TNCA) of Africa was estimated as 313 Mha for the nominal year 2015.

  17. 49 CFR 194.105 - Worst case discharge.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...: Prevention measure Standard Credit(percent) Secondary containment >100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...

  18. 49 CFR 194.105 - Worst case discharge.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...: Prevention measure Standard Credit(percent) Secondary containment > 100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...

  19. 49 CFR 194.105 - Worst case discharge.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...: Prevention measure Standard Credit(percent) Secondary containment >100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...

  20. 49 CFR 194.105 - Worst case discharge.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...: Prevention measure Standard Credit(percent) Secondary containment > 100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...

Top