Sample records for global digital database

  1. Global Access to Library of Congress' Digital Resources: National Digital Library and Internet Resources.

    ERIC Educational Resources Information Center

    Chen, Ching-chih

    1996-01-01

    Summarizes how the Library of Congress' digital library collections can be accessed globally via the Internet and World Wide Web. Outlines the resources found in each of the various access points: gopher, online catalog, library and legislative Web sites, legal and copyright databases, and FTP (file transfer protocol) sites. (LAM)

  2. Global GIS database; digital atlas of South Pacific

    USGS Publications Warehouse

    Hearn, P.P.; Hare, T.M.; Schruben, P.; Sherrill, D.; LaMar, C.; Tsushima, P.

    2001-01-01

    This CD-ROM contains a digital atlas of the countries of the South Pacific. This atlas is part of a global database compiled from USGS and other data sources at a nominal scale of 1:1 million and is intended to be used as a regional-scale reference and analytical tool by government officials, researchers, the private sector, and the general public. The atlas includes free GIS software or may be used with ESRI's ArcView software. Customized ArcView tools, specifically designed to make the atlas easier to use, are also included.

  3. Global GIS database; digital atlas of Africa

    USGS Publications Warehouse

    Hearn, P.P.; Hare, T.M.; Schruben, P.; Sherrill, D.; LaMar, C.; Tsushima, P.

    2001-01-01

    This CD-ROM contains a digital atlas of the countries of Africa. This atlas is part of a global database compiled from USGS and other data sources at a nominal scale of 1:1 million and is intended to be used as a regional-scale reference and analytical tool by government officials, researchers, the private sector, and the general public. The atlas includes free GIS software or may be used with ESRI's ArcView software. Customized ArcView tools, specifically designed to make this atlas easier to use, are also included.

  4. Global GIS database; digital atlas of South Asia

    USGS Publications Warehouse

    Hearn, P.P.; Hare, T.M.; Schruben, P.; Sherrill, D.; LaMar, C.; Tsushima, P.

    2001-01-01

    This CD-ROM contains a digital atlas of the countries of South Asia. This atlas is part of a global database compiled from USGS and other data sources at a nominal scale 1:1 million and is intended to be used as a regional-scale reference and analytical tool by government officials, researchers, the private sector, and the general public. The atlas includes free GIS software or may be used with ESRI's ArcView software. Customized ArcView tools, specifically designed to make the atlas easier to use, are also included.

  5. Hydrologic Derivatives for Modeling and Analysis—A new global high-resolution database

    USGS Publications Warehouse

    Verdin, Kristine L.

    2017-07-17

    The U.S. Geological Survey has developed a new global high-resolution hydrologic derivative database. Loosely modeled on the HYDRO1k database, this new database, entitled Hydrologic Derivatives for Modeling and Analysis, provides comprehensive and consistent global coverage of topographically derived raster layers (digital elevation model data, flow direction, flow accumulation, slope, and compound topographic index) and vector layers (streams and catchment boundaries). The coverage of the data is global, and the underlying digital elevation model is a hybrid of three datasets: HydroSHEDS (Hydrological data and maps based on SHuttle Elevation Derivatives at multiple Scales), GMTED2010 (Global Multi-resolution Terrain Elevation Data 2010), and the SRTM (Shuttle Radar Topography Mission). For most of the globe south of 60°N., the raster resolution of the data is 3 arc-seconds, corresponding to the resolution of the SRTM. For the areas north of 60°N., the resolution is 7.5 arc-seconds (the highest resolution of the GMTED2010 dataset) except for Greenland, where the resolution is 30 arc-seconds. The streams and catchments are attributed with Pfafstetter codes, based on a hierarchical numbering system, that carry important topological information. This database is appropriate for use in continental-scale modeling efforts. The work described in this report was conducted by the U.S. Geological Survey in cooperation with the National Aeronautics and Space Administration Goddard Space Flight Center.

  6. Global GIS database; digital atlas of Central and South America

    USGS Publications Warehouse

    Hearn,, Paul P.; Hare, T.; Schruben, P.; Sherrill, D.; LaMar, C.; Tsushima, P.

    2000-01-01

    This CD-ROM contains a digital atlas of the countries of Central and South America. This atlas is part of a global database compiled from USGS and other data sources at the nominal scale of 1:1 million and is intended to be used as a regional-scale reference and analytical tool by government officials, researchers, the private sector, and the general public. The atlas includes free GIS software or may also be used with ESRI's ArcView software. Customized ArcView tools, specifically designed to make the atlas easier to use, are also included. The atlas contains the following datasets: country political boundaries, digital shaded relief map, elevation, slope, hydrology, locations of cities and towns, airfields, roads, railroads, utility lines, population density, geology, ecological regions, historical seismicity, volcanoes, ore deposits, oil and gas fields, climate data, landcover, vegetation index, and lights at night.

  7. A Global Digital Database and Atlas of Quaternary Dune Fields and Sand Seas

    NASA Astrophysics Data System (ADS)

    Lancaster, N.; Halfen, A. F.

    2012-12-01

    Sand seas and dune fields are globally significant sedimentary deposits, which archive the effects of climate and sea level change on a variety of temporal and spatial scales. Dune systems provide a valuable source of information on past climate conditions, including evidence for periods of aridity and unique data on past wind regimes. Researchers have compiled vast quantities of geomorphic and chronological data from these dune systems for nearly half a century, however, these data remain disconnected, making comparisons of dune systems challenging at global and regional scales. The primary goal of this project is to develop a global digital database of chronologic information for periods of desert sand dune accumulation and stabilization, as well as, pertinent stratigraphic and geomorphic information. This database can then be used by scientists to 1) document the history of aeolian processes in arid regions with emphasis on dune systems in low and mid latitude deserts, 2) correlate periods of sand accumulation and stability with other terrestrial and marine paleoclimatic proxies and records, and 3) develop an improved understanding of the response of dune systems to climate change. The database currently resides in Microsoft Access format, which allows searching and filtering of data. The database includes 4 linked tables containing information on the site, chronological control (radiocarbon or luminescence), and the pertinent literature citations. Thus far the database contains information for 838 sites world wide, comprising 2598 luminescence and radiocarbon ages, though these numbers increase regularly as new data is added. The database is only available on request at this time, however, an online, GIS database is being developed and will be available in the near future. Data outputs from the online database will include PDF reports and Google Earth formatted data sets for quick viewing of data. Additionally, data will be available in a gridded format for wider use in data-model comparisons. Sites in database August 2012

  8. Local Places, Global Connections: Libraries in the Digital Age. What's Going On Series.

    ERIC Educational Resources Information Center

    Benton Foundation, Washington, DC.

    Libraries have long been pivotal community institutions--public spaces where people can come together to learn, reflect, and interact. Today, information is rapidly spreading beyond books and journals to digital government archives, business databases, electronic sound and image collections, and the flow of electronic impulses over computer…

  9. Abstracts of SIG Sessions.

    ERIC Educational Resources Information Center

    Proceedings of the ASIS Annual Meeting, 1997

    1997-01-01

    Presents abstracts of SIG Sessions. Highlights include digital collections; information retrieval methods; public interest/fair use; classification and indexing; electronic publication; funding; globalization; information technology projects; interface design; networking in developing countries; metadata; multilingual databases; networked…

  10. Digital map databases in support of avionic display systems

    NASA Astrophysics Data System (ADS)

    Trenchard, Michael E.; Lohrenz, Maura C.; Rosche, Henry, III; Wischow, Perry B.

    1991-08-01

    The emergence of computerized mission planning systems (MPS) and airborne digital moving map systems (DMS) has necessitated the development of a global database of raster aeronautical chart data specifically designed for input to these systems. The Naval Oceanographic and Atmospheric Research Laboratory''s (NOARL) Map Data Formatting Facility (MDFF) is presently dedicated to supporting these avionic display systems with the development of the Compressed Aeronautical Chart (CAC) database on Compact Disk Read Only Memory (CDROM) optical discs. The MDFF is also developing a series of aircraft-specific Write-Once Read Many (WORM) optical discs. NOARL has initiated a comprehensive research program aimed at improving the pilots'' moving map displays current research efforts include the development of an alternate image compression technique and generation of a standard set of color palettes. The CAC database will provide digital aeronautical chart data in six different scales. CAC is derived from the Defense Mapping Agency''s (DMA) Equal Arc-second (ARC) Digitized Raster Graphics (ADRG) a series of scanned aeronautical charts. NOARL processes ADRG to tailor the chart image resolution to that of the DMS display while reducing storage requirements through image compression techniques. CAC is being distributed by DMA as a library of CDROMs.

  11. Long-Range Atmosphere-Ocean Forecasting in Support of Undersea Warfare Operations in the Western North Pacific

    DTIC Science & Technology

    2009-09-01

    Forecasts ECS East China Sea ESRL Earth Systems Research Laboratory FA False alarm FARate False alarm rate xviii GDEM Generalized Digital...uses a LTM based, global ocean climatology database called Generalized Digital Environment Model ( GDEM ), in tactical decision aid (TDA) software, such...environment for USW planning. GDEM climatology is derived using temperature and salinity profiles from the Modular Ocean Data Assimilation System

  12. Development of a global land cover characteristics database and IGBP DISCover from 1 km AVHRR data

    USGS Publications Warehouse

    Loveland, Thomas R.; Reed, B.C.; Brown, Jesslyn F.; Ohlen, D.O.; Zhu, Z.; Yang, L.; Merchant, J.W.

    2000-01-01

    Researchers from the U.S. Geological Survey, University of Nebraska-Lincoln and the European Commission's Joint Research Centre, Ispra, Italy produced a 1 km resolution global land cover characteristics database for use in a wide range of continental-to global-scale environmental studies. This database provides a unique view of the broad patterns of the biogeographical and ecoclimatic diversity of the global land surface, and presents a detailed interpretation of the extent of human development. The project was carried out as an International Geosphere-Biosphere Programme, Data and Information Systems (IGBP-DIS) initiative. The IGBP DISCover global land cover product is an integral component of the global land cover database. DISCover includes 17 general land cover classes defined to meet the needs of IGBP core science projects. A formal accuracy assessment of the DISCover data layer will be completed in 1998. The 1 km global land cover database was developed through a continent-by-continent unsupervised classification of 1 km monthly Advanced Very High Resolution Radiometer (AVHRR) Normalized Difference Vegetation Index (NDVI) composites covering 1992-1993. Extensive post-classification stratification was necessary to resolve spectral/temporal confusion between disparate land cover types. The complete global database consists of 961 seasonal land cover regions that capture patterns of land cover, seasonality and relative primary productivity. The seasonal land cover regions were aggregated to produce seven separate land cover data sets used for global environmental modelling and assessment. The data sets include IGBP DISCover, U.S. Geological Survey Anderson System, Simple Biosphere Model, Simple Biosphere Model 2, Biosphere-Atmosphere Transfer Scheme, Olson Ecosystems and Running Global Remote Sensing Land Cover. The database also includes all digital sources that were used in the classification. The complete database can be sourced from the website: http://edcwww.cr.usgs.gov/landdaac/glcc/glcc.html.

  13. Mapping Indigenous Depth of Place

    ERIC Educational Resources Information Center

    Pearce, Margaret Wickens; Louis, Renee Pualani

    2008-01-01

    Indigenous communities have successfully used Western geospatial technologies (GT) (for example, digital maps, satellite images, geographic information systems (GIS), and global positioning systems (GPS)) since the 1970s to protect tribal resources, document territorial sovereignty, create tribal utility databases, and manage watersheds. The use…

  14. Mars Global Digital Dune Database (MGD3): Global dune distribution and wind pattern observations

    USGS Publications Warehouse

    Hayward, Rosalyn K.; Fenton, Lori; Titus, Timothy N.

    2014-01-01

    The Mars Global Digital Dune Database (MGD3) is complete and now extends from 90°N to 90°S latitude. The recently released south pole (SP) portion (MC-30) of MGD3 adds ∼60,000 km2 of medium to large-size dark dune fields and ∼15,000 km2 of sand deposits and smaller dune fields to the previously released equatorial (EQ, ∼70,000 km2), and north pole (NP, ∼845,000 km2) portions of the database, bringing the global total to ∼975,000 km2. Nearly all NP dunes are part of large sand seas, while the majority of EQ and SP dune fields are individual dune fields located in craters. Despite the differences between Mars and Earth, their dune and dune field morphologies are strikingly similar. Bullseye dune fields, named for their concentric ring pattern, are the exception, possibly owing their distinctive appearance to winds that are unique to the crater environment. Ground-based wind directions are derived from slipface (SF) orientation and dune centroid azimuth (DCA), a measure of the relative location of a dune field inside a crater. SF and DCA often preserve evidence of different wind directions, suggesting the importance of local, topographically influenced winds. In general however, ground-based wind directions are broadly consistent with expected global patterns, such as polar easterlies. Intriguingly, between 40°S and 80°S latitude both SF and DCA preserve their strongest, though different, dominant wind direction, with transport toward the west and east for SF-derived winds and toward the north and west for DCA-derived winds.

  15. Processing of Cloud Databases for the Development of an Automated Global Cloud Climatology

    DTIC Science & Technology

    1991-06-30

    cloud amounts in each DOE grid box. The actual population values were coded into one- and two- digit codes primarily for printing purposes. For example...IPIALES 72652 43.07 -95.53 0423 PICKSTOWNE S.D. 80110 6.22 -75.60 1498 MEDELLIN 72424 37.90 -85.97 0233 FT. KNOX KY 80069 7.00 -74.72 0610 AMALFI...12 According to Lund, Grantham, and Davis (1980), the quality of the whole sky photographs used in producing the WSP digital data ensemble was

  16. Digital Management and Curation of the National Rock and Ore Collections at NMNH, Smithsonian

    NASA Astrophysics Data System (ADS)

    Cottrell, E.; Andrews, B.; Sorensen, S. S.; Hale, L. J.

    2011-12-01

    The National Museum of Natural History, Smithsonian Institution, is home to the world's largest curated rock collection. The collection houses 160,680 physical rock and ore specimen lots ("samples"), all of which already have a digital record that can be accessed by the public through a searchable web interface (http://collections.mnh.si.edu/search/ms/). In addition, there are 66 accessions pending that when catalogued will add approximately 60,000 specimen lots. NMNH's collections are digitally managed on the KE EMu° platform which has emerged as the premier system for managing collections in natural history museums worldwide. In 2010 the Smithsonian released an ambitious 5 year Digitization Strategic Plan. In Mineral Sciences, new digitization efforts in the next five years will focus on integrating various digital resources for volcanic specimens. EMu sample records will link to the corresponding records for physical eruption information housed within the database of Smithsonian's Global Volcanism Program (GVP). Linkages are also planned between our digital records and geochemical databases (like EarthChem or PetDB) maintained by third parties. We anticipate that these linkages will increase the use of NMNH collections as well as engender new scholarly directions for research. Another large project the museum is currently undertaking involves the integration of the functionality of in-house designed Transaction Management software with the EMu database. This will allow access to the details (borrower, quantity, date, and purpose) of all loans of a given specimen through its catalogue record. We hope this will enable cross-referencing and fertilization of research ideas while avoiding duplicate efforts. While these digitization efforts are critical, we propose that the greatest challenge to sample curation is not posed by digitization and that a global sample registry alone will not ensure that samples are available for reuse. We suggest instead that the ability of the Earth science community to identify and preserve important collections and make them available for future study is limited by personnel and space resources from the level of the individual PI to the level of national facilities. Moreover, when it comes to specimen "estate planning," the cultural attitudes of scientists, institutions, and funding agencies are often inadequate to provide for long-term specimen curation - even if specimen discovery is enabled by digital registry. Timely access to curated samples requires that adequate resources be devoted to the physical care of specimens (facilities) and to the personnel costs associated with curation - from the conservation, storage, and inventory management of specimens, to the dispersal of samples for research, education, and exhibition.

  17. ICESat Lidar and Global Digital Elevation Models: Application to DESDynI

    NASA Technical Reports Server (NTRS)

    Carabajal, Claudia C.; Harding, David J.; Suchdeo, Vijay P.

    2010-01-01

    Geodetic control is extremely important in the production and quality control of topographic data sets, enabling elevation results to be referenced to an absolute vertical datum. Global topographic data with improved geodetic accuracy achieved using global Ground Control Point (GCP) databases enable more accurate characterization of land topography and its change related to solid Earth processes, natural hazards and climate change. The multiple-beam lidar instrument that will be part of the NASA Deformation, Ecosystem Structure and Dynamics of Ice (DESDynI) mission will provide a comprehensive, global data set that can be used for geodetic control purposes. Here we illustrate that potential using data acquired by NASA's Ice, Cloud and land Elevation Satellite (ICEsat) that has acquired single-beam, globally distributed laser altimeter profiles (+/-86deg) since February of 2003 [1, 2]. The profiles provide a consistently referenced elevation data set with unprecedented accuracy and quantified measurement errors that can be used to generate GCPs with sub-decimeter vertical accuracy and better than 10 m horizontal accuracy. Like the planned capability for DESDynI, ICESat records a waveform that is the elevation distribution of energy reflected within the laser footprint from vegetation, where present, and the ground where illuminated through gaps in any vegetation cover [3]. The waveform enables assessment of Digital Elevation Models (DEMs) with respect to the highest, centroid, and lowest elevations observed by ICESat and in some cases with respect to the ground identified beneath vegetation cover. Using the ICESat altimetry data we are developing a comprehensive database of consistent, global, geodetic ground control that will enhance the quality of a variety of regional to global DEMs. Here we illustrate the accuracy assessment of the Shuttle Radar Topography Mission (SRTM) DEM produced for Australia, documenting spatially varying elevation biases of several meters in magnitude.

  18. Creating Access to Data of Worldwide Volcanic Unrest

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Newhall, C. G.; Malone, S. D.

    2003-12-01

    We are creating a pilot database (WOVOdat - the World Organization of Volcano Observatories database) using an open source database and content generation software, allowing web access to data of worldwide volcanic seismicity, ground deformation, fumarolic activity, and other changes within or adjacent to a volcanic system. After three years of discussions with volcano observatories of the WOVO community and institutional databases such as IRIS, UNAVCO, and the Smithsonian's Global Volcanism Program about how to link global data of volcanic unrest for use during crisis situations and for research, we are now developing the pilot database. We already have created the core tables and have written simple queries that access some of the available data using pull-down menus on a website. Over the next year, we plan to complete schema realization, expand querying capabilities, and then open the pilot database for a multi-year data-loading process. Many of the challenges we are encountering are common to multidisciplinary projects and include determining standard data formats, choosing levels of data detail (raw vs. minimally processed data, summary intervals vs. continuous data, etc.), and organizing the extant but variable data into a useable schema. Additionally, we are working on how best to enter the varied data into the database (scripts for digital data and web-entry tools for non-digital data) and what standard sets of queries are most important. An essential during an evolving volcanic crisis would be: `Has any volcano shown the behavior being observed here and what happened?'. We believe that with a systematic aggregation of all datasets on volcanic unrest, we should be able to find patterns that were previously inaccessible or unrecognized. The second WOVOdat workshop in 2002 provided a recent forum for discussion of data formats, database access, and schemas. The formats and units for the discussed parameters can be viewed at http://www.wovo.org/WOVOdat/parameters.htm. Comments, suggestions, and participation in all aspects of the WOVOdat project are welcome and appreciated.

  19. eCTG: an automatic procedure to extract digital cardiotocographic signals from digital images.

    PubMed

    Sbrollini, Agnese; Agostinelli, Angela; Marcantoni, Ilaria; Morettini, Micaela; Burattini, Luca; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2018-03-01

    Cardiotocography (CTG), consisting in the simultaneous recording of fetal heart rate (FHR) and maternal uterine contractions (UC), is a popular clinical test to assess fetal health status. Typically, CTG machines provide paper reports that are visually interpreted by clinicians. Consequently, visual CTG interpretation depends on clinician's experience and has a poor reproducibility. The lack of databases containing digital CTG signals has limited number and importance of retrospective studies finalized to set up procedures for automatic CTG analysis that could contrast visual CTG interpretation subjectivity. In order to help overcoming this problem, this study proposes an electronic procedure, termed eCTG, to extract digital CTG signals from digital CTG images, possibly obtainable by scanning paper CTG reports. eCTG was specifically designed to extract digital CTG signals from digital CTG images. It includes four main steps: pre-processing, Otsu's global thresholding, signal extraction and signal calibration. Its validation was performed by means of the "CTU-UHB Intrapartum Cardiotocography Database" by Physionet, that contains digital signals of 552 CTG recordings. Using MATLAB, each signal was plotted and saved as a digital image that was then submitted to eCTG. Digital CTG signals extracted by eCTG were eventually compared to corresponding signals directly available in the database. Comparison occurred in terms of signal similarity (evaluated by the correlation coefficient ρ, and the mean signal error MSE) and clinical features (including FHR baseline and variability; number, amplitude and duration of tachycardia, bradycardia, acceleration and deceleration episodes; number of early, variable, late and prolonged decelerations; and UC number, amplitude, duration and period). The value of ρ between eCTG and reference signals was 0.85 (P < 10 -560 ) for FHR and 0.97 (P < 10 -560 ) for UC. On average, MSE value was 0.00 for both FHR and UC. No CTG feature was found significantly different when measured in eCTG vs. reference signals. eCTG procedure is a promising useful tool to accurately extract digital FHR and UC signals from digital CTG images. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Global Inventory of Gas Geochemistry Data from Fossil Fuel, Microbial and Burning Sources, version 2017

    NASA Astrophysics Data System (ADS)

    Sherwood, Owen A.; Schwietzke, Stefan; Arling, Victoria A.; Etiope, Giuseppe

    2017-08-01

    The concentration of atmospheric methane (CH4) has more than doubled over the industrial era. To help constrain global and regional CH4 budgets, inverse (top-down) models incorporate data on the concentration and stable carbon (δ13C) and hydrogen (δ2H) isotopic ratios of atmospheric CH4. These models depend on accurate δ13C and δ2H end-member source signatures for each of the main emissions categories. Compared with meticulous measurement and calibration of isotopic CH4 in the atmosphere, there has been relatively less effort to characterize globally representative isotopic source signatures, particularly for fossil fuel sources. Most global CH4 budget models have so far relied on outdated source signature values derived from globally nonrepresentative data. To correct this deficiency, we present a comprehensive, globally representative end-member database of the δ13C and δ2H of CH4 from fossil fuel (conventional natural gas, shale gas, and coal), modern microbial (wetlands, rice paddies, ruminants, termites, and landfills and/or waste) and biomass burning sources. Gas molecular compositional data for fossil fuel categories are also included with the database. The database comprises 10 706 samples (8734 fossil fuel, 1972 non-fossil) from 190 published references. Mean (unweighted) δ13C signatures for fossil fuel CH4 are significantly lighter than values commonly used in CH4 budget models, thus highlighting potential underestimation of fossil fuel CH4 emissions in previous CH4 budget models. This living database will be updated every 2-3 years to provide the atmospheric modeling community with the most complete CH4 source signature data possible. Database digital object identifier (DOI): https://doi.org/10.15138/G3201T.

  1. Status and distribution of mangrove forests of the world using earth observation satellite data

    USGS Publications Warehouse

    Giri, C.; Ochieng, E.; Tieszen, L.L.; Zhu, Z.; Singh, A.; Loveland, T.; Masek, J.; Duke, N.

    2011-01-01

    Aim Our scientific understanding of the extent and distribution of mangrove forests of the world is inadequate. The available global mangrove databases, compiled using disparate geospatial data sources and national statistics, need to be improved. Here, we mapped the status and distributions of global mangroves using recently available Global Land Survey (GLS) data and the Landsat archive.Methods We interpreted approximately 1000 Landsat scenes using hybrid supervised and unsupervised digital image classification techniques. Each image was normalized for variation in solar angle and earth-sun distance by converting the digital number values to the top-of-the-atmosphere reflectance. Ground truth data and existing maps and databases were used to select training samples and also for iterative labelling. Results were validated using existing GIS data and the published literature to map 'true mangroves'.Results The total area of mangroves in the year 2000 was 137,760 km2 in 118 countries and territories in the tropical and subtropical regions of the world. Approximately 75% of world's mangroves are found in just 15 countries, and only 6.9% are protected under the existing protected areas network (IUCN I-IV). Our study confirms earlier findings that the biogeographic distribution of mangroves is generally confined to the tropical and subtropical regions and the largest percentage of mangroves is found between 5?? N and 5?? S latitude.Main conclusions We report that the remaining area of mangrove forest in the world is less than previously thought. Our estimate is 12.3% smaller than the most recent estimate by the Food and Agriculture Organization (FAO) of the United Nations. We present the most comprehensive, globally consistent and highest resolution (30 m) global mangrove database ever created. We developed and used better mapping techniques and data sources and mapped mangroves with better spatial and thematic details than previous studies. ?? 2010 Blackwell Publishing Ltd.

  2. Status and distribution of mangrove forests of the world using earth observation satellite data

    USGS Publications Warehouse

    Giri, Chandra; Ochieng, E.; Tieszen, Larry L.; Zhu, Zhi-Liang; Singh, Ashbindu; Loveland, Thomas R.; Masek, Jeffery G.; Duke, Norm

    2011-01-01

    Aim  Our scientific understanding of the extent and distribution of mangrove forests of the world is inadequate. The available global mangrove databases, compiled using disparate geospatial data sources and national statistics, need to be improved. Here, we mapped the status and distributions of global mangroves using recently available Global Land Survey (GLS) data and the Landsat archive. Methods  We interpreted approximately 1000 Landsat scenes using hybrid supervised and unsupervised digital image classification techniques. Each image was normalized for variation in solar angle and earth–sun distance by converting the digital number values to the top-of-the-atmosphere reflectance. Ground truth data and existing maps and databases were used to select training samples and also for iterative labelling. Results were validated using existing GIS data and the published literature to map ‘true mangroves’. Results  The total area of mangroves in the year 2000 was 137,760 km2 in 118 countries and territories in the tropical and subtropical regions of the world. Approximately 75% of world's mangroves are found in just 15 countries, and only 6.9% are protected under the existing protected areas network (IUCN I-IV). Our study confirms earlier findings that the biogeographic distribution of mangroves is generally confined to the tropical and subtropical regions and the largest percentage of mangroves is found between 5° N and 5° S latitude. Main conclusions  We report that the remaining area of mangrove forest in the world is less than previously thought. Our estimate is 12.3% smaller than the most recent estimate by the Food and Agriculture Organization (FAO) of the United Nations. We present the most comprehensive, globally consistent and highest resolution (30 m) global mangrove database ever created. We developed and used better mapping techniques and data sources and mapped mangroves with better spatial and thematic details than previous studies.

  3. Digitizing Consumption Across the Operational Spectrum

    DTIC Science & Technology

    2014-09-01

    Figure 14.  Java -implemented Dictionary and Query: Result ............................................22  Figure 15.  Global Database Architecture...format. Figure 14 is an illustration of the query submitted in Java and the result which would be shown using the data shown in Figure 13. Figure...13. NoSQL (key, value) Dictionary Example 22 Figure 14. Java -implemented Dictionary and Query: Result While a

  4. Cadastral Positioning Accuracy Improvement: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Hashim, N. M.; Omar, A. H.; Omar, K. M.; Abdullah, N. M.; Yatim, M. H. M.

    2016-09-01

    Cadastral map is a parcel-based information which is specifically designed to define the limitation of boundaries. In Malaysia, the cadastral map is under authority of the Department of Surveying and Mapping Malaysia (DSMM). With the growth of spatial based technology especially Geographical Information System (GIS), DSMM decided to modernize and reform its cadastral legacy datasets by generating an accurate digital based representation of cadastral parcels. These legacy databases usually are derived from paper parcel maps known as certified plan. The cadastral modernization will result in the new cadastral database no longer being based on single and static parcel paper maps, but on a global digital map. Despite the strict process of the cadastral modernization, this reform has raised unexpected queries that remain essential to be addressed. The main focus of this study is to review the issues that have been generated by this transition. The transformed cadastral database should be additionally treated to minimize inherent errors and to fit them to the new satellite based coordinate system with high positional accuracy. This review result will be applied as a foundation for investigation to study the systematic and effectiveness method for Positional Accuracy Improvement (PAI) in cadastral database modernization.

  5. Global energy and water cycle experiment (GEWEX) continental-scale international project (GCIP); reference data sets CD-ROM

    USGS Publications Warehouse

    Rea, Alan; Cederstrand, Joel R.

    1994-01-01

    The data sets on this compact disc are a compilation of several geographic reference data sets of interest to the global-change research community. The data sets were chosen with input from the Global Energy and Water Cycle Experiment (GEWEX) Continental-Scale International Project (GCIP) Data Committee and the GCIP Hydrometeorology and Atmospheric Subpanels. The data sets include: locations and periods of record for stream gages, reservoir gages, and meteorological stations; a 500-meter-resolution digital elevation model; grid-node locations for the Eta numerical weather-prediction model; and digital map data sets of geology, land use, streams, large reservoirs, average annual runoff, average annual precipitation, average annual temperature, average annual heating and cooling degree days, hydrologic units, and state and county boundaries. Also included are digital index maps for LANDSAT scenes, and for the U.S. Geological Survey 1:250,000, 1:100,000, and 1:24,000-scale map series. Most of the data sets cover the conterminous United States; the digital elevation model also includes part of southern Canada. The stream and reservoir gage and meteorological station files cover all states having area within the Mississippi River Basin plus that part of the Mississippi River Basin lying within Canada. Several data-base retrievals were processed by state, therefore many sites outside the Mississippi River Basin are included.

  6. A global digital elevation model - GTOP030

    USGS Publications Warehouse

    1999-01-01

    GTOP030, the U.S. Geological Survey's (USGS) digital elevation model (DEM) of the Earth, provides the flrst global coverage of moderate resolution elevation data.  The original GTOP30 data set, which was developed over a 3-year period through a collaborative effort led by the USGS, was completed in 1996 at the USGS EROS Data Center in Sioux Falls, South Dakota.  The collaboration involved contributions of staffing, funding, or source data from cooperators including the National Aeronautics and Space Administration (NASA), the United Nations Environment Programme Global Resource Information Database (UNEP/GRID), the U.S. Agency for International Development (USAID), the Instituto Nacional de Estadistica Geografia e Informatica (INEGI) of Mexico, the Geographical Survey Institute (GSI) of Japan, Manaaki Whenua Landcare Research of New Zealand, and the Scientific Committee on Antarctic Research (SCAR). In 1999, work was begun on an update to the GTOP030 data set. Additional data sources are being incorporated into GTOP030 with an enhanced and improved data set planned for release in 2000.

  7. Map and database of Quaternary faults in Venezuela and its offshore regions

    USGS Publications Warehouse

    Audemard, F.A.; Machette, M.N.; Cox, J.W.; Dart, R.L.; Haller, K.M.

    2000-01-01

    As part of the International Lithosphere Program’s “World Map of Major Active Faults,” the U.S. Geological Survey is assisting in the compilation of a series of digital maps of Quaternary faults and folds in Western Hemisphere countries. The maps show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds. They are accompanied by databases that describe these features and document current information on their activity in the Quaternary. The project is a key part of the Global Seismic Hazards Assessment Program (ILP Project II-0) for the International Decade for Natural Hazard Disaster Reduction.The project is sponsored by the International Lithosphere Program and funded by the USGS’s National Earthquake Hazards Reduction Program. The primary elements of the project are general supervision and interpretation of geologic/tectonic information, data compilation and entry for fault catalog, database design and management, and digitization and manipulation of data in †ARCINFO. For the compilation of data, we engaged experts in Quaternary faulting, neotectonics, paleoseismology, and seismology.

  8. Evaluation of Local Media Surveillance for Improved Disease Recognition and Monitoring in Global Hotspot Regions

    PubMed Central

    Schwind, Jessica S.; Wolking, David J.; Brownstein, John S.; Mazet, Jonna A. K.; Smith, Woutrina A.

    2014-01-01

    Digital disease detection tools are technologically sophisticated, but dependent on digital information, which for many areas suffering from high disease burdens is simply not an option. In areas where news is often reported in local media with no digital counterpart, integration of local news information with digital surveillance systems, such as HealthMap (Boston Children’s Hospital), is critical. Little research has been published in regards to the specific contribution of local health-related articles to digital surveillance systems. In response, the USAID PREDICT project implemented a local media surveillance (LMS) pilot study in partner countries to monitor disease events reported in print media. This research assessed the potential of LMS to enhance digital surveillance reach in five low- and middle-income countries. Over 16 weeks, select surveillance system attributes of LMS, such as simplicity, flexibility, acceptability, timeliness, and stability were evaluated to identify strengths and weaknesses in the surveillance method. Findings revealed that LMS filled gaps in digital surveillance network coverage by contributing valuable localized information on disease events to the global HealthMap database. A total of 87 health events were reported through the LMS pilot in the 16-week monitoring period, including 71 unique reports not found by the HealthMap digital detection tool. Furthermore, HealthMap identified an additional 236 health events outside of LMS. It was also observed that belief in the importance of the project and proper source selection from the participants was crucial to the success of this method. The timely identification of disease outbreaks near points of emergence and the recognition of risk factors associated with disease occurrence continue to be important components of any comprehensive surveillance system for monitoring disease activity across populations. The LMS method, with its minimal resource commitment, could be one tool used to address the information gaps seen in global ‘hot spot’ regions. PMID:25333618

  9. The NavTrax fleet management system

    NASA Astrophysics Data System (ADS)

    McLellan, James F.; Krakiwsky, Edward J.; Schleppe, John B.; Knapp, Paul L.

    The NavTrax System, a dispatch-type automatic vehicle location and navigation system, is discussed. Attention is given to its positioning, communication, digital mapping, and dispatch center components. The positioning module is a robust GPS (Global Positioning System)-based system integrated with dead reckoning devices by a decentralized-federated filter, making the module fault tolerant. The error behavior and characteristics of GPS, rate gyro, compass, and odometer sensors are discussed. The communications module, as presently configured, utilizes UHF radio technology, and plans are being made to employ a digital cellular telephone system. Polling and automatic smart vehicle reporting are also discussed. The digital mapping component is an intelligent digital single line road network database stored in vector form with full connectivity and address ranges. A limited form of map matching is performed for the purposes of positioning, but its main purpose is to define location once position is determined.

  10. Building a Digital Library for Multibeam Data, Images and Documents

    NASA Astrophysics Data System (ADS)

    Miller, S. P.; Staudigel, H.; Koppers, A.; Johnson, C.; Cande, S.; Sandwell, D.; Peckman, U.; Becker, J. J.; Helly, J.; Zaslavsky, I.; Schottlaender, B. E.; Starr, S.; Montoya, G.

    2001-12-01

    The Scripps Institution of Oceanography, the UCSD Libraries and the San Diego Supercomputing Center have joined forces to establish a digital library for accessing a wide range of multibeam and marine geophysical data, to a community that ranges from the MGG researcher to K-12 outreach clients. This digital library collection will include 233 multibeam cruises with grids, plots, photographs, station data, technical reports, planning documents and publications, drawn from the holdings of the Geological Data Center and the SIO Archives. Inquiries will be made through an Ocean Exploration Console, reminiscent of a cockpit display where a multitude of data may be displayed individually or in two or three-dimensional projections. These displays will provide access to cruise data as well as global databases such as Global Topography, crustal age, and sediment thickness, thus meeting the day-to-day needs of researchers as well as educators, students, and the public. The prototype contains a few selected expeditions, and a review of the initial approach will be solicited from the user community during the poster session. The search process can be focused by a variety of constraints: geospatial (lat-lon box), temporal (e.g., since 1996), keyword (e.g., cruise, place name, PI, etc.), or expert-level (e.g., K-6 or researcher). The Storage Resource Broker (SRB) software from the SDSC manages the evolving collection as a series of distributed but related archives in various media, from shipboard data through processing and final archiving. The latest version of MB-System provides for the systematic creation of standard metadata, and for the harvesting of metadata from multibeam files. Automated scripts will be used to load the metadata catalog to enable queries with an Oracle database management system. These new efforts to bridge the gap between libraries and data archives are supported by the NSF Information Technology and National Science Digital Library (NSDL) programs, augmented by UC funds, and closely coordinated with Digital Library for Earth System Education (DLESE) activities.

  11. The Northern Circumpolar Soil Carbon Database: spatially distributed datasets of soil coverage and soil carbon storage in the northern permafrost regions

    NASA Astrophysics Data System (ADS)

    Hugelius, G.; Tarnocai, C.; Broll, G.; Canadell, J. G.; Kuhry, P.; Swanson, D. K.

    2012-08-01

    High latitude terrestrial ecosystems are key components in the global carbon (C) cycle. Estimates of global soil organic carbon (SOC), however, do not include updated estimates of SOC storage in permafrost-affected soils or representation of the unique pedogenic processes that affect these soils. The Northern Circumpolar Soil Carbon Database (NCSCD) was developed to quantify the SOC stocks in the circumpolar permafrost region (18.7 × 106 km2). The NCSCD is a polygon-based digital database compiled from harmonized regional soil classification maps in which data on soil order coverage has been linked to pedon data (n = 1647) from the northern permafrost regions to calculate SOC content and mass. In addition, new gridded datasets at different spatial resolutions have been generated to facilitate research applications using the NCSCD (standard raster formats for use in Geographic Information Systems and Network Common Data Form files common for applications in numerical models). This paper describes the compilation of the NCSCD spatial framework, the soil sampling and soil analyses procedures used to derive SOC content in pedons from North America and Eurasia and the formatting of the digital files that are available online. The potential applications and limitations of the NCSCD in spatial analyses are also discussed. The database has the doi:10.5879/ecds/00000001. An open access data-portal with all the described GIS-datasets is available online at: http://dev1.geo.su.se/bbcc/dev/ncscd/.

  12. The Northern Circumpolar Soil Carbon Database: spatially distributed datasets of soil coverage and soil carbon storage in the northern permafrost regions

    NASA Astrophysics Data System (ADS)

    Hugelius, G.; Tarnocai, C.; Broll, G.; Canadell, J. G.; Kuhry, P.; Swanson, D. K.

    2013-01-01

    High-latitude terrestrial ecosystems are key components in the global carbon (C) cycle. Estimates of global soil organic carbon (SOC), however, do not include updated estimates of SOC storage in permafrost-affected soils or representation of the unique pedogenic processes that affect these soils. The Northern Circumpolar Soil Carbon Database (NCSCD) was developed to quantify the SOC stocks in the circumpolar permafrost region (18.7 × 106 km2). The NCSCD is a polygon-based digital database compiled from harmonized regional soil classification maps in which data on soil order coverage have been linked to pedon data (n = 1778) from the northern permafrost regions to calculate SOC content and mass. In addition, new gridded datasets at different spatial resolutions have been generated to facilitate research applications using the NCSCD (standard raster formats for use in geographic information systems and Network Common Data Form files common for applications in numerical models). This paper describes the compilation of the NCSCD spatial framework, the soil sampling and soil analytical procedures used to derive SOC content in pedons from North America and Eurasia and the formatting of the digital files that are available online. The potential applications and limitations of the NCSCD in spatial analyses are also discussed. The database has the doi:10.5879/ecds/00000001. An open access data portal with all the described GIS-datasets is available online at: http://www.bbcc.su.se/data/ncscd/.

  13. Digital release of the Alaska Quaternary fault and fold database

    NASA Astrophysics Data System (ADS)

    Koehler, R. D.; Farrell, R.; Burns, P.; Combellick, R. A.; Weakland, J. R.

    2011-12-01

    The Alaska Division of Geological & Geophysical Surveys (DGGS) has designed a Quaternary fault and fold database for Alaska in conformance with standards defined by the U.S. Geological Survey for the National Quaternary fault and fold database. Alaska is the most seismically active region of the United States, however little information exists on the location, style of deformation, and slip rates of Quaternary faults. Thus, to provide an accurate, user-friendly, reference-based fault inventory to the public, we are producing a digital GIS shapefile of Quaternary fault traces and compiling summary information on each fault. Here, we present relevant information pertaining to the digital GIS shape file and online access and availability of the Alaska database. This database will be useful for engineering geologic studies, geologic, geodetic, and seismic research, and policy planning. The data will also contribute to the fault source database being constructed by the Global Earthquake Model (GEM), Faulted Earth project, which is developing tools to better assess earthquake risk. We derived the initial list of Quaternary active structures from The Neotectonic Map of Alaska (Plafker et al., 1994) and supplemented it with more recent data where available. Due to the limited level of knowledge on Quaternary faults in Alaska, pre-Quaternary fault traces from the Plafker map are shown as a layer in our digital database so users may view a more accurate distribution of mapped faults and to suggest the possibility that some older traces may be active yet un-studied. The database will be updated as new information is developed. We selected each fault by reviewing the literature and georegistered the faults from 1:250,000-scale paper maps contained in 1970's vintage and earlier bedrock maps. However, paper map scales range from 1:20,000 to 1:500,000. Fault parameters in our GIS fault attribute tables include fault name, age, slip rate, slip sense, dip direction, fault line type (i.e., well constrained, moderately constrained, or inferred), and mapped scale. Each fault is assigned a three-integer CODE, based upon age, slip rate, and how well the fault is located. This CODE dictates the line-type for the GIS files. To host the database, we are developing an interactive web-map application with ArcGIS for Server and the ArcGIS API for JavaScript from Environmental Systems Research Institute, Inc. (Esri). The web-map application will present the database through a visible scale range with each fault displayed at the resolution of the original map. Application functionality includes: search by name or location, identification of fault by manual selection, and choice of base map. Base map options include topographic, satellite imagery, and digital elevation maps available from ArcGIS on-line. We anticipate that the database will be publically accessible from a portal embedded on the DGGS website by the end of 2011.

  14. Digital database of channel cross-section surveys, Mount St. Helens, Washington

    USGS Publications Warehouse

    Mosbrucker, Adam R.; Spicer, Kurt R.; Major, Jon J.; Saunders, Dennis R.; Christianson, Tami S.; Kingsbury, Cole G.

    2015-08-06

    Stream-channel cross-section survey data are a fundamental component to studies of fluvial geomorphology. Such data provide important parameters required by many open-channel flow models, sediment-transport equations, sediment-budget computations, and flood-hazard assessments. At Mount St. Helens, Washington, the long-term response of channels to the May 18, 1980, eruption, which dramatically altered the hydrogeomorphic regime of several drainages, is documented by an exceptional time series of repeat stream-channel cross-section surveys. More than 300 cross sections, most established shortly following the eruption, represent more than 100 kilometers of surveyed topography. Although selected cross sections have been published previously in print form, we present a comprehensive digital database that includes geospatial and tabular data. Furthermore, survey data are referenced to a common geographic projection and to common datums. Database design, maintenance, and data dissemination are accomplished through a geographic information system (GIS) platform, which integrates survey data acquired with theodolite, total station, and global navigation satellite system (GNSS) instrumentation. Users can interactively perform advanced queries and geospatial time-series analysis. An accuracy assessment provides users the ability to quantify uncertainty within these data. At the time of publication, this project is ongoing. Regular database updates are expected; users are advised to confirm they are using the latest version.

  15. Internationally coordinated glacier monitoring: strategy and datasets

    NASA Astrophysics Data System (ADS)

    Hoelzle, Martin; Armstrong, Richard; Fetterer, Florence; Gärtner-Roer, Isabelle; Haeberli, Wilfried; Kääb, Andreas; Kargel, Jeff; Nussbaumer, Samuel; Paul, Frank; Raup, Bruce; Zemp, Michael

    2014-05-01

    Internationally coordinated monitoring of long-term glacier changes provide key indicator data about global climate change and began in the year 1894 as an internationally coordinated effort to establish standardized observations. Today, world-wide monitoring of glaciers and ice caps is embedded within the Global Climate Observing System (GCOS) in support of the United Nations Framework Convention on Climate Change (UNFCCC) as an important Essential Climate Variable (ECV). The Global Terrestrial Network for Glaciers (GTN-G) was established in 1999 with the task of coordinating measurements and to ensure the continuous development and adaptation of the international strategies to the long-term needs of users in science and policy. The basic monitoring principles must be relevant, feasible, comprehensive and understandable to a wider scientific community as well as to policy makers and the general public. Data access has to be free and unrestricted, the quality of the standardized and calibrated data must be high and a combination of detailed process studies at selected field sites with global coverage by satellite remote sensing is envisaged. Recently a GTN-G Steering Committee was established to guide and advise the operational bodies responsible for the international glacier monitoring, which are the World Glacier Monitoring Service (WGMS), the US National Snow and Ice Data Center (NSIDC), and the Global Land Ice Measurements from Space (GLIMS) initiative. Several online databases containing a wealth of diverse data types having different levels of detail and global coverage provide fast access to continuously updated information on glacier fluctuation and inventory data. For world-wide inventories, data are now available through (a) the World Glacier Inventory containing tabular information of about 130,000 glaciers covering an area of around 240,000 km2, (b) the GLIMS-database containing digital outlines of around 118,000 glaciers with different time stamps and (c) the Randolph Glacier Inventory (RGI), a new and globally complete digital dataset of outlines from about 180,000 glaciers with some meta-information, which has been used for many applications relating to the IPCC AR5 report. Concerning glacier changes, a database (Fluctuations of Glaciers) exists containing information about mass balance, front variations including past reconstructed time series, geodetic changes and special events. Annual mass balance reporting contains information for about 125 glaciers with a subset of 37 glaciers with continuous observational series since 1980 or earlier. Front variation observations of around 1800 glaciers are available from most of the mountain ranges world-wide. This database was recently updated with 26 glaciers having an unprecedented dataset of length changes from from reconstructions of well-dated historical evidence going back as far as the 16th century. Geodetic observations of about 430 glaciers are available. The database is completed by a dataset containing information on special events including glacier surges, glacier lake outbursts, ice avalanches, eruptions of ice-clad volcanoes, etc. related to about 200 glaciers. A special database of glacier photographs contains 13,000 pictures from around 500 glaciers, some of them dating back to the 19th century. A key challenge is to combine and extend the traditional observations with fast evolving datasets from new technologies.

  16. Neighborhood Structural Similarity Mapping for the Classification of Masses in Mammograms.

    PubMed

    Rabidas, Rinku; Midya, Abhishek; Chakraborty, Jayasree

    2018-05-01

    In this paper, two novel feature extraction methods, using neighborhood structural similarity (NSS), are proposed for the characterization of mammographic masses as benign or malignant. Since gray-level distribution of pixels is different in benign and malignant masses, more regular and homogeneous patterns are visible in benign masses compared to malignant masses; the proposed method exploits the similarity between neighboring regions of masses by designing two new features, namely, NSS-I and NSS-II, which capture global similarity at different scales. Complementary to these global features, uniform local binary patterns are computed to enhance the classification efficiency by combining with the proposed features. The performance of the features are evaluated using the images from the mini-mammographic image analysis society (mini-MIAS) and digital database for screening mammography (DDSM) databases, where a tenfold cross-validation technique is incorporated with Fisher linear discriminant analysis, after selecting the optimal set of features using stepwise logistic regression method. The best area under the receiver operating characteristic curve of 0.98 with an accuracy of is achieved with the mini-MIAS database, while the same for the DDSM database is 0.93 with accuracy .

  17. Precision global health in the digital age.

    PubMed

    Flahault, Antoine; Geissbuhler, Antoine; Guessous, Idris; Guérin, Philippe; Bolon, Isabelle; Salathé, Marcel; Escher, Gérard

    2017-04-19

    Precision global health is an approach similar to precision medicine, which facilitates, through innovation and technology, better targeting of public health interventions on a global scale, for the purpose of maximising their effectiveness and relevance. Illustrative examples include: the use of remote sensing data to fight vector-borne diseases; large databases of genomic sequences of foodborne pathogens helping to identify origins of outbreaks; social networks and internet search engines for tracking communicable diseases; cell phone data in humanitarian actions; drones to deliver healthcare services in remote and secluded areas. Open science and data sharing platforms are proposed for fostering international research programmes under fair, ethical and respectful conditions. Innovative education, such as massive open online courses or serious games, can promote wider access to training in public health and improving health literacy. The world is moving towards learning healthcare systems. Professionals are equipped with data collection and decision support devices. They share information, which are complemented by external sources, and analysed in real time using machine learning techniques. They allow for the early detection of anomalies, and eventually guide appropriate public health interventions. This article shows how information-driven approaches, enabled by digital technologies, can help improving global health with greater equity.

  18. POPSCAN: A CNES Geo-Information Study for Re-Entry Risk Assessment

    NASA Astrophysics Data System (ADS)

    Fuentes, N.; Tholey, N.; Battiston, S.; Montabord, M.; Studer, M.

    2013-09-01

    Within the framework of the FSOA, French Space Operations Act (referred to as the "Loi relative aux Opérations Spatiales" or LOS in French), including in particular the monitoring of safety requirements for people and property, one major parameter to consider is Geographic Information (GI) on population distribution, human activity, and land occupation.This article gives an overview of the set of geographic and demographic data examined for CNES control offices, outlining the advantages and limits of each one : coverage, precision, update frequency, availability, distribution, ...It focuses on the two major available global population databases: GPW-GRUMP from CIESIN of COLUMBIA University and LandScan from ORNL. The work engaged on POPSCAN integrates digital analysis about these two world population grids and also comparisons on other databases such as GLOBAL- INSIGHT, VMAP0, ESRI, DMSP-ISA, GLOBCOVER, OpenFlights, ... for urban areas, communication networks, sensitive human activities and land use.

  19. Offline Signature Verification Using the Discrete Radon Transform and a Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Coetzer, J.; Herbst, B. M.; du Preez, J. A.

    2004-12-01

    We developed a system that automatically authenticates offline handwritten signatures using the discrete Radon transform (DRT) and a hidden Markov model (HMM). Given the robustness of our algorithm and the fact that only global features are considered, satisfactory results are obtained. Using a database of 924 signatures from 22 writers, our system achieves an equal error rate (EER) of 18% when only high-quality forgeries (skilled forgeries) are considered and an EER of 4.5% in the case of only casual forgeries. These signatures were originally captured offline. Using another database of 4800 signatures from 51 writers, our system achieves an EER of 12.2% when only skilled forgeries are considered. These signatures were originally captured online and then digitally converted into static signature images. These results compare well with the results of other algorithms that consider only global features.

  20. Digital Video of Live-Scan Fingerprint Data

    National Institute of Standards and Technology Data Gateway

    NIST Digital Video of Live-Scan Fingerprint Data (PC database for purchase)   NIST Special Database 24 contains MPEG-2 (Moving Picture Experts Group) compressed digital video of live-scan fingerprint data. The database is being distributed for use in developing and testing of fingerprint verification systems.

  1. New trends in the virtualization of hospitals--tools for global e-Health.

    PubMed

    Graschew, Georgi; Roelofs, Theo A; Rakowsky, Stefan; Schlag, Peter M; Heinzlreiter, Paul; Kranzlmüller, Dieter; Volkert, Jens

    2006-01-01

    The development of virtual hospitals and digital medicine helps to bridge the digital divide between different regions of the world and enables equal access to high-level medical care. Pre-operative planning, intra-operative navigation and minimally-invasive surgery require a digital and virtual environment supporting the perception of the physician. As data and computing resources in a virtual hospital are distributed over many sites the concept of the Grid should be integrated with other communication networks and platforms. A promising approach is the implementation of service-oriented architectures for an invisible grid, hiding complexity for both application developers and end-users. Examples of promising medical applications of Grid technology are the real-time 3D-visualization and manipulation of patient data for individualized treatment planning and the creation of distributed intelligent databases of medical images.

  2. Seasonal land-cover regions of the United States

    USGS Publications Warehouse

    Loveland, Thomas R.; Merchant, James W.; Brown, Jesslyn F.; Ohlen, Donald O.; Reed, Bradley C.; Olson, Paul; Hutchinson, John

    1995-01-01

    Global-change investigations have been hindered by deficiencies in the availability and quality of land-cover data. The U.S. Geological Survey and the University of Nebraska-Lincoln have collaborated on the development of a new approach to land-cover characterization that attempts to address requirements of the global-change research community and others interested in regional patterns of land cover. An experimental 1 -kilometer-resolution database of land-cover characteristics for the coterminous U.S. has been prepared to test and evaluate the approach. Using multidate Advanced Very High Resolution Radiometer (AVHRR) satellite data complemented by elevation, climate, ecoregions, and other digital spatial datasets, the authors define 152, seasonal land-cover regions. The regionalization is based on a taxonomy of areas with respect to data on land cover, seasonality or phenology, and relative levels of primary production. The resulting database consists of descriptions of the vegetation, land cover, and seasonal, spectral, and site characteristics for each region. These data are used in the construction of an illustrative 1:7,500,000-scaIe map of the seasonal land-cover regions as well as of smaller-scale maps portraying general land cover and seasonality. The seasonal land-cover characteristics database can also be tailored to provide a broad range of other landscape parameters useful in national and global-scale environmental modeling and assessment.

  3. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) after fifteen years: Review of global products

    NASA Astrophysics Data System (ADS)

    Abrams, Michael; Tsu, Hiroji; Hulley, Glynn; Iwao, Koki; Pieri, David; Cudahy, Tom; Kargel, Jeffrey

    2015-06-01

    The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) is a 15-channel imaging instrument operating on NASA's Terra satellite. A joint project between the U.S. National Aeronautics and Space Administration and Japan's Ministry of Economy, Trade, and Industry, ASTER has been acquiring data for 15 years, since March 2000. The archive now contains over 2.8 million scenes; for the majority of them, a stereo pair was collected using nadir and backward telescopes imaging in the NIR wavelength. The majority of users require only a few to a few dozen scenes for their work. Studies have ranged over numerous scientific disciplines, and many practical applications have benefited from ASTER's unique data. A few researchers have been able to mine the entire ASTER archive, that is now global in extent due to the long duration of the mission. Six examples of global products are described in this contribution: the ASTER Global Digital Elevation Model (GDEM), the most complete, highest resolution DEM available to all users; the ASTER Emissivity Database (ASTER GED), a global 5-band emissivity map of the land surface; the ASTER Global Urban Area Map (AGURAM), a 15-m resolution database of over 3500 cities; the ASTER Volcano Archive (AVA), an archive of over 1500 active volcanoes; ASTER Geoscience products of the continent of Australia; and the Global Ice Monitoring from Space (GLIMS) project.

  4. Geographic Information System Data Analysis

    NASA Technical Reports Server (NTRS)

    Billings, Chad; Casad, Christopher; Floriano, Luis G.; Hill, Tracie; Johnson, Rashida K.; Locklear, J. Mark; Penn, Stephen; Rhoulac, Tori; Shay, Adam H.; Taylor, Antone; hide

    1995-01-01

    Data was collected in order to further NASA Langley Research Center's Geographic Information System(GIS). Information on LaRC's communication, electrical, and facility configurations was collected. Existing data was corrected through verification, resulting in more accurate databases. In addition, Global Positioning System(GPS) points were used in order to accurately impose buildings on digitized images. Overall, this project will help the Imaging and CADD Technology Team (ICTT) prove GIS to be a valuable resource for LaRC.

  5. Geologic Map of the Tucson and Nogales Quadrangles, Arizona (Scale 1:250,000): A Digital Database

    USGS Publications Warehouse

    Peterson, J.A.; Berquist, J.R.; Reynolds, S.J.; Page-Nedell, S. S.; Digital database by Oland, Gustav P.; Hirschberg, Douglas M.

    2001-01-01

    The geologic map of the Tucson-Nogales 1:250,000 scale quadrangle (Peterson and others, 1990) was digitized by U.S. Geological Survey staff and University of Arizona contractors at the Southwest Field Office, Tucson, Arizona, in 2000 for input into a geographic information system (GIS). The database was created for use as a basemap in a decision support system designed by the National Industrial Minerals and Surface Processes project. The resulting digital geologic map database can be queried in many ways to produce a variety of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included; they may be obtained from a variety of commercial and government sources. Additionally, point features, such as strike and dip, were not captured from the original paper map and are not included in the database. This database is not meant to be used or displayed at any scale larger than 1:250,000 (for example, 1:100,000 or 1:24,000). The digital geologic map graphics and plot files that are provided in the digital package are representations of the digital database. They are not designed to be cartographic products.

  6. Mars global digital dune database and initial science results

    USGS Publications Warehouse

    Hayward, R.K.; Mullins, K.F.; Fenton, L.K.; Hare, T.M.; Titus, T.N.; Bourke, M.C.; Colaprete, A.; Christensen, P.R.

    2007-01-01

    A new Mars Global Digital Dune Database (MGD3) constructed using Thermal Emission Imaging System (THEMIS) infrared (IR) images provides a comprehensive and quantitative view of the geographic distribution of moderate- to large-size dune fields (area >1 kM2) that will help researchers to understand global climatic and sedimentary processes that have shaped the surface of Mars. MGD3 extends from 65??N to 65??S latitude and includes ???550 dune fields, covering ???70,000 km2, with an estimated total volume of ???3,600 km3. This area, when combined with polar dune estimates, suggests moderate- to large-size dune field coverage on Mars may total ???800,000 km2, ???6 times less than the total areal estimate of ???5,000,000 km2 for terrestrial dunes. Where availability and quality of THEMIS visible (VIS) or Mars Orbiter Camera. narrow-angle (MOC NA) images allow, we classify dunes and include dune slipface measurements, which are derived from gross dune morphology and represent the prevailing wind direction at the last time of significant dune modification. For dunes located within craters, the azimuth from crater centroid to dune field centroid (referred to as dune centroid azimuth) is calculated and can provide an accurate method for tracking dune migration within smooth-floored craters. These indicators of wind direction are compared to output from a general circulation model (GCM). Dune centroid azimuth values generally correlate to regional wind patterns. Slipface orientations are less well correlated, suggesting that local topographic effects may play a larger role in dune orientation than regional winds. Copyright 2007 by the American Geophysical Union.

  7. Development of Elevation and Relief Databases for ICESat-2/ATLAS Receiver Algorithms

    NASA Astrophysics Data System (ADS)

    Leigh, H. W.; Magruder, L. A.; Carabajal, C. C.; Saba, J. L.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    The Advanced Topographic Laser Altimeter System (ATLAS) is planned to launch onboard NASA's ICESat-2 spacecraft in 2016. ATLAS operates at a wavelength of 532 nm with a laser repeat rate of 10 kHz and 6 individual laser footprints. The satellite will be in a 500 km, 91-day repeat ground track orbit at an inclination of 92°. A set of onboard Receiver Algorithms has been developed to reduce the data volume and data rate to acceptable levels while still transmitting the relevant ranging data. The onboard algorithms limit the data volume by distinguishing between surface returns and background noise and selecting a small vertical region around the surface return to be included in telemetry. The algorithms make use of signal processing techniques, along with three databases, the Digital Elevation Model (DEM), the Digital Relief Map (DRM), and the Surface Reference Mask (SRM), to find the signal and determine the appropriate dynamic range of vertical data surrounding the surface for downlink. The DEM provides software-based range gating for ATLAS. This approach allows the algorithm to limit the surface signal search to the vertical region between minimum and maximum elevations provided by the DEM (plus some margin to account for uncertainties). The DEM is constructed in a nested, three-tiered grid to account for a hardware constraint limiting the maximum vertical range to 6 km. The DRM is used to select the vertical width of the telemetry band around the surface return. The DRM contains global values of relief calculated along 140 m and 700 m ground track segments consistent with a 92° orbit. The DRM must contain the maximum value of relief seen in any given area, but must be as close to truth as possible as the DRM directly affects data volume. The SRM, which has been developed independently from the DEM and DRM, is used to set parameters within the algorithm and select telemetry bands for downlink. Both the DEM and DRM are constructed from publicly available digital elevation models. No elevation models currently exist that provide global coverage at a sufficient resolution, so several regional models have been mosaicked together to produce global databases. In locations where multiple data sets are available, evaluations have been made to determine the optimal source for the databases, primarily based on resolution and accuracy. Separate procedures for calculating relief were developed for high latitude (>60N/S) regions in order to take advantage of polar stereographic projections. An additional method for generating the databases was developed for use over Antarctica, such that high resolution, regional elevation models can be easily incorporated as they become available in the future. The SRM is used to facilitate DEM and DRM production by defining those regions that are ocean and sea ice. Ocean and sea ice elevation values are defined by the geoid, while relief is set to a constant value. Results presented will include the details of data source selection, the methodologies used to create the databases, and the final versions of both the DEM and DRM databases. Companion presentations by McGarry, et al. and Carabajal, et al. describe the ATLAS onboard Receiver Algorithms and the database verification, respectively.

  8. New data sources and derived products for the SRER digital spatial database

    Treesearch

    Craig Wissler; Deborah Angell

    2003-01-01

    The Santa Rita Experimental Range (SRER) digital database was developed to automate and preserve ecological data and increase their accessibility. The digital data holdings include a spatial database that is used to integrate ecological data in a known reference system and to support spatial analyses. Recently, the Advanced Resource Technology (ART) facility has added...

  9. S-World: A high resolution global soil database for simulation modelling (Invited)

    NASA Astrophysics Data System (ADS)

    Stoorvogel, J. J.

    2013-12-01

    There is an increasing call for high resolution soil information at the global level. A good example for such a call is the Global Gridded Crop Model Intercomparison carried out within AgMIP. While local studies can make use of surveying techniques to collect additional techniques this is practically impossible at the global level. It is therefore important to rely on legacy data like the Harmonized World Soil Database. Several efforts do exist that aim at the development of global gridded soil property databases. These estimates of the variation of soil properties can be used to assess e.g., global soil carbon stocks. However, they do not allow for simulation runs with e.g., crop growth simulation models as these models require a description of the entire pedon rather than a few soil properties. This study provides the required quantitative description of pedons at a 1 km resolution for simulation modelling. It uses the Harmonized World Soil Database (HWSD) for the spatial distribution of soil types, the ISRIC-WISE soil profile database to derive information on soil properties per soil type, and a range of co-variables on topography, climate, and land cover to further disaggregate the available data. The methodology aims to take stock of these available data. The soil database is developed in five main steps. Step 1: All 148 soil types are ordered on the basis of their expected topographic position using e.g., drainage, salinization, and pedogenesis. Using the topographic ordering and combining the HWSD with a digital elevation model allows for the spatial disaggregation of the composite soil units. This results in a new soil map with homogeneous soil units. Step 2: The ranges of major soil properties for the topsoil and subsoil of each of the 148 soil types are derived from the ISRIC-WISE soil profile database. Step 3: A model of soil formation is developed that focuses on the basic conceptual question where we are within the range of a particular soil property at a particular location given a specific soil type. The soil properties are predicted for each grid cell based on the soil type, the corresponding ranges of soil properties, and the co-variables. Step 4: Standard depth profiles are developed for each of the soil types using the diagnostic criteria of the soil types and soil profile information from the ISRIC-WISE database. The standard soil profiles are combined with the the predicted values for the topsoil and subsoil yielding unique soil profiles at each location. Step 5: In a final step, additional soil properties are added to the database using averages for the soil types and pedo-transfer functions. The methodology, denominated S-World (Soils of the World), results in readily available global maps with quantitative pedon data for modelling purposes. It forms the basis for the Global Gridded Crop Model Intercomparison carried out within AgMIP.

  10. Image Format Conversion to DICOM and Lookup Table Conversion to Presentation Value of the Japanese Society of Radiological Technology (JSRT) Standard Digital Image Database.

    PubMed

    Yanagita, Satoshi; Imahana, Masato; Suwa, Kazuaki; Sugimura, Hitomi; Nishiki, Masayuki

    2016-01-01

    Japanese Society of Radiological Technology (JSRT) standard digital image database contains many useful cases of chest X-ray images, and has been used in many state-of-the-art researches. However, the pixel values of all the images are simply digitized as relative density values by utilizing a scanned film digitizer. As a result, the pixel values are completely different from the standardized display system input value of digital imaging and communications in medicine (DICOM), called presentation value (P-value), which can maintain a visual consistency when observing images using different display luminance. Therefore, we converted all the images from JSRT standard digital image database to DICOM format followed by the conversion of the pixel values to P-value using an original program developed by ourselves. Consequently, JSRT standard digital image database has been modified so that the visual consistency of images is maintained among different luminance displays.

  11. Forensic Tools to Track and Connect Physical Samples to Related Data

    NASA Astrophysics Data System (ADS)

    Molineux, A.; Thompson, A. C.; Baumgardner, R. W.

    2016-12-01

    Identifiers, such as local sample numbers, are critical to successfully connecting physical samples and related data. However, identifiers must be globally unique. The International Geo Sample Number (IGSN) generated when registering the sample in the System for Earth Sample Registration (SESAR) provides a globally unique alphanumeric code associated with basic metadata, related samples and their current physical storage location. When registered samples are published, users can link the figured samples to the basic metadata held at SESAR. The use cases we discuss include plant specimens from a Permian core, Holocene corals and derived powders, and thin sections with SEM stubs. Much of this material is now published. The plant taxonomic study from the core is a digital pdf and samples can be directly linked from the captions to the SESAR record. The study of stable isotopes from the corals is not yet digitally available, but individual samples are accessible. Full data and media records for both studies are located in our database where higher quality images, field notes, and section diagrams may exist. Georeferences permit mapping in current and deep time plate configurations. Several aspects emerged during this study. The first, ensure adequate and consistent details are registered with SESAR. Second, educate and encourage the researcher to obtain IGSNs. Third, publish the archive numbers, assigned prior to publication, alongside the IGSN. This provides access to further data through an Integrated Publishing Toolkit (IPT)/aggregators/or online repository databases, thus placing the initial sample in a much richer context for future studies. Fourth, encourage software developers to customize community software to extract data from a database and use it to register samples in bulk. This would improve workflow and provide a path for registration of large legacy collections.

  12. The Global Genome Biodiversity Network (GGBN) Data Standard specification

    PubMed Central

    Droege, G.; Barker, K.; Seberg, O.; Coddington, J.; Benson, E.; Berendsohn, W. G.; Bunk, B.; Butler, C.; Cawsey, E. M.; Deck, J.; Döring, M.; Flemons, P.; Gemeinholzer, B.; Güntsch, A.; Hollowell, T.; Kelbert, P.; Kostadinov, I.; Kottmann, R.; Lawlor, R. T.; Lyal, C.; Mackenzie-Dodds, J.; Meyer, C.; Mulcahy, D.; Nussbeck, S. Y.; O'Tuama, É.; Orrell, T.; Petersen, G.; Robertson, T.; Söhngen, C.; Whitacre, J.; Wieczorek, J.; Yilmaz, P.; Zetzsche, H.; Zhang, Y.; Zhou, X.

    2016-01-01

    Genomic samples of non-model organisms are becoming increasingly important in a broad range of studies from developmental biology, biodiversity analyses, to conservation. Genomic sample definition, description, quality, voucher information and metadata all need to be digitized and disseminated across scientific communities. This information needs to be concise and consistent in today’s ever-increasing bioinformatic era, for complementary data aggregators to easily map databases to one another. In order to facilitate exchange of information on genomic samples and their derived data, the Global Genome Biodiversity Network (GGBN) Data Standard is intended to provide a platform based on a documented agreement to promote the efficient sharing and usage of genomic sample material and associated specimen information in a consistent way. The new data standard presented here build upon existing standards commonly used within the community extending them with the capability to exchange data on tissue, environmental and DNA sample as well as sequences. The GGBN Data Standard will reveal and democratize the hidden contents of biodiversity biobanks, for the convenience of everyone in the wider biobanking community. Technical tools exist for data providers to easily map their databases to the standard. Database URL: http://terms.tdwg.org/wiki/GGBN_Data_Standard PMID:27694206

  13. Spatial digital database for the tectonic map of Southeast Arizona

    USGS Publications Warehouse

    map by Drewes, Harald; digital database by Fields, Robert A.; Hirschberg, Douglas M.; Bolm, Karen S.

    2002-01-01

    A spatial database was created for Drewes' (1980) tectonic map of southeast Arizona: this database supercedes Drewes and others (2001, ver. 1.0). Staff and a contractor at the U.S. Geological Survey in Tucson, Arizona completed an interim digital geologic map database for the east part of the map in 2001, made revisions to the previously released digital data for the west part of the map (Drewes and others, 2001, ver. 1.0), merged data files for the east and west parts, and added additional data not previously captured. Digital base map data files (such as topography, roads, towns, rivers and lakes) are not included: they may be obtained from a variety of commercial and government sources. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. The resulting digital geologic map database can be queried in many ways to produce a variety of geologic maps and derivative products. Because Drewes' (1980) map sheets include additional text and graphics that were not included in this report, scanned images of his maps (i1109_e.jpg, i1109_w.jpg) are included as a courtesy to the reader. This database should not be used or displayed at any scale larger than 1:125,000 (for example, 1:100,000 or 1:24,000). The digital geologic map plot files (i1109_e.pdf and i1109_w.pdf) that are provided herein are representations of the database (see Appendix A). The map area is located in southeastern Arizona (fig. 1). This report describes the map units (from Drewes, 1980), the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. The manuscript and digital data review by Helen Kayser (Information Systems Support, Inc.) is greatly appreciated.

  14. Preliminary geologic map of the Piru 7.5' quadrangle, southern California: a digital database

    USGS Publications Warehouse

    Yerkes, R.F.; Campbell, Russell H.

    1995-01-01

    This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1995). More specific information about the units may be available in the original sources.

  15. Quaternary Geology and Liquefaction Susceptibility, San Francisco, California 1:100,000 Quadrangle: A Digital Database

    USGS Publications Warehouse

    Knudsen, Keith L.; Noller, Jay S.; Sowers, Janet M.; Lettis, William R.

    1997-01-01

    This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There are no paper maps included in the Open-File report. The report does include, however, PostScript plot files containing the images of the geologic map sheets with explanations, as well as the accompanying text describing the geology of the area. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled 'For Those Who Aren't Familiar With Digital Geologic Map Databases' below. This digital map database, compiled from previously unpublished data, and new mapping by the authors, represents the general distribution of surficial deposits in the San Francisco bay region. Together with the accompanying text file (sf_geo.txt or sf_geo.pdf), it provides current information on Quaternary geology and liquefaction susceptibility of the San Francisco, California, 1:100,000 quadrangle. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:100,000 or smaller. The content and character of the database, as well as three methods of obtaining the database, are described below.

  16. Digital images in the map revision process

    NASA Astrophysics Data System (ADS)

    Newby, P. R. T.

    Progress towards the adoption of digital (or softcopy) photogrammetric techniques for database and map revision is reviewed. Particular attention is given to the Ordnance Survey of Great Britain, the author's former employer, where digital processes are under investigation but have not yet been introduced for routine production. Developments which may lead to increasing automation of database update processes appear promising, but because of the cost and practical problems associated with managing as well as updating large digital databases, caution is advised when considering the transition to softcopy photogrammetry for revision tasks.

  17. Digital data storage systems, computers, and data verification methods

    DOEpatents

    Groeneveld, Bennett J.; Austad, Wayne E.; Walsh, Stuart C.; Herring, Catherine A.

    2005-12-27

    Digital data storage systems, computers, and data verification methods are provided. According to a first aspect of the invention, a computer includes an interface adapted to couple with a dynamic database; and processing circuitry configured to provide a first hash from digital data stored within a portion of the dynamic database at an initial moment in time, to provide a second hash from digital data stored within the portion of the dynamic database at a subsequent moment in time, and to compare the first hash and the second hash.

  18. Publications - MP 141 | Alaska Division of Geological & Geophysical Surveys

    Science.gov Websites

    DGGS MP 141 Publication Details Title: Quaternary faults and folds in Alaska: A digital database Combellick, R.A., 2012, Quaternary faults and folds in Alaska: A digital database, in Koehler, R.D Quaternary faults, scale 1:3,700,000 (63.0 M) Digital Geospatial Data Digital Geospatial Data Quaternary

  19. A digital future for the history of psychology?

    PubMed

    Green, Christopher D

    2016-08-01

    This article discusses the role that digital approaches to the history of psychology are likely to play in the near future. A tentative hierarchy of digital methods is proposed. A few examples are briefly described: a digital repository, a simple visualization using ready-made online database and tools, and more complex visualizations requiring the assembly of the database and, possibly, the analytic tools by the researcher. The relationship of digital history to the old "New Economic History" (Cliometrics) is considered. The question of whether digital history and traditional history need be at odds or, instead, might complement each other is woven throughout. The rapidly expanding territory of digital humanistic research outside of psychology is briefly discussed. Finally, the challenging current employment trends in history and the humanities more broadly are considered, along with the role that digital skills might play in mitigating those factors for prospective academic workers. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Dictionary as Database.

    ERIC Educational Resources Information Center

    Painter, Derrick

    1996-01-01

    Discussion of dictionaries as databases focuses on the digitizing of The Oxford English dictionary (OED) and the use of Standard Generalized Mark-Up Language (SGML). Topics include the creation of a consortium to digitize the OED, document structure, relational databases, text forms, sequence, and discourse. (LRW)

  1. [LONI & Co: about the epistemic specificity of digital spaces of knowledge in cognitive neuroscience].

    PubMed

    Huber, Lara

    2011-06-01

    In the neurosciences digital databases more and more are becoming important tools of data rendering and distributing. This development is due to the growing impact of imaging based trial design in cognitive neuroscience, including morphological as much as functional imaging technologies. As the case of the 'Laboratory of Neuro Imaging' (LONI) is showing, databases are attributed a specific epistemological power: Since the 1990s databasing is seen to foster the integration of neuroscientific data, although local regimes of data production, -manipulation and--interpretation are also challenging this development. Databasing in the neurosciences goes along with the introduction of new structures of integrating local data, hence establishing digital spaces of knowledge (epistemic spaces): At this stage, inherent norms of digital databases are affecting regimes of imaging-based trial design, for example clinical research into Alzheimer's disease.

  2. Spatial digital database of the geologic map of Catalina Core Complex and San Pedro Trough, Pima, Pinal, Gila, Graham, and Cochise counties, Arizona

    USGS Publications Warehouse

    Dickinson, William R.; digital database by Hirschberg, Douglas M.; Pitts, G. Stephen; Bolm, Karen S.

    2002-01-01

    The geologic map of Catalina Core Complex and San Pedro Trough by Dickinson (1992) was digitized for input into a geographic information system (GIS) by the U.S. Geological Survey staff and contractors in 2000-2001. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. The resulting digital geologic map database data can be queried in many ways to produce a variety of geologic maps and derivative products. Digital base map data (topography, roads, towns, rivers, lakes, and so forth) are not included; they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:125,000 (for example, 1:100,000 or 1:24,000). The digital geologic map plot files that are provided herein are representations of the database. The map area is located in southern Arizona. This report lists the geologic map units, the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. The manuscript and digital data review by Lorre Moyer (USGS) is greatly appreciated.

  3. Spatial digital database for the geologic map of the east part of the Pullman 1° x 2° quadrangle, Idaho

    USGS Publications Warehouse

    Rember, William C.; Bennett, Earl H.

    2001-01-01

    he paper geologic map of the east part of the Pullman 1·x 2· degree quadrangle, Idaho (Rember and Bennett, 1979) was scanned and initially attributed by Optronics Specialty Co., Inc. (Northridge, CA) and remitted to the U.S. Geological Survey for further attribution and publication of the geospatial digital files. The resulting digital geologic map GIS can be queried in many ways to produce a variety of geologic maps. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. Digital base map data files (topography, roads, towns, rivers and lakes, and others.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:250,000 (for example, 1:100,000 or 1:24,000). The digital geologic map graphics and plot files (pull250k.gra/.hp /.eps) that are provided in the digital package are representations of the digital database.

  4. Operative record using intraoperative digital data in neurosurgery.

    PubMed

    Houkin, K; Kuroda, S; Abe, H

    2000-01-01

    The purpose of this study was to develop a new method for more efficient and accurate operative records using intra-operative digital data in neurosurgery, including macroscopic procedures and microscopic procedures under an operating microscope. Macroscopic procedures were recorded using a digital camera and microscopic procedures were also recorded using a microdigital camera attached to an operating microscope. Operative records were then recorded digitally and filed in a computer using image retouch software and database base software. The time necessary for editing of the digital data and completing the record was less than 30 minutes. Once these operative records are digitally filed, they are easily transferred and used as database. Using digital operative records along with digital photography, neurosurgeons can document their procedures more accurately and efficiently than by the conventional method (handwriting). A complete digital operative record is not only accurate but also time saving. Construction of a database, data transfer and desktop publishing can be achieved using the intra-operative data, including intra-operative photographs.

  5. Global Drainage Patterns to Modern Terrestrial Sedimentary Basins and its Influence on Large River Systems

    NASA Astrophysics Data System (ADS)

    Nyberg, B.; Helland-Hansen, W.

    2017-12-01

    Long-term preservation of alluvial sediments is dependent on the hydrological processes that deposit sediments solely within an area that has available accomodation space and net subsidence know as a sedimentary basin. An understanding of the river processes contributing to terrestrial sedimentary basins is essential to fundamentally constrain and quantify controls on the modern terrestrial sink. Furthermore, the terrestrial source to sink controls place constraints on the entire coastal, shelf and deep marine sediment routing systems. In addition, the geographical importance of modern terrestrial sedimentary basins for agriculture and human settlements has resulted in significant upstream anthropogenic catchment modification for irrigation and energy needs. Yet to our knowledge, a global catchment model depicting the drainage patterns to modern terrestrial sedimentary basins has previously not been established that may be used to address these challenging issues. Here we present a new database of 180,737 global catchments that show the surface drainage patterns to modern terrestrial sedimentary basins. This is achieved by using high resolution river networks derived from digital elevation models in relation to newly acquired maps on global modern sedimentary basins to identify terrestrial sinks. The results show that active tectonic regimes are typically characterized by larger terrestrial sedimentary basins, numerous smaller source catchments and a high source to sink relief ratio. To the contrary passive margins drain catchments to smaller terrestrial sedimentary basins, are composed of fewer source catchments that are relatively larger and a lower source to sink relief ratio. The different geomorphological characteristics of source catchments by tectonic setting influence the spatial and temporal patterns of fluvial architecture within sedimentary basins and the anthropogenic methods of exploiting those rivers. The new digital database resource is aimed to help the geoscientific community to contribute further to our quantitative understanding of source-to-sink systems and its allogenic and autogenic controls, geomorphological characteristics, terrestrial sediment transit times and the anthropogenic impact on those systems.

  6. Global rates of habitat loss and implications for amphibian conservation

    USGS Publications Warehouse

    Gallant, Alisa L.; Klaver, R.W.; Casper, G.S.; Lannoo, M.J.

    2007-01-01

    A large number of factors are known to affect amphibian population viability, but most authors agree that the principal causes of amphibian declines are habitat loss, alteration, and fragmentation. We provide a global assessment of land use dynamics in the context of amphibian distributions. We accomplished this by compiling global maps of amphibian species richness and recent rates of change in land cover, land use, and human population growth. The amphibian map was developed using a combination of published literature and digital databases. We used an ecoregion framework to help interpret species distributions across environmental, rather than political, boundaries. We mapped rates of land cover and use change with statistics from the World Resources Institute, refined with a global digital dataset on land cover derived from satellite data. Temporal maps of human population were developed from the World Resources Institute database and other published sources. Our resultant map of amphibian species richness illustrates that amphibians are distributed in an uneven pattern around the globe, preferring terrestrial and freshwater habitats in ecoregions that are warm and moist. Spatiotemporal patterns of human population show that, prior to the 20th century, population growth and spread was slower, most extensive in the temperate ecoregions, and largely exclusive of major regions of high amphibian richness. Since the beginning of the 20th century, human population growth has been exponential and has occurred largely in the subtropical and tropical ecoregions favored by amphibians. Population growth has been accompanied by broad-scale changes in land cover and land use, typically in support of agriculture. We merged information on land cover, land use, and human population growth to generate a composite map showing the rates at which humans have been changing the world. When compared with the map of amphibian species richness, we found that many of the regions of the earth supporting the richest assemblages of amphibians are currently undergoing the highest rates of landscape modification.

  7. From ecological records to big data: the invention of global biodiversity.

    PubMed

    Devictor, Vincent; Bensaude-Vincent, Bernadette

    2016-12-01

    This paper is a critical assessment of the epistemological impact of the systematic quantification of nature with the accumulation of big datasets on the practice and orientation of ecological science. We examine the contents of big databases and argue that it is not just accumulated information; records are translated into digital data in a process that changes their meanings. In order to better understand what is at stake in the 'datafication' process, we explore the context for the emergence and quantification of biodiversity in the 1980s, along with the concept of the global environment. In tracing the origin and development of the global biodiversity information facility (GBIF) we describe big data biodiversity projects as a techno-political construction dedicated to monitoring a new object: the global diversity. We argue that, biodiversity big data became a powerful driver behind the invention of the concept of the global environment, and a way to embed ecological science in the political agenda.

  8. Geology of Point Reyes National Seashore and vicinity, California: a digital database

    USGS Publications Warehouse

    Clark, Jospeh C.; Brabb, Earl E.

    1997-01-01

    This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The report does include, however, a PostScript plot file containing an image of the geologic map sheet with explanation, as well as the accompanying text describing the geology of the area. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled 'For Those Who Aren't Familiar With Digital Geologic Map Databases' below. This digital map database, compiled from previously published and unpublished data and new mapping by the authors, represents the general distribution of surficial deposits and rock units in Point Reyes and surrounding areas. Together with the accompanying text file (pr-geo.txt or pr-geo.ps), it provides current information on the stratigraphy and structural geology of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:48,000 or smaller.

  9. The development of digital library system for drug research information.

    PubMed

    Kim, H J; Kim, S R; Yoo, D S; Lee, S H; Suh, O K; Cho, J H; Shin, H T; Yoon, J P

    1998-01-01

    The sophistication of computer technology and information transmission on internet has made various cyber information repository available to information consumers. In the era of information super-highway, the digital library which can be accessed from remote sites at any time is considered the prototype of information repository. Using object-oriented DBMS, the very first model of digital library for pharmaceutical researchers and related professionals in Korea has been developed. The published research papers and researchers' personal information was included in the database. For database with research papers, 13 domestic journals were abstracted and scanned for full-text image files which can be viewed by Internet web browsers. The database with researchers' personal information was also developed and interlinked to the database with research papers. These database will be continuously updated and will be combined with world-wide information as the unique digital library in the field of pharmacy.

  10. Digital database architecture and delineation methodology for deriving drainage basins, and a comparison of digitally and non-digitally derived numeric drainage areas

    USGS Publications Warehouse

    Dupree, Jean A.; Crowfoot, Richard M.

    2012-01-01

    The drainage basin is a fundamental hydrologic entity used for studies of surface-water resources and during planning of water-related projects. Numeric drainage areas published by the U.S. Geological Survey water science centers in Annual Water Data Reports and on the National Water Information Systems (NWIS) Web site are still primarily derived from hard-copy sources and by manual delineation of polygonal basin areas on paper topographic map sheets. To expedite numeric drainage area determinations, the Colorado Water Science Center developed a digital database structure and a delineation methodology based on the hydrologic unit boundaries in the National Watershed Boundary Dataset. This report describes the digital database architecture and delineation methodology and also presents the results of a comparison of the numeric drainage areas derived using this digital methodology with those derived using traditional, non-digital methods. (Please see report for full Abstract)

  11. Governing Software: Networks, Databases and Algorithmic Power in the Digital Governance of Public Education

    ERIC Educational Resources Information Center

    Williamson, Ben

    2015-01-01

    This article examines the emergence of "digital governance" in public education in England. Drawing on and combining concepts from software studies, policy and political studies, it identifies some specific approaches to digital governance facilitated by network-based communications and database-driven information processing software…

  12. Integrating Digital Images into the Art and Art History Curriculum.

    ERIC Educational Resources Information Center

    Pitt, Sharon P.; Updike, Christina B.; Guthrie, Miriam E.

    2002-01-01

    Describes an Internet-based image database system connected to a flexible, in-class teaching and learning tool (the Madison Digital Image Database) developed at James Madison University to bring digital images to the arts and humanities classroom. Discusses content, copyright issues, ensuring system effectiveness, instructional impact, sharing the…

  13. The NASA ADS Abstract Service and the Distributed Astronomy Digital Library [and] Project Soup: Comparing Evaluations of Digital Collection Efforts [and] Cross-Organizational Access Management: A Digital Library Authentication and Authorization Architecture [and] BibRelEx: Exploring Bibliographic Databases by Visualization of Annotated Content-based Relations [and] Semantics-Sensitive Retrieval for Digital Picture Libraries [and] Encoded Archival Description: An Introduction and Overview.

    ERIC Educational Resources Information Center

    Kurtz, Michael J.; Eichorn, Guenther; Accomazzi, Alberto; Grant, Carolyn S.; Demleitner, Markus; Murray, Stephen S.; Jones, Michael L. W.; Gay, Geri K.; Rieger, Robert H.; Millman, David; Bruggemann-Klein, Anne; Klein, Rolf; Landgraf, Britta; Wang, James Ze; Li, Jia; Chan, Desmond; Wiederhold, Gio; Pitti, Daniel V.

    1999-01-01

    Includes six articles that discuss a digital library for astronomy; comparing evaluations of digital collection efforts; cross-organizational access management of Web-based resources; searching scientific bibliographic databases based on content-based relations between documents; semantics-sensitive retrieval for digital picture libraries; and…

  14. Global relationships in river hydromorphology

    NASA Astrophysics Data System (ADS)

    Pavelsky, T.; Lion, C.; Allen, G. H.; Durand, M. T.; Schumann, G.; Beighley, E.; Yang, X.

    2017-12-01

    Since the widespread adoption of digital elevation models (DEMs) in the 1980s, most global and continental-scale analysis of river flow characteristics has been focused on measurements derived from DEMs such as drainage area, elevation, and slope. These variables (especially drainage area) have been related to other quantities of interest such as river width, depth, and velocity via empirical relationships that often take the form of power laws. More recently, a number of groups have developed more direct measurements of river location and some aspects of planform geometry from optical satellite imagery on regional, continental, and global scales. However, these satellite-derived datasets often lack many of the qualities that make DEM=derived datasets attractive, including robust network topology. Here, we present analysis of a dataset that combines the Global River Widths from Landsat (GRWL) database of river location, width, and braiding index with a river database extracted from the Shuttle Radar Topography Mission DEM and the HydroSHEDS dataset. Using these combined tools, we present a dataset that includes measurements of river width, slope, braiding index, upstream drainage area, and other variables. The dataset is available everywhere that both datasets are available, which includes all continental areas south of 60N with rivers sufficiently large to be observed with Landsat imagery. We use the dataset to examine patterns and frequencies of river form across continental and global scales as well as global relationships among variables including width, slope, and drainage area. The results demonstrate the complex relationships among different dimensions of river hydromorphology at the global scale.

  15. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including comparisons against ICESat altimetry for selected regions with tall vegetation and high relief. The extensive verification effort by the Receiver Algorithm team at GSFC is aimed at assuring that the onboard databases are sufficiently accurate. We will present the results of those assessments and verification tests, along with measures taken to implement modifications to the databases to optimize their use by the receiver algorithms. Companion presentations by McGarry et al. and Leigh et al. describe the details on the ATLAS Onboard Receiver Algorithms and databases development, respectively.

  16. Mars global digital dune database: MC-30

    USGS Publications Warehouse

    Hayward, R.K.; Fenton, L.K.; Titus, T.N.; Colaprete, A.; Christensen, P.R.

    2012-01-01

    The Mars Global Digital Dune Database (MGD3) provides data and describes the methodology used in creating the global database of moderate- to large-size dune fields on Mars. The database is being released in a series of U.S. Geological Survey Open-File Reports. The first report (Hayward and others, 2007) included dune fields from lat 65° N. to 65° S. (http://pubs.usgs.gov/of/2007/1158/). The second report (Hayward and others, 2010) included dune fields from lat 60° N. to 90° N. (http://pubs.usgs.gov/of/2010/1170/). This report encompasses ~75,000 km2 of mapped dune fields from lat 60° to 90° S. The dune fields included in this global database were initially located using Mars Odyssey Thermal Emission Imaging System (THEMIS) Infrared (IR) images. In the previous two reports, some dune fields may have been unintentionally excluded for two reasons: (1) incomplete THEMIS IR (daytime) coverage may have caused us to exclude some moderate- to large-size dune fields or (2) resolution of THEMIS IR coverage (100 m/pixel) certainly caused us to exclude smaller dune fields. In this report, mapping is more complete. The Arizona State University THEMIS daytime IR mosaic provided complete IR coverage, and it is unlikely that we missed any large dune fields in the South Pole (SP) region. In addition, the increased availability of higher resolution images resulted in the inclusion of more small (~1 km2) sand dune fields and sand patches. To maintain consistency with the previous releases, we have identified the sand features that would not have been included in earlier releases. While the moderate to large dune fields in MGD3 are likely to constitute the largest compilation of sediment on the planet, we acknowledge that our database excludes numerous small dune fields and some moderate to large dune fields as well. Please note that the absence of mapped dune fields does not mean that dune fields do not exist and is not intended to imply a lack of saltating sand in other areas. Where availability and quality of THEMIS visible (VIS), Mars Orbiter Camera (MOC) narrow angle, Mars Express High Resolution Stereo Camera, or Mars Reconnaissance Orbiter Context Camera and High Resolution Imaging Science Experiment images allowed, we classified dunes and included some dune slipface measurements, which were derived from gross dune morphology and represent the approximate prevailing wind direction at the last time of significant dune modification. It was beyond the scope of this report to look at the detail needed to discern subtle dune modification. It was also beyond the scope of this report to measure all slipfaces. We attempted to include enough slipface measurements to represent the general circulation (as implied by gross dune morphology) and to give a sense of the complex nature of aeolian activity on Mars. The absence of slipface measurements in a given direction should not be taken as evidence that winds in that direction did not occur. When a dune field was located within a crater, the azimuth from crater centroid to dune field centroid was calculated, as another possible indicator of wind direction. Output from a general circulation model is also included. In addition to polygons locating dune fields, the database includes ~700 of the THEMIS VIS and MOC images that were used to build the database.

  17. Spatial Digital Database for the Geologic Map of Oregon

    USGS Publications Warehouse

    Walker, George W.; MacLeod, Norman S.; Miller, Robert J.; Raines, Gary L.; Connors, Katherine A.

    2003-01-01

    Introduction This report describes and makes available a geologic digital spatial database (orgeo) representing the geologic map of Oregon (Walker and MacLeod, 1991). The original paper publication was printed as a single map sheet at a scale of 1:500,000, accompanied by a second sheet containing map unit descriptions and ancillary data. A digital version of the Walker and MacLeod (1991) map was included in Raines and others (1996). The dataset provided by this open-file report supersedes the earlier published digital version (Raines and others, 1996). This digital spatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information for use in spatial analysis in a geographic information system (GIS). This database can be queried in many ways to produce a variety of geologic maps. This database is not meant to be used or displayed at any scale larger than 1:500,000 (for example, 1:100,000). This report describes the methods used to convert the geologic map data into a digital format, describes the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. Scanned images of the printed map (Walker and MacLeod, 1991), their correlation of map units, and their explanation of map symbols are also available for download.

  18. Digital food photography: Dietary surveillance and beyond

    USDA-ARS?s Scientific Manuscript database

    The method used for creating a database of approximately 20,000 digital images of multiple portion sizes of foods linked to the USDA's Food and Nutrient Database for Dietary Studies (FNDDS) is presented. The creation of this database began in 2002, and its development has spanned 10 years. Initially...

  19. Global Seismicity: Three New Maps Compiled with Geographic Information Systems

    NASA Technical Reports Server (NTRS)

    Lowman, Paul D., Jr.; Montgomery, Brian C.

    1996-01-01

    This paper presents three new maps of global seismicity compiled from NOAA digital data, covering the interval 1963-1998, with three different magnitude ranges (mb): greater than 3.5, less than 3.5, and all detectable magnitudes. A commercially available geographic information system (GIS) was used as the database manager. Epicenter locations were acquired from a CD-ROM supplied by the National Geophysical Data Center. A methodology is presented that can be followed by general users. The implications of the maps are discussed, including the limitations of conventional plate models, and the different tectonic behavior of continental vs. oceanic lithosphere. Several little-known areas of intraplate or passive margin seismicity are also discussed, possibly expressing horizontal compression generated by ridge push.

  20. Intrusive Rock Database for the Digital Geologic Map of Utah

    USGS Publications Warehouse

    Nutt, C.J.; Ludington, Steve

    2003-01-01

    Digital geologic maps offer the promise of rapid and powerful answers to geologic questions using Geographic Information System software (GIS). Using modern GIS and database methods, a specialized derivative map can be easily prepared. An important limitation can be shortcomings in the information provided in the database associated with the digital map, a database which is often based on the legend of the original map. The purpose of this report is to show how the compilation of additional information can, when prepared as a database that can be used with the digital map, be used to create some types of derivative maps that are not possible with the original digital map and database. This Open-file Report consists of computer files with information about intrusive rocks in Utah that can be linked to the Digital Geologic Map of Utah (Hintze et al., 2000), an explanation of how to link the databases and map, and a list of references for the databases. The digital map, which represents the 1:500,000-scale Geologic Map of Utah (Hintze, 1980), can be obtained from the Utah Geological Survey (Map 179DM). Each polygon in the map has a unique identification number. We selected the polygons identified on the geologic map as intrusive rock, and constructed a database (UT_PLUT.xls) that classifies the polygons into plutonic map units (see tables). These plutonic map units are the key information that is used to relate the compiled information to the polygons on the map. The map includes a few polygons that were coded as intrusive on the state map but are largely volcanic rock; in these cases we note the volcanic rock names (rhyolite and latite) as used in the original sources Some polygons identified on the digital state map as intrusive rock were misidentified; these polygons are noted in a separate table of the database, along with some information about their true character. Fields may be empty because of lack of information from references used or difficulty in finding information. The information in the database is from a variety of sources, including geologic maps at scales ranging from 1:500,000 to 1:24,000, and thesis monographs. The references are shown twice: alphabetically and by region. The digital geologic map of Utah (Hintze and others, 2000) classifies intrusive rocks into only 3 categories, distinguished by age. They are: Ti, Tertiary intrusive rock; Ji, Upper to Middle Jurassic granite to quartz monzonite; and pCi, Early Proterozoic to Late Archean intrusive rock. Use of the tables provided in this report will permit selection and classification of those rocks by lithology and age. This database is a pilot study by the Survey and Analysis Project of the U.S. Geological Survey to characterize igneous rocks and link them to a digital map. The database, and others like it, will evolve as the project continues and other states are completed. We release this version now as an example, as a reference, and for those interested in Utah plutonic rocks.

  1. Systems and methods for automatically identifying and linking names in digital resources

    DOEpatents

    Parker, Charles T.; Lyons, Catherine M.; Roston, Gerald P.; Garrity, George M.

    2017-06-06

    The present invention provides systems and methods for automatically identifying name-like-strings in digital resources, matching these name-like-string against a set of names held in an expertly curated database, and for those name-like-strings found in said database, enhancing the content by associating additional matter with the name, wherein said matter includes information about the names that is held within said database and pointers to other digital resources which include the same name and it synonyms.

  2. Automated Bulk Uploading of Images and Metadata to Flickr

    ERIC Educational Resources Information Center

    Michel, Jason Paul; Tzoc, Elias

    2010-01-01

    The Digital Initiatives department at Miami University, like most digital initiatives and special collections departments, has a large number of rich digital image collections, stored primarily in a third-party database. Typically, these databases are not findable to the average Web user. From a desire to expose these collections to the wider Web…

  3. Digital database of the geologic map of the island of Hawai'i [Hawaii

    USGS Publications Warehouse

    Trusdell, Frank A.; Wolfe, Edward W.; Morris, Jean

    2006-01-01

    This online publication (DS 144) provides the digital database for the printed map by Edward W. Wolfe and Jean Morris (I-2524-A; 1996). This digital database contains all the information used to publish U.S. Geological Survey Geologic Investigations Series I-2524-A (available only in paper form; see http://pubs.er.usgs.gov/pubs/i/i2524A). The database contains the distribution and relationships of volcanic and surficial-sedimentary deposits on the island of Hawai‘i. This dataset represents the geologic history for the five volcanoes that comprise the Island of Hawai'i. The volcanoes are Kohala, Mauna Kea, Hualalai, Mauna Loa and Kīlauea.This database of the geologic map contributes to understanding the geologic history of the Island of Hawai‘i and provides the basis for understanding long-term volcanic processes in an intra-plate ocean island volcanic system. In addition the database also serves as a basis for producing volcanic hazards assessment for the island of Hawai‘i. Furthermore it serves as a base layer to be used for interdisciplinary research.This online publication consists of a digital database of the geologic map, an explanatory pamphlet, description of map units, correlation of map units diagram, and images for plotting. Geologic mapping was compiled at a scale of 1:100,000 for the entire mapping area. The geologic mapping was compiled as a digital geologic database in ArcInfo GIS format.

  4. The Global Genome Biodiversity Network (GGBN) Data Standard specification.

    PubMed

    Droege, G; Barker, K; Seberg, O; Coddington, J; Benson, E; Berendsohn, W G; Bunk, B; Butler, C; Cawsey, E M; Deck, J; Döring, M; Flemons, P; Gemeinholzer, B; Güntsch, A; Hollowell, T; Kelbert, P; Kostadinov, I; Kottmann, R; Lawlor, R T; Lyal, C; Mackenzie-Dodds, J; Meyer, C; Mulcahy, D; Nussbeck, S Y; O'Tuama, É; Orrell, T; Petersen, G; Robertson, T; Söhngen, C; Whitacre, J; Wieczorek, J; Yilmaz, P; Zetzsche, H; Zhang, Y; Zhou, X

    2016-01-01

    Genomic samples of non-model organisms are becoming increasingly important in a broad range of studies from developmental biology, biodiversity analyses, to conservation. Genomic sample definition, description, quality, voucher information and metadata all need to be digitized and disseminated across scientific communities. This information needs to be concise and consistent in today's ever-increasing bioinformatic era, for complementary data aggregators to easily map databases to one another. In order to facilitate exchange of information on genomic samples and their derived data, the Global Genome Biodiversity Network (GGBN) Data Standard is intended to provide a platform based on a documented agreement to promote the efficient sharing and usage of genomic sample material and associated specimen information in a consistent way. The new data standard presented here build upon existing standards commonly used within the community extending them with the capability to exchange data on tissue, environmental and DNA sample as well as sequences. The GGBN Data Standard will reveal and democratize the hidden contents of biodiversity biobanks, for the convenience of everyone in the wider biobanking community. Technical tools exist for data providers to easily map their databases to the standard.Database URL: http://terms.tdwg.org/wiki/GGBN_Data_Standard. © The Author(s) 2016. Published by Oxford University Press.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bocharova, N.Yu.; Scotese, C.R.; Pristavakina, E.I.

    A digital geographic database for the former USSR was compiled using published geologic and geodynamic maps and the unpublished suture map of Lev Zonenshain (1991). The database includes more than 900 tectonic features: strike-slip faults, sutures, thrusts, fossil and active rifts, fossil and active subduction zones, boundaries of the major and minor Precambrian blocks, ophiolites, and various volcanic complexes. The attributes of each structural unit include type of structure, name, age, tectonic setting and geographical coordinates. Paleozoic and Early Mesozoic reconstructions of the former USSR and adjacent regions were constructed using this tectonic database together with paleomagnetic data and themore » motions of continent over fixed hot spots. Global apparent polar wander paths in European and Siberian coordinates were calculated back to Cambrian time, using the paleomagnetic pole summaries of Van der Voo (1992) and Khramov (1992) and the global plate tectonic model of the Paleomap Project (Scotese and Becker, 1992). Trajectories of intraplate volcanics in South Siberia, Mongolia, Scandinavia and data on the White Mountain plutons and Karoo flood basalts were also taken into account. Using new data, the authors recalculated the stage and finite poles for the rotation of the Siberia and Europe with respect to the hot spot reference frame for the time interval 160 to 450 Ma.« less

  6. Distribution of late Pleistocene ice-rich syngenetic permafrost of the Yedoma Suite in east and central Siberia, Russia

    USGS Publications Warehouse

    Grosse, Guido; Robinson, Joel E.; Bryant, Robin; Taylor, Maxwell D.; Harper, William; DeMasi, Amy; Kyker-Snowman, Emily; Veremeeva, Alexandra; Schirrmeister, Lutz; Harden, Jennifer

    2013-01-01

    This digital database is the product of collaboration between the U.S. Geological Survey, the Geophysical Institute at the University of Alaska, Fairbanks; the Los Altos Hills Foothill College GeoSpatial Technology Certificate Program; the Alfred Wegener Institute for Polar and Marine Research, Potsdam, Germany; and the Institute of Physical Chemical and Biological Problems in Soil Science of the Russian Academy of Sciences. The primary goal for creating this digital database is to enhance current estimates of soil organic carbon stored in deep permafrost, in particular the late Pleistocene syngenetic ice-rich permafrost deposits of the Yedoma Suite. Previous studies estimated that Yedoma deposits cover about 1 million square kilometers of a large region in central and eastern Siberia, but these estimates generally are based on maps with scales smaller than 1:10,000,000. Taking into account this large area, it was estimated that Yedoma may store as much as 500 petagrams of soil organic carbon, a large part of which is vulnerable to thaw and mobilization from thermokarst and erosion. To refine assessments of the spatial distribution of Yedoma deposits, we digitized 11 Russian Quaternary geologic maps. Our study focused on extracting geologic units interpreted by us as late Pleistocene ice-rich syngenetic Yedoma deposits based on lithology, ground ice conditions, stratigraphy, and geomorphological and spatial association. These Yedoma units then were merged into a single data layer across map tiles. The spatial database provides a useful update of the spatial distribution of this deposit for an approximately 2.32 million square kilometers land area in Siberia that will (1) serve as a core database for future refinements of Yedoma distribution in additional regions, and (2) provide a starting point to revise the size of deep but thaw-vulnerable permafrost carbon pools in the Arctic based on surface geology and the distribution of cryolithofacies types at high spatial resolution. However, we recognize that the extent of Yedoma deposits presented in this database is not complete for a global assessment, because Yedoma deposits also occur in the Taymyr lowlands and Chukotka, and in parts of Alaska and northwestern Canada.

  7. Some thoughts on cartographic and geographic information systems for the 1980's

    USGS Publications Warehouse

    Starr, L.E.; Anderson, Kirk E.

    1981-01-01

    The U.S. Geological Survey is adopting computer techniques to meet the expanding need for cartographic base category data. Digital methods are becoming increasingly important in the mapmaking process, and the demand is growing for physical, social, and economic data. Recognizing these emerging needs, the National Mapping Division began, several years ago, an active program to develop advanced digital methods to support cartographic and geographic data processing. An integrated digital cartographic database would meet the anticipated needs. Such a database would contain data from various sources, and could provide a variety of standard and customized map and digital data file products. This cartographic database soon will be technologically feasible. The present trends in the economics of cartographic and geographic data handling and the growing needs for integrated physical, social, and economic data make such a database virtually mandatory.

  8. [Review of digital ground object spectral library].

    PubMed

    Zhou, Xiao-Hu; Zhou, Ding-Wu

    2009-06-01

    A higher spectral resolution is the main direction of developing remote sensing technology, and it is quite important to set up the digital ground object reflectance spectral database library, one of fundamental research fields in remote sensing application. Remote sensing application has been increasingly relying on ground object spectral characteristics, and quantitative analysis has been developed to a new stage. The present article summarized and systematically introduced the research status quo and development trend of digital ground object reflectance spectral libraries at home and in the world in recent years. Introducing the spectral libraries has been established, including desertification spectral database library, plants spectral database library, geological spectral database library, soil spectral database library, minerals spectral database library, cloud spectral database library, snow spectral database library, the atmosphere spectral database library, rocks spectral database library, water spectral database library, meteorites spectral database library, moon rock spectral database library, and man-made materials spectral database library, mixture spectral database library, volatile compounds spectral database library, and liquids spectral database library. In the process of establishing spectral database libraries, there have been some problems, such as the lack of uniform national spectral database standard and uniform standards for the ground object features as well as the comparability between different databases. In addition, data sharing mechanism can not be carried out, etc. This article also put forward some suggestions on those problems.

  9. Patch-based automatic retinal vessel segmentation in global and local structural context.

    PubMed

    Cao, Shuoying; Bharath, Anil A; Parker, Kim H; Ng, Jeffrey

    2012-01-01

    In this paper, we extend our published work [1] and propose an automated system to segment retinal vessel bed in digital fundus images with enough adaptability to analyze images from fluorescein angiography. This approach takes into account both the global and local context and enables both vessel segmentation and microvascular centreline extraction. These tools should allow researchers and clinicians to estimate and assess vessel diameter, capillary blood volume and microvascular topology for early stage disease detection, monitoring and treatment. Global vessel bed segmentation is achieved by combining phase-invariant orientation fields with neighbourhood pixel intensities in a patch-based feature vector for supervised learning. This approach is evaluated against benchmarks on the DRIVE database [2]. Local microvascular centrelines within Regions-of-Interest (ROIs) are segmented by linking the phase-invariant orientation measures with phase-selective local structure features. Our global and local structural segmentation can be used to assess both pathological structural alterations and microemboli occurrence in non-invasive clinical settings in a longitudinal study.

  10. DIGITAL CARTOGRAPHY OF THE PLANETS: NEW METHODS, ITS STATUS, AND ITS FUTURE.

    USGS Publications Warehouse

    Batson, R.M.

    1987-01-01

    A system has been developed that establishes a standardized cartographic database for each of the 19 planets and major satellites that have been explored to date. Compilation of the databases involves both traditional and newly developed digital image processing and mosaicking techniques, including radiometric and geometric corrections of the images. Each database, or digital image model (DIM), is a digital mosaic of spacecraft images that have been radiometrically and geometrically corrected and photometrically modeled. During compilation, ancillary data files such as radiometric calibrations and refined photometric values for all camera lens and filter combinations and refined camera-orientation matrices for all images used in the mapping are produced.

  11. High Tech High School Interns Develop a Mid-Ocean Ridge Database for Research and Education

    NASA Astrophysics Data System (ADS)

    Staudigel, D.; Delaney, R.; Staudigel, H.; Koppers, A. A.; Miller, S. P.

    2004-12-01

    Mid-ocean ridges (MOR) represent one of the most important geographical and geological features on planet Earth. MORs are the locations where plates spread apart, they are the locations of the majority of the Earths' volcanoes that harbor some of the most extreme life forms. These concepts attract much research, but mid-ocean ridges are still effectively underrepresented in the Earth science class rooms. As two High Tech High School students, we began an internship at Scripps to develop a database for mid-ocean ridges as a resource for science and education. This Ridge Catalog will be accessible via http://earthref.org/databases/RC/ and applies a similar structure, design and data archival principle as the Seamount Catalog under EarthRef.org. Major research goals of this project include the development of (1) an archival structure for multibeam and sidescan data, standard bathymetric maps (including ODP-DSDP drill site and dredge locations) or any other arbitrary digital objects relating to MORs, and (2) to compile a global data set for some of the most defining characteristics of every ridge segment including ridge segment length, depth and azimuth and half spreading rates. One of the challenges included the need of making MOR data useful to the scientist as well as the teacher in the class room. Since the basic structure follows the design of the Seamount Catalog closely, we could move our attention to the basic data population of the database. We have pulled together multibeam data for the MOR segments from various public archives (SIOExplorer, SIO-GDC, NGDC, Lamont), and pre-processed it for public use. In particular, we have created individual bathymetric maps for each ridge segment, while merging the multibeam data with global satellite bathymetry data from Smith & Sandwell (1997). The global scale of this database will give it the ability to be used for any number of applications, from cruise planning to data

  12. Algorithms and methodology used in constructing high-resolution terrain databases

    NASA Astrophysics Data System (ADS)

    Williams, Bryan L.; Wilkosz, Aaron

    1998-07-01

    This paper presents a top-level description of methods used to generate high-resolution 3D IR digital terrain databases using soft photogrammetry. The 3D IR database is derived from aerial photography and is made up of digital ground plane elevation map, vegetation height elevation map, material classification map, object data (tanks, buildings, etc.), and temperature radiance map. Steps required to generate some of these elements are outlined. The use of metric photogrammetry is discussed in the context of elevation map development; and methods employed to generate the material classification maps are given. The developed databases are used by the US Army Aviation and Missile Command to evaluate the performance of various missile systems. A discussion is also presented on database certification which consists of validation, verification, and accreditation procedures followed to certify that the developed databases give a true representation of the area of interest, and are fully compatible with the targeted digital simulators.

  13. Mapping the Rainforest of the Sea: Global Coral Reef Maps for Global Conservation

    NASA Technical Reports Server (NTRS)

    Robinson, Julie A.

    2006-01-01

    Coral reefs are the center of marine biodiversity, yet are under threat with an estimated 60% of coral reef habitats considered at risk by the World Resources Institute. The location and extent of coral reefs in the world are the basic information required for resource management and as a baseline for monitoring change. A NASA sponsored partnership between remote sensing scientists, international agencies and NGOs, has developed a new generation of global reef maps based on data collected by satellites. The effort, dubbed the Millennium Coral Reef Map aims to develop new methods for wide distribution of voluminous satellite data of use to the conservation and management communities. We discuss the tradeoffs between remote sensing data sources, mapping objectives, and the needs for conservation and resource management. SeaWiFS data were used to produce a composite global shallow bathymetry map at 1 km resolution. Landsat 7/ETM+ data acquisition plans were modified to collect global reefs and new operational methods were designed to generate the firstever global coral reef geomorphology map. We discuss the challenges encountered to build these databases and in implementing the geospatial data distribution strategies. Conservation applications include a new assessment of the distribution of the world s marine protected areas (UNEPWCMC), improved spatial resolution in the Reefs at Risk analysis for the Caribbean (WRI), and a global basemap for the Census of Marine Life's OBIS database. The Millennium Coral Reef map and digital image archive will pay significant dividends for local and regional conservation projects around the globe. Complete details of the project are available at http://eol.jsc.nasa.gov/reefs.

  14. The Design and Product of National 1:1000000 Cartographic Data of Topographic Map

    NASA Astrophysics Data System (ADS)

    Wang, Guizhi

    2016-06-01

    National administration of surveying, mapping and geoinformation started to launch the project of national fundamental geographic information database dynamic update in 2012. Among them, the 1:50000 database was updated once a year, furthermore the 1:250000 database was downsized and linkage-updated on the basis. In 2014, using the latest achievements of 1:250000 database, comprehensively update the 1:1000000 digital line graph database. At the same time, generate cartographic data of topographic map and digital elevation model data. This article mainly introduce national 1:1000000 cartographic data of topographic map, include feature content, database structure, Database-driven Mapping technology, workflow and so on.

  15. Seabird databases and the new paradigm for scientific publication and attribution

    USGS Publications Warehouse

    Hatch, Scott A.

    2010-01-01

    For more than 300 years, the peer-reviewed journal article has been the principal medium for packaging and delivering scientific data. With new tools for managing digital data, a new paradigm is emerging—one that demands open and direct access to data and that enables and rewards a broad-based approach to scientific questions. Ground-breaking papers in the future will increasingly be those that creatively mine and synthesize vast stores of data available on the Internet. This is especially true for conservation science, in which essential data can be readily captured in standard record formats. For seabird professionals, a number of globally shared databases are in the offing, or should be. These databases will capture the salient results of inventories and monitoring, pelagic surveys, diet studies, and telemetry. A number of real or perceived barriers to data sharing exist, but none is insurmountable. Our discipline should take an important stride now by adopting a specially designed markup language for annotating and sharing seabird data.

  16. Preliminary geologic map of the Oat Mountain 7.5' quadrangle, Southern California: a digital database

    USGS Publications Warehouse

    Yerkes, R.F.; Campbell, Russell H.

    1995-01-01

    This database, identified as "Preliminary Geologic Map of the Oat Mountain 7.5' Quadrangle, southern California: A Digital Database," has been approved for release and publication by the Director of the USGS. Although this database has been reviewed and is substantially complete, the USGS reserves the right to revise the data pursuant to further analysis and review. This database is released on condition that neither the USGS nor the U. S. Government may be held liable for any damages resulting from its use. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1993). More specific information about the units may be available in the original sources.

  17. A Global Registry for Scientific Collections: Striking a Balance Between Disciplinary Detail and Interdisciplinary Discoverability

    NASA Astrophysics Data System (ADS)

    Graham, E.; Schindel, D. E.

    2014-12-01

    The Global Registry of Scientific Collections (GRSciColl) is an online information resource developed to gather and disseminate basic information on scientific collections. Building on initiatives started for biological collections, GRSciColl expands this framework to encompass all scientific disciplines including earth and space sciences, anthropology, archaeology, biomedicine, and applied fields such as agriculture and technology. The goals of this registry are to (1) provide a single source of synoptic information about the repositories, their component collections, access and use policies, and staff contact information; and (2) facilitate the assignment of identifiers for repositories and their collections that are globally unique across all disciplines. As digitization efforts continue, the importance of globally unique identifiers is paramount to ensuring interoperability across datasets. Search capabilities and web services will significantly increase the web visibility and accessibility of these collections. Institutional records include categorization by governance (e.g., national, state or local governmental, private non-profit) and by scientific discipline (e.g., earth science, biomedical, agricultural). Collection-level metadata categorize the types of contained specimens/samples and modes of preservation. In selecting the level of granularity for these categories, designers sought a compromise that would capture enough information to be useful in searches and inquiries and would complement the detailed archives in specimen-level databases such (which are increasingly digital) hosted by discipline-specific groups (e.g. SESAR) or the repositories themselves (e.g. KE EMu).

  18. Improving global paleogeographic reconstructions since the Devonian using paleobiology

    NASA Astrophysics Data System (ADS)

    Cao, Wenchao; Zahirovic, Sabin; Williams, Simon; Flament, Nicolas; Müller, Dietmar

    2017-04-01

    Paleogeographic reconstructions are important to understand past eustatic and regional sea level change, the tectonic evolution of the planet, hydrocarbon genesis, and to constrain and interpret the dynamic topography predicted by time-dependent global mantle convection models. Several global paleogeographic compilations have been published, generally presented as static snapshots with varying temporal resolution and fixed spatial resolution. Published paleogeographic compilations are tied to a particular plate motion model, making it difficult to link them to alternative digital plate tectonic reconstructions. In order to address this issue, we developed a workflow to reverse-engineer reconstructed paleogeographies to their present-day coordinates and link them to any reconstruction model. Published paleogeographic compilations are also tied to a given dataset. We used fossil data from the Paleobiology Database to identify inconsistencies between fossils paleoenvironments and paleogeographic reconstructions, and to improve reconstructed terrestrial-marine boundaries by resolving these inconsistencies. We used the improved reconstructed paleogeographies to estimate the surface areas of global paleogeographic features (shallow marine environments, landmasses, mountains and ice sheets), to investigate the global continental flooding history since the late Paleozoic, which has inherent links to global eustasy as well as dynamic topography. Finally, we discuss the relationships between our modeled emerged land area and total continental area through time, continental growth models, and strontium isotope (87Sr/86Sr) signatures in ocean water. Our study highlights the flexibility of digital paleogeographic models linked to state-of-the-art plate tectonic reconstructions in order to better understand the interplay of continental growth and eustasy, with wider implications for understanding Earth's paleotopography, ocean circulation, and the role of mantle convection in shaping long-wavelength topography.

  19. Access to digital library databases in higher education: design problems and infrastructural gaps.

    PubMed

    Oswal, Sushil K

    2014-01-01

    After defining accessibility and usability, the author offers a broad survey of the research studies on digital content databases which have thus far primarily depended on data drawn from studies conducted by sighted researchers with non-disabled users employing screen readers and low vision devices. This article aims at producing a detailed description of the difficulties confronted by blind screen reader users with online library databases which now hold most of the academic, peer-reviewed journal and periodical content essential for research and teaching in higher education. The approach taken here is borrowed from descriptive ethnography which allows the author to create a complete picture of the accessibility and usability problems faced by an experienced academic user of digital library databases and screen readers. The author provides a detailed analysis of the different aspects of accessibility issues in digital databases under several headers with a special focus on full-text PDF files. The author emphasizes that long-term studies with actual, blind screen reader users employing both qualitative and computerized research tools can yield meaningful data for the designers and developers to improve these databases to a level that they begin to provide an equal access to the blind.

  20. Concierge: Personal Database Software for Managing Digital Research Resources

    PubMed Central

    Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro

    2007-01-01

    This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800

  1. The comparative effectiveness of conventional and digital image libraries.

    PubMed

    McColl, R I; Johnson, A

    2001-03-01

    Before introducing a hospital-wide image database to improve access, navigation and retrieval speed, a comparative study between a conventional slide library and a matching image database was undertaken to assess its relative benefits. Paired time trials and personal questionnaires revealed faster retrieval rates, higher image quality, and easier viewing for the pilot digital image database. Analysis of confidentiality, copyright and data protection exposed similar issues for both systems, thus concluding that the digital image database is a more effective library system. The authors suggest that in the future, medical images will be stored on large, professionally administered, centrally located file servers, allowing specialist image libraries to be tailored locally for individual users. The further integration of the database with web technology will enable cheap and efficient remote access for a wide range of users.

  2. Mars Global Digital Dune Database: MC2-MC29

    USGS Publications Warehouse

    Hayward, Rosalyn K.; Mullins, Kevin F.; Fenton, L.K.; Hare, T.M.; Titus, T.N.; Bourke, M.C.; Colaprete, Anthony; Christensen, P.R.

    2007-01-01

    Introduction The Mars Global Digital Dune Database presents data and describes the methodology used in creating the database. The database provides a comprehensive and quantitative view of the geographic distribution of moderate- to large-size dune fields from 65? N to 65? S latitude and encompasses ~ 550 dune fields. The database will be expanded to cover the entire planet in later versions. Although we have attempted to include all dune fields between 65? N and 65? S, some have likely been excluded for two reasons: 1) incomplete THEMIS IR (daytime) coverage may have caused us to exclude some moderate- to large-size dune fields or 2) resolution of THEMIS IR coverage (100m/pixel) certainly caused us to exclude smaller dune fields. The smallest dune fields in the database are ~ 1 km2 in area. While the moderate to large dune fields are likely to constitute the largest compilation of sediment on the planet, smaller stores of sediment of dunes are likely to be found elsewhere via higher resolution data. Thus, it should be noted that our database excludes all small dune fields and some moderate to large dune fields as well. Therefore the absence of mapped dune fields does not mean that such dune fields do not exist and is not intended to imply a lack of saltating sand in other areas. Where availability and quality of THEMIS visible (VIS) or Mars Orbiter Camera narrow angle (MOC NA) images allowed, we classifed dunes and included dune slipface measurements, which were derived from gross dune morphology and represent the prevailing wind direction at the last time of significant dune modification. For dunes located within craters, the azimuth from crater centroid to dune field centroid was calculated. Output from a general circulation model (GCM) is also included. In addition to polygons locating dune fields, the database includes over 1800 selected Thermal Emission Imaging System (THEMIS) infrared (IR), THEMIS visible (VIS) and Mars Orbiter Camera Narrow Angle (MOC NA) images that were used to build the database. The database is presented in a variety of formats. It is presented as a series of ArcReader projects which can be opened using the free ArcReader software. The latest version of ArcReader can be downloaded at http://www.esri.com/software/arcgis/arcreader/download.html. The database is also presented in ArcMap projects. The ArcMap projects allow fuller use of the data, but require ESRI ArcMap? software. Multiple projects were required to accommodate the large number of images needed. A fuller description of the projects can be found in the Dunes_ReadMe file and the ReadMe_GIS file in the Documentation folder. For users who prefer to create their own projects, the data is available in ESRI shapefile and geodatabase formats, as well as the open Geographic Markup Language (GML) format. A printable map of the dunes and craters in the database is available as a Portable Document Format (PDF) document. The map is also included as a JPEG file. ReadMe files are available in PDF and ASCII (.txt) files. Tables are available in both Excel (.xls) and ASCII formats.

  3. Digital Storytelling for Transformative Global Citizenship Education

    ERIC Educational Resources Information Center

    Truong-White, Hoa; McLean, Lorna

    2015-01-01

    This article explores how digital storytelling offers the potential to support transformative global citizenship education (TGCE) through a case study of the Bridges to Understanding program that connected middle and high school students globally using digital storytelling. Drawing on a TGCE framework, this research project probed the curriculum…

  4. Preliminary Integrated Geologic Map Databases for the United States: Connecticut, Maine, Massachusetts, New Hampshire, New Jersey, Rhode Island and Vermont

    USGS Publications Warehouse

    Nicholson, Suzanne W.; Dicken, Connie L.; Horton, John D.; Foose, Michael P.; Mueller, Julia A.L.; Hon, Rudi

    2006-01-01

    The rapid growth in the use of Geographic Information Systems (GIS) has highlighted the need for regional and national scale digital geologic maps that have standardized information about geologic age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. Although two digital geologic maps (Schruben and others, 1994; Reed and Bush, 2004) of the United States currently exist, their scales (1:2,500,000 and 1:5,000,000) are too general for many regional applications. Most states have digital geologic maps at scales of about 1:500,000, but the databases are not comparably structured and, thus, it is difficult to use the digital database for more than one state at a time. This report describes the result for a seven state region of an effort by the U.S. Geological Survey to produce a series of integrated and standardized state geologic map databases that cover the entire United States. In 1997, the United States Geological Survey's Mineral Resources Program initiated the National Surveys and Analysis (NSA) Project to develop national digital databases. One primary activity of this project was to compile a national digital geologic map database, utilizing state geologic maps, to support studies in the range of 1:250,000- to 1:1,000,000-scale. To accomplish this, state databases were prepared using a common standard for the database structure, fields, attribution, and data dictionaries. For Alaska and Hawaii new state maps are being prepared and the preliminary work for Alaska is being released as a series of 1:250,000 scale quadrangle reports. This document provides background information and documentation for the integrated geologic map databases of this report. This report is one of a series of such reports releasing preliminary standardized geologic map databases for the United States. The data products of the project consist of two main parts, the spatial databases and a set of supplemental tables relating to geologic map units. The datasets serve as a data resource to generate a variety of stratigraphic, age, and lithologic maps. This documentation is divided into four main sections: (1) description of the set of data files provided in this report, (2) specifications of the spatial databases, (3) specifications of the supplemental tables, and (4) an appendix containing the data dictionaries used to populate some fields of the spatial database and supplemental tables.

  5. New digital magnetic anomaly database for North America

    USGS Publications Warehouse

    Finn, C.A.; Pilkington, M.; Cuevas, A.; Hernandez, I.; Urrutia, J.

    2001-01-01

    The Geological Survey of Canada (GSC), U.S. Geological Survey (USGS), and Consejo de Recursos Minerales of Mexico (CRM) are compiling an upgraded digital magnetic anomaly database and map for North America. This trinational project is expected to be completed by late 2002.

  6. Digital Equipment Corporation's CRDOM Software and Database Publications.

    ERIC Educational Resources Information Center

    Adams, Michael Q.

    1986-01-01

    Acquaints information professionals with Digital Equipment Corporation's compact optical disk read-only-memory (CDROM) search and retrieval software and growing library of CDROM database publications (COMPENDEX, Chemical Abstracts Services). Highlights include MicroBASIS, boolean operators, range operators, word and phrase searching, proximity…

  7. Virtual Global Magnetic Observatory - Concept and Implementation

    NASA Astrophysics Data System (ADS)

    Papitashvili, V.; Clauer, R.; Petrov, V.; Saxena, A.

    2002-12-01

    The existing World Data Centers (WDC) continue to serve excellently the worldwide scientific community in providing free access to a huge number of global geophysical databases. Various institutions at different geographic locations house these Centers, mainly organized by a scientific discipline. However, population of the Centers requires mandatory or voluntary submission of locally collected data. Recently many digital geomagnetic datasets have been placed on the World Wide Web and some of these sets have not been even submitted to any data center. This has created an urgent need for more sophisticated search engines capable of identifying geomagnetic data on the Web and then retrieving a certain amount of data for the scientific analysis. In this study, we formulate a concept of the virtual global magnetic observatory (VGMO) that currently uses a pre-set list of the Web-based geomagnetic data holders (including WDC) as retrieving a requested case-study interval. Saving the retrieved data locally over the multiple requests, a VGMO user begins to build his/her own data sub-center, which does not need to search the Web if the newly requested interval will be within a span of the earlier retrieved data. At the same time, this self-populated sub-center becomes available to other VGMO users down on the requests chain. Some aspects of the Web``crawling'' helping to identify the newly ``webbed'' digital geomagnetic data are also considered.

  8. The ANISEED database: digital representation, formalization, and elucidation of a chordate developmental program.

    PubMed

    Tassy, Olivier; Dauga, Delphine; Daian, Fabrice; Sobral, Daniel; Robin, François; Khoueiry, Pierre; Salgado, David; Fox, Vanessa; Caillol, Danièle; Schiappa, Renaud; Laporte, Baptiste; Rios, Anne; Luxardi, Guillaume; Kusakabe, Takehiro; Joly, Jean-Stéphane; Darras, Sébastien; Christiaen, Lionel; Contensin, Magali; Auger, Hélène; Lamy, Clément; Hudson, Clare; Rothbächer, Ute; Gilchrist, Michael J; Makabe, Kazuhiro W; Hotta, Kohji; Fujiwara, Shigeki; Satoh, Nori; Satou, Yutaka; Lemaire, Patrick

    2010-10-01

    Developmental biology aims to understand how the dynamics of embryonic shapes and organ functions are encoded in linear DNA molecules. Thanks to recent progress in genomics and imaging technologies, systemic approaches are now used in parallel with small-scale studies to establish links between genomic information and phenotypes, often described at the subcellular level. Current model organism databases, however, do not integrate heterogeneous data sets at different scales into a global view of the developmental program. Here, we present a novel, generic digital system, NISEED, and its implementation, ANISEED, to ascidians, which are invertebrate chordates suitable for developmental systems biology approaches. ANISEED hosts an unprecedented combination of anatomical and molecular data on ascidian development. This includes the first detailed anatomical ontologies for these embryos, and quantitative geometrical descriptions of developing cells obtained from reconstructed three-dimensional (3D) embryos up to the gastrula stages. Fully annotated gene model sets are linked to 30,000 high-resolution spatial gene expression patterns in wild-type and experimentally manipulated conditions and to 528 experimentally validated cis-regulatory regions imported from specialized databases or extracted from 160 literature articles. This highly structured data set can be explored via a Developmental Browser, a Genome Browser, and a 3D Virtual Embryo module. We show how integration of heterogeneous data in ANISEED can provide a system-level understanding of the developmental program through the automatic inference of gene regulatory interactions, the identification of inducing signals, and the discovery and explanation of novel asymmetric divisions.

  9. Updating National Topographic Data Base Using Change Detection Methods

    NASA Astrophysics Data System (ADS)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  10. The ANISEED database: Digital representation, formalization, and elucidation of a chordate developmental program

    PubMed Central

    Tassy, Olivier; Dauga, Delphine; Daian, Fabrice; Sobral, Daniel; Robin, François; Khoueiry, Pierre; Salgado, David; Fox, Vanessa; Caillol, Danièle; Schiappa, Renaud; Laporte, Baptiste; Rios, Anne; Luxardi, Guillaume; Kusakabe, Takehiro; Joly, Jean-Stéphane; Darras, Sébastien; Christiaen, Lionel; Contensin, Magali; Auger, Hélène; Lamy, Clément; Hudson, Clare; Rothbächer, Ute; Gilchrist, Michael J.; Makabe, Kazuhiro W.; Hotta, Kohji; Fujiwara, Shigeki; Satoh, Nori; Satou, Yutaka; Lemaire, Patrick

    2010-01-01

    Developmental biology aims to understand how the dynamics of embryonic shapes and organ functions are encoded in linear DNA molecules. Thanks to recent progress in genomics and imaging technologies, systemic approaches are now used in parallel with small-scale studies to establish links between genomic information and phenotypes, often described at the subcellular level. Current model organism databases, however, do not integrate heterogeneous data sets at different scales into a global view of the developmental program. Here, we present a novel, generic digital system, NISEED, and its implementation, ANISEED, to ascidians, which are invertebrate chordates suitable for developmental systems biology approaches. ANISEED hosts an unprecedented combination of anatomical and molecular data on ascidian development. This includes the first detailed anatomical ontologies for these embryos, and quantitative geometrical descriptions of developing cells obtained from reconstructed three-dimensional (3D) embryos up to the gastrula stages. Fully annotated gene model sets are linked to 30,000 high-resolution spatial gene expression patterns in wild-type and experimentally manipulated conditions and to 528 experimentally validated cis-regulatory regions imported from specialized databases or extracted from 160 literature articles. This highly structured data set can be explored via a Developmental Browser, a Genome Browser, and a 3D Virtual Embryo module. We show how integration of heterogeneous data in ANISEED can provide a system-level understanding of the developmental program through the automatic inference of gene regulatory interactions, the identification of inducing signals, and the discovery and explanation of novel asymmetric divisions. PMID:20647237

  11. The integration of digital orthophotographs with GISs in a microcomputer environment

    NASA Technical Reports Server (NTRS)

    Steiner, David R.

    1992-01-01

    The issues involved in the use of orthoimages as a data source for GIS databases are examined. The integration of digital photographs into a GIS is discussed. A prototype PC-based program for the production of GIS databases using orthoimages is described.

  12. Seismic databases of The Caucasus

    NASA Astrophysics Data System (ADS)

    Gunia, I.; Sokhadze, G.; Mikava, D.; Tvaradze, N.; Godoladze, T.

    2012-12-01

    The Caucasus is one of the active segments of the Alpine-Himalayan collision belt. The region needs continues seismic monitoring systems for better understanding of tectonic processes going in the region. Seismic Monitoring Center of Georgia (Ilia State University) is operating the digital seismic network of the country and is also collecting and exchanging data with neighboring countries. The main focus of our study was to create seismic database which is well organized, easily reachable and is convenient for scientists to use. The seismological database includes the information about more than 100 000 earthquakes from the whole Caucasus. We have to mention that it includes data from analog and digital seismic networks. The first analog seismic station in Georgia was installed in 1899 in the Caucasus in Tbilisi city. The number of analog seismic stations was increasing during next decades and in 1980s about 100 analog stations were operated all over the region. From 1992 due to political and economical situation the number of stations has been decreased and in 2002 just two analog equipments was operated. New digital seismic network was developed in Georgia since 2003. The number of digital seismic stations was increasing and in current days there are more than 25 digital stations operating in the country. The database includes the detailed information about all equipments installed on seismic stations. Database is available online. That will make convenient interface for seismic data exchange data between Caucasus neighboring countries. It also makes easier both the seismic data processing and transferring them to the database and decreases the operator's mistakes during the routine work. The database was created using the followings: php, MySql, Javascript, Ajax, GMT, Gmap, Hypoinverse.

  13. Romanian Complex Data Center for Dense Seismic network

    NASA Astrophysics Data System (ADS)

    Neagoe, Cristian; Ionescu, Constantin; Marius Manea, Liviu

    2010-05-01

    Since 2002 the National Institute for Earth Physics (NIEP) developed its own real-time digital seismic network: consisting of 96 seismic stations of which 35 are broadband sensors and 24 stations equipped with short period sensors and two arrays earthquakes that transmit data in real time at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for Black Sea tsunamis. Seismic stations are equipped with Quanterra Q330 and K2 digitizers, broadband seismometers (STS2, CMG40T, CMG 3ESP, CMG3T) and acceleration sensors Episensor Kinemetrics (+ / - 2G). SeedLink who is a part of Seiscomp2.5 and Antelope are software packages used for acquisition in real time (RT) and for data exchange. Communication of digital seismic stations to the National Data Center in Bucharest and Seismic Observatory Eforie Nord is assured by 5 providers (GPRS, VPN, satellite radio and Internet communication). For acquisition and data processing at the two centers of reception and processing is used AntelopeTM 4.11 running on 2 workstations: one for real-time and other for offline processing and also a Seiscomp 3 server that works as back-up for Antelope 4.11 Both acquisition and analysis of seismic data systems produced information about local and global parameters of earthquakes, in addition Antelope is used for manual processing (association events, the calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV , etc.), generating ShakeMap products and interacts with global data centers. In order to make all this information easily available across the Web and also lay the grounds for a more modular and flexible development environment the National Data Center developed tools to enable centralizing of data from software such as Antelope which is using a dedicated database system ( Datascope, a database system based on text files ) to a more general-purpose database, MySQL which acts like a hub between the different acquisition and analysis systems used in the data center while also providing better connectivity at no expense in security. Mirroring certain data to MySQL also allows the National Data Center to easily share information to the public via the new application which is being developed and also mix in data collected from the public (e.g. information about the damages observed after an earthquake which intern is being used to produce macroseismic intensity indices which are then stored in the database and also made available via the web application). For internal usage there is also a web application which using data stored in the database displays earthquake information like location, magnitude and depth in semi-real-time thus aiding the personnel on duty. Another usage for the collected data is to create and maintain contact lists to which the datacenter sends notifications (SMS and emails) based on the parameters of the earthquake. For future development, amongst others the Data Center plans to develop the means to crosscheck the generated data between the different acquisition and analysis systems (e.g. comparing data generated by Antelope with data generated by Seiscomp).

  14. BAO Plate Archive Project: Digitization, Electronic Database and Research Programmes

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Abrahamyan, H. V.; Andreasyan, H. R.; Azatyan, N. M.; Farmanyan, S. V.; Gigoyan, K. S.; Gyulzadyan, M. V.; Khachatryan, K. G.; Knyazyan, A. V.; Kostandyan, G. R.; Mikayelyan, G. A.; Nikoghosyan, E. H.; Paronyan, G. M.; Vardanyan, A. V.

    2016-06-01

    The most important part of the astronomical observational heritage are astronomical plate archives created on the basis of numerous observations at many observatories. Byurakan Astrophysical Observatory (BAO) plate archive consists of 37,000 photographic plates and films, obtained at 2.6m telescope, 1m and 0.5m Schmidt type and other smaller telescopes during 1947-1991. In 2002-2005, the famous Markarian Survey (also called First Byurakan Survey, FBS) 1874 plates were digitized and the Digitized FBS (DFBS) was created. New science projects have been conducted based on these low-dispersion spectroscopic material. A large project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage was started in 2015. A Science Program Board is created to evaluate the observing material, to investigate new possibilities and to propose new projects based on the combined usage of these observations together with other world databases. The Executing Team consists of 11 astronomers and 2 computer scientists and will use 2 EPSON Perfection V750 Pro scanners for the digitization, as well as Armenian Virtual Observatory (ArVO) database will be used to accommodate all new data. The project will run during 3 years in 2015-2017 and the final result will be an electronic database and online interactive sky map to be used for further research projects, mainly including high proper motion stars, variable objects and Solar System bodies.

  15. BAO Plate Archive digitization, creation of electronic database and its scientific usage

    NASA Astrophysics Data System (ADS)

    Mickaelian, Areg M.

    2015-08-01

    Astronomical plate archives created on the basis of numerous observations at many observatories are important part of the astronomical heritage. Byurakan Astrophysical Observatory (BAO) plate archive consists of 37,500 photographic plates and films, obtained at 2.6m telescope, 1m and 0.5m Schmidt telescopes and other smaller ones during 1947-1991. In 2002-2005, the famous Markarian Survey (First Byurakan Survey, FBS) 2000 plates were digitized and the Digitized FBS (DFBS, http://www.aras.am/Dfbs/dfbs.html) was created. New science projects have been conducted based on these low-dispersion spectroscopic material. In 2015, we have started a project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage. A Science Program Board is created to evaluate the observing material, to investigate new possibilities and to propose new projects based on the combined usage of these observations together with other world databases. The Executing Team consists of 9 astronomers and 3 computer scientists and will use 2 EPSON Perfection V750 Pro scanners for the digitization, as well as Armenian Virtual Observatory (ArVO) database to accommodate all new data. The project will run during 3 years in 2015-2017 and the final result will be an electronic database and online interactive sky map to be used for further research projects.

  16. Programmed database system at the Chang Gung Craniofacial Center: part II--digitizing photographs.

    PubMed

    Chuang, Shiow-Shuh; Hung, Kai-Fong; de Villa, Glenda H; Chen, Philip K T; Lo, Lun-Jou; Chang, Sophia C N; Yu, Chung-Chih; Chen, Yu-Ray

    2003-07-01

    The archival tools used for digital images in advertising are not to fulfill the clinic requisition and are just beginning to develop. The storage of a large amount of conventional photographic slides needs a lot of space and special conditions. In spite of special precautions, degradation of the slides still occurs. The most common degradation is the appearance of fungus flecks. With the recent advances in digital technology, it is now possible to store voluminous numbers of photographs on a computer hard drive and keep them for a long time. A self-programmed interface has been developed to integrate database and image browser system that can build and locate needed files archive in a matter of seconds with the click of a button. This system requires hardware and software were market provided. There are 25,200 patients recorded in the database that involve 24,331 procedures. In the image files, there are 6,384 patients with 88,366 digital pictures files. From 1999 through 2002, NT400,000 dollars have been saved using the new system. Photographs can be managed with the integrating Database and Browse software for database archiving. This allows labeling of the individual photographs with demographic information and browsing. Digitized images are not only more efficient and economical than the conventional slide images, but they also facilitate clinical studies.

  17. Digital geologic map of the Coeur d'Alene 1:100,000 quadrangle, Idaho and Montana

    USGS Publications Warehouse

    digital compilation by Munts, Steven R.

    2000-01-01

    Between 1961 and 1969, Alan Griggs and others conducted fieldwork to prepare a geologic map of the Spokane 1:250,000 map (Griggs, 1973). Their field observations were posted on paper copies of 15-minute quadrangle maps. In 1999, the USGS contracted with the Idaho Geological Survey to prepare a digital version of the Coeur d’Alene 1:100,000 quadrangle. To facilitate this work, the USGS obtained the field maps prepared by Griggs and others from the USGS Field Records Library in Denver, Colorado. The Idaho Geological Survey (IGS) digitized these maps and used them in their mapping program. The mapping focused on field checks to resolve problems in poorly known areas and in areas of disagreement between adjoining maps. The IGS is currently in the process of preparing a final digital spatial database for the Coeur d’Alene 1:100,000 quadrangle. However, there was immediate need for a digital version of the geologic map of the Coeur d’Alene 1:100,000 quadrangle and the data from the field sheets along with several other sources were assembled to produce this interim product. This interim product is the digital geologic map of the Coeur d’Alene 1:100,000 quadrangle, Idaho and Montana. It was compiled from the preliminary digital files prepared by the Idaho Geological, and supplemented by data from Griggs (1973) and from digital databases by Bookstrom and others (1999) and Derkey and others (1996). The resulting digital geologic map (GIS) database can be queried in many ways to produce a variety of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:100,000 (e.g., 1:62,500 or 1:24,000). The digital geologic map graphics (of00-135_map.pdf) that are provided are representations of the digital database. The map area is located in north Idaho. This open-file report describes the geologic map units, the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet.

  18. Geologic and structure map of the Choteau 1 degree by 2 degrees Quadrangle, western Montana

    USGS Publications Warehouse

    Mudge, Melville R.; Earhart, Robert L.; Whipple, James W.; Harrison, Jack E.

    1982-01-01

    The geologic and structure map of Choteau 1 x 2 degree quadrangle (Mudge and others, 1982) was originally converted to a digital format by Jeff Silkwood (U.S. Forest Service and completed by the U.S. Geological Survey staff and contractor at the Spokane Field Office (WA) in 2000 for input into a geographic information system (GIS). The resulting digital geologic map (GIS) database can be queried in many ways to produce a variey of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:250,000 (e.g. 1:100,000 or 1:24,000. The digital geologic map graphics and plot files (chot250k.gra/.hp/.eps and chot-map.pdf) that are provided in the digital package are representations of the digital database. They are not designed to be cartographic products.

  19. Image database for digital hand atlas

    NASA Astrophysics Data System (ADS)

    Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente; Dey, Partha S.; Gertych, Arkadiusz; Pospiech-Kurkowska, Sywia

    2003-05-01

    Bone age assessment is a procedure frequently performed in pediatric patients to evaluate their growth disorder. A commonly used method is atlas matching by a visual comparison of a hand radiograph with a small reference set of old Greulich-Pyle atlas. We have developed a new digital hand atlas with a large set of clinically normal hand images of diverse ethnic groups. In this paper, we will present our system design and implementation of the digital atlas database to support the computer-aided atlas matching for bone age assessment. The system consists of a hand atlas image database, a computer-aided diagnostic (CAD) software module for image processing and atlas matching, and a Web user interface. Users can use a Web browser to push DICOM images, directly or indirectly from PACS, to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, are then extracted and compared with patterns from the atlas image database to assess the bone age. The digital atlas method built on a large image database and current Internet technology provides an alternative to supplement or replace the traditional one for a quantitative, accurate and cost-effective assessment of bone age.

  20. BAO Plate Archive Project

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Gigoyan, K. S.; Gyulzadyan, M. V.; Paronyan, G. M.; Abrahamyan, H. V.; Andreasyan, H. R.; Azatyan, N. M.; Kostandyan, G. R.; Samsonyan, A. L.; Mikayelyan, G. A.; Farmanyan, S. V.; Harutyunyan, V. L.

    2017-12-01

    We present the Byurakan Astrophysical Observatory (BAO) Plate Archive Project that is aimed at digitization, extraction and analysis of archival data and building an electronic database and interactive sky map. BAO Plate Archive consists of 37,500 photographic plates and films, obtained with 2.6m telescope, 1m and 0.5m Schmidt telescopes and other smaller ones during 1947-1991. The famous Markarian Survey (or the First Byurakan Survey, FBS) 2000 plates were digitized in 2002-2005 and the Digitized FBS (DFBS, www.aras.am/Dfbs/dfbs.html) was created. New science projects have been conducted based on this low-dispersion spectroscopic material. Several other smaller digitization projects have been carried out as well, such as part of Second Byurakan Survey (SBS) plates, photographic chain plates in Coma, where the blazar ON 231 is located and 2.6m film spectra of FBS Blue Stellar Objects. However, most of the plates and films are not digitized. In 2015, we have started a project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage. Armenian Virtual Observatory (ArVO, www.aras.am/Arvo/arvo.htm) database will accommodate all new data. The project runs in collaboration with the Armenian Institute of Informatics and Automation Problems (IIAP) and will continues during 4 years in 2015-2018. The final result will be an Electronic Database and online Interactive Sky map to be used for further research projects. ArVO will provide all standards and tools for efficient usage of the scientific output and its integration in international databases.

  1. Analyzing critical material demand: A revised approach.

    PubMed

    Nguyen, Ruby Thuy; Fishman, Tomer; Zhao, Fu; Imholte, D D; Graedel, T E

    2018-07-15

    Apparent consumption has been widely used as a metric to estimate material demand. However, with technology advancement and complexity of material use, this metric has become less useful in tracking material flows, estimating recycling feedstocks, and conducting life cycle assessment of critical materials. We call for future research efforts to focus on building a multi-tiered consumption database for the global trade network of critical materials. This approach will help track how raw materials are processed into major components (e.g., motor assemblies) and eventually incorporated into complete pieces of equipment (e.g., wind turbines). Foreseeable challenges would involve: 1) difficulty in obtaining a comprehensive picture of trade partners due to business sensitive information, 2) complexity of materials going into components of a machine, and 3) difficulty maintaining such a database. We propose ways to address these challenges such as making use of digital design, learning from the experience of building similar databases, and developing a strategy for financial sustainability. We recommend that, with the advancement of information technology, small steps toward building such a database will contribute significantly to our understanding of material flows in society and the associated human impacts on the environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Digital Library Education: Global Trends and Issues

    ERIC Educational Resources Information Center

    Shem, Magaji

    2015-01-01

    The paper examines trends and issues in digital education programmes globally, drawing examples of developmental growth of Library Information Science (LIS), schools and digital education courses in North America, Britain, and Southern Asia, the slow growth of LIS schools and digital education in Nigeria and some countries in Africa and India. The…

  3. Mission in the works promises precise global topographic data

    USGS Publications Warehouse

    Farr, T.; Evans, D.; Zebker, H.; Harding, D.; Bufton, J.; Dixon, T.; Vetrella, S.; Gesch, D.B.

    1995-01-01

    Significant deficiencies in the quality of today's topographic data severely limit scientific applications. Very few available data sets meet the stringent requirements of 10–30 m for global digital topography and 5 m or better vertical accuracy, and existing satellite systems are unlikely to fulfill these requirements. The Joint Topographic Science Working Group, appointed by NASA and the Italian Space Agency, concluded that radar interferometry coupled with a laser altimeter would be the most promising approach for improving data quality. By providing its own illumination at a wavelength Ion g enough to (e.g., λ = 25 cm) to penetrate clouds and rain, the interferometer would provide a global, uniform high-quality topographic data set. One mission under study, TOPSAT, is well positioned to fill this niche and promises to pave the way toward a more standardized and precise topographic database. TOPSAT would be an international mission, designed to make use of recent technology advances in such programs as NASA's New Millennium. It could be ready to launch by the end of this decade.

  4. Map and database of Quaternary faults and folds in Colombia and its offshore regions

    USGS Publications Warehouse

    Paris, Gabriel; Machette, Michael N.; Dart, Richard L.; Haller, Kathleen M.

    2000-01-01

    As part of the International Lithosphere Program’s “World Map of Major Active Faults,” the U.S. Geological Survey (USGS) is assisting in the compilation of a series of digital maps of Quaternary faults and folds in Western Hemisphere countries. The maps show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds. They are accompanied by databases that describe these features and document current information on their activity in the Quaternary. Top date, the project has published fault and fold maps for Costa Rica (Montero and others, 1998), Panama (Cowan and others, 1998), Venezuela (Audemard and others, 2000), Bolovia/Chile (Lavenu, and others, 2000), and Argentina (Costa and others, 2000). The project is a key part of the Global Seismic Hazards Assessment Program (ILP Project II-0) for the International Decade for Natural Hazard Disaster Reduction.

  5. Third molar development in a contemporary Danish 13-25year old population.

    PubMed

    Arge, Sara; Boldsen, Jesper Lier; Wenzel, Ann; Holmstrup, Palle; Jensen, Niels Dyrgaard; Lynnerup, Niels

    2018-05-16

    We present a reference database for third molar development based on a contemporary Danish population. A total of 1302 digital panoramic images were evaluated. The images were taken at a known chronological age, ranging from 13 to 25years. Third molar development was scored according to the Köhler modification of the 10-stage method of Gleiser and Hunt. We found that third molar development was generally advanced in the maxilla compared to the mandible and in males compared to females; in addition, the mandibular third molar mesial roots were generally more advanced in development than were the distal roots. There was no difference in third molar development between the left and right side of the jaws. Establishing global and robust databases on dental development is crucial for further development of forensic methods to evaluate age. Copyright © 2018. Published by Elsevier B.V.

  6. Reinforcing the foundations of ornithological nomenclature: Filling the gaps in Sherborn's and Richmond's historical legacy of bibliographic exploration.

    PubMed

    Dickinson, Edward C

    2016-01-01

    Due to its public popularity, ornithology has a huge corpus of scientific publication for a relatively small number of species. Although there are global checklists of currently recognised taxa, there has been only limited, mainly individual, effort to build a nomenclatural database that the science of ornithology deserves. This is especially true in relation to concise synonymies. With the arrival of ZooBank and the Biodiversity Heritage Library, the time has come to develop synonymies and to add fuller bibliographic detail to databases. The preparation for both began at the start of the 20(th) century with extensive work by Sherborn and Richmond. I discuss their legacy, offer notes on significant work since then, and provide suggestions for what remains to be done. To make solid the foundations for ornithological nomenclature and taxonomy, especially for synonymies, ornithologists will need to collaborate much more and contribute to the digital infrastructure.

  7. Development of a networked four-million-pixel pathological and radiological digital image presentation system and its application to medical conferences

    NASA Astrophysics Data System (ADS)

    Sakano, Toshikazu; Furukawa, Isao; Okumura, Akira; Yamaguchi, Takahiro; Fujii, Tetsuro; Ono, Sadayasu; Suzuki, Junji; Matsuya, Shoji; Ishihara, Teruo

    2001-08-01

    The wide spread of digital technology in the medical field has led to a demand for the high-quality, high-speed, and user-friendly digital image presentation system in the daily medical conferences. To fulfill this demand, we developed a presentation system for radiological and pathological images. It is composed of a super-high-definition (SHD) imaging system, a radiological image database (R-DB), a pathological image database (P-DB), and the network interconnecting these three. The R-DB consists of a 270GB RAID, a database server workstation, and a film digitizer. The P-DB includes an optical microscope, a four-million-pixel digital camera, a 90GB RAID, and a database server workstation. A 100Mbps Ethernet LAN interconnects all the sub-systems. The Web-based system operation software was developed for easy operation. We installed the whole system in NTT East Kanto Hospital to evaluate it in the weekly case conferences. The SHD system could display digital full-color images of 2048 x 2048 pixels on a 28-inch CRT monitor. The doctors evaluated the image quality and size, and found them applicable to the actual medical diagnosis. They also appreciated short image switching time that contributed to smooth presentation. Thus, we confirmed that its characteristics met the requirements.

  8. ISC-GEM: Global Instrumental Earthquake Catalogue (1900-2009), I. Data collection from early instrumental seismological bulletins

    NASA Astrophysics Data System (ADS)

    Di Giacomo, Domenico; Harris, James; Villaseñor, Antonio; Storchak, Dmitry A.; Engdahl, E. Robert; Lee, William H. K.

    2015-02-01

    In order to produce a new global reference earthquake catalogue based on instrumental data covering the last 100+ years of global earthquakes, we collected, digitized and processed an unprecedented amount of printed early instrumental seismological bulletins with fundamental parametric data for relocating and reassessing the magnitude of earthquakes that occurred in the period between 1904 and 1970. This effort was necessary in order to produce an earthquake catalogue with locations and magnitudes as homogeneous as possible. The parametric data obtained and processed during this work fills a large gap in electronic bulletin data availability. This new dataset complements the data publicly available in the International Seismological Centre (ISC) Bulletin starting in 1964. With respect to the amplitude-period data necessary to re-compute magnitude, we searched through the global collection of printed bulletins stored at the ISC and entered relevant station parametric data into the database. As a result, over 110,000 surface and body-wave amplitude-period pairs for re-computing standard magnitudes MS and mb were added to the ISC database. To facilitate earthquake relocation, different sources have been used to retrieve body-wave arrival times. These were entered into the database using optical character recognition methods (International Seismological Summary, 1918-1959) or manually (e.g., British Association for the Advancement of Science, 1913-1917). In total, ∼1,000,000 phase arrival times were added to the ISC database for large earthquakes that occurred in the time interval 1904-1970. The selection of earthquakes for which data was added depends on time period and magnitude: for the early years of last century (until 1917) only very large earthquakes were selected for processing (M ⩾ 7.5), whereas in the periods 1918-1959 and 1960-2009 the magnitude thresholds are 6.25 and 5.5, respectively. Such a selection was mainly dictated by limitations in time and funding. Although the newly available parametric data is only a subset of the station data available in the printed bulletins, its electronic availability will be important for any future study of earthquakes that occurred during the early instrumental period.

  9. Global Tsunami Database: Adding Geologic Deposits, Proxies, and Tools

    NASA Astrophysics Data System (ADS)

    Brocko, V. R.; Varner, J.

    2007-12-01

    A result of collaboration between NOAA's National Geophysical Data Center (NGDC) and the Cooperative Institute for Research in the Environmental Sciences (CIRES), the Global Tsunami Database includes instrumental records, human observations, and now, information inferred from the geologic record. Deep Ocean Assessment and Reporting of Tsunamis (DART) data, historical reports, and information gleaned from published tsunami deposit research build a multi-faceted view of tsunami hazards and their history around the world. Tsunami history provides clues to what might happen in the future, including frequency of occurrence and maximum wave heights. However, instrumental and written records commonly span too little time to reveal the full range of a region's tsunami hazard. The sedimentary deposits of tsunamis, identified with the aid of modern analogs, increasingly complement instrumental and human observations. By adding the component of tsunamis inferred from the geologic record, the Global Tsunami Database extends the record of tsunamis backward in time. Deposit locations, their estimated age and descriptions of the deposits themselves fill in the tsunami record. Tsunamis inferred from proxies, such as evidence for coseismic subsidence, are included to estimate recurrence intervals, but are flagged to highlight the absence of a physical deposit. Authors may submit their own descriptions and upload digital versions of publications. Users may sort by any populated field, including event, location, region, age of deposit, author, publication type (extract information from peer reviewed publications only, if you wish), grain size, composition, presence/absence of plant material. Users may find tsunami deposit references for a given location, event or author; search for particular properties of tsunami deposits; and even identify potential collaborators. Users may also download public-domain documents. Data and information may be viewed using tools designed to extract and display data from the Oracle database (selection forms, Web Map Services, and Web Feature Services). In addition, the historic tsunami archive (along with related earthquakes and volcanic eruptions) is available in KML (Keyhole Markup Language) format for use with Google Earth and similar geo-viewers.

  10. NOAA Data Rescue of Key Solar Databases and Digitization of Historical Solar Images

    NASA Astrophysics Data System (ADS)

    Coffey, H. E.

    2006-08-01

    Over a number of years, the staff at NOAA National Geophysical Data Center (NGDC) has worked to rescue key solar databases by converting them to digital format and making them available via the World Wide Web. NOAA has had several data rescue programs where staff compete for funds to rescue important and critical historical data that are languishing in archives and at risk of being lost due to deteriorating condition, loss of any metadata or descriptive text that describe the databases, lack of interest or funding in maintaining databases, etc. The Solar-Terrestrial Physics Division at NGDC was able to obtain funds to key in some critical historical tabular databases. Recently the NOAA Climate Database Modernization Program (CDMP) funded a project to digitize historical solar images, producing a large online database of historical daily full disk solar images. The images include the wavelengths Calcium K, Hydrogen Alpha, and white light photos, as well as sunspot drawings and the comprehensive drawings of a multitude of solar phenomena on one daily map (Fraunhofer maps and Wendelstein drawings). Included in the digitization are high resolution solar H-alpha images taken at the Boulder Solar Observatory 1967-1984. The scanned daily images document many phases of solar activity, from decadal variation to rotational variation to daily changes. Smaller versions are available online. Larger versions are available by request. See http://www.ngdc.noaa.gov/stp/SOLAR/ftpsolarimages.html. The tabular listings and solar imagery will be discussed.

  11. Geologic Map Database of Texas

    USGS Publications Warehouse

    Stoeser, Douglas B.; Shock, Nancy; Green, Gregory N.; Dumonceaux, Gayle M.; Heran, William D.

    2005-01-01

    The purpose of this report is to release a digital geologic map database for the State of Texas. This database was compiled for the U.S. Geological Survey (USGS) Minerals Program, National Surveys and Analysis Project, whose goal is a nationwide assemblage of geologic, geochemical, geophysical, and other data. This release makes the geologic data from the Geologic Map of Texas available in digital format. Original clear film positives provided by the Texas Bureau of Economic Geology were photographically enlarged onto Mylar film. These films were scanned, georeferenced, digitized, and attributed by Geologic Data Systems (GDS), Inc., Denver, Colorado. Project oversight and quality control was the responsibility of the U.S. Geological Survey. ESRI ArcInfo coverages, AMLs, and shapefiles are provided.

  12. Modernization and multiscale databases at the U.S. geological survey

    USGS Publications Warehouse

    Morrison, J.L.

    1992-01-01

    The U.S. Geological Survey (USGS) has begun a digital cartographic modernization program. Keys to that program are the creation of a multiscale database, a feature-based file structure that is derived from a spatial data model, and a series of "templates" or rules that specify the relationships between instances of entities in reality and features in the database. The database will initially hold data collected from the USGS standard map products at scales of 1:24,000, 1:100,000, and 1:2,000,000. The spatial data model is called the digital line graph-enhanced model, and the comprehensive rule set consists of collection rules, product generation rules, and conflict resolution rules. This modernization program will affect the USGS mapmaking process because both digital and graphic products will be created from the database. In addition, non-USGS map users will have more flexibility in uses of the databases. These remarks are those of the session discussant made in response to the six papers and the keynote address given in the session. ?? 1992.

  13. Evaluation of Acoustic Propagation Paths into the Human Head

    DTIC Science & Technology

    2005-07-25

    paths. A 3D finite-element solid mesh was constructed using a digital image database of an adult male head. Finite-element analysis was used to model the...air-borne sound pressure amplitude) via the alternate propagation paths. A 3D finite-element solid mesh was constructed using a digital image database ... database of an adult male head Coupled acoustic-mechanical finite-element analysis (FEA) was used to model the wave propagation through the fluid-solid

  14. Digital Geologic Map of the Rosalia 1:100,000 Quadrangle, Washington and Idaho: A Digital Database for the 1990 S.Z. Waggoner Map

    USGS Publications Warehouse

    Derkey, Pamela D.; Johnson, Bruce R.; Lackaff, Beatrice B.; Derkey, Robert E.

    1998-01-01

    The geologic map of the Rosalia 1:100,000-scale quadrangle was compiled in 1990 by S.Z. Waggoner of the Washington state Division of Geology and Earth Resources. This data was entered into a geographic information system (GIS) as part of a larger effort to create regional digital geology for the Pacific Northwest. The intent was to provide a digital geospatial database for a previously published black and white paper geologic map. This database can be queried in many ways to produce a variety of geologic maps. Digital base map data files are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:100,000 (e.g., 1:62,500 or 1:24,000) as it has been somewhat generalized to fit the 1:100,000 scale map. The map area is located in eastern Washington and extends across the state border into western Idaho. This open-file report describes the methods used to convert the geologic map data into a digital format, documents the file structures, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. We wish to thank J. Eric Schuster of the Washington Division of Geology and Earth Resources for providing the original stable-base mylar and the funding for it to be scanned. We also thank Dick Blank and Barry Moring of the U.S. Geological Survey for reviewing the manuscript and digital files, respectively.

  15. Global health education: a pilot in trans-disciplinary, digital instruction

    PubMed Central

    Wipfli, Heather; Press, David J.; Kuhn, Virginia

    2013-01-01

    Background The development of new global health academic programs provides unique opportunities to create innovative educational approaches within and across universities. Recent evidence suggests that digital media technologies may provide feasible and cost-effective alternatives to traditional classroom instruction; yet, many emerging global health academic programs lag behind in the utilization of modern technologies. Objective We created an inter-departmental University of Southern California (USC) collaboration to develop and implement a course focused on digital media and global health. Design Course curriculum was based on core tenants of modern education: multi-disciplinary, technologically advanced, learner-centered, and professional application of knowledge. Student and university evaluations were reviewed to qualitatively assess course satisfaction and educational outcomes. Results ‘New Media for Global Health’ ran for 18 weeks in the Spring 2012 semester with N=41 students (56.1% global health and 43.9% digital studies students). The course resulted in a number of high quality global health-related digital media products available at http://iml420.wordpress.com/. Challenges confronted at USC included administrative challenges related to co-teaching and frustration from students conditioned to a rigid system of teacher-led learning within a specific discipline. Quantitative and qualitative course evaluations reflected positive feedback for the course instructors and mixed reviews for the organization of the course. Conclusion The development of innovative educational programs in global health requires on-going experimentation and information sharing across departments and universities. Digital media technologies may have implications for future efforts to improve global health education. PMID:23643297

  16. Global health education: a pilot in trans-disciplinary, digital instruction.

    PubMed

    Wipfli, Heather; Press, David J; Kuhn, Virginia

    2013-05-02

    The development of new global health academic programs provides unique opportunities to create innovative educational approaches within and across universities. Recent evidence suggests that digital media technologies may provide feasible and cost-effective alternatives to traditional classroom instruction; yet, many emerging global health academic programs lag behind in the utilization of modern technologies. We created an inter-departmental University of Southern California (USC) collaboration to develop and implement a course focused on digital media and global health. Course curriculum was based on core tenants of modern education: multi-disciplinary, technologically advanced, learner-centered, and professional application of knowledge. Student and university evaluations were reviewed to qualitatively assess course satisfaction and educational outcomes. 'New Media for Global Health' ran for 18 weeks in the Spring 2012 semester with N=41 students (56.1% global health and 43.9% digital studies students). The course resulted in a number of high quality global health-related digital media products available at http://iml420.wordpress.com/. Challenges confronted at USC included administrative challenges related to co-teaching and frustration from students conditioned to a rigid system of teacher-led learning within a specific discipline. Quantitative and qualitative course evaluations reflected positive feedback for the course instructors and mixed reviews for the organization of the course. The development of innovative educational programs in global health requires on-going experimentation and information sharing across departments and universities. Digital media technologies may have implications for future efforts to improve global health education.

  17. Archive and Database as Metaphor: Theorizing the Historical Record

    ERIC Educational Resources Information Center

    Manoff, Marlene

    2010-01-01

    Digital media increase the visibility and presence of the past while also reshaping our sense of history. We have extraordinary access to digital versions of books, journals, film, television, music, art and popular culture from earlier eras. New theoretical formulations of database and archive provide ways to think creatively about these changes…

  18. Digital geomorphological landslide hazard mapping of the Alpago area, Italy

    NASA Astrophysics Data System (ADS)

    van Westen, Cees J.; Soeters, Rob; Sijmons, Koert

    Large-scale geomorphological maps of mountainous areas are traditionally made using complex symbol-based legends. They can serve as excellent "geomorphological databases", from which an experienced geomorphologist can extract a large amount of information for hazard mapping. However, these maps are not designed to be used in combination with a GIS, due to their complex cartographic structure. In this paper, two methods are presented for digital geomorphological mapping at large scales using GIS and digital cartographic software. The methods are applied to an area with a complex geomorphological setting on the Borsoia catchment, located in the Alpago region, near Belluno in the Italian Alps. The GIS database set-up is presented with an overview of the data layers that have been generated and how they are interrelated. The GIS database was also converted into a paper map, using a digital cartographic package. The resulting largescale geomorphological hazard map is attached. The resulting GIS database and cartographic product can be used to analyse the hazard type and hazard degree for each polygon, and to find the reasons for the hazard classification.

  19. Cadastral Database Positional Accuracy Improvement

    NASA Astrophysics Data System (ADS)

    Hashim, N. M.; Omar, A. H.; Ramli, S. N. M.; Omar, K. M.; Din, N.

    2017-10-01

    Positional Accuracy Improvement (PAI) is the refining process of the geometry feature in a geospatial dataset to improve its actual position. This actual position relates to the absolute position in specific coordinate system and the relation to the neighborhood features. With the growth of spatial based technology especially Geographical Information System (GIS) and Global Navigation Satellite System (GNSS), the PAI campaign is inevitable especially to the legacy cadastral database. Integration of legacy dataset and higher accuracy dataset like GNSS observation is a potential solution for improving the legacy dataset. However, by merely integrating both datasets will lead to a distortion of the relative geometry. The improved dataset should be further treated to minimize inherent errors and fitting to the new accurate dataset. The main focus of this study is to describe a method of angular based Least Square Adjustment (LSA) for PAI process of legacy dataset. The existing high accuracy dataset known as National Digital Cadastral Database (NDCDB) is then used as bench mark to validate the results. It was found that the propose technique is highly possible for positional accuracy improvement of legacy spatial datasets.

  20. Digital photography

    PubMed Central

    Windsor, J S; Rodway, G W; Middleton, P M; McCarthy, S

    2006-01-01

    Objective The emergence of a new generation of “point‐and‐shoot” digital cameras offers doctors a compact, portable and user‐friendly solution to the recording of highly detailed digital photographs and video images. This work highlights the use of such technology, and provides information for those who wish to record, store and display their own medical images. Methods Over a 3‐month period, a digital camera was carried by a doctor in a busy, adult emergency department and used to record a range of clinical images that were subsequently transferred to a computer database. Results In total, 493 digital images were recorded, of which 428 were photographs and 65 were video clips. These were successfully used for teaching purposes, publications and patient records. Conclusions This study highlights the importance of informed consent, the selection of a suitable package of digital technology and the role of basic photographic technique in developing a successful digital database in a busy clinical environment. PMID:17068281

  1. Landscape features, standards, and semantics in U.S. national topographic mapping databases

    USGS Publications Warehouse

    Varanka, Dalia

    2009-01-01

    The objective of this paper is to examine the contrast between local, field-surveyed topographical representation and feature representation in digital, centralized databases and to clarify their ontological implications. The semantics of these two approaches are contrasted by examining the categorization of features by subject domains inherent to national topographic mapping. When comparing five USGS topographic mapping domain and feature lists, results indicate that multiple semantic meanings and ontology rules were applied to the initial digital database, but were lost as databases became more centralized at national scales, and common semantics were replaced by technological terms.

  2. Possible costs associated with investigating and mitigating geologic hazards in rural areas of western San Mateo County, California with a section on using the USGS website to determine the cost of developing property for residences in rural parts of San Mateo County, California

    USGS Publications Warehouse

    Brabb, Earl E.; Roberts, Sebastian; Cotton, William R.; Kropp, Alan L.; Wright, Robert H.; Zinn, Erik N.; Digital database by Roberts, Sebastian; Mills, Suzanne K.; Barnes, Jason B.; Marsolek, Joanna E.

    2000-01-01

    This publication consists of a digital map database on a geohazards web site, http://kaibab.wr.usgs.gov/geohazweb/intro.htm, this text, and 43 digital map images available for downloading at this site. The report is stored as several digital files, in ARC export (uncompressed) format for the database, and Postscript and PDF formats for the map images. Several of the source data layers for the images have already been released in other publications by the USGS and are available for downloading on the Internet. These source layers are not included in this digital database, but rather a reference is given for the web site where the data can be found in digital format. The exported ARC coverages and grids lie in UTM zone 10 projection. The pamphlet, which only describes the content and character of the digital map database, is included as Postscript, PDF, and ASCII text files and is also available on paper as USGS Open-File Report 00-127. The full versatility of the spatial database is realized by importing the ARC export files into ARC/INFO or an equivalent GIS. Other GIS packages, including MapInfo and ARCVIEW, can also use the ARC export files. The Postscript map image can be used for viewing or plotting in computer systems with sufficient capacity, and the considerably smaller PDF image files can be viewed or plotted in full or in part from Adobe ACROBAT software running on Macintosh, PC, or UNIX platforms.

  3. Increasing the efficiency of digitization workflows for herbarium specimens.

    PubMed

    Tulig, Melissa; Tarnowsky, Nicole; Bevans, Michael; Anthony Kirchgessner; Thiers, Barbara M

    2012-01-01

    The New York Botanical Garden Herbarium has been databasing and imaging its estimated 7.3 million plant specimens for the past 17 years. Due to the size of the collection, we have been selectively digitizing fundable subsets of specimens, making successive passes through the herbarium with each new grant. With this strategy, the average rate for databasing complete records has been 10 specimens per hour. With 1.3 million specimens databased, this effort has taken about 130,000 hours of staff time. At this rate, to complete the herbarium and digitize the remaining 6 million specimens, another 600,000 hours would be needed. Given the current biodiversity and economic crises, there is neither the time nor money to complete the collection at this rate.Through a combination of grants over the last few years, The New York Botanical Garden has been testing new protocols and tactics for increasing the rate of digitization through combinations of data collaboration, field book digitization, partial data entry and imaging, and optical character recognition (OCR) of specimen images. With the launch of the National Science Foundation's new Advancing Digitization of Biological Collections program, we hope to move forward with larger, more efficient digitization projects, capturing data from larger portions of the herbarium at a fraction of the cost and time.

  4. Increasing the efficiency of digitization workflows for herbarium specimens

    PubMed Central

    Tulig, Melissa; Tarnowsky, Nicole; Bevans, Michael; Anthony Kirchgessner; Thiers,  Barbara M.

    2012-01-01

    Abstract The New York Botanical Garden Herbarium has been databasing and imaging its estimated 7.3 million plant specimens for the past 17 years. Due to the size of the collection, we have been selectively digitizing fundable subsets of specimens, making successive passes through the herbarium with each new grant. With this strategy, the average rate for databasing complete records has been 10 specimens per hour. With 1.3 million specimens databased, this effort has taken about 130,000 hours of staff time. At this rate, to complete the herbarium and digitize the remaining 6 million specimens, another 600,000 hours would be needed. Given the current biodiversity and economic crises, there is neither the time nor money to complete the collection at this rate. Through a combination of grants over the last few years, The New York Botanical Garden has been testing new protocols and tactics for increasing the rate of digitization through combinations of data collaboration, field book digitization, partial data entry and imaging, and optical character recognition (OCR) of specimen images. With the launch of the National Science Foundation’s new Advancing Digitization of Biological Collections program, we hope to move forward with larger, more efficient digitization projects, capturing data from larger portions of the herbarium at a fraction of the cost and time. PMID:22859882

  5. Digital Dental X-ray Database for Caries Screening

    NASA Astrophysics Data System (ADS)

    Rad, Abdolvahab Ehsani; Rahim, Mohd Shafry Mohd; Rehman, Amjad; Saba, Tanzila

    2016-06-01

    Standard database is the essential requirement to compare the performance of image analysis techniques. Hence the main issue in dental image analysis is the lack of available image database which is provided in this paper. Periapical dental X-ray images which are suitable for any analysis and approved by many dental experts are collected. This type of dental radiograph imaging is common and inexpensive, which is normally used for dental disease diagnosis and abnormalities detection. Database contains 120 various Periapical X-ray images from top to bottom jaw. Dental digital database is constructed to provide the source for researchers to use and compare the image analysis techniques and improve or manipulate the performance of each technique.

  6. 15 years of zooming in and zooming out: Developing a new single scale national active fault database of New Zealand

    NASA Astrophysics Data System (ADS)

    Ries, William; Langridge, Robert; Villamor, Pilar; Litchfield, Nicola; Van Dissen, Russ; Townsend, Dougal; Lee, Julie; Heron, David; Lukovic, Biljana

    2014-05-01

    In New Zealand, we are currently reconciling multiple digital coverages of mapped active faults into a national coverage at a single scale (1:250,000). This seems at first glance to be a relatively simple task. However, methods used to capture data, the scale of capture, and the initial purpose of the fault mapping, has produced datasets that have very different characteristics. The New Zealand digital active fault database (AFDB) was initially developed as a way of managing active fault locations and fault-related features within a computer-based spatial framework. The data contained within the AFDB comes from a wide range of studies, from plate tectonic (1:500,000) to cadastral (1:2,000) scale. The database was designed to allow capture of field observations and remotely sourced data without a loss in data resolution. This approach has worked well as a method for compiling a centralised database for fault information but not for providing a complete national coverage at a single scale. During the last 15 years other complementary projects have used and also contributed data to the AFDB, most notably the QMAP project (a national series of geological maps completed over 19 years that include coverage of active and inactive faults at 1:250,000). AFDB linework and attributes was incorporated into this series but simplification of linework and attributes has occurred to maintain map clarity at 1:250,000 scale. Also, during this period on-going mapping of active faults has improved upon these data. Other projects of note that have used data from the AFDB include the National Seismic Hazard Model of New Zealand and the Global Earthquake Model (GEM). The main goal of the current project has been to provide the best digital spatial representation of a fault trace at 1:250,000 scale and combine this with the most up to date attributes. In some areas this has required a simplification of very fine detailed data and in some cases new mapping to provide a complete coverage. Where datasets have conflicting line work and/or attributes, data was reviewed through consultation with authors or review of published research to ensure the most to date representation was maintained. The current project aims to provide a coverage that will be consistent between the AFDB and QMAP digital and provide a free download of these data on the AFDB website (http://data.gns.cri.nz/af/).

  7. Digital hand atlas and computer-aided bone age assessment via the Web

    NASA Astrophysics Data System (ADS)

    Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente

    1999-07-01

    A frequently used assessment method of bone age is atlas matching by a radiological examination of a hand image against a reference set of atlas patterns of normal standards. We are in a process of developing a digital hand atlas with a large standard set of normal hand and wrist images that reflect the skeletal maturity, race and sex difference, and current child development. The digital hand atlas will be used for a computer-aided bone age assessment via Web. We have designed and partially implemented a computer-aided diagnostic (CAD) system for Web-based bone age assessment. The system consists of a digital hand atlas, a relational image database and a Web-based user interface. The digital atlas is based on a large standard set of normal hand an wrist images with extracted bone objects and quantitative features. The image database uses a content- based indexing to organize the hand images and their attributes and present to users in a structured way. The Web-based user interface allows users to interact with the hand image database from browsers. Users can use a Web browser to push a clinical hand image to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, will be extracted and compared with patterns from the atlas database to assess the bone age. The relevant reference imags and the final assessment report will be sent back to the user's browser via Web. The digital atlas will remove the disadvantages of the currently out-of-date one and allow the bone age assessment to be computerized and done conveniently via Web. In this paper, we present the system design and Web-based client-server model for computer-assisted bone age assessment and our initial implementation of the digital atlas database.

  8. Using Geocoded Databases in Teaching Urban Historical Geography.

    ERIC Educational Resources Information Center

    Miller, Roger P.

    1986-01-01

    Provides information regarding hardware and software requirements for using geocoded databases in urban historical geography. Reviews 11 IBM and Apple Macintosh database programs and describes the pen plotter and digitizing table interface used with the databases. (JDH)

  9. The Neotoma Paleoecology Database: An International Community-Curated Resource for Paleoecological and Paleoenvironmental Data

    NASA Astrophysics Data System (ADS)

    Williams, J. W.; Grimm, E. C.; Ashworth, A. C.; Blois, J.; Charles, D. F.; Crawford, S.; Davis, E.; Goring, S. J.; Graham, R. W.; Miller, D. A.; Smith, A. J.; Stryker, M.; Uhen, M. D.

    2017-12-01

    The Neotoma Paleoecology Database supports global change research at the intersection of geology and ecology by providing a high-quality, community-curated data repository for paleoecological data. These data are widely used to study biological responses and feedbacks to past environmental change at local to global scales. The Neotoma data model is flexible and can store multiple kinds of fossil, biogeochemical, or physical variables measured from sedimentary archives. Data additions to Neotoma are growing and include >3.5 million observations, >16,000 datasets, and >8,500 sites. Dataset types include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Neotoma data can be found and retrieved in multiple ways, including the Explorer map-based interface, a RESTful Application Programming Interface, the neotoma R package, and digital object identifiers. Neotoma has partnered with the Paleobiology Database to produce a common data portal for paleobiological data, called the Earth Life Consortium. A new embargo management is designed to allow investigators to put their data into Neotoma and then make use of Neotoma's value-added services. Neotoma's distributed scientific governance model is flexible and scalable, with many open pathways for welcoming new members, data contributors, stewards, and research communities. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.

  10. SPACEWAY: Providing affordable and versatile communication solutions

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, E. J.

    1995-08-01

    By the end of this decade, Hughes' SPACEWAY network will provide the first interactive 'bandwidth on demand' communication services for a variety of applications. High quality digital voice, interactive video, global access to multimedia databases, and transborder workgroup computing will make SPACEWAY an essential component of the computer-based workplace of the 21st century. With relatively few satellites to construct, insure, and launch -- plus extensive use of cost-effective, tightly focused spot beams on the world's most populated areas -- the high capacity SPACEWAY system can pass its significant cost savings onto its customers. The SPACEWAY network is different from other proposed global networks in that its geostationary orbit location makes it a truly market driven system: each satellite will make available extensive telecom services to hundreds of millions of people within the continuous view of that satellite, providing immediate capacity within a specific region of the world.

  11. An approach to regional wetland digital elevation model development using a differential global positioning system and a custom-built helicopter-based surveying system

    USGS Publications Warehouse

    Jones, J.W.; Desmond, G.B.; Henkle, C.; Glover, R.

    2012-01-01

    Accurate topographic data are critical to restoration science and planning for the Everglades region of South Florida, USA. They are needed to monitor and simulate water level, water depth and hydroperiod and are used in scientific research on hydrologic and biologic processes. Because large wetland environments and data acquisition challenge conventional ground-based and remotely sensed data collection methods, the United States Geological Survey (USGS) adapted a classical data collection instrument to global positioning system (GPS) and geographic information system (GIS) technologies. Data acquired with this instrument were processed using geostatistics to yield sub-water level elevation values with centimetre accuracy (??15 cm). The developed database framework, modelling philosophy and metadata protocol allow for continued, collaborative model revision and expansion, given additional elevation or other ancillary data. ?? 2012 Taylor & Francis.

  12. Citizen science, GIS, and the global hunt for landslides

    NASA Astrophysics Data System (ADS)

    Juang, C.; Stanley, T.; Kirschbaum, D.

    2017-12-01

    Landslides occur across the United States and around the world, causing much suffering and infrastructure damage. Many of these events have been recorded in the Global Landslide Catalog (GLC), a worldwide record of recently rainfall-triggered landslides. The extent and composition of this database has been affected by the limits of media search tools and available staffing. Citizen scientists could expand the effort exponentially, as well as diversify the knowledge base of the research team. In order to enable this collaboration the NASA Center for Climate Simulation has created a GIS portal for viewing, editing, and managing the GLC. The data is also exposed through a Rest API, for easy incorporation into geospatial websites by third parties. Future developments may include the ability to store polygons delineating large landslides, digitization from recent satellite imagery, and the establishment of a community for international landslide research that is open to both lay and academic users.

  13. SPACEWAY: Providing affordable and versatile communication solutions

    NASA Technical Reports Server (NTRS)

    Fitzpatrick, E. J.

    1995-01-01

    By the end of this decade, Hughes' SPACEWAY network will provide the first interactive 'bandwidth on demand' communication services for a variety of applications. High quality digital voice, interactive video, global access to multimedia databases, and transborder workgroup computing will make SPACEWAY an essential component of the computer-based workplace of the 21st century. With relatively few satellites to construct, insure, and launch -- plus extensive use of cost-effective, tightly focused spot beams on the world's most populated areas -- the high capacity SPACEWAY system can pass its significant cost savings onto its customers. The SPACEWAY network is different from other proposed global networks in that its geostationary orbit location makes it a truly market driven system: each satellite will make available extensive telecom services to hundreds of millions of people within the continuous view of that satellite, providing immediate capacity within a specific region of the world.

  14. A long-term study of the impact of solar flares on ionospheric characteristics measured by digisondes and GNSS receivers

    NASA Astrophysics Data System (ADS)

    Tripathi, Sharad Chandra; Haralambous, Haris; Das, Tanmay

    2016-07-01

    Solar Flares are highly transient phenomena radiating over a wide spectrum of wavelengths with EUV and X-rays imposing the most significant effect on ionospheric characteristics. This study presents an attempt to examine qualitatively and quantitatively these effects as measured by digisondes and GNSS receivers on a global scale. For this purpose we have divided the whole globe in three sectors (American, African-European and Asian) based on longitude. We have extracted data for ionospheric characteristics by scaling, manually, the ionograms being provided by DIDBase (Digital Ionogram Database) as provided by the Global Ionospheric Radio Observatory (GIRO) during X-class flares for an approximate period of a solar cycle . We have also used TEC data extracted from GPS observations from collocated IGS Stations. Spectral analysis of Solar Flares are added to the methodology to compare the effects in terms of spectral characteristics.

  15. Training system for digital mammographic diagnoses of breast cancer

    NASA Astrophysics Data System (ADS)

    Thomaz, R. L.; Nirschl Crozara, M. G.; Patrocinio, A. C.

    2013-03-01

    As the technology evolves, the analog mammography systems are being replaced by digital systems. The digital system uses video monitors as the display of mammographic images instead of the previously used screen-film and negatoscope for analog images. The change in the way of visualizing mammographic images may require a different approach for training the health care professionals in diagnosing the breast cancer with digital mammography. Thus, this paper presents a computational approach to train the health care professionals providing a smooth transition between analog and digital technology also training to use the advantages of digital image processing tools to diagnose the breast cancer. This computational approach consists of a software where is possible to open, process and diagnose a full mammogram case from a database, which has the digital images of each of the mammographic views. The software communicates with a gold standard digital mammogram cases database. This database contains the digital images in Tagged Image File Format (TIFF) and the respective diagnoses according to BI-RADSTM, these files are read by software and shown to the user as needed. There are also some digital image processing tools that can be used to provide better visualization of each single image. The software was built based on a minimalist and a user-friendly interface concept that might help in the smooth transition. It also has an interface for inputting diagnoses from the professional being trained, providing a result feedback. This system has been already completed, but hasn't been applied to any professional training yet.

  16. Creating a standardized watersheds database for the Lower Rio Grande/Río Bravo, Texas

    USGS Publications Warehouse

    Brown, J.R.; Ulery, Randy L.; Parcher, Jean W.

    2000-01-01

    This report describes the creation of a large-scale watershed database for the lower Rio Grande/Río Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets.Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds.A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.

  17. Creating a standardized watersheds database for the lower Rio Grande/Rio Bravo, Texas

    USGS Publications Warehouse

    Brown, Julie R.; Ulery, Randy L.; Parcher, Jean W.

    2000-01-01

    This report describes the creation of a large-scale watershed database for the lower Rio Grande/Rio Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets. Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds. A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.

  18. Public Access to Digital Material; A Call to Researchers: Digital Libraries Need Collaboration across Disciplines; Greenstone: Open-Source Digital Library Software; Retrieval Issues for the Colorado Digitization Project's Heritage Database; Report on the 5th European Conference on Digital Libraries, ECDL 2001; Report on the First Joint Conference on Digital Libraries.

    ERIC Educational Resources Information Center

    Kahle, Brewster; Prelinger, Rick; Jackson, Mary E.; Boyack, Kevin W.; Wylie, Brian N.; Davidson, George S.; Witten, Ian H.; Bainbridge, David; Boddie, Stefan J.; Garrison, William A.; Cunningham, Sally Jo; Borgman, Christine L.; Hessel, Heather

    2001-01-01

    These six articles discuss various issues relating to digital libraries. Highlights include public access to digital materials; intellectual property concerns; the need for collaboration across disciplines; Greenstone software for construction and presentation of digital information collections; the Colorado Digitization Project; and conferences…

  19. Conversion of environmental data to a digital-spatial database, Puget Sound area, Washington

    USGS Publications Warehouse

    Uhrich, M.A.; McGrath, T.S.

    1997-01-01

    Data and maps from the Puget Sound Environmental Atlas, compiled for the U.S. Environmental Protection Agency, the Puget Sound Water Quality Authority, and the U.S. Army Corps of Engineers, have been converted into a digital-spatial database using a geographic information system. Environmental data for the Puget Sound area,collected from sources other than the Puget SoundEnvironmental Atlas by different Federal, State, andlocal agencies, also have been converted into thisdigital-spatial database. Background on the geographic-information-system planning process, the design and implementation of the geographic information-system database, and the reasons for conversion to this digital-spatial database are included in this report. The Puget Sound Environmental Atlas data layers include information about seabird nesting areas, eelgrass and kelp habitat, marine mammal and fish areas, and shellfish resources and bed certification. Data layers, from sources other than the Puget Sound Environmental Atlas, include the Puget Sound shoreline, the water-body system, shellfish growing areas, recreational shellfish beaches, sewage-treatment outfalls, upland hydrography,watershed and political boundaries, and geographicnames. The sources of data, descriptions of the datalayers, and the steps and errors of processing associated with conversion to a digital-spatial database used in development of the Puget Sound Geographic Information System also are included in this report. The appendixes contain data dictionaries for each of the resource layers and error values for the conversion of Puget SoundEnvironmental Atlas data.

  20. Improving the Automatic Inversion of Digital Alouette/ISIS Ionogram Reflection Traces into Topside Electron Density Profiles

    NASA Technical Reports Server (NTRS)

    Benson, Robert F.; Truhlik, Vladimir; Huang, Xueqin; Wang, Yongli; Bilitza, Dieter

    2012-01-01

    The topside sounders of the International Satellites for Ionospheric Studies (ISIS) program were designed as analog systems. The resulting ionograms were displayed on 35 mm film for analysis by visual inspection. Each of these satellites, launched between 1962 and 1971, produced data for 10 to 20 years. A number of the original telemetry tapes from this large data set have been converted directly into digital records. Software, known as the Topside Ionogram Scalar With True-Height (TOPIST) algorithm, has been produced and used for the automatic inversion of the ionogram reflection traces on more than 100,000 ISIS-2 digital topside ionograms into topside vertical electron density profiles Ne(h). Here we present some topside ionospheric solar cycle variations deduced from the TOPIST database to illustrate the scientific benefit of improving and expanding the topside ionospheric Ne(h) database. The profile improvements will be based on improvements in the TOPIST software motivated by direct comparisons between TOPIST profiles and profiles produced by manual scaling in the early days of the ISIS program. The database expansion will be based on new software designed to overcome limitations in the original digital topside ionogram database caused by difficulties encountered during the analog-to-digital conversion process in the detection of the ionogram frame sync pulse and/or the frequency markers. This improved and expanded TOPIST topside Ne(h) database will greatly enhance investigations into both short- and long-term ionospheric changes, e.g., the observed topside ionospheric responses to magnetic storms, induced by interplanetary magnetic clouds, and solar cycle variations, respectively.

  1. Enhanced digital mapping project : final report

    DOT National Transportation Integrated Search

    2004-11-19

    The Enhanced Digital Map Project (EDMap) was a three-year effort launched in April 2001 to develop a range of digital map database enhancements that enable or improve the performance of driver assistance systems currently under development or conside...

  2. DAM-ing the Digital Flood

    ERIC Educational Resources Information Center

    Raths, David

    2008-01-01

    With the widespread digitization of art, photography, and music, plus the introduction of streaming video, many colleges and universities are realizing that they must develop or purchase systems to preserve their school's digitized objects; that they must create searchable databases so that researchers can find and share copies of digital files;…

  3. Digital Initiatives and Metadata Use in Thailand

    ERIC Educational Resources Information Center

    SuKantarat, Wichada

    2008-01-01

    Purpose: This paper aims to provide information about various digital initiatives in libraries in Thailand and especially use of Dublin Core metadata in cataloguing digitized objects in academic and government digital databases. Design/methodology/approach: The author began researching metadata use in Thailand in 2003 and 2004 while on sabbatical…

  4. 78 FR 58545 - Global Unique Device Identification Database; Draft Guidance for Industry; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ...] Global Unique Device Identification Database; Draft Guidance for Industry; Availability AGENCY: Food and... the availability of the draft guidance entitled ``Global Unique Device Identification Database (GUDID... manufacturer) will interface with the GUDID, as well as information on the database elements that must be...

  5. Benford's Law for Quality Assurance of Manner of Death Counts in Small and Large Databases.

    PubMed

    Daniels, Jeremy; Caetano, Samantha-Jo; Huyer, Dirk; Stephen, Andrew; Fernandes, John; Lytwyn, Alice; Hoppe, Fred M

    2017-09-01

    To assess if Benford's law, a mathematical law used for quality assurance in accounting, can be applied as a quality assurance measure for the manner of death determination. We examined a regional forensic pathology service's monthly manner of death counts (N = 2352) from 2011 to 2013, and provincial monthly and weekly death counts from 2009 to 2013 (N = 81,831). We tested whether each dataset's leading digit followed Benford's law via the chi-square test. For each database, we assessed whether number 1 was the most common leading digit. The manner of death counts first digit followed Benford's law in all the three datasets. Two of the three datasets had 1 as the most frequent leading digit. The manner of death data in this study showed qualities consistent with Benford's law. The law has potential as a quality assurance metric in the manner of death determination for both small and large databases. © 2017 American Academy of Forensic Sciences.

  6. Surfacing the deep data of taxonomy

    PubMed Central

    Page, Roderic D. M.

    2016-01-01

    Abstract Taxonomic databases are perpetuating approaches to citing literature that may have been appropriate before the Internet, often being little more than digitised 5 × 3 index cards. Typically the original taxonomic literature is either not cited, or is represented in the form of a (typically abbreviated) text string. Hence much of the “deep data” of taxonomy, such as the original descriptions, revisions, and nomenclatural actions are largely hidden from all but the most resourceful users. At the same time there are burgeoning efforts to digitise the scientific literature, and much of this newly available content has been assigned globally unique identifiers such as Digital Object Identifiers (DOIs), which are also the identifier of choice for most modern publications. This represents an opportunity for taxonomic databases to engage with digitisation efforts. Mapping the taxonomic literature on to globally unique identifiers can be time consuming, but need be done only once. Furthermore, if we reuse existing identifiers, rather than mint our own, we can start to build the links between the diverse data that are needed to support the kinds of inference which biodiversity informatics aspires to support. Until this practice becomes widespread, the taxonomic literature will remain balkanized, and much of the knowledge that it contains will linger in obscurity. PMID:26877663

  7. Databases for the Global Dynamics of Multiparameter Nonlinear Systems

    DTIC Science & Technology

    2014-03-05

    AFRL-OSR-VA-TR-2014-0078 DATABASES FOR THE GLOBAL DYNAMICS OF MULTIPARAMETER NONLINEAR SYSTEMS Konstantin Mischaikow RUTGERS THE STATE UNIVERSITY OF...University of New Jersey ASB III, Rutgers Plaza New Brunswick, NJ 08807 DATABASES FOR THE GLOBAL DYNAMICS OF MULTIPARAMETER NONLINEAR SYSTEMS ...dynamical systems . We refer to the output as a Database for Global Dynamics since it allows the user to query for information about the existence and

  8. Digital asset management.

    PubMed

    Humphrey, Clinton D; Tollefson, Travis T; Kriet, J David

    2010-05-01

    Facial plastic surgeons are accumulating massive digital image databases with the evolution of photodocumentation and widespread adoption of digital photography. Managing and maximizing the utility of these vast data repositories, or digital asset management (DAM), is a persistent challenge. Developing a DAM workflow that incorporates a file naming algorithm and metadata assignment will increase the utility of a surgeon's digital images. Copyright 2010 Elsevier Inc. All rights reserved.

  9. Digital Citizenship within Global Contexts

    ERIC Educational Resources Information Center

    Searson, Michael; Hancock, Marsali; Soheil, Nusrat; Shepherd, Gregory

    2015-01-01

    EduSummIT 2013 featured a working group that examined digital citizenship within a global context. Group members recognized that, given today's international, regional, political, and social dynamics, the notion of "global" might be more aspirational than practical. The development of informed policies and practices serving and involving…

  10. Digital Education Governance: Data Visualization, Predictive Analytics, and "Real-Time" Policy Instruments

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    Educational institutions and governing practices are increasingly augmented with digital database technologies that function as new kinds of policy instruments. This article surveys and maps the landscape of digital policy instrumentation in education and provides two detailed case studies of new digital data systems. The Learning Curve is a…

  11. 21 CFR 830.350 - Correction of information submitted to the Global Unique Device Identification Database.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Unique Device Identification Database. 830.350 Section 830.350 Food and Drugs FOOD AND DRUG... Global Unique Device Identification Database § 830.350 Correction of information submitted to the Global Unique Device Identification Database. (a) If FDA becomes aware that any information submitted to the...

  12. Uniform resolution of compact identifiers for biomedical data

    PubMed Central

    Wimalaratne, Sarala M.; Juty, Nick; Kunze, John; Janée, Greg; McMurry, Julie A.; Beard, Niall; Jimenez, Rafael; Grethe, Jeffrey S.; Hermjakob, Henning; Martone, Maryann E.; Clark, Tim

    2018-01-01

    Most biomedical data repositories issue locally-unique accessions numbers, but do not provide globally unique, machine-resolvable, persistent identifiers for their datasets, as required by publishers wishing to implement data citation in accordance with widely accepted principles. Local accessions may however be prefixed with a namespace identifier, providing global uniqueness. Such “compact identifiers” have been widely used in biomedical informatics to support global resource identification with local identifier assignment. We report here on our project to provide robust support for machine-resolvable, persistent compact identifiers in biomedical data citation, by harmonizing the Identifiers.org and N2T.net (Name-To-Thing) meta-resolvers and extending their capabilities. Identifiers.org services hosted at the European Molecular Biology Laboratory - European Bioinformatics Institute (EMBL-EBI), and N2T.net services hosted at the California Digital Library (CDL), can now resolve any given identifier from over 600 source databases to its original source on the Web, using a common registry of prefix-based redirection rules. We believe these services will be of significant help to publishers and others implementing persistent, machine-resolvable citation of research data. PMID:29737976

  13. Not Just Another Research Paper: Understanding Global Sustainability through Digital Documentary

    ERIC Educational Resources Information Center

    Green, Martha R.; Walters, Lynne Masel; Walters, Timothy; Wang, Liangyan

    2015-01-01

    This article evaluates the impact of extending a traditional written research paper into a digital documentary on students' perception and level of comprehension of a global sustainability issue. An adaptation of Moon's (1999) five-stage map of learning was used to assess the written and digital projects students submitted to a statewide…

  14. Modeling the Historical Flood Events in France

    NASA Astrophysics Data System (ADS)

    Ali, Hani; Blaquière, Simon

    2017-04-01

    We will present the simulation results for different scenarios based on the flood model developed by AXA Global P&C CAT Modeling team. The model uses a Digital Elevation Model (DEM) with 75 m resolution, a hydrographic system (DB Carthage), daily rainfall data from "Météo France", water level from "HYDRO Banque" the French Hydrological Database (www.hydro.eaufrance.fr), for more than 1500 stations, hydrological model from IRSTEA and in-house hydraulic tool. In particular, the model re-simulates the most important and costly flood events that occurred during the past decade in France: we will present the re-simulated meteorological conditions since 1964 and estimate insurance loss incurred on current AXA portfolio of individual risks.

  15. Dragon pulse information management system (DPIMS): A unique model-based approach to implementing domain agnostic system of systems and behaviors

    NASA Astrophysics Data System (ADS)

    Anderson, Thomas S.

    2016-05-01

    The Global Information Network Architecture is an information technology based on Vector Relational Data Modeling, a unique computational paradigm, DoD network certified by USARMY as the Dragon Pulse Informa- tion Management System. This network available modeling environment for modeling models, where models are configured using domain relevant semantics and use network available systems, sensors, databases and services as loosely coupled component objects and are executable applications. Solutions are based on mission tactics, techniques, and procedures and subject matter input. Three recent ARMY use cases are discussed a) ISR SoS. b) Modeling and simulation behavior validation. c) Networked digital library with behaviors.

  16. Geologic map of the eastern part of the Challis National Forest and vicinity, Idaho

    USGS Publications Warehouse

    Wilson, A.B.; Skipp, B.A.

    1994-01-01

    The paper version of the Geologic Map of the eastern part of the Challis National Forest and vicinity, Idaho was compiled by Anna Wilson and Betty Skipp in 1994. The geology was compiled on a 1:250,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a GIS database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  17. Creating of Central Geospatial Database of the Slovak Republic and Procedures of its Revision

    NASA Astrophysics Data System (ADS)

    Miškolci, M.; Šafář, V.; Šrámková, R.

    2016-06-01

    The article describes the creation of initial three dimensional geodatabase from planning and designing through the determination of technological and manufacturing processes to practical using of Central Geospatial Database (CGD - official name in Slovak language is Centrálna Priestorová Databáza - CPD) and shortly describes procedures of its revision. CGD ensures proper collection, processing, storing, transferring and displaying of digital geospatial information. CGD is used by Ministry of Defense (MoD) for defense and crisis management tasks and by Integrated rescue system. For military personnel CGD is run on MoD intranet, and for other users outside of MoD is transmutated to ZbGIS (Primary Geodatabase of Slovak Republic) and is run on public web site. CGD is a global set of geo-spatial information. CGD is a vector computer model which completely covers entire territory of Slovakia. Seamless CGD is created by digitizing of real world using of photogrammetric stereoscopic methods and measurements of objects properties. Basic vector model of CGD (from photogrammetric processing) is then taken out to the field for inspection and additional gathering of objects properties in the whole area of mapping. Finally real-world objects are spatially modeled as a entities of three-dimensional database. CGD gives us opportunity, to get know the territory complexly in all the three spatial dimensions. Every entity in CGD has recorded the time of collection, which allows the individual to assess the timeliness of information. CGD can be utilized for the purposes of geographical analysis, geo-referencing, cartographic purposes as well as various special-purpose mapping and has the ambition to cover the needs not only the MoD, but to become a reference model for the national geographical infrastructure.

  18. A Medical Image Backup Architecture Based on a NoSQL Database and Cloud Computing Services.

    PubMed

    Santos Simões de Almeida, Luan Henrique; Costa Oliveira, Marcelo

    2015-01-01

    The use of digital systems for storing medical images generates a huge volume of data. Digital images are commonly stored and managed on a Picture Archiving and Communication System (PACS), under the DICOM standard. However, PACS is limited because it is strongly dependent on the server's physical space. Alternatively, Cloud Computing arises as an extensive, low cost, and reconfigurable resource. However, medical images contain patient information that can not be made available in a public cloud. Therefore, a mechanism to anonymize these images is needed. This poster presents a solution for this issue by taking digital images from PACS, converting the information contained in each image file to a NoSQL database, and using cloud computing to store digital images.

  19. A digital version of the 1970 U.S. Geological Survey topographic map of the San Francisco Bay region, three sheets, 1:125,000

    USGS Publications Warehouse

    Aitken, Douglas S.

    1997-01-01

    This Open-File report is a digital topographic map database. It contains a digital version of the 1970 U.S. Geological Survey topographic map of the San Francisco Bay Region (3 sheets), at a scale of 1:125,000. These ARC/INFO coverages are in vector format. The vectorization process has distorted characters representing letters and numbers, as well as some road and other symbols, making them difficult to read in some instances. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The content and character of the database and methods of obtaining it are described herein.

  20. Municipal GIS incorporates database from pipe lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-05-01

    League City, a coastal area community of about 35,000 population in Galveston County, Texas, has developed an impressive municipal GIS program. The system represents a textbook example of what a municipal GIS can represent and produce. In 1987, the city engineer was authorized to begin developing the area information system. City survey personnel used state-of-the-art Global Positioning System (GPS) technology to establish a first order monumentation program with a grid of 78 monuments set over 54 sq mi. Street, subdivision, survey, utilities, taxing criteria, hydrology, topography, environmental and other concerns were layered into the municipal GIS database program. Today, areamore » developers submit all layout, design, and land use plan data to the city in digital format without hard copy. Multi-color maps with high resolution graphics can be quickly generate for cross-referenced queries sensitive to political, environmental, engineering, taxing, and/or utility capacity jurisdictions. The design of both the GIS and data base system are described.« less

  1. Toward Soil Spatial Information Systems (SSIS) for global modeling and ecosystem management

    NASA Technical Reports Server (NTRS)

    Baumgardner, Marion F.

    1995-01-01

    The general objective is to conduct research to contribute toward the realization of a world soils and terrain (SOTER) database, which can stand alone or be incorporated into a more complete and comprehensive natural resources digital information system. The following specific objectives are focussed on: (1) to conduct research related to (a) translation and correlation of different soil classification systems to the SOTER database legend and (b) the inferfacing of disparate data sets in support of the SOTER Project; (2) to examine the potential use of AVHRR (Advanced Very High Resolution Radiometer) data for delineating meaningful soils and terrain boundaries for small scale soil survey (range of scale: 1:250,000 to 1:1,000,000) and terrestrial ecosystem assessment and monitoring; and (3) to determine the potential use of high dimensional spectral data (220 reflectance bands with 10 m spatial resolution) for delineating meaningful soils boundaries and conditions for the purpose of detailed soil survey and land management.

  2. USGS national surveys and analysis projects: Preliminary compilation of integrated geological datasets for the United States

    USGS Publications Warehouse

    Nicholson, Suzanne W.; Stoeser, Douglas B.; Wilson, Frederic H.; Dicken, Connie L.; Ludington, Steve

    2007-01-01

    The growth in the use of Geographic nformation Systems (GS) has highlighted the need for regional and national digital geologic maps attributed with age and rock type information. Such spatial data can be conveniently used to generate derivative maps for purposes that include mineral-resource assessment, metallogenic studies, tectonic studies, human health and environmental research. n 1997, the United States Geological Survey’s Mineral Resources Program initiated an effort to develop national digital databases for use in mineral resource and environmental assessments. One primary activity of this effort was to compile a national digital geologic map database, utilizing state geologic maps, to support mineral resource studies in the range of 1:250,000- to 1:1,000,000-scale. Over the course of the past decade, state databases were prepared using a common standard for the database structure, fields, attributes, and data dictionaries. As of late 2006, standardized geological map databases for all conterminous (CONUS) states have been available on-line as USGS Open-File Reports. For Alaska and Hawaii, new state maps are being prepared, and the preliminary work for Alaska is being released as a series of 1:500,000-scale regional compilations. See below for a list of all published databases.

  3. GIS-project: geodynamic globe for global monitoring of geological processes

    NASA Astrophysics Data System (ADS)

    Ryakhovsky, V.; Rundquist, D.; Gatinsky, Yu.; Chesalova, E.

    2003-04-01

    A multilayer geodynamic globe at the scale 1:10,000,000 was created at the end of the nineties in the GIS Center of the Vernadsky Museum. A special soft-and-hardware complex was elaborated for its visualization with a set of multitarget object directed databases. The globe includes separate thematic covers represented by digital sets of spatial geological, geochemical, and geophysical information (maps, schemes, profiles, stratigraphic columns, arranged databases etc.). At present the largest databases included in the globe program are connected with petrochemical and isotopic data on magmatic rocks of the World Ocean and with the large and supperlarge mineral deposits. Software by the Environmental Scientific Research Institute (ESRI), USA as well as ArcScan vectrorizator were used for covers digitizing and database adaptation (ARC/INFO 7.0, 8.0). All layers of the geoinformational project were obtained by scanning of separate objects and their transfer to the real geographic co-ordinates of an equiintermediate conic projection. Then the covers were projected on plane degree-system geographic co-ordinates. Some attributive databases were formed for each thematic layer, and in the last stage all covers were combined into the single information system. Separate digital covers represent mathematical descriptions of geological objects and relations between them, such as Earth's altimetry, active fault systems, seismicity etc. Some grounds of the cartographic generalization were taken into consideration in time of covers compilation with projection and co-ordinate systems precisely answered a given scale. The globe allows us to carry out in the interactive regime the formation of coordinated with each other object-oriented databases and thematic covers directly connected with them. They can be spread for all the Earth and the near-Earth space, and for the most well known parts of divergent and convergent boundaries of the lithosphere plates. Such covers and time series reflect in diagram form a total combination and dynamics of data on the geological structure, geophysical fields, seismicity, geomagnetism, composition of rock complexes, and metalloge-ny of different areas on the Earth's surface. They give us possibility to scale, detail, and develop 3D spatial visualization. Information filling the covers could be replenished as in the existing so in newly formed databases with new data. The integrated analyses of the data allows us more precisely to define our ideas on regularities in development of lithosphere and mantle unhomogeneities using some original technologies. It also enables us to work out 3D digital models for geodynamic development of tectonic zones in convergent and divergent plate boundaries with the purpose of integrated monitoring of mineral resources and establishing correlation between seismicity, magmatic activity, and metallogeny in time-spatial co-ordinates. The created multifold geoinformation system gives a chance to execute an integral analyses of geoinformation flows in the interactive regime and, in particular, to establish some regularities in the time-spatial distribution and dynamics of main structural units in the lithosphere, as well as illuminate the connection between stages of their development and epochs of large and supperlarge mineral deposit formation. Now we try to use the system for prediction of large oil and gas concentration in the main sedimentary basins. The work was supported by RFBR, (grants 93-07-14680, 96-07-89499, 99-07-90030, 00-15-98535, 02-07-90140) and MTC.

  4. Towards a Global Names Architecture: The future of indexing scientific names

    PubMed Central

    Pyle, Richard L.

    2016-01-01

    Abstract For more than 250 years, the taxonomic enterprise has remained almost unchanged. Certainly, the tools of the trade have improved: months-long journeys aboard sailing ships have been reduced to hours aboard jet airplanes; advanced technology allows humans to access environments that were once utterly inaccessible; GPS has replaced crude maps; digital hi-resolution imagery provides far more accurate renderings of organisms that even the best commissioned artists of a century ago; and primitive candle-lit microscopes have been replaced by an array of technologies ranging from scanning electron microscopy to DNA sequencing. But the basic paradigm remains the same. Perhaps the most revolutionary change of all – which we are still in the midst of, and which has not yet been fully realized – is the means by which taxonomists manage and communicate the information of their trade. The rapid evolution in recent decades of computer database management software, and of information dissemination via the Internet, have both dramatically improved the potential for streamlining the entire taxonomic process. Unfortunately, the potential still largely exceeds the reality. The vast majority of taxonomic information is either not yet digitized, or digitized in a form that does not allow direct and easy access. Moreover, the information that is easily accessed in digital form is not yet seamlessly interconnected. In an effort to bring reality closer to potential, a loose affiliation of major taxonomic resources, including GBIF, the Encyclopedia of Life, NBII, Catalog of Life, ITIS, IPNI, ICZN, Index Fungorum, and many others have been crafting a “Global Names Architecture” (GNA). The intention of the GNA is not to replace any of the existing taxonomic data initiatives, but rather to serve as a dynamic index to interconnect them in a way that streamlines the entire taxonomic enterprise: from gathering specimens in the field, to publication of new taxa and related data. PMID:26877664

  5. Towards a Global Names Architecture: The future of indexing scientific names.

    PubMed

    Pyle, Richard L

    2016-01-01

    For more than 250 years, the taxonomic enterprise has remained almost unchanged. Certainly, the tools of the trade have improved: months-long journeys aboard sailing ships have been reduced to hours aboard jet airplanes; advanced technology allows humans to access environments that were once utterly inaccessible; GPS has replaced crude maps; digital hi-resolution imagery provides far more accurate renderings of organisms that even the best commissioned artists of a century ago; and primitive candle-lit microscopes have been replaced by an array of technologies ranging from scanning electron microscopy to DNA sequencing. But the basic paradigm remains the same. Perhaps the most revolutionary change of all - which we are still in the midst of, and which has not yet been fully realized - is the means by which taxonomists manage and communicate the information of their trade. The rapid evolution in recent decades of computer database management software, and of information dissemination via the Internet, have both dramatically improved the potential for streamlining the entire taxonomic process. Unfortunately, the potential still largely exceeds the reality. The vast majority of taxonomic information is either not yet digitized, or digitized in a form that does not allow direct and easy access. Moreover, the information that is easily accessed in digital form is not yet seamlessly interconnected. In an effort to bring reality closer to potential, a loose affiliation of major taxonomic resources, including GBIF, the Encyclopedia of Life, NBII, Catalog of Life, ITIS, IPNI, ICZN, Index Fungorum, and many others have been crafting a "Global Names Architecture" (GNA). The intention of the GNA is not to replace any of the existing taxonomic data initiatives, but rather to serve as a dynamic index to interconnect them in a way that streamlines the entire taxonomic enterprise: from gathering specimens in the field, to publication of new taxa and related data.

  6. Map and data for Quaternary faults and folds in New Mexico

    USGS Publications Warehouse

    Machette, M.N.; Personius, S.F.; Kelson, K.I.; Haller, K.M.; Dart, R.L.

    1998-01-01

    The "World Map of Major Active Faults" Task Group is compiling a series of digital maps for the United States and other countries in the Western Hemisphere that show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds; the companion database includes published information on these seismogenic features. The Western Hemisphere effort is sponsored by International Lithosphere Program (ILP) Task Group H-2, whereas the effort to compile a new map and database for the United States is funded by the Earthquake Reduction Program (ERP) through the U.S. Geological Survey. The maps and accompanying databases represent a key contribution to the new Global Seismic Hazards Assessment Program (ILP Task Group II-O) for the International Decade for Natural Disaster Reduction. This compilation, which describes evidence for surface faulting and folding in New Mexico, is the third of many similar State and regional compilations that are planned for the U.S. The compilation for West Texas is available as U.S. Geological Survey Open-File Report 96-002 (Collins and others, 1996 #993) and the compilation for Montana will be released as a Montana Bureau of Mines product (Haller and others, in press #1750).

  7. Technical and Organizational Considerations for the Long-Term Maintenance and Development of Digital Brain Atlases and Web-Based Databases

    PubMed Central

    Ito, Kei

    2010-01-01

    Digital brain atlas is a kind of image database that specifically provide information about neurons and glial cells in the brain. It has various advantages that are unmatched by conventional paper-based atlases. Such advantages, however, may become disadvantages if appropriate cares are not taken. Because digital atlases can provide unlimited amount of data, they should be designed to minimize redundancy and keep consistency of the records that may be added incrementally by different staffs. The fact that digital atlases can easily be revised necessitates a system to assure that users can access previous versions that might have been cited in papers at a particular period. To inherit our knowledge to our descendants, such databases should be maintained for a very long period, well over 100 years, like printed books and papers. Technical and organizational measures to enable long-term archive should be considered seriously. Compared to the initial development of the database, subsequent efforts to increase the quality and quantity of its contents are not regarded highly, because such tasks do not materialize in the form of publications. This fact strongly discourages continuous expansion of, and external contributions to, the digital atlases after its initial launch. To solve these problems, the role of the biocurators is vital. Appreciation of the scientific achievements of the people who do not write papers, and establishment of the secure academic career path for them, are indispensable for recruiting talents for this very important job. PMID:20661458

  8. Storage and distribution of pathology digital images using integrated web-based viewing systems.

    PubMed

    Marchevsky, Alberto M; Dulbandzhyan, Ronda; Seely, Kevin; Carey, Steve; Duncan, Raymond G

    2002-05-01

    Health care providers have expressed increasing interest in incorporating digital images of gross pathology specimens and photomicrographs in routine pathology reports. To describe the multiple technical and logistical challenges involved in the integration of the various components needed for the development of a system for integrated Web-based viewing, storage, and distribution of digital images in a large health system. An Oracle version 8.1.6 database was developed to store, index, and deploy pathology digital photographs via our Intranet. The database allows for retrieval of images by patient demographics or by SNOMED code information. The Intranet of a large health system accessible from multiple computers located within the medical center and at distant private physician offices. The images can be viewed using any of the workstations of the health system that have authorized access to our Intranet, using a standard browser or a browser configured with an external viewer or inexpensive plug-in software, such as Prizm 2.0. The images can be printed on paper or transferred to film using a digital film recorder. Digital images can also be displayed at pathology conferences by using wireless local area network (LAN) and secure remote technologies. The standardization of technologies and the adoption of a Web interface for all our computer systems allows us to distribute digital images from a pathology database to a potentially large group of users distributed in multiple locations throughout a large medical center.

  9. ASTER Global Digital Elevation Model GDEM

    NASA Image and Video Library

    2009-06-29

    NASA and Japan Ministry of Economy, Trade and Industry METI released the Advanced Spaceborne Thermal Emission and Reflection Radiometer ASTER Global Digital Elevation Model GDEM to the worldwide public on June 29, 2009.

  10. Publications - DDS 3 | Alaska Division of Geological & Geophysical Surveys

    Science.gov Websites

    Division of Geological & Geophysical Surveys Digital Data Series 3, http://doi.org/10.14509/qff. http Combellick, R.A., 2012, Quaternary faults and folds in Alaska: A digital database, 31 p., 1 sheet, 1 map of Alaska (Plafker and others, 1994), 1 p. Digital Geospatial Data Digital Geospatial Data QFF

  11. A Dynamic Approach to Make CDS/ISIS Databases Interoperable over the Internet Using the OAI Protocol

    ERIC Educational Resources Information Center

    Jayakanth, F.; Maly, K.; Zubair, M.; Aswath, L.

    2006-01-01

    Purpose: A dynamic approach to making legacy databases, like CDS/ISIS, interoperable with OAI-compliant digital libraries (DLs). Design/methodology/approach: There are many bibliographic databases that are being maintained using legacy database systems. CDS/ISIS is one such legacy database system. It was designed and developed specifically for…

  12. Current issues with standards in the measurement and documentation of human skeletal anatomy.

    PubMed

    Magee, Justin; McClelland, Brian; Winder, John

    2012-09-01

    Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18-65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing international published standards relating to engineering drawing and visual communication. Large variations are also evident in standards or guidelines used for global coordinate systems across biomechanics, ergonomics, software systems and 3D software applications. This paper identifies where established good practice exists and suggests additional recommendations, informing an improved communication protocol, to assist reconstruction of skeletal anatomy using 3D digital modeling. © 2012 The Authors. Journal of Anatomy © 2012 Anatomical Society.

  13. Current issues with standards in the measurement and documentation of human skeletal anatomy

    PubMed Central

    Magee, Justin; McClelland, Brian; Winder, John

    2012-01-01

    Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18–65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing international published standards relating to engineering drawing and visual communication. Large variations are also evident in standards or guidelines used for global coordinate systems across biomechanics, ergonomics, software systems and 3D software applications. This paper identifies where established good practice exists and suggests additional recommendations, informing an improved communication protocol, to assist reconstruction of skeletal anatomy using 3D digital modeling. PMID:22747678

  14. Digitized Database of Old Seismograms Recorder in Romania

    NASA Astrophysics Data System (ADS)

    Paulescu, Daniel; Rogozea, Maria; Popa, Mihaela; Radulian, Mircea

    2016-08-01

    The aim of this paper is to describe a managing system for a unique Romanian database of historical seismograms and complementary documentation (metadata) and its dissemination and analysis procedure. For this study, 5188 historical seismograms recorded between 1903 and 1957 by the Romanian seismological observatories (Bucharest-Filaret, Focşani, Bacău, Vrincioaia, Câmpulung-Muscel, Iaşi) were used. In order to reconsider the historical instrumental data, the analog seismograms are converted to digital images and digital waveforms (digitization/ vectorialisation). First, we applied a careful scanning procedure of the seismograms and related material (seismic bulletins, station books, etc.). In a next step, the high resolution scanned seismograms will be processed to obtain the digital/numeric waveforms. We used a Colortrac Smartlf Cx40 scanner which provides images in TIFF or JPG format. For digitization the algorithm Teseo2 developed by the National Institute of Geophysics and Volcanology in Rome (Italy), within the framework of the SISMOS Project, will be used.

  15. Institutionalizing human-computer interaction for global health

    PubMed Central

    Gulliksen, Jan

    2017-01-01

    ABSTRACT Digitalization is the societal change process in which new ICT-based solutions bring forward completely new ways of doing things, new businesses and new movements in the society. Digitalization also provides completely new ways of addressing issues related to global health. This paper provides an overview of the field of human-computer interaction (HCI) and in what way the field has contributed to international development in different regions of the world. Additionally, it outlines the United Nations’ new sustainability goals from December 2015 and what these could contribute to the development of global health and its relationship to digitalization. Finally, it argues why and how HCI could be adopted and adapted to fit the contextual needs, the need for localization and for the development of new digital innovations. The research methodology is mostly qualitative following an action research paradigm in which the actual change process that the digitalization is evoking is equally important as the scientific conclusions that can be drawn. In conclusion, the paper argues that digitalization is fundamentally changing the society through the development and use of digital technologies and may have a profound effect on the digital development of every country in the world. But it needs to be developed based on local practices, it needs international support and to not be limited by any technological constraints. Particularly digitalization to support global health requires a profound understanding of the users and their context, arguing for user-centred systems design methodologies as particularly suitable. PMID:28838309

  16. Geometric processing of digital images of the planets

    NASA Technical Reports Server (NTRS)

    Edwards, Kathleen

    1987-01-01

    New procedures and software have been developed for geometric transformation of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases. Completed Sinusoidal databases may be used for digital analysis and registration with other spatial data. They may also be reproduced as published image maps by digitally transforming them to appropriate map projections.

  17. Fossil-Fuel C02 Emissions Database and Exploration System

    NASA Astrophysics Data System (ADS)

    Krassovski, M.; Boden, T.; Andres, R. J.; Blasing, T. J.

    2012-12-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) quantifies the release of carbon from fossil-fuel use and cement production at global, regional, and national spatial scales. The CDIAC emission time series estimates are based largely on annual energy statistics published at the national level by the United Nations (UN). CDIAC has developed a relational database to house collected data and information and a web-based interface to help users worldwide identify, explore and download desired emission data. The available information is divided in two major group: time series and gridded data. The time series data is offered for global, regional and national scales. Publications containing historical energy statistics make it possible to estimate fossil fuel CO2 emissions back to 1751. Etemad et al. (1991) published a summary compilation that tabulates coal, brown coal, peat, and crude oil production by nation and year. Footnotes in the Etemad et al.(1991) publication extend the energy statistics time series back to 1751. Summary compilations of fossil fuel trade were published by Mitchell (1983, 1992, 1993, 1995). Mitchell's work tabulates solid and liquid fuel imports and exports by nation and year. These pre-1950 production and trade data were digitized and CO2 emission calculations were made following the procedures discussed in Marland and Rotty (1984) and Boden et al. (1995). The gridded data presents annual and monthly estimates. Annual data presents a time series recording 1° latitude by 1° longitude CO2 emissions in units of million metric tons of carbon per year from anthropogenic sources for 1751-2008. The monthly, fossil-fuel CO2 emissions estimates from 1950-2008 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2011), the references therein, and the methodology described in Andres et al. (2011). The data accessible here take these tabular, national, mass-emissions data and distribute them spatially on a one degree latitude by one degree longitude grid. The within-country spatial distribution is achieved through a fixed population distribution as reported in Andres et al. (1996). This presentation introduces newly build database and web interface, reflects the present state and functionality of the Fossil-Fuel CO2 Emissions Database and Exploration System as well as future plans for expansion.

  18. Durand Neighbourhood Heritage Inventory: Toward a Digital Citywide Survey Approach to Heritage Planning in Hamilton

    NASA Astrophysics Data System (ADS)

    Angel, V.; Garvey, A.; Sydor, M.

    2017-08-01

    In the face of changing economies and patterns of development, the definition of heritage is diversifying, and the role of inventories in local heritage planning is coming to the fore. The Durand neighbourhood is a layered and complex area located in inner-city Hamilton, Ontario, Canada, and the second subject area in a set of pilot inventory studies to develop a new city-wide inventory strategy for the City of Hamilton,. This paper presents an innovative digital workflow developed to undertake the Durand Built Heritage Inventory project. An online database was developed to be at the centre of all processes, including digital documentation, record management, analysis and variable outputs. Digital tools were employed for survey work in the field and analytical work in the office, resulting in a GIS-based dataset that can be integrated into Hamilton's larger municipal planning system. Together with digital mapping and digitized historical resources, the Durand database has been leveraged to produce both digital and static outputs to shape recommendations for the protection of Hamilton's heritage resources.

  19. Digital Earth system based river basin data integration

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Li, Wanqing; Lin, Chao

    2014-12-01

    Digital Earth is an integrated approach to build scientific infrastructure. The Digital Earth systems provide a three-dimensional visualization and integration platform for river basin data which include the management data, in situ observation data, remote sensing observation data and model output data. This paper studies the Digital Earth system based river basin data integration technology. Firstly, the construction of the Digital Earth based three-dimensional river basin data integration environment is discussed. Then the river basin management data integration technology is presented which is realized by general database access interface, web service and ActiveX control. Thirdly, the in situ data stored in database tables as records integration is realized with three-dimensional model of the corresponding observation apparatus display in the Digital Earth system by a same ID code. In the next two parts, the remote sensing data and the model output data integration technologies are discussed in detail. The application in the Digital Zhang River basin System of China shows that the method can effectively improve the using efficiency and visualization effect of the data.

  20. 77 FR 14523 - Western Digital Corporation; Analysis of Agreement Containing Consent Order to Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-12

    ... of Viviti Technologies Ltd., formerly known as Hitachi Global Storage Technologies Ltd. (``HGST''), a... negotiate the purchase price of desktop HDDs at a global level. The desktop HDD market is highly... Digital'') proposed acquisition of Viviti Technologies Ltd., formerly known as Hitachi Global Storage...

  1. Digital Democracy and Global Citizenship Education: Mutually Compatible or Mutually Complicit?

    ERIC Educational Resources Information Center

    de Oliveira Andreotti, Vanessa; Pashby, Karen

    2013-01-01

    This article uses a critique of modernity to examine the perceived relationship between global citizenship education (GCE) and digital democracy (DD). We review critiques of citizenship education in the global imperative and of the relationship of technology to democratic engagement. An analogy expresses the problematic way that GCE and DD are…

  2. Spatial database for a global assessment of undiscovered copper resources: Chapter Z in Global mineral resource assessment

    USGS Publications Warehouse

    Dicken, Connie L.; Dunlap, Pamela; Parks, Heather L.; Hammarstrom, Jane M.; Zientek, Michael L.; Zientek, Michael L.; Hammarstrom, Jane M.; Johnson, Kathleen M.

    2016-07-13

    As part of the first-ever U.S. Geological Survey global assessment of undiscovered copper resources, data common to several regional spatial databases published by the U.S. Geological Survey, including one report from Finland and one from Greenland, were standardized, updated, and compiled into a global copper resource database. This integrated collection of spatial databases provides location, geologic and mineral resource data, and source references for deposits, significant prospects, and areas permissive for undiscovered deposits of both porphyry copper and sediment-hosted copper. The copper resource database allows for efficient modeling on a global scale in a geographic information system (GIS) and is provided in an Esri ArcGIS file geodatabase format.

  3. GISD

    Science.gov Websites

    GISD Global invasive species database Home About the GISD How to use Contacts 100 of the worst GISD for Any additional comments The Global Invasive Species Database (GISD) contains authoritative The Global Invasive Species Database was developed and is managed by the Invasive Species Specialist

  4. Development of an Open Global Oil and Gas Infrastructure Inventory and Geodatabase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, Kelly

    This submission contains a technical report describing the development process and visual graphics for the Global Oil and Gas Infrastructure database. Access the GOGI database using the following link: https://edx.netl.doe.gov/dataset/global-oil-gas-features-database

  5. Trustworthy History and Provenance for Files and Databases

    ERIC Educational Resources Information Center

    Hasan, Ragib

    2009-01-01

    In today's world, information is increasingly created, processed, transmitted, and stored digitally. While the digital nature of information has brought enormous benefits, it has also created new vulnerabilities and attacks against data. Unlike physical documents, digitally stored information can be rapidly copied, erased, or modified. The…

  6. Digitized Archival Primary Sources in STEM: A Selected Webliography

    ERIC Educational Resources Information Center

    Jankowski, Amy

    2017-01-01

    Accessibility and findability of digitized archival resources can be a challenge, particularly for students or researchers not familiar with archival formats and digital interfaces, which adhere to different descriptive standards than more widely familiar library resources. Numerous aggregate archival collection databases exist, which provide a…

  7. A Digital Library in the Mid-Nineties, Ahead or On Schedule?

    ERIC Educational Resources Information Center

    Dijkstra, Joost

    1994-01-01

    Discussion of the future possibilities of digital library systems highlights digital projects developed at Tilburg University (Netherlands). Topics addressed include online access to databases; electronic document delivery; agreements between libraries and Elsevier Science publishers to provide journal articles; full text document delivery; and…

  8. DOT Online Database

    Science.gov Websites

    site requires that JavaScript is enabled in order to function properly. Online Digital Special Collections Welcome to Online Digital Special Collections of interest to Department of Transportation (DOT

  9. 78 FR 47004 - Digital Trade in the U.S. and Global Economies, Part 2; Proposed Information Collection; Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 332-540] Digital Trade in the U.S. and Global Economies, Part 2; Proposed Information Collection; Comment Request; Digital Trade 2 Questionnaire AGENCY: United States International Trade Commission ACTION: In accordance with the provisions of the Paperwork Reduction Act of 1995 (44 U.S.C. Chapter...

  10. Geologic database for digital geology of California, Nevada, and Utah: an application of the North American Data Model

    USGS Publications Warehouse

    Bedford, David R.; Ludington, Steve; Nutt, Constance M.; Stone, Paul A.; Miller, David M.; Miller, Robert J.; Wagner, David L.; Saucedo, George J.

    2003-01-01

    The USGS is creating an integrated national database for digital state geologic maps that includes stratigraphic, age, and lithologic information. The majority of the conterminous 48 states have digital geologic base maps available, often at scales of 1:500,000. This product is a prototype, and is intended to demonstrate the types of derivative maps that will be possible with the national integrated database. This database permits the creation of a number of types of maps via simple or sophisticated queries, maps that may be useful in a number of areas, including mineral-resource assessment, environmental assessment, and regional tectonic evolution. This database is distributed with three main parts: a Microsoft Access 2000 database containing geologic map attribute data, an Arc/Info (Environmental Systems Research Institute, Redlands, California) Export format file containing points representing designation of stratigraphic regions for the Geologic Map of Utah, and an ArcView 3.2 (Environmental Systems Research Institute, Redlands, California) project containing scripts and dialogs for performing a series of generalization and mineral resource queries. IMPORTANT NOTE: Spatial data for the respective stage geologic maps is not distributed with this report. The digital state geologic maps for the states involved in this report are separate products, and two of them are produced by individual state agencies, which may be legally and/or financially responsible for this data. However, the spatial datasets for maps discussed in this report are available to the public. Questions regarding the distribution, sale, and use of individual state geologic maps should be sent to the respective state agency. We do provide suggestions for obtaining and formatting the spatial data to make it compatible with data in this report. See section ‘Obtaining and Formatting Spatial Data’ in the PDF version of the report.

  11. Mars Global Digital Dune Database; MC-1

    USGS Publications Warehouse

    Hayward, R.K.; Fenton, L.K.; Tanaka, K.L.; Titus, T.N.; Colaprete, A.; Christensen, P.R.

    2010-01-01

    The Mars Global Digital Dune Database presents data and describes the methodology used in creating the global database of moderate- to large-size dune fields on Mars. The database is being released in a series of U.S. Geological Survey (USGS) Open-File Reports. The first release (Hayward and others, 2007) included dune fields from 65 degrees N to 65 degrees S (http://pubs.usgs.gov/of/2007/1158/). The current release encompasses ~ 845,000 km2 of mapped dune fields from 65 degrees N to 90 degrees N latitude. Dune fields between 65 degrees S and 90 degrees S will be released in a future USGS Open-File Report. Although we have attempted to include all dune fields, some have likely been excluded for two reasons: (1) incomplete THEMIS IR (daytime) coverage may have caused us to exclude some moderate- to large-size dune fields or (2) resolution of THEMIS IR coverage (100m/pixel) certainly caused us to exclude smaller dune fields. The smallest dune fields in the database are ~ 1 km2 in area. While the moderate to large dune fields are likely to constitute the largest compilation of sediment on the planet, smaller stores of sediment of dunes are likely to be found elsewhere via higher resolution data. Thus, it should be noted that our database excludes all small dune fields and some moderate to large dune fields as well. Therefore, the absence of mapped dune fields does not mean that such dune fields do not exist and is not intended to imply a lack of saltating sand in other areas. Where availability and quality of THEMIS visible (VIS), Mars Orbiter Camera narrow angle (MOC NA), or Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) images allowed, we classified dunes and included some dune slipface measurements, which were derived from gross dune morphology and represent the prevailing wind direction at the last time of significant dune modification. It was beyond the scope of this report to look at the detail needed to discern subtle dune modification. It was also beyond the scope of this report to measure all slipfaces. We attempted to include enough slipface measurements to represent the general circulation (as implied by gross dune morphology) and to give a sense of the complex nature of aeolian activity on Mars. The absence of slipface measurements in a given direction should not be taken as evidence that winds in that direction did not occur. When a dune field was located within a crater, the azimuth from crater centroid to dune field centroid was calculated, as another possible indicator of wind direction. Output from a general circulation model (GCM) is also included. In addition to polygons locating dune fields, the database includes THEMIS visible (VIS) and Mars Orbiter Camera Narrow Angle (MOC NA) images that were used to build the database. The database is presented in a variety of formats. It is presented as an ArcReader project which can be opened using the free ArcReader software. The latest version of ArcReader can be downloaded at http://www.esri.com/software/arcgis/arcreader/download.html. The database is also presented in an ArcMap project. The ArcMap project allows fuller use of the data, but requires ESRI ArcMap(Registered) software. A fuller description of the projects can be found in the NP_Dunes_ReadMe file (NP_Dunes_ReadMe folder_ and the NP_Dunes_ReadMe_GIS file (NP_Documentation folder). For users who prefer to create their own projects, the data are available in ESRI shapefile and geodatabase formats, as well as the open Geography Markup Language (GML) format. A printable map of the dunes and craters in the database is available as a Portable Document Format (PDF) document. The map is also included as a JPEG file. (NP_Documentation folder) Documentation files are available in PDF and ASCII (.txt) files. Tables are available in both Excel and ASCII (.txt)

  12. The Digital Literacy Partnership Website: Promoting Interdisciplinary Scholarship between Faculty, Students, and Librarians

    ERIC Educational Resources Information Center

    Tzoc, Elias; Ubbes, Valerie A.

    2017-01-01

    In 2013, the Center for Digital Scholarship at Miami University was established and coincided with the redesign of the Children's Picture Book Database, which had a successful web presence for nearly 20 years. We developed the Digital Literacy Partnership (DLP) website project in order to upgrade the project to Omeka as a new digital management…

  13. Changing State Digital Libraries

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    2006-01-01

    Research has shown that state virtual or digital libraries are evolving into websites that are loaded with free resources, subscription databases, and instructional tools. In this article, the author explores these evolving libraries based on the following questions: (1) How user-friendly are the state digital libraries?; (2) How do state digital…

  14. Geologic Communications | Alaska Division of Geological & Geophysical

    Science.gov Websites

    improves a database for the Division's digital and map-based geological, geophysical, and geochemical data interfaces DGGS metadata and digital data distribution - Geospatial datasets published by DGGS are designed to be compatible with a broad variety of digital mapping software, to present DGGS's geospatial data

  15. Regional Geologic Map of San Andreas and Related Faults in Carrizo Plain, Temblor, Caliente and La Panza Ranges and Vicinity, California; A Digital Database

    USGS Publications Warehouse

    Dibblee, T. W.; Digital database compiled by Graham, S. E.; Mahony, T.M.; Blissenbach, J.L.; Mariant, J.J.; Wentworth, C.M.

    1999-01-01

    This Open-File Report is a digital geologic map database. The report serves to introduce and describe the digital data. There is no paper map included in the Open-File Report. The report includes PostScript and PDF plot files that can be used to plot images of the geologic map sheet and explanation sheet. This digital map database is prepared from a previously published map by Dibblee (1973). The geologic map database delineates map units that are identified by general age, lithology, and clast size following the stratigraphic nomenclature of the U.S. Geological Survey. For descriptions of the units, their stratigraphic relations, and sources of geologic mapping, consult the explanation sheet (of99-14_4b.ps or of99-14_4d.pdf), or the original published paper map (Dibblee, 1973). The scale of the source map limits the spatial resolution (scale) of the database to 1:125,000 or smaller. For those interested in the geology of Carrizo Plain and vicinity who do not use an ARC/INFO compatible Geographic Information System (GIS), but would like to obtain a paper map and explanation, PDF and PostScript plot files containing map images of the data in the digital database, as well as PostScript and PDF plot files of the explanation sheet and explanatory text, have been included in the database package (please see the section 'Digital Plot Files', page 5). The PostScript plot files require a gzip utility to access them. For those without computer capability, we can provide users with the PostScript or PDF files on tape that can be taken to a vendor for plotting. Paper plots can also be ordered directly from the USGS (please see the section 'Obtaining Plots from USGS Open-File Services', page 5). The content and character of the database, methods of obtaining it, and processes of extracting the map database from the tar (tape archive) file are described herein. The map database itself, consisting of six ARC/INFO coverages, can be obtained over the Internet or by magnetic tape copy as described below. The database was compiled using ARC/INFO, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). The ARC/INFO coverages are stored in uncompressed ARC export format (ARC/INFO version 7.x). All data files have been compressed, and may be uncompressed with gzip, which is available free of charge over the Internet via links from the USGS Public Domain Software page (http://edcwww.cr.usgs.gov/doc/edchome/ndcdb/public.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView.

  16. Global Oil & Gas Features Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly Rose; Jennifer Bauer; Vic Baker

    This submission contains a zip file with the developed Global Oil & Gas Features Database (as an ArcGIS geodatabase). Access the technical report describing how this database was produced using the following link: https://edx.netl.doe.gov/dataset/development-of-an-open-global-oil-and-gas-infrastructure-inventory-and-geodatabase

  17. CHIP Demonstrator: Semantics-Driven Recommendations and Museum Tour Generation

    NASA Astrophysics Data System (ADS)

    Aroyo, Lora; Stash, Natalia; Wang, Yiwen; Gorgels, Peter; Rutledge, Lloyd

    The main objective of the CHIP project is to demonstrate how Semantic Web technologies can be deployed to provide personalized access to digital museum collections. We illustrate our approach with the digital database ARIA of the Rijksmuseum Amsterdam. For the semantic enrichment of the Rijksmuseum ARIA database we collaborated with the CATCH STITCH project to produce mappings to Iconclass, and with the MultimediaN E-culture project to produce the RDF/OWL of the ARIA and Adlib databases. The main focus of CHIP is on exploring the potential of applying adaptation techniques to provide personalized experience for the museum visitors both on the Web site and in the museum.

  18. Monitoring Earth's reservoir and lake dynamics from space

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Eilander, D.; Schellekens, J.; Winsemius, H.; Gorelick, N.; Erickson, T.; Van De Giesen, N.

    2016-12-01

    Reservoirs and lakes constitute about 90% of the Earth's fresh surface water. They play a major role in the water cycle and are critical for the ever increasing demands of the world's growing population. Water from reservoirs is used for agricultural, industrial, domestic, and other purposes. Current digital databases of lakes and reservoirs are scarce, mainly providing only descriptive and static properties of the reservoirs. The Global Reservoir and Dam (GRanD) database contains almost 7000 entries while OpenStreetMap counts more than 500 000 entries tagged as a reservoir. In the last decade several research efforts already focused on accurate estimates of surface water dynamics, mainly using satellite altimetry, However, currently they are limited only to less than 1000 (mostly large) water bodies. Our approach is based on three main components. Firstly, a novel method, allowing automated and accurate estimation of surface area from (partially) cloud-free optical multispectral or radar satellite imagery. The algorithm uses satellite imagery measured by Landsat, Sentinel and MODIS missions. Secondly, a database to store reservoir static and dynamic parameters. Thirdly, a web-based tool, built on top of Google Earth Engine infrastructure. The tool allows estimation of surface area for lakes and reservoirs at planetary-scale at high spatial and temporal resolution. A prototype version of the method, database, and tool will be presented as well as validation using in-situ measurements.

  19. Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation.

    PubMed

    Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y

    2016-03-01

    Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.

  20. Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation

    NASA Astrophysics Data System (ADS)

    Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y.

    2016-03-01

    Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.

  1. LDEF meteoroid and debris database

    NASA Technical Reports Server (NTRS)

    Dardano, C. B.; See, Thomas H.; Zolensky, Michael E.

    1994-01-01

    The Long Duration Exposure Facility (LDEF) Meteoroid and Debris Special Investigation Group (M&D SIG) database is maintained at the Johnson Space Center (JSC), Houston, Texas, and consists of five data tables containing information about individual features, digitized images of selected features, and LDEF hardware (i.e., approximately 950 samples) archived at JSC. About 4000 penetrations (greater than 300 micron in diameter) and craters (greater than 500 micron in diameter) were identified and photodocumented during the disassembly of LDEF at the Kennedy Space Center (KSC), while an additional 4500 or so have subsequently been characterized at JSC. The database also contains some data that have been submitted by various PI's, yet the amount of such data is extremely limited in its extent, and investigators are encouraged to submit any and all M&D-type data to JSC for inclusion within the M&D database. Digitized stereo-image pairs are available for approximately 4500 features through the database.

  2. A proposal to extend our understanding of the global economy

    NASA Technical Reports Server (NTRS)

    Hough, Robbin R.; Ehlers, Manfred

    1991-01-01

    Satellites acquire information on a global and repetitive basis. They are thus ideal tools for use when global scale and analysis over time is required. Data from satellites comes in digital form which means that it is ideally suited for incorporation in digital data bases and that it can be evaluated using automated techniques. The development of a global multi-source data set which integrates digital information is proposed regarding some 15,000 major industrial sites worldwide with remotely sensed images of the sites. The resulting data set would provide the basis for a wide variety of studies of the global economy. The preliminary results give promise of a new class of global policy model which is far more detailed and helpful to local policy makers than its predecessors. The central thesis of this proposal is that major industrial sites can be identified and their utilization can be tracked with the aid of satellite images.

  3. Exploration of operator method digital optical computers for application to NASA

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Digital optical computer design has been focused primarily towards parallel (single point-to-point interconnection) implementation. This architecture is compared to currently developing VHSIC systems. Using demonstrated multichannel acousto-optic devices, a figure of merit can be formulated. The focus is on a figure of merit termed Gate Interconnect Bandwidth Product (GIBP). Conventional parallel optical digital computer architecture demonstrates only marginal competitiveness at best when compared to projected semiconductor implements. Global, analog global, quasi-digital, and full digital interconnects are briefly examined as alternative to parallel digital computer architecture. Digital optical computing is becoming a very tough competitor to semiconductor technology since it can support a very high degree of three dimensional interconnect density and high degrees of Fan-In without capacitive loading effects at very low power consumption levels.

  4. Digitalization and the global technology trends

    NASA Astrophysics Data System (ADS)

    Ignat, V.

    2017-08-01

    Digitalization, connected products and services, and shortening innovation cycles are widely discussed topics in management practice and theory and demand for new concepts. We analysed how companies innovated their business models and how are the new the technology trends. We found out, that have a positive approach to digitalization but the technology strategy still runs its original business model. Digitalization forces to new solution orientation. For companies it is necessary to master the digital transformation, new innovations have to be developed. Furthermore, digitalization / Industry 4.0 linking the real-life factory with virtual reality, will play an increasingly important role in global manufacturing. Companies have to obtain new digital capabilities, in order to make their company sustainable for the future. A long term growth and welfare in Europe could be guaranteed only by new technology innovation.

  5. The use of a personal digital assistant for wireless entry of data into a database via the Internet.

    PubMed

    Fowler, D L; Hogle, N J; Martini, F; Roh, M S

    2002-01-01

    Researchers typically record data on a worksheet and at some later time enter it into the database. Wireless data entry and retrieval using a personal digital assistant (PDA) at the site of patient contact can simplify this process and improve efficiency. A surgeon and a nurse coordinator provided the content for the database. The computer programmer created the database, placed the pages of the database on the PDA screen, and researched and installed security measures. Designing the database took 6 months. Meeting Health Insurance Portability and Accountability Act of 1996 (HIPAA) requirements for patient confidentiality, satisfying institutional Information Services requirements, and ensuring connectivity required an additional 8 months before the functional system was complete. It is now possible to achieve wireless entry and retrieval of data using a PDA. Potential advantages include collection and entry of data at the same time, easy entry of data from multiple sites, and retrieval of data at the patient's bedside.

  6. Creation of Norms for the Purpose of Global Talent Management

    ERIC Educational Resources Information Center

    Hedricks, Cynthia A.; Robie, Chet; Harnisher, John V.

    2008-01-01

    Personality scores were used to construct three databases of global norms. The composition of the three databases varied according to percentage of cases by global region, occupational group, applicant status, and gender of the job candidate. Comparison of personality scores across the three norms databases revealed that the magnitude of the…

  7. Large-scale feature searches of collections of medical imagery

    NASA Astrophysics Data System (ADS)

    Hedgcock, Marcus W.; Karshat, Walter B.; Levitt, Tod S.; Vosky, D. N.

    1993-09-01

    Large scale feature searches of accumulated collections of medical imagery are required for multiple purposes, including clinical studies, administrative planning, epidemiology, teaching, quality improvement, and research. To perform a feature search of large collections of medical imagery, one can either search text descriptors of the imagery in the collection (usually the interpretation), or (if the imagery is in digital format) the imagery itself. At our institution, text interpretations of medical imagery are all available in our VA Hospital Information System. These are downloaded daily into an off-line computer. The text descriptors of most medical imagery are usually formatted as free text, and so require a user friendly database search tool to make searches quick and easy for any user to design and execute. We are tailoring such a database search tool (Liveview), developed by one of the authors (Karshat). To further facilitate search construction, we are constructing (from our accumulated interpretation data) a dictionary of medical and radiological terms and synonyms. If the imagery database is digital, the imagery which the search discovers is easily retrieved from the computer archive. We describe our database search user interface, with examples, and compare the efficacy of computer assisted imagery searches from a clinical text database with manual searches. Our initial work on direct feature searches of digital medical imagery is outlined.

  8. An integrated photogrammetric and spatial database management system for producing fully structured data using aerial and remote sensing images.

    PubMed

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.

  9. František Nábělek's Iter Turcico-Persicum 1909-1910 - database and digitized herbarium collection.

    PubMed

    Kempa, Matúš; Edmondson, John; Lack, Hans Walter; Smatanová, Janka; Marhold, Karol

    2016-01-01

    The Czech botanist František Nábělek (1884-1965) explored the Middle East in 1909-1910, visiting what are now Israel, Palestine, Jordan, Syria, Lebanon, Iraq, Bahrain, Iran and Turkey. He described four new genera, 78 species, 69 varieties and 38 forms of vascular plants, most of these in his work Iter Turcico-Persicum (1923-1929). The main herbarium collection of Iter Turcico-Persicum comprises 4163 collection numbers (some with duplicates), altogether 6465 specimens. It is currently deposited in the herbarium SAV. In addition, some fragments and duplicates are found in B, E, W and WU. The whole collection at SAV was recently digitized and both images and metadata are available via web portal www.nabelek.sav.sk, and through JSTOR Global Plants and the Biological Collection Access Service. Most localities were georeferenced and the web portal provides a mapping facility. Annotation of specimens is available via the AnnoSys facility. For each specimen a CETAF stable identifier is provided enabling the correct reference to the image and metadata.

  10. František Nábělek’s Iter Turcico-Persicum 1909–1910 – database and digitized herbarium collection

    PubMed Central

    Kempa, Matúš; Edmondson, John; Lack, Hans Walter; Smatanová, Janka; Marhold, Karol

    2016-01-01

    Abstract The Czech botanist František Nábělek (1884−1965) explored the Middle East in 1909-1910, visiting what are now Israel, Palestine, Jordan, Syria, Lebanon, Iraq, Bahrain, Iran and Turkey. He described four new genera, 78 species, 69 varieties and 38 forms of vascular plants, most of these in his work Iter Turcico-Persicum (1923−1929). The main herbarium collection of Iter Turcico-Persicum comprises 4163 collection numbers (some with duplicates), altogether 6465 specimens. It is currently deposited in the herbarium SAV. In addition, some fragments and duplicates are found in B, E, W and WU. The whole collection at SAV was recently digitized and both images and metadata are available via web portal www.nabelek.sav.sk, and through JSTOR Global Plants and the Biological Collection Access Service. Most localities were georeferenced and the web portal provides a mapping facility. Annotation of specimens is available via the AnnoSys facility. For each specimen a CETAF stable identifier is provided enabling the correct reference to the image and metadata. PMID:28127245

  11. A New Era in Geodesy and Cartography: Implications for Landing Site Operations

    NASA Technical Reports Server (NTRS)

    Duxbury, T. C.

    2001-01-01

    The Mars Global Surveyor (MGS) Mars Orbiter Laser Altimeter (MOLA) global dataset has ushered in a new era for Mars local and global geodesy and cartography. These data include the global digital terrain model (Digital Terrain Model (DTM) radii), the global digital elevation model (Digital Elevation Model (DEM) elevation with respect to the geoid), and the higher spatial resolution individual MOLA ground tracks. Currently there are about 500,000,000 MOLA points and this number continues to grow as MOLA continues successful operations in orbit about Mars, the combined processing of radiometric X-band Doppler and ranging tracking of MGS together with millions of MOLA orbital crossover points has produced global geodetic and cartographic control having a spatial (latitude/longitude) accuracy of a few meters and a topographic accuracy of less than 1 meter. This means that the position of an individual MOLA point with respect to the center-of-mass of Mars is know to an absolute accuracy of a few meters. The positional accuracy of this point in inertial space over time is controlled by the spin rate uncertainty of Mars which is less than 1 km over 10 years that will be improved significantly with the next landed mission.

  12. The role of digital sample information within the digital geoscience infrastructure: a pragmatic approach

    NASA Astrophysics Data System (ADS)

    Howe, Michael

    2014-05-01

    Much of the digital geological information on the composition, properties and dynamics of the subsurface is based ultimately on physical samples, many of which are archived to provide a basis for the information. Online metadata catalogues of these collections have now been available for many years. Many of these are institutional and tightly focussed, with UK examples including the British Geological Survey's (BGS) palaeontological samples database, PalaeoSaurus (http://www.bgs.ac.uk/palaeosaurus/), and mineralogical and petrological sample database, Britrocks (http://www.bgs.ac.uk/data/britrocks.html) . There are now a growing number of international sample metadata databases, including The Palaeobiology Database (http://paleobiodb.org/) and SESAR, the IGSN (International Geo Sample Number) database (http://www.geosamples.org/catalogsearch/ ). More recently the emphasis has moved beyond metadata (locality, identification, age, citations, etc) to digital imagery, with the intention of providing the user with at least enough information to determine whether viewing the sample would be worthwhile. Recent BGS examples include high resolution (e.g. 7216 x 5412 pixel) hydrocarbon well core images (http://www.bgs.ac.uk/data/offshoreWells/wells.cfc?method=searchWells) , high resolution rock thin section images (e.g. http://www.largeimages.bgs.ac.uk/iip/britrocks.html?id=290000/291739 ) and building stone images (http://geoscenic.bgs.ac.uk/asset-bank/action/browseItems?categoryId=1547&categoryTypeId=1) . This has been developed further with high resolution stereo images. The Jisc funded GB3D type fossils online project delivers these as red-cyan anaglyphs (http://www.3d-fossils.ac.uk/). More innovatively, the GB3D type fossils project has laser scanned several thousand type fossils and the resulting 3d-digital models are now being delivered through the online portal. Importantly, this project also represents collaboration between the BGS, Oxford and Cambridge Universities, the National Museums of Wales, and numerous other national, local and regional museums. The lack of currently accepted international standards and infrastructures for the delivery of high resolution images and 3d-digital models has necessitated the BGS in developing or selecting its own. Most high resolution images have been delivered using the JPEG 2000 format because of its quality and speed. Digital models have been made available in both .PLY and .OBJ format because of their respective efficient file size, and flexibility. Consideration must now be given to European and international standards and infrastructures for the delivery of high resolution images and 3d-digital models.

  13. Internationally coordinated glacier monitoring - a timeline since 1894

    NASA Astrophysics Data System (ADS)

    Nussbaumer, Samuel U.; Armstrong, Richard; Fetterer, Florence; Gärtner-Roer, Isabelle; Hoelzle, Martin; Machguth, Horst; Mölg, Nico; Paul, Frank; Raup, Bruce H.; Zemp, Michael

    2016-04-01

    Changes in glaciers and ice caps provide some of the clearest evidence of climate change, with impacts on sea-level variations, regional hydrological cycles, and natural hazard situations. Therefore, glaciers have been recognized as an Essential Climate Variable (ECV). Internationally coordinated collection and distribution of standardized information about the state and change of glaciers and ice caps was initiated in 1894 and is today organized within the Global Terrestrial Network for Glaciers (GTN-G). GTN-G ensures the continuous development and adaptation of the international strategies to the long-term needs of users in science and policy. A GTN-G Steering Committee coordinates, supports and advices the operational bodies responsible for the international glacier monitoring, which are the World Glacier Monitoring Service (WGMS), the US National Snow and Ice Data Center (NSIDC), and the Global Land Ice Measurements from Space (GLIMS) initiative. In this presentation, we trace the development of the internationally coordinated glacier monitoring since its beginning in the 19th century. Today, several online databases containing a wealth of diverse data types with different levels of detail and global coverage provide fast access to continuously updated information on glacier fluctuation and inventory data. All glacier datasets are made freely available through the respective operational bodies within GTN-G, and can be accessed through the GTN-G Global Glacier Browser (http://www.gtn-g.org/data_browser.html). Glacier inventory data (e.g., digital outlines) are available for about 180,000 glaciers (GLIMS database, RGI - Randolph Glacier Inventory, WGI - World Glacier Inventory). Glacier front variations with about 45,000 entries since the 17th century and about 6,200 glaciological and geodetic mass (volume) change observations dating back to the 19th century are available in the Fluctuations of Glaciers (FoG) database. These datasets reveal clear evidence that glacier retreat and mass loss is a global phenomenon. Glaciological and geodetic observations show that the rates of the 21st-century mass loss are unprecedented on a global scale, for the time period observed, and probably also for recorded history, as indicated in glacier reconstructions from written and illustrated documents. The databases are supplemented by specific index datasets (e.g., glacier thickness data) and a dataset containing information on special events including glacier surges, glacier lake outbursts, ice avalanches, eruptions of ice-clad volcanoes, etc. related to about 200 glaciers. A special database of glacier photographs (GPC - Glacier Photograph Collection) contains more than 15,000 pictures from around 500 glaciers, some of them dating back to the mid-19th century. Current efforts are to close remaining observational gaps regarding data both from in-situ measurements and remote sensing, to establish a well-distributed baseline for sound estimates of climate-related glacier changes and their impacts. Within the framework of dedicated capacity building and twinning activities, disrupted long-term mass balance programmes in Central Asia have recently been resumed, and the continuation of mass balance measurements in the Tropical Andes has been supported. New data also emerge from several research projects using NASA and ESA sensors and are actively integrated into the GTN-G databases. Key tasks for the future include the quantitative assessment of uncertainties of available measurements, and their representativeness for changes in the respective mountain ranges. For this, a well-considered integration of in-situ measurements, remotely sensed observations, and numerical modelling is required.

  14. Shaping the Curriculum: The Power of a Library's Digital Resources

    ERIC Educational Resources Information Center

    Kirkwood, Patricia

    2011-01-01

    Researchers were the first adopters of digital resources available through the library. Online journals and databases make finding research articles much easier than when this author started as a librarian more than 20 years ago. Speedier interlibrary loan due to digital delivery means research materials are never far away. Making it easier for…

  15. A Digital Library for Education: The PEN-DOR Project.

    ERIC Educational Resources Information Center

    Fullerton, Karen; Greenberg, Jane; McClure, Maureen; Rasmussen, Edie; Stewart, Darin

    1999-01-01

    Describes Pen-DOR (Pennsylvania Education Network Digital Object Repository), a digital library designed to provide K-12 educators with access to multimedia resources and tools to create new lesson plans and modify existing ones via the World Wide Web. Discusses design problems of a distributed, object-oriented database architecture and describes…

  16. ::: American Indians of the Pacific Northwest Collection :::

    Science.gov Websites

    Ask Us! University of Washington Libraries Digital Collections Toggle navigation Browse Special learn from the images and writings of the time...This site provides an extensive digital collection of digital databases includes over 2,300 original photographs as well as over 1,500 pages from the Annual

  17. Data Services Upgrade: Perfecting the ISIS-I Topside Digital Ionogram Database

    NASA Technical Reports Server (NTRS)

    Wang, Yongli; Benson, Robert F.; Bilitza, Dieter; Fung, Shing. F.; Chu, Philip; Huang, Xueqin; Truhlik, Vladimir

    2015-01-01

    The ionospheric topside sounders of the International Satellites for Ionospheric Studies (ISIS) program were designed as analog systems. More than 16,000 of the original telemetry tapes from three satellites were used to produce topside digital ionograms, via an analog-to-digital (A/D) conversion process, suitable for modern analysis techniques. Unfortunately, many of the resulting digital topside ionogram files could not be auto-processed to produce topside Ne(h) profiles because of problems encountered during the A/D process. Software has been written to resolve these problems and here we report on (1) the first application of this software to a significant portion of the ISIS-1 digital topside-ionogram database, (2) software improvements motivated by this activity, (3) N(sub e)(h) profiles automatically produced from these corrected ISIS-1 digital ionogram files, and (4) the availability via the Virtual Wave Observatory (VWO) of the corrected ISIS-1 digital topside ionogram files for research. We will also demonstrate the use of these N(sub e)(h) profiles for making refinements in the International Reference Ionosphere (IRI) and in the determination of transition heights from Oxygen ion to Hydrogen ion.

  18. New design and facilities for the International Database for Absolute Gravity Measurements (AGrav): A support for the Establishment of a new Global Absolute Gravity Reference System

    NASA Astrophysics Data System (ADS)

    Wziontek, Hartmut; Falk, Reinhard; Bonvalot, Sylvain; Rülke, Axel

    2017-04-01

    After about 10 years of successful joint operation by BGI and BKG, the International Database for Absolute Gravity Measurements "AGrav" (see references hereafter) was under a major revision. The outdated web interface was replaced by a responsive, high level web application framework based on Python and built on top of Pyramid. Functionality was added, like interactive time series plots or a report generator and the interactive map-based station overview was updated completely, comprising now clustering and the classification of stations. Furthermore, the database backend was migrated to PostgreSQL for better support of the application framework and long-term availability. As comparisons of absolute gravimeters (AG) become essential to realize a precise and uniform gravity standard, the database was extended to document the results on international and regional level, including those performed at monitoring stations equipped with SGs. By this it will be possible to link different AGs and to trace their equivalence back to the key comparisons under the auspices of International Committee for Weights and Measures (CIPM) as the best metrological realization of the absolute gravity standard. In this way the new AGrav database accommodates the demands of the new Global Absolute Gravity Reference System as recommended by the IAG Resolution No. 2 adopted in Prague 2015. The new database will be presented with focus on the new user interface and new functionality, calling all institutions involved in absolute gravimetry to participate and contribute with their information to built up a most complete picture of high precision absolute gravimetry and improve its visibility. A Digital Object Identifier (DOI) will be provided by BGI to contributors to give a better traceability and facilitate the referencing of their gravity surveys. Links and references: BGI mirror site : http://bgi.obs-mip.fr/data-products/Gravity-Databases/Absolute-Gravity-data/ BKG mirror site: http://agrav.bkg.bund.de/agrav-meta/ Wilmes, H., H. Wziontek, R. Falk, S. Bonvalot (2009). AGrav - the New Absolute Gravity Database and a Proposed Cooperation with the GGP Project. J. of Geodynamics, 48, pp. 305-309. doi:10.1016/j.jog.2009.09.035. Wziontek, H., H. Wilmes, S. Bonvalot (2011). AGrav: An international database for absolute gravity measurements. In Geodesy for Planet Earth (S. Kenyon at al. eds). IAG Symposia, 136, 1035-1040, Springer, Berlin. 2011. doi:10.1007/978-3-642-20338-1_130.

  19. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  20. Tufts Health Sciences Database: Lessons, Issues, and Opportunities.

    ERIC Educational Resources Information Center

    Lee, Mary Y.; Albright, Susan A.; Alkasab, Tarik; Damassa, David A.; Wang, Paul J.; Eaton, Elizabeth K.

    2003-01-01

    Describes a seven-year experience with developing the Tufts Health Sciences Database, a database-driven information management system that combines the strengths of a digital library, content delivery tools, and curriculum management. Identifies major effects on teaching and learning. Also addresses issues of faculty development, copyright and…

  1. Lessons Learned With a Global Graph and Ozone Widget Framework (OWF) Testbed

    DTIC Science & Technology

    2013-05-01

    of operating system and database environments. The following is one example. Requirements are: Java 1.6 + and a Relational Database Management...We originally tried to use MySQL as our database, because we were more familiar with it, but since the database dumps as well as most of the...Global Graph Rest Services In order to set up the Global Graph Rest Services, you will need to have the following dependencies installed: Java 1.6

  2. Computer Storage and Retrieval of Position - Dependent Data.

    DTIC Science & Technology

    1982-06-01

    This thesis covers the design of a new digital database system to replace the merged (observation and geographic location) record, one file per cruise...68 "The Digital Data Library System: Library Storage and Retrieval of Digital Geophysical Data" by Robert C. Groan) provided a relatively simple...dependent, ’geophysical’ data. The system is operational on a Digital Equipment Corporation VAX-11/780 computer. Values of measured and computed

  3. Digital methods for the history of psychology: Introduction and resources.

    PubMed

    Fox Lee, Shayna

    2016-02-01

    At the York University Digital History of Psychology Laboratory, we have been working on projects that explore what digital methodologies have to offer historical research in our field. This piece provides perspective on the history and theory of digital history, as well as introductory resources for those who are curious about incorporating these methods into their own work. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. 12-Digit Watershed Boundary Data 1:24,000 for EPA Region 2 and Surrounding States (NAT_HYDROLOGY.HUC12_NRCS_REG2)

    EPA Pesticide Factsheets

    12 digit Hydrologic Units (HUCs) for EPA Region 2 and surrounding states (Northeastern states, parts of the Great Lakes, Puerto Rico and the USVI) downloaded from the Natural Resources Conservation Service (NRCS) Geospatial Gateway and imported into the EPA Region 2 Oracle/SDE database. This layer reflects 2009 updates to the national Watershed Boundary Database (WBD) that included new boundary data for New York and New Jersey.

  5. Digital version of "Open-File Report 92-179: Geologic map of the Cow Cove Quadrangle, San Bernardino County, California"

    USGS Publications Warehouse

    Wilshire, Howard G.; Bedford, David R.; Coleman, Teresa

    2002-01-01

    3. Plottable map representations of the database at 1:24,000 scale in PostScript and Adobe PDF formats. The plottable files consist of a color geologic map derived from the spatial database, composited with a topographic base map in the form of the USGS Digital Raster Graphic for the map area. Color symbology from each of these datasets is maintained, which can cause plot file sizes to be large.

  6. Optimal Path Planning Program for Autonomous Speed Sprayer in Orchard Using Order-Picking Algorithm

    NASA Astrophysics Data System (ADS)

    Park, T. S.; Park, S. J.; Hwang, K. Y.; Cho, S. I.

    This study was conducted to develop a software program which computes optimal path for autonomous navigation in orchard, especially for speed sprayer. Possibilities of autonomous navigation in orchard were shown by other researches which have minimized distance error between planned path and performed path. But, research of planning an optimal path for speed sprayer in orchard is hardly founded. In this study, a digital map and a database for orchard which contains GPS coordinate information (coordinates of trees and boundary of orchard) and entity information (heights and widths of trees, radius of main stem of trees, disease of trees) was designed. An orderpicking algorithm which has been used for management of warehouse was used to calculate optimum path based on the digital map. Database for digital map was created by using Microsoft Access and graphic interface for database was made by using Microsoft Visual C++ 6.0. It was possible to search and display information about boundary of an orchard, locations of trees, daily plan for scattering chemicals and plan optimal path on different orchard based on digital map, on each circumstance (starting speed sprayer in different location, scattering chemicals for only selected trees).

  7. Emission Database for Global Atmospheric Research (EDGAR).

    ERIC Educational Resources Information Center

    Olivier, J. G. J.; And Others

    1994-01-01

    Presents the objective and methodology chosen for the construction of a global emissions source database called EDGAR and the structural design of the database system. The database estimates on a regional and grid basis, 1990 annual emissions of greenhouse gases, and of ozone depleting compounds from all known sources. (LZ)

  8. Manual therapy for unsettled, distressed and excessively crying infants: a systematic review and meta-analyses.

    PubMed

    Carnes, Dawn; Plunkett, Austin; Ellwood, Julie; Miles, Clare

    2018-01-24

    To conduct a systematic review and meta-analyses to assess the effect of manual therapy interventions for healthy but unsettled, distressed and excessively crying infants and to provide information to help clinicians and parents inform decisions about care. We reviewed published peer-reviewed primary research articles in the last 26 years from nine databases (Medline Ovid, Embase, Web of Science, Physiotherapy Evidence Database, Osteopathic Medicine Digital Repository , Cochrane (all databases), Index of Chiropractic Literature, Open Access Theses and Dissertations and Cumulative Index to Nursing and Allied Health Literature). Our inclusion criteria were: manual therapy (by regulated or registered professionals) of unsettled, distressed and excessively crying infants who were otherwise healthy and treated in a primary care setting. Outcomes of interest were: crying, feeding, sleep, parent-child relations, parent experience/satisfaction and parent-reported global change. Nineteen studies were selected for full review: seven randomised controlled trials, seven case series, three cohort studies, one service evaluation study and one qualitative study.We found moderate strength evidence for the effectiveness of manual therapy on: reduction in crying time (favourable: -1.27 hours per day (95% CI -2.19 to -0.36)), sleep (inconclusive), parent-child relations (inconclusive) and global improvement (no effect). The risk of reported adverse events was low: seven non-serious events per 1000 infants exposed to manual therapy (n=1308) and 110 per 1000 in those not exposed. Some small benefits were found, but whether these are meaningful to parents remains unclear as does the mechanisms of action. Manual therapy appears relatively safe. CRD42016037353. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Manual therapy for unsettled, distressed and excessively crying infants: a systematic review and meta-analyses

    PubMed Central

    Carnes, Dawn; Plunkett, Austin; Ellwood, Julie; Miles, Clare

    2018-01-01

    Objective To conduct a systematic review and meta-analyses to assess the effect of manual therapy interventions for healthy but unsettled, distressed and excessively crying infants and to provide information to help clinicians and parents inform decisions about care. Methods We reviewed published peer-reviewed primary research articles in the last 26 years from nine databases (Medline Ovid, Embase, Web of Science, Physiotherapy Evidence Database, Osteopathic Medicine Digital Repository, Cochrane (all databases), Index of Chiropractic Literature, Open Access Theses and Dissertations and Cumulative Index to Nursing and Allied Health Literature). Our inclusion criteria were: manual therapy (by regulated or registered professionals) of unsettled, distressed and excessively crying infants who were otherwise healthy and treated in a primary care setting. Outcomes of interest were: crying, feeding, sleep, parent–child relations, parent experience/satisfaction and parent-reported global change. Results Nineteen studies were selected for full review: seven randomised controlled trials, seven case series, three cohort studies, one service evaluation study and one qualitative study. We found moderate strength evidence for the effectiveness of manual therapy on: reduction in crying time (favourable: −1.27 hours per day (95% CI −2.19 to –0.36)), sleep (inconclusive), parent–child relations (inconclusive) and global improvement (no effect). The risk of reported adverse events was low: seven non-serious events per 1000 infants exposed to manual therapy (n=1308) and 110 per 1000 in those not exposed. Conclusions Some small benefits were found, but whether these are meaningful to parents remains unclear as does the mechanisms of action. Manual therapy appears relatively safe. PROSPERO registration number CRD42016037353. PMID:29371279

  10. Global Supply Chain Management at Digital Equipment Corporation

    DTIC Science & Technology

    1995-01-01

    Global Supply Chain Management at Digital Equipment Corporation BRUCE C. ARNTZEN Gr t~ALD G...answers change; and -Are tax havens worth the extra freight and duty. In designing a global logistics network, they must decide 71 ARNTZEN ET AL...but is solved with heunshcs. Cohen and Lee (1988, p . 216] continue 73 ARNTZEN ET AL. with a set of approximate stochastic sub- models and

  11. Volcanoes of México: An Interactive CD-ROM From the Smithsonian's Global Volcanism Program

    NASA Astrophysics Data System (ADS)

    Siebert, L.; Kimberly, P.; Calvin, C.; Luhr, J. F.; Kysar, G.

    2002-12-01

    The Smithsonian Institution's Global Volcanism Program is nearing completion of an interactive CD-ROM, the Volcanoes of México. This CD is the second in a series sponsored by the U.S. Department of Energy Office of Geothermal Technologies to collate Smithsonian data on Quaternary volcanism as a resource for the geothermal community. It also has utility for those concerned with volcanic hazard and risk mitgation as well as an educational tool for those interested in Mexican volcanism. We acknowledge the significant contributions of many Mexican volcanologists to the eruption reports, data, and images contained in this CD, in particular those contributions of the Centro Nacional de Prevencion de Desastres (CENAPRED), the Colima Volcano Observatory of the University of Colima, and the Universidad Nacional Autónoma de México (UNAM). The Volcanoes of México CD has a format similar to that of an earlier Smithsonian CD, the Volcanoes of Indonesia, but also shows Pleistocene volcanic centers and additional data on geothermal sites. A clickable map of México shows both Holocene and Pleistocene volcanic centers and provides access to individual pages on 67 volcanoes ranging from Cerro Prieto in Baja California to Tacaná on the Guatemalan border. These include geographic and geologic data on individual volcanoes (as well as a brief paragraph summarizing the geologic history) along with tabular eruption chronologies, eruptive characteristics, and eruptive volumes, when known. Volcano data are accessible from both geographical and alphabetical searches. A major component of the CD is more than 400 digitized images illustrating the morphology of volcanic centers and eruption processes and deposits, providing a dramatic visual primer to the country's volcanoes. Images of specific eruptions can be directly linked to from the eruption chronology tables. The Volcanoes of México CD includes monthly reports and associated figures and tables cataloging volcanic activity in México from the Bulletin of the Global Volcanism Network and its predecessor, the Scientific Event Alert Network Bulletin, as well as early event-card notices of the Smithsonian's Center for Short-Lived Phenomena. An extensive petrologic database contains major-element analyses and other petrological and geochemical data for 1776 samples. The user also has access to a database of the Global Volcanism Program's map archives. Another option on the CD views earthquake hypocenters and volcanic eruptions from 1960 to the present plotted sequentially on a map of México and Central America. A bibliography of Mexican volcanism and geothermal research includes references cited in the Smithsonian's volcano database as well as those obtained from a search of the Georef bibliographic database. For more advanced queries and searches both the petrologic database and volcanic activity reports can be uploaded from the CD.

  12. 47 CFR 15.713 - TV bands database.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... authorized services operating in the TV bands. In addition, a TV bands database must also verify that the FCC identifier (FCC ID) of a device seeking access to its services is valid; under this requirement the TV bands... information will come from the official Commission database. These services include: (i) Digital television...

  13. Databases for K-8 Students

    ERIC Educational Resources Information Center

    Young, Terrence E., Jr.

    2004-01-01

    Today's elementary school students have been exposed to computers since birth, so it is not surprising that they are so proficient at using them. As a result, they are ready to search databases that include topics and information appropriate for their age level. Subscription databases are digital copies of magazines, newspapers, journals,…

  14. The Stanford MediaServer Project: strategies for building a flexible digital media platform to support biomedical education and research.

    PubMed Central

    Durack, Jeremy C.; Chao, Chih-Chien; Stevenson, Derek; Andriole, Katherine P.; Dev, Parvati

    2002-01-01

    Medical media collections are growing at a pace that exceeds the value they currently provide as research and educational resources. To address this issue, the Stanford MediaServer was designed to promote innovative multimedia-based application development. The nucleus of the MediaServer platform is a digital media database strategically designed to meet the information needs of many biomedical disciplines. Key features include an intuitive web-based interface for collaboratively populating the media database, flexible creation of media collections for diverse and specialized purposes, and the ability to construct a variety of end-user applications from the same database to support biomedical education and research. PMID:12463820

  15. The Stanford MediaServer Project: strategies for building a flexible digital media platform to support biomedical education and research.

    PubMed

    Durack, Jeremy C; Chao, Chih-Chien; Stevenson, Derek; Andriole, Katherine P; Dev, Parvati

    2002-01-01

    Medical media collections are growing at a pace that exceeds the value they currently provide as research and educational resources. To address this issue, the Stanford MediaServer was designed to promote innovative multimedia-based application development. The nucleus of the MediaServer platform is a digital media database strategically designed to meet the information needs of many biomedical disciplines. Key features include an intuitive web-based interface for collaboratively populating the media database, flexible creation of media collections for diverse and specialized purposes, and the ability to construct a variety of end-user applications from the same database to support biomedical education and research.

  16. SIOExplorer: Modern IT Methods and Tools for Digital Library Management

    NASA Astrophysics Data System (ADS)

    Sutton, D. W.; Helly, J.; Miller, S.; Chase, A.; Clarck, D.

    2003-12-01

    With more geoscience disciplines becoming data-driven it is increasingly important to utilize modern techniques for data, information and knowledge management. SIOExplorer is a new digital library project with 2 terabytes of oceanographic data collected over the last 50 years on 700 cruises by the Scripps Institution of Oceanography. It is built using a suite of information technology tools and methods that allow for an efficient and effective digital library management system. The library consists of a number of independent collections, each with corresponding metadata formats. The system architecture allows each collection to be built and uploaded based on a collection dependent metadata template file (MTF). This file is used to create the hierarchical structure of the collection, create metadata tables in a relational database, and to populate object metadata files and the collection as a whole. Collections are comprised of arbitrary digital objects stored at the San Diego Supercomputer Center (SDSC) High Performance Storage System (HPSS) and managed using the Storage Resource Broker (SRB), data handling middle ware developed at SDSC. SIOExplorer interoperates with other collections as a data provider through the Open Archives Initiative (OAI) protocol. The user services for SIOExplorer are accessed from CruiseViewer, a Java application served using Java Web Start from the SIOExplorer home page. CruiseViewer is an advanced tool for data discovery and access. It implements general keyword and interactive geospatial search methods for the collections. It uses a basemap to georeference search results on user selected basemaps such as global topography or crustal age. User services include metadata viewing, opening of selective mime type digital objects (such as images, documents and grid files), and downloading of objects (including the brokering of proprietary hold restrictions).

  17. Improving global paleogeography since the late Paleozoic using paleobiology

    NASA Astrophysics Data System (ADS)

    Cao, Wenchao; Zahirovic, Sabin; Flament, Nicolas; Williams, Simon; Golonka, Jan; Dietmar Müller, R.

    2017-12-01

    Paleogeographic reconstructions are important to understand Earth's tectonic evolution, past eustatic and regional sea level change, paleoclimate and ocean circulation, deep Earth resources and to constrain and interpret the dynamic topography predicted by mantle convection models. Global paleogeographic maps have been compiled and published, but they are generally presented as static maps with varying map projections, different time intervals represented by the maps and different plate motion models that underlie the paleogeographic reconstructions. This makes it difficult to convert the maps into a digital form and link them to alternative digital plate tectonic reconstructions. To address this limitation, we develop a workflow to restore global paleogeographic maps to their present-day coordinates and enable them to be linked to a different tectonic reconstruction. We use marine fossil collections from the Paleobiology Database to identify inconsistencies between their indicative paleoenvironments and published paleogeographic maps, and revise the locations of inferred paleo-coastlines that represent the estimated maximum transgression surfaces by resolving these inconsistencies. As a result, the consistency ratio between the paleogeography and the paleoenvironments indicated by the marine fossil collections is increased from an average of 75 % to nearly full consistency (100 %). The paleogeography in the main regions of North America, South America, Europe and Africa is significantly revised, especially in the Late Carboniferous, Middle Permian, Triassic, Jurassic, Late Cretaceous and most of the Cenozoic. The global flooded continental areas since the Early Devonian calculated from the revised paleogeography in this study are generally consistent with results derived from other paleoenvironment and paleo-lithofacies data and with the strontium isotope record in marine carbonates. We also estimate the terrestrial areal change over time associated with transferring reconstruction, filling gaps and modifying the paleogeographic geometries based on the paleobiology test. This indicates that the variation of the underlying plate reconstruction is the main factor that contributes to the terrestrial areal change, and the effect of revising paleogeographic geometries based on paleobiology is secondary.

  18. Tomato Expression Database (TED): a suite of data presentation and analysis tools

    PubMed Central

    Fei, Zhangjun; Tang, Xuemei; Alba, Rob; Giovannoni, James

    2006-01-01

    The Tomato Expression Database (TED) includes three integrated components. The Tomato Microarray Data Warehouse serves as a central repository for raw gene expression data derived from the public tomato cDNA microarray. In addition to expression data, TED stores experimental design and array information in compliance with the MIAME guidelines and provides web interfaces for researchers to retrieve data for their own analysis and use. The Tomato Microarray Expression Database contains normalized and processed microarray data for ten time points with nine pair-wise comparisons during fruit development and ripening in a normal tomato variety and nearly isogenic single gene mutants impacting fruit development and ripening. Finally, the Tomato Digital Expression Database contains raw and normalized digital expression (EST abundance) data derived from analysis of the complete public tomato EST collection containing >150 000 ESTs derived from 27 different non-normalized EST libraries. This last component also includes tools for the comparison of tomato and Arabidopsis digital expression data. A set of query interfaces and analysis, and visualization tools have been developed and incorporated into TED, which aid users in identifying and deciphering biologically important information from our datasets. TED can be accessed at . PMID:16381976

  19. Tomato Expression Database (TED): a suite of data presentation and analysis tools.

    PubMed

    Fei, Zhangjun; Tang, Xuemei; Alba, Rob; Giovannoni, James

    2006-01-01

    The Tomato Expression Database (TED) includes three integrated components. The Tomato Microarray Data Warehouse serves as a central repository for raw gene expression data derived from the public tomato cDNA microarray. In addition to expression data, TED stores experimental design and array information in compliance with the MIAME guidelines and provides web interfaces for researchers to retrieve data for their own analysis and use. The Tomato Microarray Expression Database contains normalized and processed microarray data for ten time points with nine pair-wise comparisons during fruit development and ripening in a normal tomato variety and nearly isogenic single gene mutants impacting fruit development and ripening. Finally, the Tomato Digital Expression Database contains raw and normalized digital expression (EST abundance) data derived from analysis of the complete public tomato EST collection containing >150,000 ESTs derived from 27 different non-normalized EST libraries. This last component also includes tools for the comparison of tomato and Arabidopsis digital expression data. A set of query interfaces and analysis, and visualization tools have been developed and incorporated into TED, which aid users in identifying and deciphering biologically important information from our datasets. TED can be accessed at http://ted.bti.cornell.edu.

  20. Treading Old Paths in New Ways: Upper Secondary Students Using a Digital Tool of the Professional Historian

    ERIC Educational Resources Information Center

    Nygren, Thomas; Vikström, Lotta

    2013-01-01

    This article presents problems and possibilities associated with incorporating into history teaching a digital demographic database made for professional historians. We detect and discuss the outcome of how students in Swedish upper secondary schools respond to a teaching approach involving digitized registers comprising 19th century individuals…

  1. Digital Library Archaeology: A Conceptual Framework for Understanding Library Use through Artifact-Based Evaluation

    ERIC Educational Resources Information Center

    Nicholson, Scott

    2005-01-01

    Archaeologists have used material artifacts found in a physical space to gain an understanding about the people who occupied that space. Likewise, as users wander through a digital library, they leave behind data-based artifacts of their activity in the virtual space. Digital library archaeologists can gather these artifacts and employ inductive…

  2. Digital Tidbits

    ERIC Educational Resources Information Center

    Kumaran, Maha; Geary, Joe

    2011-01-01

    Technology has transformed libraries. There are digital libraries, electronic collections, online databases and catalogs, ebooks, downloadable books, and much more. With free technology such as social websites, newspaper collections, downloadable online calendars, clocks and sticky notes, online scheduling, online document sharing, and online…

  3. Digital geologic map and GIS database of Venezuela

    USGS Publications Warehouse

    Garrity, Christopher P.; Hackley, Paul C.; Urbani, Franco

    2006-01-01

    The digital geologic map and GIS database of Venezuela captures GIS compatible geologic and hydrologic data from the 'Geologic Shaded Relief Map of Venezuela,' which was released online as U.S. Geological Survey Open-File Report 2005-1038. Digital datasets and corresponding metadata files are stored in ESRI geodatabase format; accessible via ArcGIS 9.X. Feature classes in the geodatabase include geologic unit polygons, open water polygons, coincident geologic unit linework (contacts, faults, etc.) and non-coincident geologic unit linework (folds, drainage networks, etc.). Geologic unit polygon data were attributed for age, name, and lithologic type following the Lexico Estratigrafico de Venezuela. All digital datasets were captured from source data at 1:750,000. Although users may view and analyze data at varying scales, the authors make no guarantee as to the accuracy of the data at scales larger than 1:750,000.

  4. The role of digital cartographic data in the geosciences

    USGS Publications Warehouse

    Guptill, S.C.

    1983-01-01

    The increasing demand of the Nation's natural resource developers for the manipulation, analysis, and display of large quantities of earth-science data has necessitated the use of computers and the building of geoscience information systems. These systems require, in digital form, the spatial data on map products. The basic cartographic data shown on quadrangle maps provide a foundation for the addition of geological and geophysical data. If geoscience information systems are to realize their full potential, large amounts of digital cartographic base data must be available. A major goal of the U.S. Geological Survey is to create, maintain, manage, and distribute a national cartographic and geographic digital database. This unified database will contain numerous categories (hydrography, hypsography, land use, etc.) that, through the use of standardized data-element definitions and formats, can be used easily and flexibly to prepare cartographic products and perform geoscience analysis. ?? 1983.

  5. BAO plate archive digitization

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Nikoghosyan, E. H.; Gigoyan, K. S.; Paronyan, G. M.; Abrahamyan, H. V.; Andreasyan, H. R.; Azatyan, N. M.; Kostandyan, G. R.; Khachatryan, K. G.; Vardanyan, A. V.; Gyulzadyan, M. V.; Mikayelyan, G. A.; Farmanyan, S. V.; Knyazyan, A. V.

    Astronomical plate archives created on the basis of numerous observations at many observatories are important part of the astronomical heritage. Byurakan Astrophysical Observatory (BAO) plate archive consists of 37,000 photographic plates and films, obtained at 2.6m telescope, 1m and 0.5m Schmidt telescopes and other smaller ones during 1947-1991. In 2015, we have started a project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage. A Science Program Board is created to evaluate the observing material, to investigate new possibilities and to propose new projects based on the combined usage of these observations together with other world databases. The Executing Team consists of 11 astronomers and 2 computer scientists and will use 2 EPSON Perfection V750 Pro scanners for the digitization. The project will run during 3 years in 2015-2017 and the final result will be an electronic database and online interactive sky map to be used for further research projects.

  6. The digital divide: Trends in global mobile and broadband Internet access from 2000–2010

    PubMed Central

    Ronquillo, Charlene; Currie, Leanne

    2012-01-01

    The digital divide is described as the gap between those who do and do not have access to digital information and communications technologies (ICT). ICTs are viewed as an indicator of infrastructure and potential for development, and are a growing platform for health information and services delivery. This study compares the penetration of mobile and broadband Internet technologies by global region from 2000 to 2010. Results illustrate the rapid growth of mobile cellular telephone subscriptions in all global regions with trends suggesting a continued increase. Little to modest gains were made in fixed broadband Internet subscriptions globally. There is a growing popularity of mobile subscriptions with use of data communications, exceeding the numbers of fixed Internet subscriptions. This comparison reveals current strengths that can be built on and highlights the importance of awareness of global trends and using such knowledge to inform design and delivery of ICT-based health services. PMID:24199118

  7. Integrated technologies for solid waste bin monitoring system.

    PubMed

    Arebey, Maher; Hannan, M A; Basri, Hassan; Begum, R A; Abdullah, Huda

    2011-06-01

    The integration of communication technologies such as radio frequency identification (RFID), global positioning system (GPS), general packet radio system (GPRS), and geographic information system (GIS) with a camera are constructed for solid waste monitoring system. The aim is to improve the way of responding to customer's inquiry and emergency cases and estimate the solid waste amount without any involvement of the truck driver. The proposed system consists of RFID tag mounted on the bin, RFID reader as in truck, GPRS/GSM as web server, and GIS as map server, database server, and control server. The tracking devices mounted in the trucks collect location information in real time via the GPS. This information is transferred continuously through GPRS to a central database. The users are able to view the current location of each truck in the collection stage via a web-based application and thereby manage the fleet. The trucks positions and trash bin information are displayed on a digital map, which is made available by a map server. Thus, the solid waste of the bin and the truck are being monitored using the developed system.

  8. STEP-TRAMM - A modeling interface for simulating localized rainfall induced shallow landslides and debris flow runout pathways

    NASA Astrophysics Data System (ADS)

    von Ruette, Jonas; Lehmann, Peter; Fan, Linfeng; Bickel, Samuel; Or, Dani

    2017-04-01

    Landslides and subsequent debris-flows initiated by rainfall represent a ubiquitous natural hazard in steep mountainous regions. We integrated a landslide hydro-mechanical triggering model and associated debris flow runout pathways with a graphical user interface (GUI) to represent these natural hazards in a wide range of catchments over the globe. The STEP-TRAMM GUI provides process-based locations and sizes of landslides patterns using digital elevation models (DEM) from SRTM database (30 m resolution) linked with soil maps from global database SoilGrids (250 m resolution) and satellite based information on rainfall statistics for the selected region. In a preprocessing step STEP-TRAMM models soil depth distribution and complements soil information that jointly capture key hydrological and mechanical properties relevant to local soil failure representation. In the presentation we will discuss feature of this publicly available platform and compare landslide and debris flow patterns for different regions considering representative intense rainfall events. Model outcomes will be compared for different spatial and temporal resolutions to test applicability of web-based information on elevation and rainfall for hazard assessment.

  9. WOVOdat, A Worldwide Volcano Unrest Database, to Improve Eruption Forecasts

    NASA Astrophysics Data System (ADS)

    Widiwijayanti, C.; Costa, F.; Win, N. T. Z.; Tan, K.; Newhall, C. G.; Ratdomopurbo, A.

    2015-12-01

    WOVOdat is the World Organization of Volcano Observatories' Database of Volcanic Unrest. An international effort to develop common standards for compiling and storing data on volcanic unrests in a centralized database and freely web-accessible for reference during volcanic crises, comparative studies, and basic research on pre-eruption processes. WOVOdat will be to volcanology as an epidemiological database is to medicine. Despite the large spectrum of monitoring techniques, the interpretation of monitoring data throughout the evolution of the unrest and making timely forecasts remain the most challenging tasks for volcanologists. The field of eruption forecasting is becoming more quantitative, based on the understanding of the pre-eruptive magmatic processes and dynamic interaction between variables that are at play in a volcanic system. Such forecasts must also acknowledge and express the uncertainties, therefore most of current research in this field focused on the application of event tree analysis to reflect multiple possible scenarios and the probability of each scenario. Such forecasts are critically dependent on comprehensive and authoritative global volcano unrest data sets - the very information currently collected in WOVOdat. As the database becomes more complete, Boolean searches, side-by-side digital and thus scalable comparisons of unrest, pattern recognition, will generate reliable results. Statistical distribution obtained from WOVOdat can be then used to estimate the probabilities of each scenario after specific patterns of unrest. We established main web interface for data submission and visualizations, and have now incorporated ~20% of worldwide unrest data into the database, covering more than 100 eruptive episodes. In the upcoming years we will concentrate in acquiring data from volcano observatories develop a robust data query interface, optimizing data mining, and creating tools by which WOVOdat can be used for probabilistic eruption forecasting. The more data in WOVOdat, the more useful it will be.

  10. Final report for DOE Award # DE- SC0010039*: Carbon dynamics of forest recovery under a changing climate: Forcings, feedbacks, and implications for earth system modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson-Teixeira, Kristina J.; DeLucia, Evan H.; Duval, Benjamin D.

    2015-10-29

    To advance understanding of C dynamics of forests globally, we compiled a new database, the Forest C database (ForC-db), which contains data on ground-based measurements of ecosystem-level C stocks and annual fluxes along with disturbance history. This database currently contains 18,791 records from 2009 sites, making it the largest and most comprehensive database of C stocks and flows in forest ecosystems globally. The tropical component of the database will be published in conjunction with a manuscript that is currently under review (Anderson-Teixeira et al., in review). Database development continues, and we hope to maintain a dynamic instance of the entiremore » (global) database.« less

  11. A uniform database of teleseismic shear wave splitting measurements for the western and central United States

    NASA Astrophysics Data System (ADS)

    Liu, Kelly H.; Elsheikh, Ahmed; Lemnifi, Awad; Purevsuren, Uranbaigal; Ray, Melissa; Refayee, Hesham; Yang, Bin B.; Yu, Youqiang; Gao, Stephen S.

    2014-05-01

    We present a shear wave splitting (SWS) database for the western and central United States as part of a lasting effort to build a uniform SWS database for the entire North America. The SWS measurements were obtained by minimizing the energy on the transverse component of the PKS, SKKS, and SKS phases. Each of the individual measurements was visually checked to ensure quality. This version of the database contains 16,105 pairs of splitting parameters. The data used to generate the parameters were recorded by 1774 digital broadband seismic stations over the period of 1989-2012, and represented all the available data from both permanent and portable seismic networks archived at the Incorporated Research Institutions for Seismology Data Management Center in the area of 26.00°N to 50.00°N and 125.00°W to 90.00°W. About 10,000 pairs of the measurements were from the 1092 USArray Transportable Array stations. The results show that approximately 2/3 of the fast orientations are within 30° from the absolute plate motion (APM) direction of the North American plate, and most of the largest departures with the APM are located along the eastern boundary of the western US orogenic zone and in the central Great Basins. The splitting times observed in the western US are larger than, and those in the central US are comparable with the global average of 1.0 s. The uniform database has an unprecedented spatial coverage and can be used for various investigations of the structure and dynamics of the Earth.

  12. Evaluation of personal digital assistant drug information databases for the managed care pharmacist.

    PubMed

    Lowry, Colleen M; Kostka-Rokosz, Maria D; McCloskey, William W

    2003-01-01

    Personal digital assistants (PDAs) are becoming a necessity for practicing pharmacists. They offer a time-saving and convenient way to obtain current drug information. Several software companies now offer general drug information databases for use on hand held computers. PDAs priced less than 200 US dollars often have limited memory capacity; therefore, the user must choose from a growing list of general drug information database options in order to maximize utility without exceeding memory capacity. This paper reviews the attributes of available general drug information software databases for the PDA. It provides information on the content, advantages, limitations, pricing, memory requirements, and accessibility of drug information software databases. Ten drug information databases were subjectively analyzed and evaluated based on information from the product.s Web site, vendor Web sites, and from our experience. Some of these databases have attractive auxiliary features such as kinetics calculators, disease references, drug-drug and drug-herb interaction tools, and clinical guidelines, which may make them more useful to the PDA user. Not all drug information databases are equal with regard to content, author credentials, frequency of updates, and memory requirements. The user must therefore evaluate databases for completeness, currency, and cost effectiveness before purchase. In addition, consideration should be given to the ease of use and flexibility of individual programs.

  13. Map showing geologic terranes of the Hailey 1 degree x 2 degrees quadrangle and the western part of the Idaho Falls 1 degree x 2 degrees quadrangle, south-central Idaho

    USGS Publications Warehouse

    Worl, R.G.; Johnson, K.M.

    1995-01-01

    The paper version of Map Showing Geologic Terranes of the Hailey 1x2 Quadrangle and the western part of the Idaho Falls 1x2 Quadrangle, south-central Idaho was compiled by Ron Worl and Kate Johnson in 1995. The plate was compiled on a 1:250,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a geographic information system database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  14. OSTMED.DR®, an Osteopathic Medicine Digital Library.

    PubMed

    Fitterling, Lori; Powers, Elaine; Vardell, Emily

    2018-01-01

    The OSTMED.DR® database provides access to both citation and full-text osteopathic literature, including the Journal of the American Osteopathic Association. Currently, it is a free database searchable using basic and advanced search features.

  15. Building the Digital Library Infrastructure: A Primer.

    ERIC Educational Resources Information Center

    Tebbetts, Diane R.

    1999-01-01

    Provides a framework for examining the complex infrastructure needed to successfully implement a digital library. Highlights include database development, online public-access catalogs, interactive technical services, full-text documents, hardware and wiring, licensing, access, and security issues. (Author/LRW)

  16. Database for the geologic map of upper Eocene to Holocene volcanic and related rocks in the Cascade Range, Washington

    USGS Publications Warehouse

    Barron, Andrew D.; Ramsey, David W.; Smith, James G.

    2014-01-01

    This digital database contains information used to produce the geologic map published as Sheet 1 in U.S. Geological Survey Miscellaneous Investigations Series Map I-2005. (Sheet 2 of Map I-2005 shows sources of geologic data used in the compilation and is available separately). Sheet 1 of Map I-2005 shows the distribution and relations of volcanic and related rock units in the Cascade Range of Washington at a scale of 1:500,000. This digital release is produced from stable materials originally compiled at 1:250,000 scale that were used to publish Sheet 1. The database therefore contains more detailed geologic information than is portrayed on Sheet 1. This is most noticeable in the database as expanded polygons of surficial units and the presence of additional strands of concealed faults. No stable compilation materials exist for Sheet 1 at 1:500,000 scale. The main component of this digital release is a spatial database prepared using geographic information systems (GIS) applications. This release also contains links to files to view or print the map sheet, main report text, and accompanying mapping reference sheet from Map I-2005. For more information on volcanoes in the Cascade Range in Washington, Oregon, or California, please refer to the U.S. Geological Survey Volcano Hazards Program website.

  17. PHASE I MATERIALS PROPERTY DATABASE DEVELOPMENT FOR ASME CODES AND STANDARDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Lin, Lianshan

    2013-01-01

    To support the ASME Boiler and Pressure Vessel Codes and Standard (BPVC) in modern information era, development of a web-based materials property database is initiated under the supervision of ASME Committee on Materials. To achieve efficiency, the project heavily draws upon experience from development of the Gen IV Materials Handbook and the Nuclear System Materials Handbook. The effort is divided into two phases. Phase I is planned to deliver a materials data file warehouse that offers a depository for various files containing raw data and background information, and Phase II will provide a relational digital database that provides advanced featuresmore » facilitating digital data processing and management. Population of the database will start with materials property data for nuclear applications and expand to data covering the entire ASME Code and Standards including the piping codes as the database structure is continuously optimized. The ultimate goal of the effort is to establish a sound cyber infrastructure that support ASME Codes and Standards development and maintenance.« less

  18. Keyless Entry: Building a Text Database Using OCR Technology.

    ERIC Educational Resources Information Center

    Grotophorst, Clyde W.

    1989-01-01

    Discusses the use of optical character recognition (OCR) technology to produce an ASCII text database. A tutorial on digital scanning and OCR is provided, and a systems integration project which used the Calera CDP-3000XF scanner and text retrieval software to construct a database of dissertations at George Mason University is described. (four…

  19. GrainGenes: Changing Times, Changing Databases, Digital Evolution.

    USDA-ARS?s Scientific Manuscript database

    The GrainGenes database is one of few agricultural databases that had an early start on the Internet and that has changed with the times. Initial goals were to collect a wide range of data relating to the developing maps and attributes of small grains crops, and to make them easily accessible. The ...

  20. A "Neogeographical Education"? The Geospatial Web, GIS and Digital Art in Adult Education

    ERIC Educational Resources Information Center

    Papadimitriou, Fivos

    2010-01-01

    Neogeography provides a link between the science of geography and digital art. The carriers of this link are geospatial technologies (global navigational satellite systems such as the global positioning system, Geographical Information System [GIS] and satellite imagery) along with ubiquitous information and communication technologies (such as…

  1. NCSTRL: Design and Deployment of a Globally Distributed Digital Library.

    ERIC Educational Resources Information Center

    Davies, James R.; Lagoze, Carl

    2000-01-01

    Discusses the development of a digital library architecture that allows the creation of digital libraries within the World Wide Web. Describes a digital library, NCSTRL (Networked Computer Science Technical Research Library), within which the work has taken place and explains Dienst, a protocol and architecture for distributed digital libraries.…

  2. Launching the Next Generation IODP Site Survey Data Bank

    NASA Astrophysics Data System (ADS)

    Miller, S. P.; Helly, J.; Clark, D.; Eakins, B.; Sutton, D.; Weatherford, J.; Thatch, G.; Miville, B.; Zelt, B.

    2005-12-01

    The next generation all-digital Site Survey Data Bank (SSDB) became operational on August 15, 2005 as an online resource for Integrated Ocean Drilling Program (IODP) proponents, reviewers, panels and operations, worldwide. There are currently 123 active proposals for drilling at sites distributed across the globe, involving nearly 1000 proponents from more than 40 countries. The goal is to provide an authoritative, persistent, secure, password-controlled and easily-used home for contributed data objects, as proposals evolve through their life cycle from preliminary phases to planned drilling expeditions. Proposal status can be monitored graphically by proposal number, data type or date. A Java SSDBviewer allows discovery of all proposal data objects, displayed over a basemap of global topography, crustal age or other custom maps. Data can be viewed or downloaded under password control. Webform interfaces assist with the uploading of data and metadata. Thirty four different standard data types are currently supported. The system was designed as a fully functioning digital library, not just a database or a web archive, drawing upon the resources of the SIOExplorer Digital Library project. Blocks of metadata are organized to support discovery and use, as appropriate for each data type. The SSDB has been developed by a UCSD team of researchers and computer scientists at the Scripps Institution of Oceanography and the San Diego Supercomputer Center, under contract with IODP Management International Inc., supported by NSF OCE 0432224.

  3. Preliminary Geologic Map of the Topanga 7.5' Quadrangle, Southern California: A Digital Database

    USGS Publications Warehouse

    Yerkes, R.F.; Campbell, R.H.

    1995-01-01

    INTRODUCTION This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1994). More specific information about the units may be available in the original sources. The content and character of the database and methods of obtaining it are described herein. The geologic map database itself, consisting of three ARC coverages and one base layer, can be obtained over the Internet or by magnetic tape copy as described below. The processes of extracting the geologic map database from the tar file, and importing the ARC export coverages (procedure described herein), will result in the creation of an ARC workspace (directory) called 'topnga.' The database was compiled using ARC/INFO version 7.0.3, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). It is stored in uncompressed ARC export format (ARC/INFO version 7.x) in a compressed UNIX tar (tape archive) file. The tar file was compressed with gzip, and may be uncompressed with gzip, which is available free of charge via the Internet from the gzip Home Page (http://w3.teaser.fr/~jlgailly/gzip). A tar utility is required to extract the database from the tar file. This utility is included in most UNIX systems, and can be obtained free of charge via the Internet from Internet Literacy's Common Internet File Formats Webpage http://www.matisse.net/files/formats.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView (version 1.0 for Windows 3.1 to 3.11 is available for free from ESRI's web site: http://www.esri.com). 1. Different base layer - The original digital database included separates clipped out of the Los Angeles 1:100,000 sheet. This release includes a vectorized scan of a scale-stable negative of the Topanga 7.5 minute quadrangle. 2. Map projection - The files in the original release were in polyconic projection. The projection used in this release is state plane, which allows for the tiling of adjacent quadrangles. 3. File compression - The files in the original release were compressed with UNIX compression. The files in this release are compressed with gzip.

  4. The Digitization of Early English Books: A Database Comparison of Internet Archive and Early English Books Online

    ERIC Educational Resources Information Center

    Brightenburg, Cindy

    2016-01-01

    The use of digital books is diverse, ranging from casual reading to in-depth primary source research. Digitization of early English printed books in particular, has provided greater access to a previously limited resource for academic faculty and researchers. Internet Archive, a free, internet website and Early English Books Online, a subscription…

  5. Air Weather Service Master Station Catalog: USAFETAC Climatic Database Users Handbook No. 6

    DTIC Science & Technology

    1993-03-01

    4) . ... .4 • FIELD NO. DESCRIPTION OF FIELD AND COMMENTS 01 STN NUM. A 6- digit number with the first 5 digits assigned to a particular weather...reporting location lAW WMO ,ules plus a sixth digit as follows: 0 = The first five digits are the actual block/station number (WMO number) assigned to...it is considered inactive for that hour. A digit (1-9) tells how many months it has been since a report was received from the station for that hour

  6. Effectiveness of online and mobile telephone applications ('apps') for the self-management of suicidal ideation and self-harm: a systematic review and meta-analysis.

    PubMed

    Witt, Katrina; Spittal, Matthew J; Carter, Gregory; Pirkis, Jane; Hetrick, Sarah; Currier, Dianne; Robinson, Jo; Milner, Allison

    2017-08-15

    Online and mobile telephone applications ('apps') have the potential to improve the scalability of effective interventions for suicidal ideation and self-harm. The aim of this review was therefore to investigate the effectiveness of digital interventions for the self-management of suicidal ideation or self-harm. Seven databases (Applied Science & Technology; CENTRAL; CRESP; Embase; Global Health; PsycARTICLES; PsycINFO; Medline) were searched to 31 March, 2017. Studies that examined the effectiveness of digital interventions for suicidal ideation and/or self-harm, or which reported outcome data for suicidal ideation and/or self-harm, within a randomised controlled trial (RCT), pseudo-RCT, or observational pre-test/post-test design were included in the review. Fourteen non-overlapping studies were included, reporting data from a total of 3,356 participants. Overall, digital interventions were associated with reductions for suicidal ideation scores at post-intervention. There was no evidence of a treatment effect for self-harm or attempted suicide. Most studies were biased in relation to at least one aspect of study design, and particularly the domains of participant, clinical personnel, and outcome assessor blinding. Performance and detection bias therefore cannot be ruled out. Digital interventions for suicidal ideation and self-harm may be more effective than waitlist control. It is unclear whether these reductions would be clinically meaningful at present. Further evidence, particularly with regards to the potential mechanisms of action of these interventions, as well as safety, is required before these interventions could recommended.

  7. In need of combined topography and bathymetry DEM

    NASA Astrophysics Data System (ADS)

    Kisimoto, K.; Hilde, T.

    2003-04-01

    In many geoscience applications, digital elevation models (DEMs) are now more commonly used at different scales and greater resolution due to the great advancement in computer technology. Increasing the accuracy/resolution of the model and the coverage of the terrain (global model) has been the goal of users as mapping technology has improved and computers get faster and cheaper. The ETOPO5 (5 arc minutes spatial resolution land and seafloor model), initially developed in 1988 by Margo Edwards, then at Washington University, St. Louis, MO, has been the only global terrain model for a long time, and it is now being replaced by three new topographic and bathymetric DEMs, i.e.; the ETOPO2 (2 arc minutes spatial resolution land and seafloor model), the GTOPO30 land model with a spatial resolution of 30 arc seconds (c.a. 1km at equator) and the 'GEBCO 1-MINUTE GLOBAL BATHYMETRIC GRID' ocean floor model with a spatial resolution of 1 arc minute (c.a. 2 km at equator). These DEMs are products of projects through which compilation and reprocessing of existing and/or new datasets were made to meet user's new requirements. These ongoing efforts are valuable and support should be continued to refine and update these DEMs. On the other hand, a different approach to create a global bathymetric (seafloor) database exists. A method to estimate the seafloor topography from satellite altimetry combined with existing ships' conventional sounding data was devised and a beautiful global seafloor database created and made public by W.H. Smith and D.T. Sandwell in 1997. The big advantage of this database is the uniformity of coverage, i.e. there is no large area where depths are missing. It has a spatial resolution of 2 arc minute. Another important effort is found in making regional, not global, seafloor databases with much finer resolutions in many countries. The Japan Hydrographic Department has compiled and released a 500m-grid topography database around Japan, J-EGG500, in 1999. Although the coverage of this database is only a small portion of the Earth, the database has been highly appreciated in the academic community, and accepted in surprise by the general public when the database was displayed in 3D imagery to show its quality. This database could be rather smoothly combined with the finer land DEM of 250m spatial resolution (Japan250m.grd, K. Kisimoto, 2000). One of the most important applications of this combined DEM of topography and bathymetry is tsunami modeling. Understanding of the coastal environment, management and development of the coastal region are other fields in need of these data. There is, however, an important issue to consider when we create a combined DEM of topography and bathymetry in finer resolutions. The problem arises from the discrepancy of the standard datum planes or reference levels used for topographic leveling and bathymetric sounding. Land topography (altitude) is defined by leveling from the single reference point determined by average mean sea level, in other words, land height is measured from the geoid. On the other hand, depth charts are made based on depth measured from locally determined reference sea surface level, and this value of sea surface level is taken from the long term average of the lowest tidal height. So, to create a combined DEM of topography and bathymetry in very fine scale, we need to avoid this inconsistency between height and depth across the coastal region. Height and depth should be physically continuous relative to a single reference datum across the coast within such new high resolution DEMs. (N.B. Coast line is not equal to 'altitude-zero line' nor 'depth-zero line'. It is defined locally as the long term average of the highest tide level.) All of this said, we still need a lot of work on the ocean side. Global coverage with detailed bathymetric mapping is still poor. Seafloor imaging and other geophysical measurements/experiments should be organized and conducted internationally and interdisciplinary ways more than ever. We always need greater technological advancement and application of this technology in marine sciences, and more enthusiastic minds of seagoing researchers as well. Recent seafloor mapping technology/quality both in bathymetry and imagery is very promising and even favorably compared with the terrain mapping. We discuss and present on recent achievement and needs on the seafloor mapping using several most up-to-date global- and regional- DEMs available for science community at the poster session.

  8. Radiology education: a glimpse into the future.

    PubMed

    Scarsbrook, A F; Graham, R N J; Perriss, R W

    2006-08-01

    The digital revolution in radiology continues to advance rapidly. There are a number of interesting developments within radiology informatics which may have a significant impact on education and training of radiologists in the near future. These include extended functionality of handheld computers, web-based skill and knowledge assessment, standardization of radiological procedural training using simulated or virtual patients, worldwide videoconferencing via high-quality health networks such as Internet2 and global collaboration of radiological educational resources via comprehensive, multi-national databases such as the medical imaging resource centre initiative of the Radiological Society of North America. This article will explore the role of e-learning in radiology, highlight a number of useful web-based applications in this area, and explain how the current and future technological advances might best be incorporated into radiological training.

  9. The future of medical diagnostics: large digitized databases.

    PubMed

    Kerr, Wesley T; Lau, Edward P; Owens, Gwen E; Trefler, Aaron

    2012-09-01

    The electronic health record mandate within the American Recovery and Reinvestment Act of 2009 will have a far-reaching affect on medicine. In this article, we provide an in-depth analysis of how this mandate is expected to stimulate the production of large-scale, digitized databases of patient information. There is evidence to suggest that millions of patients and the National Institutes of Health will fully support the mining of such databases to better understand the process of diagnosing patients. This data mining likely will reaffirm and quantify known risk factors for many diagnoses. This quantification may be leveraged to further develop computer-aided diagnostic tools that weigh risk factors and provide decision support for health care providers. We expect that creation of these databases will stimulate the development of computer-aided diagnostic support tools that will become an integral part of modern medicine.

  10. Herbarium data: Global biodiversity and societal botanical needs for novel research.

    PubMed

    James, Shelley A; Soltis, Pamela S; Belbin, Lee; Chapman, Arthur D; Nelson, Gil; Paul, Deborah L; Collins, Matthew

    2018-02-01

    Building on centuries of research based on herbarium specimens gathered through time and around the globe, a new era of discovery, synthesis, and prediction using digitized collections data has begun. This paper provides an overview of how aggregated, open access botanical and associated biological, environmental, and ecological data sets, from genes to the ecosystem, can be used to document the impacts of global change on communities, organisms, and society; predict future impacts; and help to drive the remediation of change. Advocacy for botanical collections and their expansion is needed, including ongoing digitization and online publishing. The addition of non-traditional digitized data fields, user annotation capability, and born-digital field data collection enables the rapid access of rich, digitally available data sets for research, education, informed decision-making, and other scholarly and creative activities. Researchers are receiving enormous benefits from data aggregators including the Global Biodiversity Information Facility (GBIF), Integrated Digitized Biocollections (iDigBio), the Atlas of Living Australia (ALA), and the Biodiversity Heritage Library (BHL), but effective collaboration around data infrastructures is needed when working with large and disparate data sets. Tools for data discovery, visualization, analysis, and skills training are increasingly important for inspiring novel research that improves the intrinsic value of physical and digital botanical collections.

  11. The implementation of a modernized Dynamic Digital Map on Gale Crater, Mars

    NASA Astrophysics Data System (ADS)

    McBeck, J.; Condit, C. D.

    2012-12-01

    Currently, geology instructors present information to students via PowerPoint, Word, Excel and other programs that are not designed to parse or present geologic data. More tech-savvy, and perhaps better-funded, instructors use Google Earth or ArcGIS to display geologic maps and other visual information. However, Google Earth lacks the ability to present large portions of text, and ArcGIS restricts such functionality to labels and annotations. The original Dynamic Digital Map, which we have renamed Dynamic Digital Map Classic (DDMC), allows instructors to represent both visual and large portions of textual information to students. This summer we generalized the underlying architecture of DDMC, redesigned the user interface, modernized the analytical functionality, renamed the older version and labeled this new creature Dynamic Digital Map Extended (DDME). With the new DDME instructors can showcase maps, images, articles and movies, and create digital field trips. They can set the scale, coordinate system and caption of maps and images, add symbol links to maps and images that can transport the user to any specified destination—either internally (to data contained within the DDME) or externally (to a website address). Instructors and students can also calculate non-linear distances and irregular areas of maps and images, and create digital field trips with any number of stops—complete with notes and driving directions. DDMEs are perhaps best described as a sort of computerized, self-authored, interactive textbook. To display the vast capabilities of DDME, we created a DDME of Gale Crater (DDME-GC), which is the landing site of the most sophisticated NASA Mars Rover—Curiosity. DDME-GC hosts six thematic maps: a detailed geologic map provided by Brad Thompson of the Boston University Center for Remote Sensing (Thompson, et al., 2010), and five maps maintained in ASU's JMARS system, including global mosaics from Mars Global Surveyor's Mars Orbiter Laser Altimeter (MOLA), Mars Odyssey's Thermal Emission Imaging System (THEMIS), and the Mars Digital Image Model. DDME-GC offers a diverse suite of images, with over 40 images captured in the High Resolution Imaging Science Experiment (HiRISE), as well as several global mosaics created from Viking Orbiter, Hubble Telescope, THEMIS, MOLA and HiRISE data. DDME-GC also provides more than 25 articles that span subjects from the possible origins of the mound located in Gale Crater to the goals of NASA's Mars Exploration Program. The movies hosted by DDME-GC describe the difficulties of selecting a landing site for Curiosity, landing Curiosity on Mars and several other dynamic topics. The most significant advantage of the modernized DDME is its easily augmented functionality. In the future, DDME will be able to communicate with databases, import Keyhole Markup Language (KML) files from Google Earth, and be available on iOS and Android operating system. (Imagine: a field trip without the burden of notebooks, pens or pencils, paper or clipboards, with this information maintained on a mobile device.) The most recent DDME is a mere skeleton of its full capabilities—a robust architecture upon which myriad functionality can be supplemented.

  12. Unlocking Index Animalium: From paper slips to bytes and bits

    PubMed Central

    Pilsk, Suzanne C.; Kalfatovic, Martin R.; Richard, Joel M.

    2016-01-01

    Abstract In 1996 Smithsonian Libraries (SIL) embarked on the digitization of its collections. By 1999, a full-scale digitization center was in place and rare volumes from the natural history collections, often of high illustrative value, were the focus for the first years of the program. The resulting beautiful books made available for online display were successful to a certain extent, but it soon became clear that the data locked within the texts needed to be converted to more usable and re-purposable form via digitization methods that went beyond simple page imaging and included text conversion elements. Library staff met with researchers from the taxonomic community to understand their path to the literature and identified tools (indexes and bibliographies) used to connect to the library holdings. The traditional library metadata describing the titles, which made them easily retrievable from the shelves of libraries, was not meeting the needs of the researcher looking for more detailed and granular data within the texts. The result was to identify proper print tools that could potential assist researchers in digital form. This paper outlines the project undertaken to convert Charles Davies Sherborn’s Index Animalium into a tool to connect researchers to the library holdings: from a print index to a database to eventually a dataset. Sherborn’s microcitation of a species name and his bibliographies help bridge the gap between taxonomist and literature holdings of libraries. In 2004, SIL received funding from the Smithsonian’s Atherton Seidell Endowment to create an online version of Sherborn’s Index Animalium. The initial project was to digitize the page images and re-key the data into a simple data structure. As the project evolved, a more complex database was developed which enabled quality field searching to retrieve species names and to search the bibliography. Problems with inconsistent abbreviations and styling of his bibliographies made the parsing of the data difficult. Coinciding with the development of the Biodiversity Heritage Library (BHL) in 2005, it became obvious there was a need to integrate the database converted Index Animalium, BHL’s scanned taxonomic literature, and taxonomic intelligence (the algorithmic identification of binomial, Latinate name-strings). The challenges of working with legacy taxonomic citation, computer matching algorithms, and making connections have brought us to today’s goal of making Sherborn available and linked to other datasets. Partnering with others to allow machine-to-machine communications the data is being examined for possible transformation into RDF markup and meeting the standards of Linked Open Data. SIL staff have partnered with Thomson Reuters and the Global Names Initiative to further enhance the Index Animalium data set. Thomson Reuters’ staff is now working on integrating the species microcitation and species name in the ION: Index to Organism Names project; Richard Pyle (The Bishop Museum) is also working on further parsing of the text. The Index Animalium collaborative project’s ultimate goal is to successful have researchers go seamlessly from the species name in either ION or the scanned pages of Index Animalium to the digitized original description in BHL - connecting taxonomic researchers to original authored species descriptions with just a click. PMID:26877657

  13. Unlocking Index Animalium: From paper slips to bytes and bits.

    PubMed

    Pilsk, Suzanne C; Kalfatovic, Martin R; Richard, Joel M

    2016-01-01

    In 1996 Smithsonian Libraries (SIL) embarked on the digitization of its collections. By 1999, a full-scale digitization center was in place and rare volumes from the natural history collections, often of high illustrative value, were the focus for the first years of the program. The resulting beautiful books made available for online display were successful to a certain extent, but it soon became clear that the data locked within the texts needed to be converted to more usable and re-purposable form via digitization methods that went beyond simple page imaging and included text conversion elements. Library staff met with researchers from the taxonomic community to understand their path to the literature and identified tools (indexes and bibliographies) used to connect to the library holdings. The traditional library metadata describing the titles, which made them easily retrievable from the shelves of libraries, was not meeting the needs of the researcher looking for more detailed and granular data within the texts. The result was to identify proper print tools that could potential assist researchers in digital form. This paper outlines the project undertaken to convert Charles Davies Sherborn's Index Animalium into a tool to connect researchers to the library holdings: from a print index to a database to eventually a dataset. Sherborn's microcitation of a species name and his bibliographies help bridge the gap between taxonomist and literature holdings of libraries. In 2004, SIL received funding from the Smithsonian's Atherton Seidell Endowment to create an online version of Sherborn's Index Animalium. The initial project was to digitize the page images and re-key the data into a simple data structure. As the project evolved, a more complex database was developed which enabled quality field searching to retrieve species names and to search the bibliography. Problems with inconsistent abbreviations and styling of his bibliographies made the parsing of the data difficult. Coinciding with the development of the Biodiversity Heritage Library (BHL) in 2005, it became obvious there was a need to integrate the database converted Index Animalium, BHL's scanned taxonomic literature, and taxonomic intelligence (the algorithmic identification of binomial, Latinate name-strings). The challenges of working with legacy taxonomic citation, computer matching algorithms, and making connections have brought us to today's goal of making Sherborn available and linked to other datasets. Partnering with others to allow machine-to-machine communications the data is being examined for possible transformation into RDF markup and meeting the standards of Linked Open Data. SIL staff have partnered with Thomson Reuters and the Global Names Initiative to further enhance the Index Animalium data set. Thomson Reuters' staff is now working on integrating the species microcitation and species name in the ION: Index to Organism Names project; Richard Pyle (The Bishop Museum) is also working on further parsing of the text. The Index Animalium collaborative project's ultimate goal is to successful have researchers go seamlessly from the species name in either ION or the scanned pages of Index Animalium to the digitized original description in BHL - connecting taxonomic researchers to original authored species descriptions with just a click.

  14. Application research for 4D technology in flood forecasting and evaluation

    NASA Astrophysics Data System (ADS)

    Li, Ziwei; Liu, Yutong; Cao, Hongjie

    1998-08-01

    In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.

  15. Digital Mapping Techniques '07 - Workshop Proceedings

    USGS Publications Warehouse

    Soller, David R.

    2008-01-01

    The Digital Mapping Techniques '07 (DMT'07) workshop was attended by 85 technical experts from 49 agencies, universities, and private companies, including representatives from 27 state geological surveys. This year's meeting, the tenth in the annual series, was hosted by the South Carolina Geological Survey, from May 20-23, 2007, on the University of South Carolina campus in Columbia, South Carolina. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous year's meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.

  16. Development of the Global Earthquake Model’s neotectonic fault database

    USGS Publications Warehouse

    Christophersen, Annemarie; Litchfield, Nicola; Berryman, Kelvin; Thomas, Richard; Basili, Roberto; Wallace, Laura; Ries, William; Hayes, Gavin P.; Haller, Kathleen M.; Yoshioka, Toshikazu; Koehler, Richard D.; Clark, Dan; Wolfson-Schwehr, Monica; Boettcher, Margaret S.; Villamor, Pilar; Horspool, Nick; Ornthammarath, Teraphan; Zuñiga, Ramon; Langridge, Robert M.; Stirling, Mark W.; Goded, Tatiana; Costa, Carlos; Yeats, Robert

    2015-01-01

    The Global Earthquake Model (GEM) aims to develop uniform, openly available, standards, datasets and tools for worldwide seismic risk assessment through global collaboration, transparent communication and adapting state-of-the-art science. GEM Faulted Earth (GFE) is one of GEM’s global hazard module projects. This paper describes GFE’s development of a modern neotectonic fault database and a unique graphical interface for the compilation of new fault data. A key design principle is that of an electronic field notebook for capturing observations a geologist would make about a fault. The database is designed to accommodate abundant as well as sparse fault observations. It features two layers, one for capturing neotectonic faults and fold observations, and the other to calculate potential earthquake fault sources from the observations. In order to test the flexibility of the database structure and to start a global compilation, five preexisting databases have been uploaded to the first layer and two to the second. In addition, the GFE project has characterised the world’s approximately 55,000 km of subduction interfaces in a globally consistent manner as a basis for generating earthquake event sets for inclusion in earthquake hazard and risk modelling. Following the subduction interface fault schema and including the trace attributes of the GFE database schema, the 2500-km-long frontal thrust fault system of the Himalaya has also been characterised. We propose the database structure to be used widely, so that neotectonic fault data can make a more complete and beneficial contribution to seismic hazard and risk characterisation globally.

  17. The Digital Sample: Metadata, Unique Identification, and Links to Data and Publications

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Vinayagamoorthy, S.; Djapic, B.; Klump, J.

    2006-12-01

    A significant part of digital data in the Geosciences refers to physical samples of Earth materials, from igneous rocks to sediment cores to water or gas samples. The application and long-term utility of these sample-based data in research is critically dependent on (a) the availability of information (metadata) about the samples such as geographical location and time of sampling, or sampling method, (b) links between the different data types available for individual samples that are dispersed in the literature and in digital data repositories, and (c) access to the samples themselves. Major problems for achieving this include incomplete documentation of samples in publications, use of ambiguous sample names, and the lack of a central catalog that allows to find a sample's archiving location. The International Geo Sample Number IGSN, managed by the System for Earth Sample Registration SESAR, provides solutions for these problems. The IGSN is a unique persistent identifier for samples and other GeoObjects that can be obtained by submitting sample metadata to SESAR (www.geosamples.org). If data in a publication is referenced to an IGSN (rather than an ambiguous sample name), sample metadata can readily be extracted from the SESAR database, which evolves into a Global Sample Catalog that also allows to locate the owner or curator of the sample. Use of the IGSN in digital data systems allows building linkages between distributed data. SESAR is contributing to the development of sample metadata standards. SESAR will integrate the IGSN in persistent, resolvable identifiers based on the handle.net service to advance direct linkages between the digital representation of samples in SESAR (sample profiles) and their related data in the literature and in web-accessible digital data repositories. Technologies outlined by Klump et al. (this session) such as the automatic creation of ontologies by text mining applications will be explored for harvesting identifiers of publications and datasets that contain information about a specific sample in order to establish comprehensive data profiles for samples.

  18. Global Digital Revolution and Africa: Transforming Nigerian Universities to World Class Institutions

    ERIC Educational Resources Information Center

    Isah, Emmanuel Aileonokhuoya; Ayeni, A. O.

    2010-01-01

    This study examined the global digital revolution and the transformation of Nigerian universities. The study overviewed university developments world wide in line with what obtains in Nigeria. The study highlighted the several challenges that face Nigerian universities inclusive of poor funding, poor personnel and the poor exposure to global…

  19. Technology and the Modern Library.

    ERIC Educational Resources Information Center

    Boss, Richard W.

    1984-01-01

    Overview of the impact of information technology on libraries highlights turnkey vendors, bibliographic utilities, commercial suppliers of records, state and regional networks, computer-to-computer linkages, remote database searching, terminals and microcomputers, building local databases, delivery of information, digital telefacsimile,…

  20. A framework for analysis of large database of old art paintings

    NASA Astrophysics Data System (ADS)

    Da Rugna, Jérome; Chareyron, Ga"l.; Pillay, Ruven; Joly, Morwena

    2011-03-01

    For many years, a lot of museums and countries organize the high definition digitalization of their own collections. In consequence, they generate massive data for each object. In this paper, we only focus on art painting collections. Nevertheless, we faced a very large database with heterogeneous data. Indeed, image collection includes very old and recent scans of negative photos, digital photos, multi and hyper spectral acquisitions, X-ray acquisition, and also front, back and lateral photos. Moreover, we have noted that art paintings suffer from much degradation: crack, softening, artifact, human damages and, overtime corruption. Considering that, it appears necessary to develop specific approaches and methods dedicated to digital art painting analysis. Consequently, this paper presents a complete framework to evaluate, compare and benchmark devoted to image processing algorithms.

  1. Digital Mapping Techniques '11–12 workshop proceedings

    USGS Publications Warehouse

    Soller, David R.

    2014-01-01

    At these meetings, oral and poster presentations and special discussion sessions emphasized: (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase formats; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  2. Fast Multiclass Segmentation using Diffuse Interface Methods on Graphs

    DTIC Science & Technology

    2013-02-01

    000 28 × 28 images of handwritten digits 0 through 9. Examples of entries can be found in Figure 6. The task is to classify each of the images into the...database of handwritten digits .” [Online]. Available: http://yann.lecun.com/exdb/mnist/ [36] J. Lellmann, J. H. Kappes, J. Yuan, F. Becker, and C...corresponding digit . The images include digits from 0 to 9; thus, this is a 10 class segmentation problem. To construct the weight matrix, we used N

  3. The wavelet/scalar quantization compression standard for digital fingerprint images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, J.N.; Brislawn, C.M.

    1994-04-01

    A new digital image compression standard has been adopted by the US Federal Bureau of Investigation for use on digitized gray-scale fingerprint images. The algorithm is based on adaptive uniform scalar quantization of a discrete wavelet transform image decomposition and is referred to as the wavelet/scalar quantization standard. The standard produces archival quality images at compression ratios of around 20:1 and will allow the FBI to replace their current database of paper fingerprint cards with digital imagery.

  4. Primary Multimedia Objects and 'Educational Metadata' A Fundamental Dilemma for Developers of Multimedia Archives; Evaluation of Digital Library Impact and User Communities by Analysis of Usage Patterns; The KYVL Kentuckiana Digital Library Project: Background and Current Status; DPDx Collection.

    ERIC Educational Resources Information Center

    Shabajee, Paul; Bollen, Johan; Luce, Rick; Weig, Eric

    2002-01-01

    Includes four articles that discuss multimedia educational database systems and the use of metadata, including repurposing; the evaluation of digital library use that analyzes the retrieval habits of users; the Kentucky Virtual Library (KYVL) and digital collection project; and the collection of the Division of Parasitic Diseases, Centers for…

  5. Global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David; Porter, Keith

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

  6. Digital Systems Supporting Cognition and Exploratory Learning in Twenty-First Century: Guest Editorial

    ERIC Educational Resources Information Center

    Spector, J. Michael; Ifenthaler, Dirk; Sampson, Demetrios G.

    2016-01-01

    Digital systems and digital technologies are globally investigated for their potential to transform learning, teaching and assessment towards offering unique learning experiences to the twenty-first century learners. This Special Issue on "Digital systems supporting cognition and exploratory learning in twenty-first century" aims to…

  7. Developing Students' Professional Digital Identity

    ERIC Educational Resources Information Center

    Cochrane, Thomas; Antonczak, Laurent

    2015-01-01

    In contrast to the myth of the "Digital Native" and the ubiquity of Facebook use, we have found that students' digital identities are predominantly social with their online activity beyond Facebook limited to being social media consumers rather than producers. Within a global economy students need to learn new digital literacy skills to…

  8. Modis, SeaWIFS, and Pathfinder funded activities

    NASA Technical Reports Server (NTRS)

    Evans, Robert H.

    1995-01-01

    MODIS (Moderate Resolution Imaging Spectrometer), SeaWIFS (Sea-viewing Wide Field Sensor), Pathfinder, and DSP (Digital Signal Processor) objectives are summarized. An overview of current progress is given for the automatic processing database, client/server status, matchup database, and DSP support.

  9. GISD

    Science.gov Websites

    GISD Global invasive species database Home About the GISD How to use Contacts 100 of the worst ) Bromus rubens Line drawing of Bromus rubens (USDA-NRCS PLANTS Database / Hitchcock, A.S. (rev. A. Chase GISD ISPRA SNPA The Global Invasive Species Database was developed and is managed by the Invasive

  10. Global hierarchical classification of deepwater and wetland environments from remote sensing products

    NASA Astrophysics Data System (ADS)

    Fluet-Chouinard, E.; Lehner, B.; Aires, F.; Prigent, C.; McIntyre, P. B.

    2017-12-01

    Global surface water maps have improved in spatial and temporal resolutions through various remote sensing methods: open water extents with compiled Landsat archives and inundation with topographically downscaled multi-sensor retrievals. These time-series capture variations through time of open water and inundation without discriminating between hydrographic features (e.g. lakes, reservoirs, river channels and wetland types) as other databases have done as static representation. Available data sources present the opportunity to generate a comprehensive map and typology of aquatic environments (deepwater and wetlands) that improves on earlier digitized inventories and maps. The challenge of classifying surface waters globally is to distinguishing wetland types with meaningful characteristics or proxies (hydrology, water chemistry, soils, vegetation) while accommodating limitations of remote sensing data. We present a new wetland classification scheme designed for global application and produce a map of aquatic ecosystem types globally using state-of-the-art remote sensing products. Our classification scheme combines open water extent and expands it with downscaled multi-sensor inundation data to capture the maximal vegetated wetland extent. The hierarchical structure of the classification is modified from the Cowardin Systems (1979) developed for the USA. The first level classification is based on a combination of landscape positions and water source (e.g. lacustrine, riverine, palustrine, coastal and artificial) while the second level represents the hydrologic regime (e.g. perennial, seasonal, intermittent and waterlogged). Class-specific descriptors can further detail the wetland types with soils and vegetation cover. Our globally consistent nomenclature and top-down mapping allows for direct comparison across biogeographic regions, to upscale biogeochemical fluxes as well as other landscape level functions.

  11. The global compendium of Aedes aegypti and Ae. albopictus occurrence

    NASA Astrophysics Data System (ADS)

    Kraemer, Moritz U. G.; Sinka, Marianne E.; Duda, Kirsten A.; Mylne, Adrian; Shearer, Freya M.; Brady, Oliver J.; Messina, Jane P.; Barker, Christopher M.; Moore, Chester G.; Carvalho, Roberta G.; Coelho, Giovanini E.; van Bortel, Wim; Hendrickx, Guy; Schaffner, Francis; Wint, G. R. William; Elyazar, Iqbal R. F.; Teng, Hwa-Jen; Hay, Simon I.

    2015-07-01

    Aedes aegypti and Ae. albopictus are the main vectors transmitting dengue and chikungunya viruses. Despite being pathogens of global public health importance, knowledge of their vectors’ global distribution remains patchy and sparse. A global geographic database of known occurrences of Ae. aegypti and Ae. albopictus between 1960 and 2014 was compiled. Herein we present the database, which comprises occurrence data linked to point or polygon locations, derived from peer-reviewed literature and unpublished studies including national entomological surveys and expert networks. We describe all data collection processes, as well as geo-positioning methods, database management and quality-control procedures. This is the first comprehensive global database of Ae. aegypti and Ae. albopictus occurrence, consisting of 19,930 and 22,137 geo-positioned occurrence records respectively. Both datasets can be used for a variety of mapping and spatial analyses of the vectors and, by inference, the diseases they transmit.

  12. The global compendium of Aedes aegypti and Ae. albopictus occurrence

    PubMed Central

    Kraemer, Moritz U. G.; Sinka, Marianne E.; Duda, Kirsten A.; Mylne, Adrian; Shearer, Freya M.; Brady, Oliver J.; Messina, Jane P.; Barker, Christopher M.; Moore, Chester G.; Carvalho, Roberta G.; Coelho, Giovanini E.; Van Bortel, Wim; Hendrickx, Guy; Schaffner, Francis; Wint, G. R. William; Elyazar, Iqbal R. F.; Teng, Hwa-Jen; Hay, Simon I.

    2015-01-01

    Aedes aegypti and Ae. albopictus are the main vectors transmitting dengue and chikungunya viruses. Despite being pathogens of global public health importance, knowledge of their vectors’ global distribution remains patchy and sparse. A global geographic database of known occurrences of Ae. aegypti and Ae. albopictus between 1960 and 2014 was compiled. Herein we present the database, which comprises occurrence data linked to point or polygon locations, derived from peer-reviewed literature and unpublished studies including national entomological surveys and expert networks. We describe all data collection processes, as well as geo-positioning methods, database management and quality-control procedures. This is the first comprehensive global database of Ae. aegypti and Ae. albopictus occurrence, consisting of 19,930 and 22,137 geo-positioned occurrence records respectively. Both datasets can be used for a variety of mapping and spatial analyses of the vectors and, by inference, the diseases they transmit. PMID:26175912

  13. WorldView-2 and the evolution of the DigitalGlobe remote sensing satellite constellation: introductory paper for the special session on WorldView-2

    NASA Astrophysics Data System (ADS)

    Anderson, Neal T.; Marchisio, Giovanni B.

    2012-06-01

    Over the last decade DigitalGlobe (DG) has built and launched a series of remote sensing satellites with steadily increasing capabilities: QuickBird, WorldView-1 (WV-1), and WorldView-2 (WV-2). Today, this constellation acquires over 2.5 million km2 of imagery on a daily basis. This paper presents the configuration and performance capabilities of each of these satellites, with emphasis on the unique spatial and spectral capabilities of WV-2. WV-2 employs high-precision star tracker and inertial measurement units to achieve a geolocation accuracy of 5 m Circular Error, 90% confidence (CE90). The native resolution of WV-2 is 0.5 m GSD in the panchromatic band and 2 m GSD in 8 multispectral bands. Four of the multispectral bands match those of the Landsat series of satellites; four new bands enable novel and expanded applications. We are rapidly establishing and refreshing a global database of very high resolution (VHR) 8-band multispectral imagery. Control moment gyroscopes (CMGs) on both WV-1 and WV-2 improve collection capacity and provide the agility to capture multi-angle sequences in rapid succession. These capabilities result in a rich combination of image features that can be exploited to develop enhanced monitoring solutions. Algorithms for interpretation and analysis can leverage: 1) broader and more continuous spectral coverage at 2 m resolution; 2) textural and morphological information from the 0.5 m panchromatic band; 3) ancillary information from stereo and multi-angle collects, including high precision digital elevation models; 4) frequent revisits and time-series collects; and 5) the global reference image archives. We introduce the topic of creative fusion of image attributes, as this provides a unifying theme for many of the papers in this WV-2 Special Session.

  14. A design for the geoinformatics system

    NASA Astrophysics Data System (ADS)

    Allison, M. L.

    2002-12-01

    Informatics integrates and applies information technologies with scientific and technical disciplines. A geoinformatics system targets the spatially based sciences. The system is not a master database, but will collect pertinent information from disparate databases distributed around the world. Seamless interoperability of databases promises quantum leaps in productivity not only for scientific researchers but also for many areas of society including business and government. The system will incorporate: acquisition of analog and digital legacy data; efficient information and data retrieval mechanisms (via data mining and web services); accessibility to and application of visualization, analysis, and modeling capabilities; online workspace, software, and tutorials; GIS; integration with online scientific journal aggregates and digital libraries; access to real time data collection and dissemination; user-defined automatic notification and quality control filtering for selection of new resources; and application to field techniques such as mapping. In practical terms, such a system will provide the ability to gather data over the Web from a variety of distributed sources, regardless of computer operating systems, database formats, and servers. Search engines will gather data about any geographic location, above, on, or below ground, covering any geologic time, and at any scale or detail. A distributed network of digital geolibraries can archive permanent copies of databases at risk of being discontinued and those that continue to be maintained by the data authors. The geoinformatics system will generate results from widely distributed sources to function as a dynamic data network. Instead of posting a variety of pre-made tables, charts, or maps based on static databases, the interactive dynamic system creates these products on the fly, each time an inquiry is made, using the latest information in the appropriate databases. Thus, in the dynamic system, a map generated today may differ from one created yesterday and one to be created tomorrow, because the databases used to make it are constantly (and sometimes automatically) being updated.

  15. Synthesis of phylogeny and taxonomy into a comprehensive tree of life

    PubMed Central

    Hinchliff, Cody E.; Smith, Stephen A.; Allman, James F.; Burleigh, J. Gordon; Chaudhary, Ruchi; Coghill, Lyndon M.; Crandall, Keith A.; Deng, Jiabin; Drew, Bryan T.; Gazis, Romina; Gude, Karl; Hibbett, David S.; Katz, Laura A.; Laughinghouse, H. Dail; McTavish, Emily Jane; Midford, Peter E.; Owen, Christopher L.; Ree, Richard H.; Rees, Jonathan A.; Soltis, Douglas E.; Williams, Tiffani; Cranston, Karen A.

    2015-01-01

    Reconstructing the phylogenetic relationships that unite all lineages (the tree of life) is a grand challenge. The paucity of homologous character data across disparately related lineages currently renders direct phylogenetic inference untenable. To reconstruct a comprehensive tree of life, we therefore synthesized published phylogenies, together with taxonomic classifications for taxa never incorporated into a phylogeny. We present a draft tree containing 2.3 million tips—the Open Tree of Life. Realization of this tree required the assembly of two additional community resources: (i) a comprehensive global reference taxonomy and (ii) a database of published phylogenetic trees mapped to this taxonomy. Our open source framework facilitates community comment and contribution, enabling the tree to be continuously updated when new phylogenetic and taxonomic data become digitally available. Although data coverage and phylogenetic conflict across the Open Tree of Life illuminate gaps in both the underlying data available for phylogenetic reconstruction and the publication of trees as digital objects, the tree provides a compelling starting point for community contribution. This comprehensive tree will fuel fundamental research on the nature of biological diversity, ultimately providing up-to-date phylogenies for downstream applications in comparative biology, ecology, conservation biology, climate change, agriculture, and genomics. PMID:26385966

  16. Synthesis of phylogeny and taxonomy into a comprehensive tree of life.

    PubMed

    Hinchliff, Cody E; Smith, Stephen A; Allman, James F; Burleigh, J Gordon; Chaudhary, Ruchi; Coghill, Lyndon M; Crandall, Keith A; Deng, Jiabin; Drew, Bryan T; Gazis, Romina; Gude, Karl; Hibbett, David S; Katz, Laura A; Laughinghouse, H Dail; McTavish, Emily Jane; Midford, Peter E; Owen, Christopher L; Ree, Richard H; Rees, Jonathan A; Soltis, Douglas E; Williams, Tiffani; Cranston, Karen A

    2015-10-13

    Reconstructing the phylogenetic relationships that unite all lineages (the tree of life) is a grand challenge. The paucity of homologous character data across disparately related lineages currently renders direct phylogenetic inference untenable. To reconstruct a comprehensive tree of life, we therefore synthesized published phylogenies, together with taxonomic classifications for taxa never incorporated into a phylogeny. We present a draft tree containing 2.3 million tips-the Open Tree of Life. Realization of this tree required the assembly of two additional community resources: (i) a comprehensive global reference taxonomy and (ii) a database of published phylogenetic trees mapped to this taxonomy. Our open source framework facilitates community comment and contribution, enabling the tree to be continuously updated when new phylogenetic and taxonomic data become digitally available. Although data coverage and phylogenetic conflict across the Open Tree of Life illuminate gaps in both the underlying data available for phylogenetic reconstruction and the publication of trees as digital objects, the tree provides a compelling starting point for community contribution. This comprehensive tree will fuel fundamental research on the nature of biological diversity, ultimately providing up-to-date phylogenies for downstream applications in comparative biology, ecology, conservation biology, climate change, agriculture, and genomics.

  17. Video Altimeter and Obstruction Detector for an Aircraft

    NASA Technical Reports Server (NTRS)

    Delgado, Frank J.; Abernathy, Michael F.; White, Janis; Dolson, William R.

    2013-01-01

    Video-based altimetric and obstruction detection systems for aircraft have been partially developed. The hardware of a system of this type includes a downward-looking video camera, a video digitizer, a Global Positioning System receiver or other means of measuring the aircraft velocity relative to the ground, a gyroscope based or other attitude-determination subsystem, and a computer running altimetric and/or obstruction-detection software. From the digitized video data, the altimetric software computes the pixel velocity in an appropriate part of the video image and the corresponding angular relative motion of the ground within the field of view of the camera. Then by use of trigonometric relationships among the aircraft velocity, the attitude of the camera, the angular relative motion, and the altitude, the software computes the altitude. The obstruction-detection software performs somewhat similar calculations as part of a larger task in which it uses the pixel velocity data from the entire video image to compute a depth map, which can be correlated with a terrain map, showing locations of potential obstructions. The depth map can be used as real-time hazard display and/or to update an obstruction database.

  18. GTN-G, WGI, RGI, DCW, GLIMS, WGMS, GCOS - What's all this about? (Invited)

    NASA Astrophysics Data System (ADS)

    Paul, F.; Raup, B. H.; Zemp, M.

    2013-12-01

    In a large collaborative effort, the glaciological community has compiled a new and spa-tially complete global dataset of glacier outlines, the so-called Randolph Glacier Inventory or RGI. Despite its regional shortcomings in quality (e.g. in regard to geolocation, gener-alization, and interpretation), this dataset was heavily used for global-scale modelling ap-plications (e.g. determination of total glacier volume and glacier contribution to sea-level rise) in support of the forthcoming 5th Assessment Report (AR5) of Working Group I of the IPCC. The RGI is a merged dataset that is largely based on the GLIMS database and several new datasets provided by the community (both are mostly derived from satellite data), as well as the Digital Chart of the World (DCW) and glacier attribute information (location, size) from the World Glacier Inventory (WGI). There are now two key tasks to be performed, (1) improving the quality of the RGI in all regions where the outlines do not met the quality required for local scale applications, and (2) integrating the RGI in the GLIMS glacier database to improve its spatial completeness. While (1) requires again a huge effort but is already ongoing, (2) is mainly a technical issue that is nearly solved. Apart from this technical dimension, there is also a more political or structural one. While GLIMS is responsible for the remote sensing and glacier inventory part (Tier 5) of the Global Terrestrial Network for Glaciers (GTN-G) within the Global Climate Observing System (GCOS), the World Glacier Monitoring Service (WGMS) is collecting and dis-seminating the field observations. Along with new global products derived from satellite data (e.g. elevation changes and velocity fields) and the community wish to keep a snap-shot dataset such as the RGI available, how to make all these datasets available to the community without duplicating efforts and making best use of the very limited financial resources available must now be discussed. This overview presentation describes the cur-rently available datasets, clarifying the terminology and the international framework, and suggesting a way forward to serve the community at best.

  19. 78 FR 51744 - Digital Trade in the U.S. and Global Economies, Part 2; Scheduling of an Additional Public Hearing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-21

    ...The Commission has scheduled a public hearing in investigation No. 332-540, Digital Trade in the U.S. and Global Economies, Part 2 at the NASA Ames Research Center in Moffett Field, California beginning at 9:30 a.m. on Wednesday, September 25, 2013.

  20. Essays on the Digital Divide--Explorations through Global, National and Individual Lenses

    ERIC Educational Resources Information Center

    Skaletsky, Maria

    2013-01-01

    The Digital Divide has emerged as an important research and policy issue during the past thirty years. The divide exists at different levels, such as global, regional and individual levels. While extensive research already exists on this subject, the complexity of the issue presents opportunities for further research. In particular, there is ample…

  1. Explaining the Global Digital Divide: Economic, Political and Sociological Drivers of Cross-National Internet Use

    ERIC Educational Resources Information Center

    Guillen, Mauro F.; Suarez, Sandra L.

    2005-01-01

    We argue that the global digital divide, as measured by cross-national differences in Internet use, is the result of the economic, regulatory and sociopolitical characteristics of countries and their evolution over time. We predict Internet use to increase with world-system status, privatization and competition in the telecommunications sector,…

  2. Librarians Lead the Growth of Information Literacy and Global Digital Citizens

    ERIC Educational Resources Information Center

    Crockett, Lee Watanabe

    2018-01-01

    Librarians are leaders in growing global digital citizens. The libraries of the future are more than just housing centers for books and media. They are invigorating meeting places and communities where truly meaningful learning and discovery take place. As technology has transformed reading and learning, it has also transformed the vision of the…

  3. ESA personal communications and digital audio broadcasting systems based on non-geostationary satellites

    NASA Technical Reports Server (NTRS)

    Logalbo, P.; Benedicto, J.; Viola, R.

    1993-01-01

    Personal Communications and Digital Audio Broadcasting are two new services that the European Space Agency (ESA) is investigating for future European and Global Mobile Satellite systems. ESA is active in promoting these services in their various mission options including non-geostationary and geostationary satellite systems. A Medium Altitude Global Satellite System (MAGSS) for global personal communications at L and S-band, and a Multiregional Highly inclined Elliptical Orbit (M-HEO) system for multiregional digital audio broadcasting at L-band are described. Both systems are being investigated by ESA in the context of future programs, such as Archimedes, which are intended to demonstrate the new services and to develop the technology for future non-geostationary mobile communication and broadcasting satellites.

  4. A georeferenced Landsat digital database for forest insect-damage assessment

    NASA Technical Reports Server (NTRS)

    Williams, D. L.; Nelson, R. F.; Dottavio, C. L.

    1985-01-01

    In 1869, the gypsy moth caterpillar was introduced in the U.S. in connection with the experiments of a French scientist. Throughout the insect's period of establishment, gypsy moth populations have periodically increased to epidemic proportions. For programs concerned with preventing the insect's spread, it would be highly desirable to be able to employ a survey technique which could provide timely, accurate, and standardized assessments at a reasonable cost. A project was, therefore, initiated with the aim to demonstrate the usefulness of satellite remotely sensed data for monitoring the insect defoliation of hardwood forests in Pennsylvania. A major effort within this project involved the development of a map-registered Landsat digital database. A complete description of the database developed is provided along with information regarding the employed data management system.

  5. Geophysical Log Database for the Mississippi Embayment Regional Aquifer Study (MERAS)

    USGS Publications Warehouse

    Hart, Rheannon M.; Clark, Brian R.

    2008-01-01

    The Mississippi Embayment Regional Aquifer Study (MERAS) is an investigation of ground-water availability and sustainability within the Mississippi embayment as part of the U.S. Geological Survey Ground-Water Resources Program. The MERAS area consists of approximately 70,000 square miles and encompasses parts of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. More than 2,600 geophysical logs of test holes and wells within the MERAS area were compiled into a database and were used to develop a digital hydrogeologic framework from land surface to the top of the Midway Group of upper Paleocene age. The purpose of this report is to document, present, and summarize the geophysical log database, as well as to preserve the geophysical logs in a digital image format for online access.

  6. Automatic Mexican sign language and digits recognition using normalized central moments

    NASA Astrophysics Data System (ADS)

    Solís, Francisco; Martínez, David; Espinosa, Oscar; Toxqui, Carina

    2016-09-01

    This work presents a framework for automatic Mexican sign language and digits recognition based on computer vision system using normalized central moments and artificial neural networks. Images are captured by digital IP camera, four LED reflectors and a green background in order to reduce computational costs and prevent the use of special gloves. 42 normalized central moments are computed per frame and used in a Multi-Layer Perceptron to recognize each database. Four versions per sign and digit were used in training phase. 93% and 95% of recognition rates were achieved for Mexican sign language and digits respectively.

  7. Infectious diseases and global warming: Tracking disease incidence rates globally

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Low, N.C.

    1995-09-01

    Given the increasing importance of impact of global warming on public health, there is no global database system to monitor infectious disease and disease in general, and to which global data of climate change and environmental factors, such as temperature, greenhouse gases, and human activities, e.g., coastal development, deforestation, can be calibrated, investigated and correlated. The author proposes the diseases incidence rates be adopted as the basic global measure of morbidity of infectious diseases. The importance of a correctly chosen measure of morbidity of disease is presented. The importance of choosing disease incidence rates as the measure of morbidity andmore » the mathematical foundation of which are discussed. The author further proposes the establishment of a global database system to track the incidence rates of infectious diseases. Only such a global system can be used to calibrate and correlate other globally tracked climatic, greenhouse gases and environmental data. The infrastructure and data sources for building such a global database is discussed.« less

  8. Global Ground Motion Prediction Equations Program | Just another WordPress

    Science.gov Websites

    Motion Task 2: Compile and Critically Review GMPEs Task 3: Select or Derive a Global Set of GMPEs Task 6 : Design the Specifications to Compile a Global Database of Soil Classification Task 5: Build a Database of Update on PEER's Global GMPEs Project from recent workshop in Turkey Posted on June 11, 2012 During May

  9. Geologic Map of the Wenatchee 1:100,000 Quadrangle, Central Washington: A Digital Database

    USGS Publications Warehouse

    Tabor, R.W.; Waitt, R.B.; Frizzell, V.A.; Swanson, D.A.; Byerly, G.R.; Bentley, R.D.

    2005-01-01

    This digital map database has been prepared by R.W. Tabor from the published Geologic map of the Wenatchee 1:100,000 Quadrangle, Central Washington. Together with the accompanying text files as PDF, it provides information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The authors mapped most of the bedrock geology at 1:100,000 scale, but compiled Quaternary units at 1:24,000 scale. The Quaternary contacts and structural data have been much simplified for the 1:100,000-scale map and database. The spatial resolution (scale) of the database is 1:100,000 or smaller. This database depicts the distribution of geologic materials and structures at a regional (1:100,000) scale. The report is intended to provide geologic information for the regional study of materials properties, earthquake shaking, landslide potential, mineral hazards, seismic velocity, and earthquake faults. In addition, the report contains information and interpretations about the regional geologic history and framework. However, the regional scale of this report does not provide sufficient detail for site development purposes.

  10. Soil and Land Resources Information System (SLISYS-Tarim) for Sustainable Management of River Oases along the Tarim River, China

    NASA Astrophysics Data System (ADS)

    Othmanli, Hussein; Zhao, Chengyi; Stahr, Karl

    2017-04-01

    The Tarim River Basin is the largest continental basin in China. The region has extremely continental desert climate characterized by little rainfall <50 mm/a and high potential evaporation >3000 mm/a. The climate change is affecting severely the basin causing soil salinization, water shortage, and regression in crop production. Therefore, a Soil and Land Resources Information System (SLISYS-Tarim) for the regional simulation of crop yield production in the basin was developed. The SLISYS-Tarim consists of a database and an agro-ecological simulation model EPIC (Environmental Policy Integrated Climate). The database comprises relational tables including information about soils, terrain conditions, land use, and climate. The soil data implicate information of 50 soil profiles which were dug, analyzed, described and classified in order to characterize the soils in the region. DEM data were integrated with geological maps to build a digital terrain structure. Remote sensing data of Landsat images were applied for soil mapping, and for land use and land cover classification. An additional database for climate data, land management and crop information were linked to the system, too. Construction of the SLISYS-Tarim database was accomplished by integrating and overlaying the recommended thematic maps within environment of the geographic information system (GIS) to meet the data standard of the global and national SOTER digital database. This database forms appropriate input- and output data for the crop modelling with the EPIC model at various scales in the Tarim Basin. The EPIC model was run for simulating cotton production under a constructed scenario characterizing the current management practices, soil properties and climate conditions. For the EPIC model calibration, some parameters were adjusted so that the modeled cotton yield fits to the measured yield on the filed scale. The validation of the modeling results was achieved in a later step based on remote sensing data. The simulated cotton yield varied according to field management, soil type and salinity level, where soil salinity was the main limiting factor. Furthermore, the calibrated and validated EPIC model was run under several scenarios of climate conditions and land management practices to estimate the effect of climate change on cotton production and sustainability of agriculture systems in the basin. The application of SLISYS-Tarim showed that this database can be a suitable framework for storage and retrieval of soil and terrain data at various scales. The simulation with the EPIC model can assess the impact of climate change and management strategies. Therefore, SLISYS-Tarim can be a good tool for regional planning and serve the decision support system on regional and national scale.

  11. A global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  12. MyLibrary: A Web Personalized Digital Library.

    ERIC Educational Resources Information Center

    Rocha, Catarina; Xexeo, Geraldo; da Rocha, Ana Regina C.

    With the increasing availability of information on Internet information providers, like search engines, digital libraries and online databases, it becomes more important to have personalized systems that help users to find relevant information. One type of personalization that is growing in use is recommender systems. This paper presents…

  13. Preserving the 'Athens of Indiana' through Digitization.

    ERIC Educational Resources Information Center

    Helling, Bill

    2003-01-01

    Describes a digitization project at the public library in Crawfordsville, Indiana that was designed to preserve their local history collection. Highlights include damage to the collection from fire, termites, use, and age; selecting a scanner and software; creating databases; and making information accessible on the Web. (LRW)

  14. Tapping into the Hexagon spy imagery database: A new automated pipeline for geomorphic change detection

    NASA Astrophysics Data System (ADS)

    Maurer, Joshua; Rupper, Summer

    2015-10-01

    Declassified historical imagery from the Hexagon spy satellite database has near-global coverage, yet remains a largely untapped resource for geomorphic change studies. Unavailable satellite ephemeris data make DEM (digital elevation model) extraction difficult in terms of time and accuracy. A new fully-automated pipeline for DEM extraction and image orthorectification is presented which yields accurate results and greatly increases efficiency over traditional photogrammetric methods, making the Hexagon image database much more appealing and accessible. A 1980 Hexagon DEM is extracted and geomorphic change computed for the Thistle Creek Landslide region in the Wasatch Range of North America to demonstrate an application of the new method. Surface elevation changes resulting from the landslide show an average elevation decrease of 14.4 ± 4.3 m in the source area, an increase of 17.6 ± 4.7 m in the deposition area, and a decrease of 30.2 ± 5.1 m resulting from a new roadcut. Two additional applications of the method include volume estimates of material excavated during the Mount St. Helens volcanic eruption and the volume of net ice loss over a 34-year period for glaciers in the Bhutanese Himalayas. These results show the value of Hexagon imagery in detecting and quantifying historical geomorphic change, especially in regions where other data sources are limited.

  15. Topside Ionogram Scaler With True Height Algorithm (TOPIST): Automated processing of ISIS topside ionograms

    NASA Astrophysics Data System (ADS)

    Bilitza, Dieter; Huang, Xueqin; Reinisch, Bodo W.; Benson, Robert F.; Hills, H. Kent; Schar, William B.

    2004-02-01

    The United States/Canadian ISIS-1 and ISIS-2 satellites collected several million topside ionograms in the 1960s and 1970s with a multinational network of ground stations that provided good global coverage. However, processing of these ionograms into electron density profiles required time-consuming manual scaling of the traces from the analog ionograms, and as a result, only a few percent of the ionograms had been processed into electron density profiles. In recent years an effort began to digitize the analog recordings to prepare the ionograms for computerized analysis. As of November 2002, approximately 390,000 ISIS-1 and ISIS-2 digital topside-sounder ionograms have been produced. The Topside Ionogram Scaler With True Height Algorithm (TOPIST) program was developed for the automated scaling of the echo traces and for the inversion of these traces into topside electron density profiles. The program is based on the techniques that have been successfully applied in the analysis of ground-based Digisonde ionograms. The TOPIST software also includes an "editing option" for manual scaling of the more difficult ionograms, which could not be scaled during the automated TOPIST run. TOPIST is now successfully scaling ˜60% of the ISIS ionograms, and the electron density profiles are available through the online archive of the National Space Science Data Center at ftp://nssdcftp.gsfc.nasa.gov/spacecraft_data/isis/topside_sounder. This data restoration effort is producing a unique global database of topside electron densities over more than one solar cycle, which will be of particular importance for improvements of topside ionosphere models, especially the International Reference Ionosphere.

  16. Generation of the 30 M-Mesh Global Digital Surface Model by Alos Prism

    NASA Astrophysics Data System (ADS)

    Tadono, T.; Nagai, H.; Ishida, H.; Oda, F.; Naito, S.; Minakawa, K.; Iwamoto, H.

    2016-06-01

    Topographical information is fundamental to many geo-spatial related information and applications on Earth. Remote sensing satellites have the advantage in such fields because they are capable of global observation and repeatedly. Several satellite-based digital elevation datasets were provided to examine global terrains with medium resolutions e.g. the Shuttle Radar Topography Mission (SRTM), the global digital elevation model by the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER GDEM). A new global digital surface model (DSM) dataset using the archived data of the Panchromatic Remote-sensing Instrument for Stereo Mapping (PRISM) onboard the Advanced Land Observing Satellite (ALOS, nicknamed "Daichi") has been completed on March 2016 by Japan Aerospace Exploration Agency (JAXA) collaborating with NTT DATA Corp. and Remote Sensing Technology Center, Japan. This project is called "ALOS World 3D" (AW3D), and its dataset consists of the global DSM dataset with 0.15 arcsec. pixel spacing (approx. 5 m mesh) and ortho-rectified PRISM image with 2.5 m resolution. JAXA is also processing the global DSM with 1 arcsec. spacing (approx. 30 m mesh) based on the AW3D DSM dataset, and partially releasing it free of charge, which calls "ALOS World 3D 30 m mesh" (AW3D30). The global AW3D30 dataset will be released on May 2016. This paper describes the processing status, a preliminary validation result of the AW3D30 DSM dataset, and its public release status. As a summary of the preliminary validation of AW3D30 DSM, 4.40 m (RMSE) of the height accuracy of the dataset was confirmed using 5,121 independent check points distributed in the world.

  17. Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1 degree x 2 degrees quadrangle and part of the southern part of the Challis 1 degree x 2 degrees quadrangle, south-central Idaho

    USGS Publications Warehouse

    Link, P.K.; Mahoney, J.B.; Bruner, D.J.; Batatian, L.D.; Wilson, Eric; Williams, F.J.C.

    1995-01-01

    The paper version of the Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1x2 Quadrangle and part of the southern part of the Challis 1x2 Quadrangle, south-central Idaho was compiled by Paul Link and others in 1995. The plate was compiled on a 1:100,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a GIS database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  18. Inter-Annual Variability of the Acoustic Propagation in the Mediterranean Sea Identified from a Synoptic Monthly Gridded Database as Compared with GDEM

    DTIC Science & Technology

    2016-12-01

    VARIABILITY OF THE ACOUSTIC PROPAGATION IN THE MEDITERRANEAN SEA IDENTIFIED FROM A SYNOPTIC MONTHLY GRIDDED DATABASE AS COMPARED WITH GDEM by...ANNUAL VARIABILITY OF THE ACOUSTIC PROPAGATION IN THE MEDITERRANEAN SEA IDENTIFIED FROM A SYNOPTIC MONTHLY GRIDDED DATABASE AS COMPARED WITH GDEM 5...profiles obtained from the synoptic monthly gridded World Ocean Database (SMD-WOD) and Generalized Digital Environmental Model (GDEM) temperature (T

  19. Global Digital Image Mosaics of Mars: Assessment of Geodetic Accuracy

    NASA Technical Reports Server (NTRS)

    Kirk, R.; Archinal, B. A.; Lee, E. M.; Davies, M. E.; Colvin, T. R.; Duxbury, T. C.

    2001-01-01

    A revised global image mosaic of Mars (MDIM 2.0) was recently completed by USGS. Comparison with high-resolution gridded Mars Orbiter Laser Altimeter (MOLA) digital image mosaics will allow us to quantify its geodetic errors; linking the next MDIM to the MOLA data will help eliminate those errors. Additional information is contained in the original extended abstract.

  20. Digital Mapping Techniques '08—Workshop Proceedings, Moscow, Idaho, May 18–21, 2008

    USGS Publications Warehouse

    Soller, David R.

    2009-01-01

    The Digital Mapping Techniques '08 (DMT'08) workshop was attended by more than 100 technical experts from 40 agencies, universities, and private companies, including representatives from 24 State geological surveys. This year's meeting, the twelfth in the annual series, was hosted by the Idaho Geological Survey, from May 18-21, 2008, on the University of Idaho campus in Moscow, Idaho. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  1. Council of International Neonatal Nurses (COINN) Global Neonatal Provider Database Initiative (CGNPD): Results From an Implementation Focus Group.

    PubMed

    Eklund, Wakako; Kenner, Carole

    2015-12-01

    The neonatal nurses are the key component of the essential workforce necessary to address the healthcare needs of the infants globally. The paucity of the data regarding the availability and training of the neonatal workforce challenges the stakeholders at the regional, national, and global levels. The lack of these data makes strategic planning for initiatives especially in low-resourced countries difficult. Up-to-date data are critically needed to describe the role neonatal nurses play in global newborn health outcomes. The purpose of the COINN Global Neonatal Provider Database Initiative (CGNPD) was to develop a workforce database by developing survey questions, conducting a focus group to determine the key reasons such a database was needed and how best to implement it, and incorporating these comments into the workforce survey and launch. Pilot testing of the draft survey instrument was done. This article reports on the findings from the focus group and the development of the survey. A qualitative design using the focus group method was used. The focus group discussions were guided by semi-structured interview questions that had been developed prior to the focus group by neonatal experts. A convenience sample of 14 members from the international delegates and project advisory members who attended the COINN 2013 in Belfast, Northern Ireland, participated. These participants represented 10 countries. Thematic analysis was conducted using verbatim transcripts of the focus group data. Four main themes emerged: (1) the invisibility of neonatal nurses, (2) benchmarking needs for quality and standards, (3) need for partnership to implement the database, and (4) setting priorities for variables needed for the most salient database. The questionnaire examined participants' perceptions of the significance of and the future utilization of the workforce database and elements that should be included in the survey. The global neonatal workforce database is needed to describe who the neonatal nurses are in each country, what they do, how they are trained, and where they work. The data from the focus group aided in the development of the workforce survey that has been pilot tested and provides critical information to guide COINN's global implementation of the database project.

  2. Users as essential contributors to spatial cyberinfrastructures

    PubMed Central

    Poore, Barbara S.

    2011-01-01

    Current accounts of spatial cyberinfrastructure development tend to overemphasize technologies to the neglect of critical social and cultural issues on which adoption depends. Spatial cyberinfrastructures will have a higher chance of success if users of many types, including nonprofessionals, are made central to the development process. Recent studies in the history of infrastructures reveal key turning points and issues that should be considered in the development of spatial cyberinfrastructure projects. These studies highlight the importance of adopting qualitative research methods to learn how users work with data and digital tools, and how user communities form. The author's empirical research on data sharing networks in the Pacific Northwest salmon crisis at the turn of the 21st century demonstrates that ordinary citizens can contribute critical local knowledge to global databases and should be considered in the design and construction of spatial cyberinfrastructures. PMID:21444825

  3. Users as essential contributors to spatial cyberinfrastructures.

    PubMed

    Poore, Barbara S

    2011-04-05

    Current accounts of spatial cyberinfrastructure development tend to overemphasize technologies to the neglect of critical social and cultural issues on which adoption depends. Spatial cyberinfrastructures will have a higher chance of success if users of many types, including nonprofessionals, are made central to the development process. Recent studies in the history of infrastructures reveal key turning points and issues that should be considered in the development of spatial cyberinfrastructure projects. These studies highlight the importance of adopting qualitative research methods to learn how users work with data and digital tools, and how user communities form. The author's empirical research on data sharing networks in the Pacific Northwest salmon crisis at the turn of the 21st century demonstrates that ordinary citizens can contribute critical local knowledge to global databases and should be considered in the design and construction of spatial cyberinfrastructures.

  4. Users as essential contributors to spatial cyberinfrastructures

    USGS Publications Warehouse

    Poore, B.S.

    2011-01-01

    Current accounts of spatial cyberinfrastructure development tend to overemphasize technologies to the neglect of critical social and cultural issues on which adoption depends. Spatial cyberinfrastructures will have a higher chance of success if users of many types, including nonprofessionals, are made central to the development process. Recent studies in the history of infrastructures reveal key turning points and issues that should be considered in the development of spatial cyberinfrastructure projects. These studies highlight the importance of adopting qualitative research methods to learn how users work with data and digital tools, and how user communities form. The author's empirical research on data sharing networks in the Pacific Northwest salmon crisis at the turn of the 21st century demonstrates that ordinary citizens can contribute critical local knowledge to global databases and should be considered in the design and construction of spatial cyberinfrastructures.

  5. Evaluations of Threshold and Curvature Mixed Layer Depths by Various Mixing Schemes in the Mediterranean Sea

    DTIC Science & Technology

    2010-01-01

    the 1/8°climatological monthly mean temperature and salinity fields from the Generalized Digital Environmental Model ( GDEM ) climatology (NAVOCEANO...1424. Page 24 of 51 NAVOCEANO, 2003. Database description for the generalized digital environmental model ( GDEM --V) Version 3.0. OAML--DBD--72

  6. Do "Digital Certificates" Hold the Key to Colleges' On-Line Activities?

    ERIC Educational Resources Information Center

    Olsen, Florence

    1999-01-01

    Examines the increasing use of "digital certificates" to validate computer user identity in various applications on college and university campuses, including letting students register for courses, monitoring access to Internet2, and monitoring access to databases and electronic journals. The methodology has been developed by the…

  7. Saving the Information Commons.

    ERIC Educational Resources Information Center

    Bollier, David

    2003-01-01

    Discusses the control of digital content and the stakes for libraries and our democratic culture. Highlights include copyright term extension, the Digital Millennium Copyright Act, use of contract law to limit the public domain, database legislation, trademarks versus the public domain, the void in our cultural vocabulary, and the concept of the…

  8. Cracking the Egg: The South Carolina Digital Library's New Perspective

    ERIC Educational Resources Information Center

    Vinson, Christopher G.; Boyd, Kate Foster

    2008-01-01

    This article explores the historical foundations of the South Carolina Digital Library, a collaborative statewide program that ties together academic special collections and archives, public libraries, state government archives, and other cultural resource institutions in an effort to provide the state with a comprehensive database of online…

  9. The Digital Workforce: Update, August 2000 [and] The Digital Work Force: State Data & Rankings, September 2000.

    ERIC Educational Resources Information Center

    Sargent, John

    The Office of Technology Policy analyzed Bureau of Labor Statistics' growth projections for the core occupational classifications of IT (information technology) workers to assess future demand in the United States. Classifications studied were computer engineers, systems analysts, computer programmers, database administrators, computer support…

  10. Aboriginal Knowledge Traditions in Digital Environments

    ERIC Educational Resources Information Center

    Christie, Michael

    2005-01-01

    According to Manovich (2001), the database and the narrative are natural enemies, each competing for the same territory of human culture. Aboriginal knowledge traditions depend upon narrative through storytelling and other shared performances. The database objectifies and commodifies distillations of such performances and absorbs them into data…

  11. Cenozoic Antarctic DiatomWare/BugCam: An aid for research and teaching

    USGS Publications Warehouse

    Wise, S.W.; Olney, M.; Covington, J.M.; Egerton, V.M.; Jiang, S.; Ramdeen, D.K.; ,; Schrader, H.; Sims, P.A.; Wood, A.S.; Davis, A.; Davenport, D.R.; Doepler, N.; Falcon, W.; Lopez, C.; Pressley, T.; Swedberg, O.L.; Harwood, D.M.

    2007-01-01

    Cenozoic Antarctic DiatomWare/BugCam© is an interactive, icon-driven digital-image database/software package that displays over 500 illustrated Cenozoic Antarctic diatom taxa along with original descriptions (including over 100 generic and 20 family-group descriptions). This digital catalog is designed primarily for use by micropaleontologists working in the field (at sea or on the Antarctic continent) where hard-copy literature resources are limited. This new package will also be useful for classroom/lab teaching as well as for any paleontologists making or refining taxonomic identifications at the microscope. The database (Cenozoic Antarctic DiatomWare) is displayed via a custom software program (BugCam) written in Visual Basic for use on PCs running Windows 95 or later operating systems. BugCam is a flexible image display program that utilizes an intuitive thumbnail “tree” structure for navigation through the database. The data are stored on Micrsosoft EXCEL spread sheets, hence no separate relational database program is necessary to run the package

  12. Image query and indexing for digital x rays

    NASA Astrophysics Data System (ADS)

    Long, L. Rodney; Thoma, George R.

    1998-12-01

    The web-based medical information retrieval system (WebMIRS) allows interned access to databases containing 17,000 digitized x-ray spine images and associated text data from National Health and Nutrition Examination Surveys (NHANES). WebMIRS allows SQL query of the text, and viewing of the returned text records and images using a standard browser. We are now working (1) to determine utility of data directly derived from the images in our databases, and (2) to investigate the feasibility of computer-assisted or automated indexing of the images to support image retrieval of images of interest to biomedical researchers in the field of osteoarthritis. To build an initial database based on image data, we are manually segmenting a subset of the vertebrae, using techniques from vertebral morphometry. From this, we will derive and add to the database vertebral features. This image-derived data will enhance the user's data access capability by enabling the creation of combined SQL/image-content queries.

  13. Apollo Lunar Sample Photograph Digitization Project Update

    NASA Technical Reports Server (NTRS)

    Todd, N. S.; Lofgren, G. E.

    2012-01-01

    This is an update of the progress of a 4-year data restoration project effort funded by the LASER program to digitize photographs of the Apollo lunar rock samples and create high resolution digital images and undertaken by the Astromaterials Acquisition and Curation Office at JSC [1]. The project is currently in its last year of funding. We also provide an update on the derived products that make use of the digitized photos including the Lunar Sample Catalog and Photo Database[2], Apollo Sample data files for GoogleMoon[3].

  14. The GED4GEM project: development of a Global Exposure Database for the Global Earthquake Model initiative

    USGS Publications Warehouse

    Gamba, P.; Cavalca, D.; Jaiswal, K.S.; Huyck, C.; Crowley, H.

    2012-01-01

    In order to quantify earthquake risk of any selected region or a country of the world within the Global Earthquake Model (GEM) framework (www.globalquakemodel.org/), a systematic compilation of building inventory and population exposure is indispensable. Through the consortium of leading institutions and by engaging the domain-experts from multiple countries, the GED4GEM project has been working towards the development of a first comprehensive publicly available Global Exposure Database (GED). This geospatial exposure database will eventually facilitate global earthquake risk and loss estimation through GEM’s OpenQuake platform. This paper provides an overview of the GED concepts, aims, datasets, and inference methodology, as well as the current implementation scheme, status and way forward.

  15. Physiographic rim of the Grand Canyon, Arizona: a digital database

    USGS Publications Warehouse

    Billingsley, George H.; Hampton, Haydee M.

    1999-01-01

    This Open-File report is a digital physiographic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The report does include, however, PostScript and PDF format plot files, each containing an image of the map. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled "For Those Who Don't Use Digital Geologic Map Databases" below. This physiographic map of the Grand Canyon is modified from previous versions by Billingsley and Hendricks (1989), and Billingsley and others (1997). The boundary is drawn approximately along the topographic rim of the Grand Canyon and its tributary canyons between Lees Ferry and Lake Mead (shown in red). Several isolated small mesas, buttes, and plateaus are within this area, which overall encompasses about 2,600 square miles. The Grand Canyon lies within the southwestern part of the Colorado Plateaus of northern Arizona between Lees Ferry, Colorado River Mile 0, and Lake Mead, Colorado River Mile 277. The Colorado River is the corridor for raft trips through the Grand Canyon. Limestone rocks of the Kaibab Formation form most of the north and south rims of the Grand Canyon, and a few volcanic rocks form the north rim of parts of the Uinkaret and Shivwits Plateaus. Limestones of the Redwall Limestone and lower Supai Group form the rim of the Hualapai Plateau area, and Limestones of Devonian and Cambrian age form the boundary rim near the mouth of Grand Canyon at the Lake Mead. The natural physiographic boundary of the Grand Canyon is roughly the area a visitor would first view any part of the Grand Canyon and its tributaries.

  16. A novel method for efficient archiving and retrieval of biomedical images using MPEG-7

    NASA Astrophysics Data System (ADS)

    Meyer, Joerg; Pahwa, Ash

    2004-10-01

    Digital archiving and efficient retrieval of radiological scans have become critical steps in contemporary medical diagnostics. Since more and more images and image sequences (single scans or video) from various modalities (CT/MRI/PET/digital X-ray) are now available in digital formats (e.g., DICOM-3), hospitals and radiology clinics need to implement efficient protocols capable of managing the enormous amounts of data generated daily in a typical clinical routine. We present a method that appears to be a viable way to eliminate the tedious step of manually annotating image and video material for database indexing. MPEG-7 is a new framework that standardizes the way images are characterized in terms of color, shape, and other abstract, content-related criteria. A set of standardized descriptors that are automatically generated from an image is used to compare an image to other images in a database, and to compute the distance between two images for a given application domain. Text-based database queries can be replaced with image-based queries using MPEG-7. Consequently, image queries can be conducted without any prior knowledge of the keys that were used as indices in the database. Since the decoding and matching steps are not part of the MPEG-7 standard, this method also enables searches that were not planned by the time the keys were generated.

  17. Database for volcanic processes and geology of Augustine Volcano, Alaska

    USGS Publications Warehouse

    McIntire, Jacqueline; Ramsey, David W.; Thoms, Evan; Waitt, Richard B.; Beget, James E.

    2012-01-01

    This digital release contains information used to produce the geologic map published as Plate 1 in U.S. Geological Survey Professional Paper 1762 (Waitt and Begét, 2009). The main component of this digital release is a geologic map database prepared using geographic information systems (GIS) applications. This release also contains links to files to view or print the map plate, accompanying measured sections, and main report text from Professional Paper 1762. It should be noted that Augustine Volcano erupted in 2006, after the completion of the geologic mapping shown in Professional Paper 1762 and presented in this database. Information on the 2006 eruption can be found in U.S. Geological Survey Professional Paper 1769. For the most up to date information on the status of Alaska volcanoes, please refer to the U.S. Geological Survey Volcano Hazards Program website.

  18. A spatial national health facility database for public health sector planning in Kenya in 2008.

    PubMed

    Noor, Abdisalan M; Alegana, Victor A; Gething, Peter W; Snow, Robert W

    2009-03-06

    Efforts to tackle the enormous burden of ill-health in low-income countries are hampered by weak health information infrastructures that do not support appropriate planning and resource allocation. For health information systems to function well, a reliable inventory of health service providers is critical. The spatial referencing of service providers to allow their representation in a geographic information system is vital if the full planning potential of such data is to be realized. A disparate series of contemporary lists of health service providers were used to update a public health facility database of Kenya last compiled in 2003. These new lists were derived primarily through the national distribution of antimalarial and antiretroviral commodities since 2006. A combination of methods, including global positioning systems, was used to map service providers. These spatially-referenced data were combined with high-resolution population maps to analyze disparity in geographic access to public health care. The updated 2008 database contained 5,334 public health facilities (67% ministry of health; 28% mission and nongovernmental organizations; 2% local authorities; and 3% employers and other ministries). This represented an overall increase of 1,862 facilities compared to 2003. Most of the additional facilities belonged to the ministry of health (79%) and the majority were dispensaries (91%). 93% of the health facilities were spatially referenced, 38% using global positioning systems compared to 21% in 2003. 89% of the population was within 5 km Euclidean distance to a public health facility in 2008 compared to 71% in 2003. Over 80% of the population outside 5 km of public health service providers was in the sparsely settled pastoralist areas of the country. We have shown that, with concerted effort, a relatively complete inventory of mapped health services is possible with enormous potential for improving planning. Expansion in public health care in Kenya has resulted in significant increases in geographic access although several areas of the country need further improvements. This information is key to future planning and with this paper we have released the digital spatial database in the public domain to assist the Kenyan Government and its partners in the health sector.

  19. A Conceptual Framework and Classification for the Fluvial-Backwater-Marine Transition in Coastal Rivers Globally

    NASA Astrophysics Data System (ADS)

    Howes, N. C.; Georgiou, I. Y.; Hughes, Z. J.; Wolinsky, M. A.

    2012-12-01

    Channels in fluvio-deltaic and coastal plain settings undergo a progressive series of downstream transitions in hydrodynamics and sediment transport, which is consequently reflected in their morphology and stratigraphic architecture. Conditions progress from uniform fluvial flow to backwater conditions with non-uniform flow, and finally to bi-directional tidal flow or estuarine circulation at the ocean boundary. While significant attention has been given to geomorphic scaling relationships in purely fluvial settings, there have been far fewer studies on the backwater and tidal reaches, and no systematic comparisons. Our study addresses these gaps by analyzing geometric scaling relationships independently in each of the above hydrodynamic regimes and establishes a comparison. To accomplish this goal we have constructed a database of planform geometries including more than 150 channels. In terms of hydrodynamics studies, much of the work on backwater dynamics has concentrated on the Mississippi River, which has very limited tidal influence. We will extend this analysis to include systems with appreciable offshore tidal range, using a numerical hydrodynamic model to study the interaction between backwater dynamics and tides. The database is comprised of systems with a wide range of tectonic, climatic, and oceanic forcings. The scale of these systems, as measured by bankfull width, ranges over three orders of magnitude from the Amazon River in Brazil to the Palix River in Washington. Channel centerlines are extracted from processed imagery, enabling continuous planform measurements of bankfull width, meander wavelength, and sinuosity. Digital terrain and surface models are used to estimate floodplain slopes. Downstream tidal boundary conditions are obtained from the TOPEX 7.1 global tidal model, while upstream boundary conditions such as basin area, relief, and discharge are obtained by linking the databases of Milliman and Meade (2011) and Syvitski (2005). Backwater and tidal length-scales are computed from published data as well as from numerical simulations. An analysis of the database combined with numerical hydrodynamic simulations allows us to organize the results into a process-based classification of coastal rivers. The classification describes the scale, shape, and flow field transitions of coastal rivers as a function of discharge, floodplain slope, and offshore tidal range.

  20. Terrestrial Sediments of the Earth: Development of a Global Unconsolidated Sediments Map Database (GUM)

    NASA Astrophysics Data System (ADS)

    Börker, J.; Hartmann, J.; Amann, T.; Romero-Mujalli, G.

    2018-04-01

    Mapped unconsolidated sediments cover half of the global land surface. They are of considerable importance for many Earth surface processes like weathering, hydrological fluxes or biogeochemical cycles. Ignoring their characteristics or spatial extent may lead to misinterpretations in Earth System studies. Therefore, a new Global Unconsolidated Sediments Map database (GUM) was compiled, using regional maps specifically representing unconsolidated and quaternary sediments. The new GUM database provides insights into the regional distribution of unconsolidated sediments and their properties. The GUM comprises 911,551 polygons and describes not only sediment types and subtypes, but also parameters like grain size, mineralogy, age and thickness where available. Previous global lithological maps or databases lacked detail for reported unconsolidated sediment areas or missed large areas, and reported a global coverage of 25 to 30%, considering the ice-free land area. Here, alluvial sediments cover about 23% of the mapped total ice-free area, followed by aeolian sediments (˜21%), glacial sediments (˜20%), and colluvial sediments (˜16%). A specific focus during the creation of the database was on the distribution of loess deposits, since loess is highly reactive and relevant to understand geochemical cycles related to dust deposition and weathering processes. An additional layer compiling pyroclastic sediment is added, which merges consolidated and unconsolidated pyroclastic sediments. The compilation shows latitudinal abundances of sediment types related to climate of the past. The GUM database is available at the PANGAEA database (https://doi.org/10.1594/PANGAEA.884822).

  1. NASA Tech Briefs, March 2014

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Topics include: Data Fusion for Global Estimation of Forest Characteristics From Sparse Lidar Data; Debris and Ice Mapping Analysis Tool - Database; Data Acquisition and Processing Software - DAPS; Metal-Assisted Fabrication of Biodegradable Porous Silicon Nanostructures; Post-Growth, In Situ Adhesion of Carbon Nanotubes to a Substrate for Robust CNT Cathodes; Integrated PEMFC Flow Field Design for Gravity-Independent Passive Water Removal; Thermal Mechanical Preparation of Glass Spheres; Mechanistic-Based Multiaxial-Stochastic-Strength Model for Transversely-Isotropic Brittle Materials; Methods for Mitigating Space Radiation Effects, Fault Detection and Correction, and Processing Sensor Data; Compact Ka-Band Antenna Feed with Double Circularly Polarized Capability; Dual-Leadframe Transient Liquid Phase Bonded Power Semiconductor Module Assembly and Bonding Process; Quad First Stage Processor: A Four-Channel Digitizer and Digital Beam-Forming Processor; Protective Sleeve for a Pyrotechnic Reefing Line Cutter; Metabolic Heat Regenerated Temperature Swing Adsorption; CubeSat Deployable Log Periodic Dipole Array; Re-entry Vehicle Shape for Enhanced Performance; NanoRacks-Scale MEMS Gas Chromatograph System; Variable Camber Aerodynamic Control Surfaces and Active Wing Shaping Control; Spacecraft Line-of-Sight Stabilization Using LWIR Earth Signature; Technique for Finding Retro-Reflectors in Flash LIDAR Imagery; Novel Hemispherical Dynamic Camera for EVAs; 360 deg Visual Detection and Object Tracking on an Autonomous Surface Vehicle; Simulation of Charge Carrier Mobility in Conducting Polymers; Observational Data Formatter Using CMOR for CMIP5; Propellant Loading Physics Model for Fault Detection Isolation and Recovery; Probabilistic Guidance for Swarms of Autonomous Agents; Reducing Drift in Stereo Visual Odometry; Future Air-Traffic Management Concepts Evaluation Tool; Examination and A Priori Analysis of a Direct Numerical Simulation Database for High-Pressure Turbulent Flows; and Resource-Constrained Application of Support Vector Machines to Imagery.

  2. Novel advancements in colposcopy: historical perspectives and a systematic review of future developments.

    PubMed

    Adelman, Marisa Rachel

    2014-07-01

    To describe novel innovations and techniques for the detection of high-grade dysplasia. Studies were identified through the PubMed database, spanning the last 10 years. The key words (["computerized colposcopy" or "digital colposcopy" or "spectroscopy" or "multispectral digital colposcopy" or "dynamic spectral imaging", or "electrical impedance spectroscopy" or "confocal endomicroscopy" or "confocal microscopy"or "optical coherence tomography"] and ["cervical dysplasia" or cervical precancer" or "cervix" or "cervical"]) were used. The inclusion criteria were published articles of original research referring to noncolposcopic evaluation of the cervix for the detection of cervical dysplasia. Only English-language articles from the past 10 years were included, in which the technologies were used in vivo, and sensitivities and specificities could be calculated. The single author reviewed the articles for inclusion. Primary search of the database yielded 59 articles, and secondary cross-reference yielded 12 articles. Thirty-two articles met the inclusion criteria. An instrument that globally assesses the cervix, such as computer-assisted colposcopy, optical spectroscopy, and dynamic spectral imaging, would provided the most comprehensive estimate of disease and is therefore best suited when treatment is preferred. Electrical impedance spectroscopy, confocal microscopy, and optical coherence tomography provide information at the cellular level to estimate histology and are therefore best suited when deferment of treatment is preferred. If a device is to eventually replace the colposcope, it will likely combine technologies to best meet the needs of the target population, and as such, no single instrument may prove to be universally appropriate. Analyses of false-positive rates, additional colposcopies and biopsies, cost, and absolute life-savings will be important when considering these technologies and are limited thus far.

  3. Concordance of Commercial Data Sources for Neighborhood-Effects Studies

    PubMed Central

    Schootman, Mario

    2010-01-01

    Growing evidence supports a relationship between neighborhood-level characteristics and important health outcomes. One source of neighborhood data includes commercial databases integrated with geographic information systems to measure availability of certain types of businesses or destinations that may have either favorable or adverse effects on health outcomes; however, the quality of these data sources is generally unknown. This study assessed the concordance of two commercial databases for ascertaining the presence, locations, and characteristics of businesses. Businesses in the St. Louis, Missouri area were selected based on their four-digit Standard Industrial Classification (SIC) codes and classified into 14 business categories. Business listings in the two commercial databases were matched by standardized business name within specified distances. Concordance and coverage measures were calculated using capture–recapture methods for all businesses and by business type, with further stratification by census-tract-level population density, percent below poverty, and racial composition. For matched listings, distance between listings and agreement in four-digit SIC code, sales volume, and employee size were calculated. Overall, the percent agreement was 32% between the databases. Concordance and coverage estimates were lowest for health-care facilities and leisure/entertainment businesses; highest for popular walking destinations, eating places, and alcohol/tobacco establishments; and varied somewhat by population density. The mean distance (SD) between matched listings was 108.2 (179.0) m with varying levels of agreement in four-digit SIC (percent agreement = 84.6%), employee size (weighted kappa = 0.63), and sales volume (weighted kappa = 0.04). Researchers should cautiously interpret findings when using these commercial databases to yield measures of the neighborhood environment. PMID:20480397

  4. Shaping Our World: Digital Storytelling and the Authoring of Society

    ERIC Educational Resources Information Center

    Brzoska, Karen Lynn

    2009-01-01

    Globalization, networked societies, and a knowledge-based economy engender increasing reliance on digital communication technologies for the dissemination of information and ideas (Castells, Fernandez-Ardevol, Qiu & Sey, 2006). While the technological revolution has broadened access this digital domain, participants often adopt the passive…

  5. AN ASSESSMENT OF GROUND TRUTH VARIABILITY USING A "VIRTUAL FIELD REFERENCE DATABASE"

    EPA Science Inventory



    A "Virtual Field Reference Database (VFRDB)" was developed from field measurment data that included location and time, physical attributes, flora inventory, and digital imagery (camera) documentation foy 1,01I sites in the Neuse River basin, North Carolina. The sampling f...

  6. Completion of the National Land Cover Database (NLCD) 1992-2001 Land Cover Change Retrofit Product

    EPA Science Inventory

    The Multi-Resolution Land Characteristics Consortium has supported the development of two national digital land cover products: the National Land Cover Dataset (NLCD) 1992 and National Land Cover Database (NLCD) 2001. Substantial differences in imagery, legends, and methods betwe...

  7. Worldwide Engagement for Digitizing Biocollections (WeDigBio): The Biocollections Community's Citizen-Science Space on the Calendar.

    PubMed

    Ellwood, Elizabeth R; Kimberly, Paul; Guralnick, Robert; Flemons, Paul; Love, Kevin; Ellis, Shari; Allen, Julie M; Best, Jason H; Carter, Richard; Chagnoux, Simon; Costello, Robert; Denslow, Michael W; Dunckel, Betty A; Ferriter, Meghan M; Gilbert, Edward E; Goforth, Christine; Groom, Quentin; Krimmel, Erica R; LaFrance, Raphael; Martinec, Joann Lacey; Miller, Andrew N; Minnaert-Grote, Jamie; Nash, Thomas; Oboyski, Peter; Paul, Deborah L; Pearson, Katelin D; Pentcheff, N Dean; Roberts, Mari A; Seltzer, Carrie E; Soltis, Pamela S; Stephens, Rhiannon; Sweeney, Patrick W; von Konrat, Matt; Wall, Adam; Wetzer, Regina; Zimmerman, Charles; Mast, Austin R

    2018-02-01

    The digitization of biocollections is a critical task with direct implications for the global community who use the data for research and education. Recent innovations to involve citizen scientists in digitization increase awareness of the value of biodiversity specimens; advance science, technology, engineering, and math literacy; and build sustainability for digitization. In support of these activities, we launched the first global citizen-science event focused on the digitization of biodiversity specimens: Worldwide Engagement for Digitizing Biocollections (WeDigBio). During the inaugural 2015 event, 21 sites hosted events where citizen scientists transcribed specimen labels via online platforms (DigiVol, Les Herbonautes, Notes from Nature, the Smithsonian Institution's Transcription Center, and Symbiota). Many citizen scientists also contributed off-site. In total, thousands of citizen scientists around the world completed over 50,000 transcription tasks. Here, we present the process of organizing an international citizen-science event, an analysis of the event's effectiveness, and future directions-content now foundational to the growing WeDigBio event.

  8. ClearedLeavesDB: an online database of cleared plant leaf images

    PubMed Central

    2014-01-01

    Background Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. Description The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. Conclusions We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org. PMID:24678985

  9. ClearedLeavesDB: an online database of cleared plant leaf images.

    PubMed

    Das, Abhiram; Bucksch, Alexander; Price, Charles A; Weitz, Joshua S

    2014-03-28

    Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org.

  10. The CATDAT damaging earthquakes database

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  11. Prototype of web-based database of surface wave investigation results for site classification

    NASA Astrophysics Data System (ADS)

    Hayashi, K.; Cakir, R.; Martin, A. J.; Craig, M. S.; Lorenzo, J. M.

    2016-12-01

    As active and passive surface wave methods are getting popular for evaluating site response of earthquake ground motion, demand on the development of database for investigation results is also increasing. Seismic ground motion not only depends on 1D velocity structure but also on 2D and 3D structures so that spatial information of S-wave velocity must be considered in ground motion prediction. The database can support to construct 2D and 3D underground models. Inversion of surface wave processing is essentially non-unique so that other information must be combined into the processing. The database of existed geophysical, geological and geotechnical investigation results can provide indispensable information to improve the accuracy and reliability of investigations. Most investigations, however, are carried out by individual organizations and investigation results are rarely stored in the unified and organized database. To study and discuss appropriate database and digital standard format for the surface wave investigations, we developed a prototype of web-based database to store observed data and processing results of surface wave investigations that we have performed at more than 400 sites in U.S. and Japan. The database was constructed on a web server using MySQL and PHP so that users can access to the database through the internet from anywhere with any device. All data is registered in the database with location and users can search geophysical data through Google Map. The database stores dispersion curves, horizontal to vertical spectral ratio and S-wave velocity profiles at each site that was saved in XML files as digital data so that user can review and reuse them. The database also stores a published 3D deep basin and crustal structure and user can refer it during the processing of surface wave data.

  12. A web-system of virtual morphometric globes

    NASA Astrophysics Data System (ADS)

    Florinsky, Igor; Garov, Andrei; Karachevtseva, Irina

    2017-04-01

    Virtual globes — programs implementing interactive three-dimensional (3D) models of planets — are increasingly used in geo- and planetary sciences. We develop a web-system of virtual morphometric globes. As the initial data, we used the following global digital elevation models (DEMs): (1) a DEM of the Earth extracted from SRTM30_PLUS database; (2) a DEM of Mars extracted from the Mars Orbiter Laser Altimeter (MOLA) gridded data record archive; and (3) A DEM of the Moon extracted from the Lunar Orbiter Laser Altimeter (LOLA) gridded data record archive. From these DEMs, we derived global digital models of the following 16 local, nonlocal, and combined morphometric variables: horizontal curvature, vertical curvature, mean curvature, Gaussian curvature, minimal curvature, maximal curvature, unsphericity curvature, difference curvature, vertical excess curvature, horizontal excess curvature, ring curvature, accumulation curvature, catchment area, dispersive area, topographic index, and stream power index (definitions, formulae, and interpretations can be found elsewhere [1]). To calculate local morphometric variables, we applied a finite-difference method intended for spheroidal equal angular grids [1]. Digital models of a nonlocal and combined morphometric variables were derived by a method of Martz and de Jong adapted to spheroidal equal angular grids [1]. DEM processing was performed in the software LandLord [1]. The calculated morphometric models were integrated into the testing version of the system. The following main functions are implemented in the system: (1) selection of a celestial body; (2) selection of a morphometric variable; (3) 2D visualization of a calculated global morphometric model (a map in equirectangular projection); (4) 3D visualization of a calculated global morphometric model on the sphere surface (a globe by itself); (5) change of a globe scale (zooming); and (6) globe rotation by an arbitrary angle. The testing version of the system represents morphometric models with the resolution of 15'. In the final version of the system, we plan to implement a multiscale 3D visualization for models of 17 morphometric variables with the resolution from 15' to 30". The web-system of virtual morphometric globes is designed as a separate unit of a 3D web GIS for storage, processing, and access to planetary data [2], which is currently developed as an extension of an existing 2D web GIS (http://cartsrv.mexlab.ru/geoportal). Free, real-time web access to the system of virtual globes will be provided. The testing version of the system is available at: http://cartsrv.mexlab.ru/virtualglobe. The study is supported by the Russian Foundation for Basic Research, grant 15-07-02484. References 1. Florinsky, I.V., 2016. Digital Terrain Analysis in Soil Science and Geology. 2nd ed. Academic Press, Amsterdam, 486 p. 2. Garov, A.S., Karachevtseva, I.P., Matveev, E.V., Zubarev, A.E., and Florinsky, I.V., 2016. Development of a heterogenic distributed environment for spatial data processing using cloud technologies. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 41(B4): 385-390.

  13. 78 FR 57174 - Digital Trade in the U.S. and Global Economies, Part 2; Submission of Questionnaire for OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-17

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 332-540] Digital Trade in the U.S. and Global Economies, Part 2; Submission of Questionnaire for OMB Review AGENCY: United States International Trade Commission. ACTION: Notice of submission of request for approval of a questionnaire to the Office of Management and Budget. This notice is being...

  14. Potash: a global overview of evaporate-related potash resources, including spatial databases of deposits, occurrences, and permissive tracts: Chapter S in Global mineral resource assessment

    USGS Publications Warehouse

    Orris, Greta J.; Cocker, Mark D.; Dunlap, Pamela; Wynn, Jeff C.; Spanski, Gregory T.; Briggs, Deborah A.; Gass, Leila; Bliss, James D.; Bolm, Karen S.; Yang, Chao; Lipin, Bruce R.; Ludington, Stephen; Miller, Robert J.; Słowakiewicz, Mirosław

    2014-01-01

    This report describes a global, evaporite-related potash deposits and occurrences database and a potash tracts database. Chapter 1 summarizes potash resource history and use. Chapter 2 describes a global potash deposits and occurrences database, which contains more than 900 site records. Chapter 3 describes a potash tracts database, which contains 84 tracts with geology permissive for the presence of evaporite-hosted potash resources, including areas with active evaporite-related potash production, areas with known mineralization that has not been quantified or exploited, and areas with potential for undiscovered potash resources. Chapter 4 describes geographic information system (GIS) data files that include (1) potash deposits and occurrences data, (2) potash tract data, (3) reference databases for potash deposit and tract data, and (4) representative graphics of geologic features related to potash tracts and deposits. Summary descriptive models for stratabound potash-bearing salt and halokinetic potash-bearing salt are included in appendixes A and B, respectively. A glossary of salt- and potash-related terms is contained in appendix C and a list of database abbreviations is given in appendix D. Appendix E describes GIS data files, and appendix F is a guide to using the geodatabase.

  15. Mapping soil texture classes and optimization of the result by accuracy assessment

    NASA Astrophysics Data System (ADS)

    Laborczi, Annamária; Takács, Katalin; Bakacsi, Zsófia; Szabó, József; Pásztor, László

    2014-05-01

    There are increasing demands nowadays on spatial soil information in order to support environmental related and land use management decisions. The GlobalSoilMap.net (GSM) project aims to make a new digital soil map of the world using state-of-the-art and emerging technologies for soil mapping and predicting soil properties at fine resolution. Sand, silt and clay are among the mandatory GSM soil properties. Furthermore, soil texture class information is input data of significant agro-meteorological and hydrological models. Our present work aims to compare and evaluate different digital soil mapping methods and variables for producing the most accurate spatial prediction of texture classes in Hungary. In addition to the Hungarian Soil Information and Monitoring System as our basic data, digital elevation model and its derived components, geological database, and physical property maps of the Digital Kreybig Soil Information System have been applied as auxiliary elements. Two approaches have been applied for the mapping process. At first the sand, silt and clay rasters have been computed independently using regression kriging (RK). From these rasters, according to the USDA categories, we have compiled the texture class map. Different combinations of reference and training soil data and auxiliary covariables have resulted several different maps. However, these results consequentially include the uncertainty factor of the three kriged rasters. Therefore we have suited data mining methods as the other approach of digital soil mapping. By working out of classification trees and random forests we have got directly the texture class maps. In this way the various results can be compared to the RK maps. The performance of the different methods and data has been examined by testing the accuracy of the geostatistically computed and the directly classified results. We have used the GSM methodology to assess the most predictive and accurate way for getting the best among the several result maps. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  16. The Eruption Forecasting Information System (EFIS) database project

    NASA Astrophysics Data System (ADS)

    Ogburn, Sarah; Harpel, Chris; Pesicek, Jeremy; Wellik, Jay; Pallister, John; Wright, Heather

    2016-04-01

    The Eruption Forecasting Information System (EFIS) project is a new initiative of the U.S. Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) with the goal of enhancing VDAP's ability to forecast the outcome of volcanic unrest. The EFIS project seeks to: (1) Move away from relying on the collective memory to probability estimation using databases (2) Create databases useful for pattern recognition and for answering common VDAP questions; e.g. how commonly does unrest lead to eruption? how commonly do phreatic eruptions portend magmatic eruptions and what is the range of antecedence times? (3) Create generic probabilistic event trees using global data for different volcano 'types' (4) Create background, volcano-specific, probabilistic event trees for frequently active or particularly hazardous volcanoes in advance of a crisis (5) Quantify and communicate uncertainty in probabilities A major component of the project is the global EFIS relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest, including the timing of phreatic eruptions, column heights, eruptive products, etc. and will be initially populated using chronicles of eruptive activity from Alaskan volcanic eruptions in the GeoDIVA database (Cameron et al. 2013). This database module allows us to query across other global databases such as the WOVOdat database of monitoring data and the Smithsonian Institution's Global Volcanism Program (GVP) database of eruptive histories and volcano information. The EFIS database is in the early stages of development and population; thus, this contribution also serves as a request for feedback from the community.

  17. Using Digital Globes to Explore the Deep Sea and Advance Public Literacy in Earth System Science

    ERIC Educational Resources Information Center

    Beaulieu, Stace E.; Emery, Emery; Brickley, Annette; Spargo, Abbey; Patterson, Kathleen; Joyce, Katherine; Silva, Tim; Madin, Katherine

    2015-01-01

    Digital globes are new technologies increasingly used in informal and formal education to display global datasets and show connections among Earth systems. But how effective are digital globes in advancing public literacy in Earth system science? We addressed this question by developing new content for digital globes with the intent to educate and…

  18. A pier-scour database: 2,427 field and laboratory measurements of pier scour

    USGS Publications Warehouse

    Benedict, Stephen T.; Caldwell, Andral W.

    2014-01-01

    The U.S. Geological Survey conducted a literature review to identify potential sources of published pier-scour data, and selected data were compiled into a digital spreadsheet called the 2014 USGS Pier-Scour Database (PSDb-2014) consisting of 569 laboratory and 1,858 field measurements. These data encompass a wide range of laboratory and field conditions and represent field data from 23 States within the United States and from 6 other countries. The digital spreadsheet is available on the Internet and offers a valuable resource to engineers and researchers seeking to understand pier-scour relations in the laboratory and field.

  19. The ERESE Project: Interfacing with the ERDA Digital Archive and ERR Reference Database in EarthRef.org

    NASA Astrophysics Data System (ADS)

    Koppers, A. A.; Staudigel, H.; Mills, H.; Keller, M.; Wallace, A.; Bachman, N.; Helly, J.; Helly, M.; Miller, S. P.; Massell Symons, C.

    2004-12-01

    To bridge the gap between Earth science teachers, librarians, scientists and data archive managers, we have started the ERESE project that will create, archive and make available "Enduring Resources in Earth Science Education" through information technology (IT) portals. In the first phase of this National Science Digital Library (NSDL) project, we are focusing on the development of these ERESE resources for middle and high school teachers to be used in lesson plans with "plate tectonics" and "magnetics" as their main theme. In this presentation, we will show how these new ERESE resources are being generated, how they can be uploaded via online web wizards, how they are archived, how we make them available via the EarthRef.org Digital Archive (ERDA) and Reference Database (ERR), and how they relate to the SIOExplorer database containing data objects for all seagoing cruises carried out by the Scripps Institution of Oceanography. The EarthRef.org web resource uses the vision of a "general description" of the Earth as a geological system to provide an IT infrastructure for the Earth sciences. This emphasizes the marriage of the "scientific process" (and its results) with an educational cyber-infrastructure for teaching Earth sciences, on any level, from middle school to college and graduate levels. Eight different databases reside under EarthRef.org from which ERDA holds any digital object that has been uploaded by other scientists, teachers and students for free, while the ERR holds more than 80,000 publications. For more than 1,500 of these publications, this latter database makes available for downloading JPG/PDF images of the abstracts, data tables, methods and appendices, together with their digitized contents in Microsoft Word and Excel format. Both holdings are being used to store the ERESE objects that are being generated by a group of undergraduate students majoring in Environmental Systems (ESYS) program at the UCSD with an emphasis on the Earth Sciences. These students perform library and internet research in order to design and generate these "Enduring Resources in Earth Science Education" that they test by closely interacting with the research faculty at the Scripps Institution of Oceanography. Typical ERESE resources can be diagrams, model cartoons, maps, data sets for analyses, and glossary items and essays to explain certain Earth Science concepts and are ready to be used in the classroom.

  20. A review of existing and emerging digital technologies to combat the global trade in fake medicines.

    PubMed

    Mackey, Tim K; Nayyar, Gaurvika

    2017-05-01

    The globalization of the pharmaceutical supply chain has introduced new challenges, chief among them, fighting the international criminal trade in fake medicines. As the manufacture, supply, and distribution of drugs becomes more complex, so does the need for innovative technology-based solutions to protect patients globally. Areas covered: We conducted a multidisciplinary review of the science/health, information technology, computer science, and general academic literature with the aim of identifying cutting-edge existing and emerging 'digital' solutions to combat fake medicines. Our review identified five distinct categories of technology including mobile, radio frequency identification, advanced computational methods, online verification, and blockchain technology. Expert opinion: Digital fake medicine solutions are unifying platforms that integrate different types of anti-counterfeiting technologies as complementary solutions, improve information sharing and data collection, and are designed to overcome existing barriers of adoption and implementation. Investment in this next generation technology is essential to ensure the future security and integrity of the global drug supply chain.

  1. Transcribing and digitizing eighteenth- and nineteenth-century letters for a historical digital repository.

    PubMed

    Dunster, Emily S; Kipnis, Daniel G; Angelo, F Michael

    2014-01-01

    In fall 2011, the Scott Memorial Library purchased 53 letters belonging to an 1841 graduate of Jefferson Medical College, John Plimpton Green. The library staff transcribed and digitized the letters, creating an online collection in the university's institutional repository, Jefferson Digital Commons. This article will detail the process of transcribing and digitizing the collection along with sharing statistics and the benefits of this project to global researchers.

  2. Transnational Feminist Rhetorics in a Digital World

    ERIC Educational Resources Information Center

    Queen, Mary

    2008-01-01

    In this essay, the author examines the digital circulations of representations of one Afghan women's rights organization--the Revolutionary Association of Women of Afghanistan (RAWA)--to demonstrate the importance of a global and digital field for feminist rhetorical analysis. Specifically, this analysis traces how women's self-representations are…

  3. Navigate the Digital Rapids

    ERIC Educational Resources Information Center

    Lindsay, Julie; Davis, Vicki

    2010-01-01

    How can teachers teach digital citizenship when the digital landscape is changing so rapidly? How can teachers teach proper online social interactions when the students are outside their classroom and thus outside their control? Will encouraging students to engage in global collaborative environments land teachers in hot water? These are the…

  4. Essays on the Digital Divide

    ERIC Educational Resources Information Center

    Abdelfattah, Belal M. T.

    2013-01-01

    The digital divide is a phenomenon that is globally persistent, despite rapidly decreasing costs in technology. While much of the variance in the adoption and use of information communication technology (ICT) that defines the digital divide can be explained by socioeconomic and demographic variables, there is still significant unaccounted variance…

  5. The PREDICTS database: a global database of how local terrestrial biodiversity responds to human impacts

    Treesearch

    L.N. Hudson; T. Newbold; S. Contu

    2014-01-01

    Biodiversity continues to decline in the face of increasing anthropogenic pressures such as habitat destruction, exploitation, pollution and introduction of alien species. Existing global databases of species’ threat status or population time series are dominated by charismatic species. The collation of datasets with broad taxonomic and biogeographic extents, and that...

  6. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR GLOBAL CODING USED BY NHEXAS ARIZONA (HAND ENTRY) (UA-D-5.0)

    EPA Science Inventory

    The purpose of this SOP is to define the global coding scheme to used in the working and master databases. This procedure applies to all of the databases used during the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; databases.

    The National Human Exposu...

  7. Issues Facing Academic Library Consortia and Perceptions of Members of the Illinois Digital Academic Library.

    ERIC Educational Resources Information Center

    Brooks, Sam; Dorst, Thomas J.

    2002-01-01

    Discusses the role of consortia in academic libraries, specifically the Illinois Digital Academic Library (IDAL), and describes a study conducted by the IDAL that investigated issues surrounding full text database research including stability of content, vendor communication, embargo periods, publisher concerns, quality of content, linking and…

  8. Fact or Fiction? Libraries Can Thrive in the Digital Age

    ERIC Educational Resources Information Center

    Harris, Christopher

    2014-01-01

    Today's school library uses an increasing number of digital resources to supplement a print collection that is moving more toward fiction and literary non-fiction. Supplemental resources, including streaming video, online resources, subscription databases, audiobooks, e-books, and even games, round out the new collections. Despite the best…

  9. Development of a globally applicable model for near real-time prediction of seismically induced landslides

    USGS Publications Warehouse

    Nowicki, M. Anna; Wald, David J.; Hamburger, Michael W.; Hearne, Mike; Thompson, Eric M.

    2014-01-01

    Substantial effort has been invested to understand where seismically induced landslides may occur in the future, as they are a costly and frequently fatal threat in mountainous regions. The goal of this work is to develop a statistical model for estimating the spatial distribution of landslides in near real-time around the globe for use in conjunction with the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system. This model uses standardized outputs of ground shaking from the USGS ShakeMap Atlas 2.0 to develop an empirical landslide probability model, combining shaking estimates with broadly available landslide susceptibility proxies, i.e., topographic slope, surface geology, and climate parameters. We focus on four earthquakes for which digitally mapped landslide inventories and well-constrainedShakeMaps are available. The resulting database is used to build a predictive model of the probability of landslide occurrence. The landslide database includes the Guatemala (1976), Northridge (1994), Chi-Chi (1999), and Wenchuan (2008) earthquakes. Performance of the regression model is assessed using statistical goodness-of-fit metrics and a qualitative review to determine which combination of the proxies provides both the optimum prediction of landslide-affected areas and minimizes the false alarms in non-landslide zones. Combined with near real-time ShakeMaps, these models can be used to make generalized predictions of whether or not landslides are likely to occur (and if so, where) for earthquakes around the globe, and eventually to inform loss estimates within the framework of the PAGER system.

  10. Water level ingest, archive and processing system - an integral part of NOAA's tsunami database

    NASA Astrophysics Data System (ADS)

    McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.

    2013-12-01

    The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.

  11. The Greater Caucasus Glacier Inventory (Russia, Georgia and Azerbaijan)

    NASA Astrophysics Data System (ADS)

    Tielidze, Levan G.; Wheate, Roger D.

    2018-01-01

    There have been numerous studies of glaciers in the Greater Caucasus, but none that have generated a modern glacier database across the whole mountain range. Here, we present an updated and expanded glacier inventory at three time periods (1960, 1986, 2014) covering the entire Greater Caucasus. Large-scale topographic maps and satellite imagery (Corona, Landsat 5, Landsat 8 and ASTER) were used to conduct a remote-sensing survey of glacier change, and the 30 m resolution Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM; 17 November 2011) was used to determine the aspect, slope and height distribution of glaciers. Glacier margins were mapped manually and reveal that in 1960 the mountains contained 2349 glaciers with a total glacier surface area of 1674.9 ± 70.4 km2. By 1986, glacier surface area had decreased to 1482.1 ± 64.4 km2 (2209 glaciers), and by 2014 to 1193.2 ± 54.0 km2 (2020 glaciers). This represents a 28.8 ± 4.4 % (481 ± 21.2 km2) or 0.53 % yr-1 reduction in total glacier surface area between 1960 and 2014 and an increase in the rate of area loss since 1986 (0.69 % yr-1) compared to 1960-1986 (0.44 % yr-1). Glacier mean size decreased from 0.70 km2 in 1960 to 0.66 km2 in 1986 and to 0.57 km2 in 2014. This new glacier inventory has been submitted to the Global Land Ice Measurements from Space (GLIMS) database and can be used as a basis data set for future studies.

  12. The Identity Mapping Project: Demographic differences in patterns of distributed identity.

    PubMed

    Gilbert, Richard L; Dionisio, John David N; Forney, Andrew; Dorin, Philip

    2015-01-01

    The advent of cloud computing and a multi-platform digital environment is giving rise to a new phase of human identity called "The Distributed Self." In this conception, aspects of the self are distributed into a variety of 2D and 3D digital personas with the capacity to reflect any number of combinations of now malleable personality traits. In this way, the source of human identity remains internal and embodied, but the expression or enactment of the self becomes increasingly external, disembodied, and distributed on demand. The Identity Mapping Project (IMP) is an interdisciplinary collaboration between psychology and computer Science designed to empirically investigate the development of distributed forms of identity. Methodologically, it collects a large database of "identity maps" - computerized graphical representations of how active someone is online and how their identity is expressed and distributed across 7 core digital domains: email, blogs/personal websites, social networks, online forums, online dating sites, character based digital games, and virtual worlds. The current paper reports on gender and age differences in online identity based on an initial database of distributed identity profiles.

  13. [Intra-oral digital photography with the non professional camera--simplicity and effectiveness at a low price].

    PubMed

    Sackstein, M

    2006-10-01

    Over the last five years digital photography has become ubiquitous. For the family photo album, a 4 or 5 megapixel camera costing about 2000 NIS will produce satisfactory results for most people. However, for intra-oral photography the common wisdom holds that only professional photographic equipment is up to the task. Such equipment typically costs around 12,000 NIS and includes the camera body, an attachable macro lens and a ringflash. The following article challenges this conception. Although professional equipment does produce the most exemplary results, a highly effective database of clinical pictures can be compiled even with a "non-professional" digital camera. Since the year 2002, my clinical work has been routinely documented with digital cameras of the Nikon CoolPix series. The advantages are that these digicams are economical both in price and in size and allow easy transport and operation when compared to their expensive and bulky professional counterparts. The details of how to use a non-professional digicam to produce and maintain an effective clinical picture database, for documentation, monitoring, demonstration and professional fulfillment, are described below.

  14. APPLICATION OF A "VITURAL FIELD REFERENCE DATABASE" TO ASSESS LAND-COVER MAP ACCURACIES

    EPA Science Inventory

    An accuracy assessment was performed for the Neuse River Basin, NC land-cover/use
    (LCLU) mapping results using a "Virtual Field Reference Database (VFRDB)". The VFRDB was developed using field measurement and digital imagery (camera) data collected at 1,409 sites over a perio...

  15. Access control based on attribute certificates for medical intranet applications.

    PubMed

    Mavridis, I; Georgiadis, C; Pangalos, G; Khair, M

    2001-01-01

    Clinical information systems frequently use intranet and Internet technologies. However these technologies have emphasized sharing and not security, despite the sensitive and private nature of much health information. Digital certificates (electronic documents which recognize an entity or its attributes) can be used to control access in clinical intranet applications. To outline the need for access control in distributed clinical database systems, to describe the use of digital certificates and security policies, and to propose the architecture for a system using digital certificates, cryptography and security policy to control access to clinical intranet applications. We have previously developed a security policy, DIMEDAC (Distributed Medical Database Access Control), which is compatible with emerging public key and privilege management infrastructure. In our implementation approach we propose the use of digital certificates, to be used in conjunction with DIMEDAC. Our proposed access control system consists of two phases: the ways users gain their security credentials; and how these credentials are used to access medical data. Three types of digital certificates are used: identity certificates for authentication; attribute certificates for authorization; and access-rule certificates for propagation of access control policy. Once a user is identified and authenticated, subsequent access decisions are based on a combination of identity and attribute certificates, with access-rule certificates providing the policy framework. Access control in clinical intranet applications can be successfully and securely managed through the use of digital certificates and the DIMEDAC security policy.

  16. A personal digital assistant application (MobilDent) for dental fieldwork data collection, information management and database handling.

    PubMed

    Forsell, M; Häggström, M; Johansson, O; Sjögren, P

    2008-11-08

    To develop a personal digital assistant (PDA) application for oral health assessment fieldwork, including back-office and database systems (MobilDent). System design, construction and implementation of PDA, back-office and database systems. System requirements for MobilDent were collected, analysed and translated into system functions. User interfaces were implemented and system architecture was outlined. MobilDent was based on a platform with. NET (Microsoft) components, using an SQL Server 2005 (Microsoft) for data storage with Windows Mobile (Microsoft) operating system. The PDA devices were Dell Axim. System functions and user interfaces were specified for MobilDent. User interfaces for PDA, back-office and database systems were based on. NET programming. The PDA user interface was based on Windows suitable to a PDA display, whereas the back-office interface was designed for a normal-sized computer screen. A synchronisation module (MS Active Sync, Microsoft) was used to enable download of field data from PDA to the database. MobilDent is a feasible application for oral health assessment fieldwork, and the oral health assessment database may prove a valuable source for care planning, educational and research purposes. Further development of the MobilDent system will include wireless connectivity with download-on-demand technology.

  17. 78 FR 12787 - Digital Trade in the U.S. and Global Economies, Part 2; Institution of Investigation and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-25

    ...In response to a request from the Senate Committee on Finance (Committee) dated December 13, 2012 (received on December 14, 2012) under section 332(g) of the Tariff Act of 1930 (19 U.S.C. 1332(g)), the U.S. International Trade Commission has instituted the second of two investigations, investigation No. 332-540, Digital Trade in the U.S. and Global Economies, Part 2. The Commission's report in this investigation will build upon the approaches outlined in the Commission's report in the first investigation, No. 332-531, Digital Trade in the U.S. and Global Economies, Part 1, which is scheduled to be transmitted to the Committee by July 14, 2013. The Commission has previously announced that it will hold a public hearing in the two investigations on March 7, 2013.

  18. The place-value of a digit in multi-digit numbers is processed automatically.

    PubMed

    Kallai, Arava Y; Tzelgov, Joseph

    2012-09-01

    The automatic processing of the place-value of digits in a multi-digit number was investigated in 4 experiments. Experiment 1 and two control experiments employed a numerical comparison task in which the place-value of a non-zero digit was varied in a string composed of zeros. Experiment 2 employed a physical comparison task in which strings of digits varied in their physical sizes. In both types of tasks, the place-value of the non-zero digit in the string was irrelevant to the task performed. Interference of the place-value information was found in both tasks. When the non-zero digit occupied a lower place-value, it was recognized slower as a larger digit or as written in a larger font size. We concluded that place-value in a multi-digit number is processed automatically. These results support the notion of a decomposed representation of multi-digit numbers in memory. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  19. Parallel database search and prime factorization with magnonic holographic memory devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khitun, Alexander

    In this work, we describe the capabilities of Magnonic Holographic Memory (MHM) for parallel database search and prime factorization. MHM is a type of holographic device, which utilizes spin waves for data transfer and processing. Its operation is based on the correlation between the phases and the amplitudes of the input spin waves and the output inductive voltage. The input of MHM is provided by the phased array of spin wave generating elements allowing the producing of phase patterns of an arbitrary form. The latter makes it possible to code logic states into the phases of propagating waves and exploitmore » wave superposition for parallel data processing. We present the results of numerical modeling illustrating parallel database search and prime factorization. The results of numerical simulations on the database search are in agreement with the available experimental data. The use of classical wave interference may results in a significant speedup over the conventional digital logic circuits in special task data processing (e.g., √n in database search). Potentially, magnonic holographic devices can be implemented as complementary logic units to digital processors. Physical limitations and technological constrains of the spin wave approach are also discussed.« less

  20. Parallel database search and prime factorization with magnonic holographic memory devices

    NASA Astrophysics Data System (ADS)

    Khitun, Alexander

    2015-12-01

    In this work, we describe the capabilities of Magnonic Holographic Memory (MHM) for parallel database search and prime factorization. MHM is a type of holographic device, which utilizes spin waves for data transfer and processing. Its operation is based on the correlation between the phases and the amplitudes of the input spin waves and the output inductive voltage. The input of MHM is provided by the phased array of spin wave generating elements allowing the producing of phase patterns of an arbitrary form. The latter makes it possible to code logic states into the phases of propagating waves and exploit wave superposition for parallel data processing. We present the results of numerical modeling illustrating parallel database search and prime factorization. The results of numerical simulations on the database search are in agreement with the available experimental data. The use of classical wave interference may results in a significant speedup over the conventional digital logic circuits in special task data processing (e.g., √n in database search). Potentially, magnonic holographic devices can be implemented as complementary logic units to digital processors. Physical limitations and technological constrains of the spin wave approach are also discussed.

  1. [Research report of experimental database establishment of digitized virtual Chinese No.1 female].

    PubMed

    Zhong, Shi-zhen; Yuan, Lin; Tang, Lei; Huang, Wen-hua; Dai, Jing-xing; Li, Jian-yi; Liu, Chang; Wang, Xing-hai; Li, Hua; Luo, Shu-qian; Qin, Dulie; Zeng, Shao-qun; Wu, Tao; Zhang, Mei-chao; Wu, Kun-cheng; Jiao, Pei-feng; Lu, Yun-tao; Chen, Hao; Li, Pei-liang; Gao, Yuan; Wang, Tong; Fan, Ji-hong

    2003-03-01

    To establish digitized virtual Chinese No.1 female (VCH-F1) image database. A 19 years old female cadaver was scanned by CT, MRI, and perfused with red filling material through formal artery before freezing and em- bedding. The whole body was cut by JZ1500A vertical milling machine with a 0.2 mm inter-spacing. All the images was produced by Fuji FinePix S2 Pro camera. The body index of VCH-F1 was 94%. We cut 8 556 sections of the whole body, and each image was 17.5 MB in size and the whole database reached 149.7 GB. We have totally 6 versions of the database for different applications. Compared with other databases, VCH-F1 has good representation of the Chinese body shape, colorful filling material in blood vessels providing enough information for future registration and segmentation. Vertical embedding and cutting helped to retain normal human physiological posture, and the image quality and operation efficiency were improved by using various techniques such as one-time freezing and fixation, double-temperature icehouse, large-diameter milling disc and whole body cutting.

  2. Digital mapping techniques '06 - Workshop proceedings

    USGS Publications Warehouse

    Soller, David R.

    2007-01-01

    The Digital Mapping Techniques `06 (DMT`06) workshop was attended by more than 110 technical experts from 51 agencies, universities, and private companies, including representatives from 27 state geological surveys (see Appendix A of these Proceedings). This workshop was similar in nature to the previous nine meetings, which were held in Lawrence, Kansas (Soller, 1997), Champaign, Illinois (Soller, 1998), Madison, Wisconsin (Soller, 1999), Lexington, Kentucky (Soller, 2000), Tuscaloosa, Alabama (Soller, 2001), Salt Lake City, Utah (Soller, 2002), Millersville, Pennsylvania (Soller, 2003), Portland, Oregon (Soller, 2004), and Baton Rouge, Louisiana (Soller, 2005). This year?s meeting was hosted by the Ohio Geological Survey, from June 11-14, 2006, on the Ohio State University campus in Columbus, Ohio. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops.Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, the latter of which was formed in August 1996 to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database - and for the State and Federal geological surveys - to provide more high-quality digital maps to the public.At the 2006 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, "publishing" includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.

  3. Digital Literacy: Tools and Methodologies for Information Society

    ERIC Educational Resources Information Center

    Rivoltella, Pier Cesare, Ed.

    2008-01-01

    Currently in a state of cultural transition, global society is moving from a literary society to digital one, adopting widespread use of advanced technologies such as the Internet and mobile devices. Digital media has an extraordinary impact on society's formative processes, forcing a pragmatic shift in their management and organization. This…

  4. Learning the Virtual Life: Public Pedagogy in a Digital World

    ERIC Educational Resources Information Center

    Trifonas, Peter Pericles, Ed.

    2011-01-01

    Digital technologies have transformed cultural perceptions of learning and what it means to be literate, expanding the importance of experience alongside interpretation and reflection. "Living the Virtual Life" offers ways to consider the local and global effects of digital media on educational environments, as well as the cultural transformations…

  5. The Profiles in Science Digital Library: Behind the Scenes.

    PubMed

    Gallagher, Marie E; Moffatt, Christie

    2012-01-01

    This demonstration shows the Profiles in Science ® digital library. Profiles in Science contains digitized selections from the personal manuscript collections of prominent biomedical researchers, medical practitioners, and those fostering science and health. The Profiles in Science Web site is the delivery mechanism for content derived from the digital library system. The system is designed according to our basic principles for digital library development [1]. The digital library includes the rules and software used for digitizing items, creating and editing database records and performing quality control as well as serving the digital content to the public. Among the types of data managed by the digital library are detailed item-level, collection-level and cross-collection metadata, digitized photographs, papers, audio clips, movies, born-digital electronic files, optical character recognized (OCR) text, and annotations (see Figure 1). The digital library also tracks the status of each item, including digitization quality, sensitivity of content, and copyright. Only items satisfying all required criteria are released to the public through the World Wide Web. External factors have influenced all aspects of the digital library's infrastructure.

  6. From Bricks and Mortar to the Public Sphere in Cyberspace: Creating a Culture of Caring on the Digital Global Commons

    ERIC Educational Resources Information Center

    Delacruz, Elizabeth M.

    2009-01-01

    This paper is intended as a broad, conceptual and theoretical treatise on the aims of teaching art in the age of global digital media. To contextualize a set of general recommendations for art education technology pedagogy, I first provide an overview of the meteoric rise of on-line social networks, and consider questions about the nature and…

  7. NYC Reservoirs Watershed Areas (HUC 12)

    EPA Pesticide Factsheets

    This NYC Reservoirs Watershed Areas (HUC 12) GIS layer was derived from the 12-Digit National Watershed Boundary Database (WBD) at 1:24,000 for EPA Region 2 and Surrounding States. HUC 12 polygons were selected from the source based on interactively comparing these HUC 12s in our GIS with images of the New York City's Water Supply System Map found at http://www.nyc.gov/html/dep/html/drinking_water/wsmaps_wide.shtml. The 12 digit Hydrologic Units (HUCs) for EPA Region 2 and surrounding states (Northeastern states, parts of the Great Lakes, Puerto Rico and the USVI) are a subset of the National Watershed Boundary Database (WBD), downloaded from the Natural Resources Conservation Service (NRCS) Geospatial Gateway and imported into the EPA Region 2 Oracle/SDE database. This layer reflects 2009 updates to the WBD that included new boundary data for New York and New Jersey.

  8. Dangers of Noncritical Use of Historical Plague Data

    PubMed Central

    Roosen, Joris

    2018-01-01

    Researchers have published several articles using historical data sets on plague epidemics using impressive digital databases that contain thousands of recorded outbreaks across Europe over the past several centuries. Through the digitization of preexisting data sets, scholars have unprecedented access to the historical record of plague occurrences. However, although these databases offer new research opportunities, noncritical use and reproduction of preexisting data sets can also limit our understanding of how infectious diseases evolved. Many scholars have performed investigations using Jean-Noël Biraben’s data, which contains information on mentions of plague from various kinds of sources, many of which were not cited. When scholars fail to apply source criticism or do not reflect on the content of the data they use, the reliability of their results becomes highly questionable. Researchers using these databases going forward need to verify and restrict content spatially and temporally, and historians should be encouraged to compile the work.

  9. An Index to PGE-Ni-Cr Deposits and Occurrences in Selected Mineral-Occurrence Databases

    USGS Publications Warehouse

    Causey, J. Douglas; Galloway, John P.; Zientek, Michael L.

    2009-01-01

    Databases of mineral deposits and occurrences are essential to conducting assessments of undiscovered mineral resources. In the USGS's (U.S. Geological Survey) global assessment of undiscovered resources of copper, potash, and the platinum-group elements (PGE), only a few mineral deposit types will be evaluated. For example, only porphyry-copper and sediment-hosted copper deposits will be considered for the copper assessment. To support the global assessment, the USGS prepared comprehensive compilations of the occurrences of these two deposit types in order to develop grade and tonnage models and delineate permissive areas for undiscovered deposits of those types. This publication identifies previously published databases and database records that describe PGE, nickel, and chromium deposits and occurrences. Nickel and chromium were included in this overview because of the close association of PGE with nickel and chromium mineralization. Users of this database will need to refer to the original databases for detailed information about the deposits and occurrences. This information will be used to develop a current and comprehensive global database of PGE deposits and occurrences.

  10. Historical geoscientific collections - requirements on digital cataloging and problems

    NASA Astrophysics Data System (ADS)

    Ehling, A.

    2011-12-01

    The Federal Institute for Geosciences and Natural Resources maintains comprehensive geoscientific collections: the historical collections of Prussian Geological Survey in Berlin (19th and 20th century; about 2 mio specimen) and the geoscientific collections of the 20th century in Hannover (about 800.000 specimen). Nowadays, where financial support is strictly bound to efficiency and rentability on one side and the soaring (among young people - nearly exclusive) use of the web for the research, it is mandatory to provide the information about the available stock of specimen on the web. The digital cataloging has being carried out since 20 years: up to now about 40 % of the stock has been documented in 20 access-databases. The experiences of 20 years digital cataloging as well as the contact with professional users allow to formulate the requirements on a modern digital database with all accordingly problems. The main problems are different kinds of specimen: minerals, rocks, fossils, drill cores with diverging descriptions; obsolescent names of minerals, rocks and geographical sites; generations of various inventory numbers; inhomogeneous data (quantity and quality). Out of it result requirements to much, well educated manpower on the one side and an intelligent digital solution on the other side: it should have an internationally useable standard considering all the described local problems.

  11. A community effort to construct a gravity database for the United States and an associated Web portal

    USGS Publications Warehouse

    Keller, Gordon R.; Hildenbrand, T.G.; Kucks, R.; Webring, M.; Briesacher, A.; Rujawitz, K.; Hittleman, A.M.; Roman, D.R.; Winester, D.; Aldouri, R.; Seeley, J.; Rasillo, J.; Torres, R.; Hinze, W. J.; Gates, A.; Kreinovich, V.; Salayandia, L.

    2006-01-01

    Potential field data (gravity and magnetic measurements) are both useful and costeffective tools for many geologic investigations. Significant amounts of these data are traditionally in the public domain. A new magnetic database for North America was released in 2002, and as a result, a cooperative effort between government agencies, industry, and universities to compile an upgraded digital gravity anomaly database, grid, and map for the conterminous United States was initiated and is the subject of this paper. This database is being crafted into a data system that is accessible through a Web portal. This data system features the database, software tools, and convenient access. The Web portal will enhance the quality and quantity of data contributed to the gravity database that will be a shared community resource. The system's totally digital nature ensures that it will be flexible so that it can grow and evolve as new data, processing procedures, and modeling and visualization tools become available. Another goal of this Web-based data system is facilitation of the efforts of researchers and students who wish to collect data from regions currently not represented adequately in the database. The primary goal of upgrading the United States gravity database and this data system is to provide more reliable data that support societal and scientific investigations of national importance. An additional motivation is the international intent to compile an enhanced North American gravity database, which is critical to understanding regional geologic features, the tectonic evolution of the continent, and other issues that cross national boundaries. ?? 2006 Geological Society of America. All rights reserved.

  12. Digital games in health professions education: Advantages, disadvantages, and game engagement factors.

    PubMed

    Bigdeli, Shoaleh; Kaufman, David

    2017-01-01

    Background: The application of digital educational games in health professions education is on expansion and game-based education usage is increasing. Methods: Diverse databases were searched and the related papers were reviewed. Results: Considering the growing popularity of educational games in medical education, we attempted to classify their benefits, flaws, and engaging factors. Conclusion: Advantages, disadvantages, and engagement factors of educational digital games used for health professions education must be the focus of attention in designing games for health professions discipline.

  13. Digital games in health professions education: Advantages, disadvantages, and game engagement factors

    PubMed Central

    Bigdeli, Shoaleh; Kaufman, David

    2017-01-01

    Background: The application of digital educational games in health professions education is on expansion and game-based education usage is increasing. Methods: Diverse databases were searched and the related papers were reviewed. Results: Considering the growing popularity of educational games in medical education, we attempted to classify their benefits, flaws, and engaging factors. Conclusion: Advantages, disadvantages, and engagement factors of educational digital games used for health professions education must be the focus of attention in designing games for health professions discipline. PMID:29951418

  14. Spatial Digital Database for the Geology of the San Pedro River Basin in Cochise, Gila, Graham, Pima, and Pinal Counties, Arizona

    USGS Publications Warehouse

    Bolm, Karen S.

    2002-01-01

    The map area is located in southeastern Arizona. This report describes the map units, the methods used to convert the geologic map data into a digital format, and the ArcInfo GIS file structures and relationships; and it explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. See figures 2 and 3 for page-size versions of the map compilation.

  15. Trends in the Evolution of the Public Web, 1998-2002; The Fedora Project: An Open-source Digital Object Repository Management System; State of the Dublin Core Metadata Initiative, April 2003; Preservation Metadata; How Many People Search the ERIC Database Each Day?

    ERIC Educational Resources Information Center

    O'Neill, Edward T.; Lavoie, Brian F.; Bennett, Rick; Staples, Thornton; Wayland, Ross; Payette, Sandra; Dekkers, Makx; Weibel, Stuart; Searle, Sam; Thompson, Dave; Rudner, Lawrence M.

    2003-01-01

    Includes five articles that examine key trends in the development of the public Web: size and growth, internationalization, and metadata usage; Flexible Extensible Digital Object and Repository Architecture (Fedora) for use in digital libraries; developments in the Dublin Core Metadata Initiative (DCMI); the National Library of New Zealand Te Puna…

  16. European Geophysical Society (23rd) General Assembly, Annales Geophysicae, Part 4, Nonlinear Geophysics & Natural Hazards, Supplement 4 to Volume 16, Held in Nice, France on 20-24 April 1998

    DTIC Science & Technology

    1998-01-01

    the power spectra of instrumental temperature data from the Global Summary of day database from time scales of 1 day to 100 years. Maritime sta- tions...continental-type spectrum to a maritime-type spectrum is investigated by averaging spectra from all stations in the database in 2°x2° grid squares...We present global and regional maps of the seismic intensity factor based on data from the NEIC Global Hypocenter Database from 1963-1994. The

  17. The influence of lumber grade on machine productivity in the rough mill

    Treesearch

    Philip H. Steele; Jan Wiedenbeck; Rubin Shmulsky; Anura Perera; Anura Perera

    1999-01-01

    Lumber grade effect on hardwood-part processing time was investigated with a digitally described lumber database in conjunction with a crosscut-first rough mill yield optimization simulator. In this study, the digital lumber sample was subdivided into five hardwood lumber grades. Three cutting bills with varying degrees of difficulty were Cut." The three cutting...

  18. Digital Library Services: Perceptions and Expectations of User Communities and Librarians in a New Zealand Academic Library.

    ERIC Educational Resources Information Center

    Xia, Wei

    2003-01-01

    Provides an overview of research conducted at Victoria University of Wellington regarding differing perceptions and expectations of user communities and librarians related to the usability of digital services. Considers access to services, currency of information on the Web site, the online public access catalog, databases, electronic journals,…

  19. Digital hand atlas for web-based bone age assessment: system design and implementation

    NASA Astrophysics Data System (ADS)

    Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente

    2000-04-01

    A frequently used assessment method of skeletal age is atlas matching by a radiological examination of a hand image against a small set of Greulich-Pyle patterns of normal standards. The method however can lead to significant deviation in age assessment, due to a variety of observers with different levels of training. The Greulich-Pyle atlas based on middle upper class white populations in the 1950s, is also not fully applicable for children of today, especially regarding the standard development in other racial groups. In this paper, we present our system design and initial implementation of a digital hand atlas and computer-aided diagnostic (CAD) system for Web-based bone age assessment. The digital atlas will remove the disadvantages of the currently out-of-date one and allow the bone age assessment to be computerized and done conveniently via Web. The system consists of a hand atlas database, a CAD module and a Java-based Web user interface. The atlas database is based on a large set of clinically normal hand images of diverse ethnic groups. The Java-based Web user interface allows users to interact with the hand image database form browsers. Users can use a Web browser to push a clinical hand image to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, is then extracted and compared with patterns from the atlas database to assess the bone age.

  20. Incorporating the APS Catalog of the POSS I and Image Archive in ADS

    NASA Technical Reports Server (NTRS)

    Humphreys, Roberta M.

    1998-01-01

    The primary purpose of this contract was to develop the software to both create and access an on-line database of images from digital scans of the Palomar Sky Survey. This required modifying our DBMS (called Star Base) to create an image database from the actual raw pixel data from the scans. The digitized images are processed into a set of coordinate-reference index and pixel files that are stored in run-length files, thus achieving an efficient lossless compression. For efficiency and ease of referencing, each digitized POSS I plate is then divided into 900 subplates. Our custom DBMS maps each query into the corresponding POSS plate(s) and subplate(s). All images from the appropriate subplates are retrieved from disk with byte-offsets taken from the index files. These are assembled on-the-fly into a GIF image file for browser display, and a FITS format image file for retrieval. The FITS images have a pixel size of 0.33 arcseconds. The FITS header contains astrometric and photometric information. This method keeps the disk requirements manageable while allowing for future improvements. When complete, the APS Image Database will contain over 130 Gb of data. A set of web pages query forms are available on-line, as well as an on-line tutorial and documentation. The database is distributed to the Internet by a high-speed SGI server and a high-bandwidth disk system. URL is http://aps.umn.edu/IDB/. The image database software is written in perl and C and has been compiled on SGI computers with MIX5.3. A copy of the written documentation is included and the software is on the accompanying exabyte tape.

  1. A New Approach to Data Publication in Ocean Sciences

    NASA Astrophysics Data System (ADS)

    Lowry, Roy; Urban, Ed; Pissierssens, Peter

    2009-12-01

    Data are collected from ocean sciences activities that range from a single investigator working in a laboratory to large teams of scientists cooperating on big, multinational, global ocean research projects. What these activities have in common is that all result in data, some of which are used as the basis for publications in peer-reviewed journals. However, two major problems regarding data remain. First, many data valuable for understanding ocean physics, chemistry, geology, biology, and how the oceans operate in the Earth system are never archived or made accessible to other scientists. Data underlying traditional journal articles are often difficult to obtain. Second, when scientists do contribute data to databases, their data become freely available, with little acknowledgment and no contribution to their career advancement. To address these problems, stronger ties must be made between data repositories and academic journals, and a “digital backbone” needs to be created for data related to journal publications.

  2. Multimedia consultation session recording and playback using Java-based browser in global PACS

    NASA Astrophysics Data System (ADS)

    Martinez, Ralph; Shah, Pinkesh J.; Yu, Yuan-Pin

    1998-07-01

    The current version of the Global PACS software system uses a Java-based implementation of the Remote Consultation and Diagnosis (RCD) system. The Java RCD includes a multimedia consultation session between physicians that includes text, static image, image annotation, and audio data. The JAVA RCD allows 2-4 physicians to collaborate on a patient case. It allows physicians to join the session via WWW Java-enabled browsers or stand alone RCD application. The RCD system includes a distributed database archive system for archiving and retrieving patient and session data. The RCD system can be used for store and forward scenarios, case reviews, and interactive RCD multimedia sessions. The RCD system operates over the Internet, telephone lines, or in a private Intranet. A multimedia consultation session can be recorded, and then played back at a later time for review, comments, and education. A session can be played back using Java-enabled WWW browsers on any operating system platform. The JAVA RCD system shows that a case diagnosis can be captured digitally and played back with the original real-time temporal relationships between data streams. In this paper, we describe design and implementation of the RCD session playback.

  3. Using digital databases to create geologic maps for the 21st century : a GIS model for geologic, environmental, cultural and transportation data from southern Rhode Island

    DOT National Transportation Integrated Search

    2002-05-01

    Knowledge of surface and subsurface geology is fundamental to the planning and development of new or modified transportation systems. Toward this : end, we have compiled a model GIS database consisting of important geologic, cartographic, environment...

  4. Back to the Scriptorium: Database Marketplace 2009

    ERIC Educational Resources Information Center

    Tenopir, Carol; Baker, Gayle; Grogg, Jill E.

    2009-01-01

    The 2009 database marketplace is bounded by two extremes: massive digitization projects to increase access, and retrenchment owing to budget worries. Picture medieval monks hunched over their desks in the scriptorium as they labor to copy manuscripts. A 21st-century version of this activity is being repeated daily in the world's libraries and…

  5. "There's so Much Data": Exploring the Realities of Data-Based School Governance

    ERIC Educational Resources Information Center

    Selwyn, Neil

    2016-01-01

    Educational governance is commonly predicated around the generation, collation and processing of data through digital technologies. Drawing upon an empirical study of two Australian secondary schools, this paper explores the different forms of data-based governance that are being enacted by school leaders, managers, administrators and teachers.…

  6. Computerization of the Arkansas Fishes Database

    Treesearch

    Henry W. Robison; L. Gayle Henderson; Melvin L. Warren; Janet S. Rader

    2004-01-01

    Abstract - Until recently, distributional data for the fishes of Arkansas existed in the form of museum records, field notebooks of various ichthyologists, and published fish survey data; none of which was in a digital format. In 1995, a relational database system was used to design a PC platform data entry module for the capture of information on...

  7. Sentence-Based Metadata: An Approach and Tool for Viewing Database Designs.

    ERIC Educational Resources Information Center

    Boyle, John M.; Gunge, Jakob; Bryden, John; Librowski, Kaz; Hanna, Hsin-Yi

    2002-01-01

    Describes MARS (Museum Archive Retrieval System), a research tool which enables organizations to exchange digital images and documents by means of a common thesaurus structure, and merge the descriptive data and metadata of their collections. Highlights include theoretical basis; searching the MARS database; and examples in European museums.…

  8. A compilation of spatial digital databases for selected U.S. Geological Survey nonfuel mineral resource assessments for parts of Idaho and Montana

    USGS Publications Warehouse

    Carlson, Mary H.; Zientek, Michael L.; Causey, J. Douglas; Kayser, Helen Z.; Spanski, Gregory T.; Wilson, Anna B.; Van Gosen, Bradley S.; Trautwein, Charles M.

    2007-01-01

    This report compiles selected results from 13 U.S. Geological Survey (USGS) mineral resource assessment studies conducted in Idaho and Montana into consistent spatial databases that can be used in a geographic information system. The 183 spatial databases represent areas of mineral potential delineated in these studies and include attributes on mineral deposit type, level of mineral potential, certainty, and a reference. The assessments were conducted for five 1? x 2? quadrangles (Butte, Challis, Choteau, Dillon, and Wallace), several U.S. Forest Service (USFS) National Forests (including Challis, Custer, Gallatin, Helena, and Payette), and one Bureau of Land Management (BLM) Resource Area (Dillon). The data contained in the spatial databases are based on published information: no new interpretations are made. This digital compilation is part of an ongoing effort to provide mineral resource information formatted for use in spatial analysis. In particular, this is one of several reports prepared to address USFS needs for science information as forest management plans are revised in the Northern Rocky Mountains.

  9. Ethical implications of digital images for teaching and learning purposes: an integrative review.

    PubMed

    Kornhaber, Rachel; Betihavas, Vasiliki; Baber, Rodney J

    2015-01-01

    Digital photography has simplified the process of capturing and utilizing medical images. The process of taking high-quality digital photographs has been recognized as efficient, timely, and cost-effective. In particular, the evolution of smartphone and comparable technologies has become a vital component in teaching and learning of health care professionals. However, ethical standards in relation to digital photography for teaching and learning have not always been of the highest standard. The inappropriate utilization of digital images within the health care setting has the capacity to compromise patient confidentiality and increase the risk of litigation. Therefore, the aim of this review was to investigate the literature concerning the ethical implications for health professionals utilizing digital photography for teaching and learning. A literature search was conducted utilizing five electronic databases, PubMed, Embase (Excerpta Medica Database), Cumulative Index to Nursing and Allied Health Literature, Educational Resources Information Center, and Scopus, limited to English language. Studies that endeavored to evaluate the ethical implications of digital photography for teaching and learning purposes in the health care setting were included. The search strategy identified 514 papers of which nine were retrieved for full review. Four papers were excluded based on the inclusion criteria, leaving five papers for final analysis. Three key themes were developed: knowledge deficit, consent and beyond, and standards driving scope of practice. The assimilation of evidence in this review suggests that there is value for health professionals utilizing digital photography for teaching purposes in health education. However, there is limited understanding of the process of obtaining and storage and use of such mediums for teaching purposes. Disparity was also highlighted related to policy and guideline identification and development in clinical practice. Therefore, the implementation of policy to guide practice requires further research.

  10. A perceptive method for handwritten text segmentation

    NASA Astrophysics Data System (ADS)

    Lemaitre, Aurélie; Camillerapp, Jean; Coüasnon, Bertrand

    2011-01-01

    This paper presents a new method to address the problem of handwritten text segmentation into text lines and words. Thus, we propose a method based on the cooperation among points of view that enables the localization of the text lines in a low resolution image, and then to associate the pixels at a higher level of resolution. Thanks to the combination of levels of vision, we can detect overlapping characters and re-segment the connected components during the analysis. Then, we propose a segmentation of lines into words based on the cooperation among digital data and symbolic knowledge. The digital data are obtained from distances inside a Delaunay graph, which gives a precise distance between connected components, at the pixel level. We introduce structural rules in order to take into account some generic knowledge about the organization of a text page. This cooperation among information gives a bigger power of expression and ensures the global coherence of the recognition. We validate this work using the metrics and the database proposed for the segmentation contest of ICDAR 2009. Thus, we show that our method obtains very interesting results, compared to the other methods of the literature. More precisely, we are able to deal with slope and curvature, overlapping text lines and varied kinds of writings, which are the main difficulties met by the other methods.

  11. Visual-spatial abilities relate to mathematics achievement in children with heavy prenatal alcohol exposure.

    PubMed

    Crocker, Nicole; Riley, Edward P; Mattson, Sarah N

    2015-01-01

    The current study examined the relationship between mathematics and attention, working memory, and visual memory in children with heavy prenatal alcohol exposure and controls. Subjects were 56 children (29 AE, 27 CON) who were administered measures of global mathematics achievement (WRAT-3 Arithmetic & WISC-III Written Arithmetic), attention, (WISC-III Digit Span forward and Spatial Span forward), working memory (WISC-III Digit Span backward and Spatial Span backward), and visual memory (CANTAB Spatial Recognition Memory and Pattern Recognition Memory). The contribution of cognitive domains to mathematics achievement was analyzed using linear regression techniques. Attention, working memory, and visual memory data were entered together on Step 1 followed by group on Step 2, and the interaction terms on Step 3. Model 1 accounted for a significant amount of variance in both mathematics achievement measures; however, model fit improved with the addition of group on Step 2. Significant predictors of mathematics achievement were Spatial Span forward and backward and Spatial Recognition Memory. These findings suggest that deficits in spatial processing may be related to math impairments seen in FASD. In addition, prenatal alcohol exposure was associated with deficits in mathematics achievement, above and beyond the contribution of general cognitive abilities. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  12. Slumdog romance: Facebook love and digital privacy at the margins.

    PubMed

    Arora, Payal; Scheiber, Laura

    2017-04-01

    Facebook has consolidated its position as the one-stop-shop for social activity among the poor in the global South. Sex, romance, and love are key motivations for mobile and Internet technology usage among this demographic, much like the West. Digital romance is a critical context through which we gain fresh perspectives on Internet governance for an emerging digital and globalizing public. Revenge porn, slut-shaming, and Internet romance scams are a common and growing malady worldwide. Focusing on how it manifests in diverse digital cultures will aid in the shaping of new Internet laws for a more inclusive cross-cultural public. In specific, this article examines how low-income youth in two of the BRICS (Brazil, Russia, India, China and South Africa) nations - Brazil and India - exercise and express their notions on digital privacy, surveillance, and trust through the lens of romance. This allows for a more thorough investigation of the relationship between sexuality, morality, and governance within the larger Facebook ecology. As Facebook becomes the dominant virtual public sphere for the world's poor, we are compelled to ask whether inclusivity of the digital users comes at the price of diversity of digital platforms.

  13. Slumdog romance: Facebook love and digital privacy at the margins

    PubMed Central

    Arora, Payal; Scheiber, Laura

    2017-01-01

    Facebook has consolidated its position as the one-stop-shop for social activity among the poor in the global South. Sex, romance, and love are key motivations for mobile and Internet technology usage among this demographic, much like the West. Digital romance is a critical context through which we gain fresh perspectives on Internet governance for an emerging digital and globalizing public. Revenge porn, slut-shaming, and Internet romance scams are a common and growing malady worldwide. Focusing on how it manifests in diverse digital cultures will aid in the shaping of new Internet laws for a more inclusive cross-cultural public. In specific, this article examines how low-income youth in two of the BRICS (Brazil, Russia, India, China and South Africa) nations – Brazil and India – exercise and express their notions on digital privacy, surveillance, and trust through the lens of romance. This allows for a more thorough investigation of the relationship between sexuality, morality, and governance within the larger Facebook ecology. As Facebook becomes the dominant virtual public sphere for the world’s poor, we are compelled to ask whether inclusivity of the digital users comes at the price of diversity of digital platforms. PMID:29708133

  14. Launching Discovery through a Digital Library Portal: SIOExplorer

    NASA Astrophysics Data System (ADS)

    Miller, S. P.; Staudigel, H.; Johnson, C.; McSherry, K.; Clark, D.; Peckman, U.; Helly, J.; Sutton, D.; Chase, A.; Schottlaender, B. E.; Day, D.; Helly, M.

    2003-12-01

    The launching of an oceanographic expedition has its own brand of excitement, with the sound of the main engines firing up, and the lifting of the gangway in a foreign port, as the team of scientists and crew sets out for a month at sea with only the resources they have aboard. Although this adventure is broadly appealing, very few have the privilege of actually joining an expedition. With the "SIOExplorer" family of projects we are now beginning to open this experience across cyberspace to a wide range of students and teachers. What began two years ago as an effort to stabilize the Scripps Institution of Oceanography (SIO) data archives from more than 700 cruises going back 50 years, has now become an operational component of the National Science Digital Library (NSDL; www.nsdl.org), complete with thousands of historic photographs, full text documents and 3D visualization experiences. Our initial emphasis has been on marine geology and geophysics, in particular multibeam seafloor mapping, including 2 terabytes of digital objects. The IT architecture implemented at the San Diego Supercomputer Center (SDSC) streamlines the integration of additional projects in other disciplines with a suite of metadata management and collection building tools for "arbitrary digital objects." The "CruiseViewer" Java application is the primary portal to the digital library, providing a graphical user and display interface, the interface with the metadata database, and the interface with the SDSC "Storage Resource Broker" for long-term bulk distributed data storage management. It presents the user with a view of the available objects, overlaid on a global topography map. Geospatial objects can be selected interactively, and searches can be constrained by keywords. Metadata can be browsed and objects can be viewed onscreen or downloaded for further analysis, with automatic proprietary-hold request management. These efforts will be put to the test with national teacher workshops in the next two summers. Teachers, in collaboration with SIO-graduate students, will prepare and field-test learning-experience modules that explore concepts from plate tectonics theory for classroom and web use. Students will design their own personal voyages of discovery through our digital archives, promoting inquiry-based learning tailored to each individual. Future education and outreach efforts will include 1) developing a global registry of seafloor research or education projects (academic, industry, government), allowing at least a URL and a contact for further information 2) adding new collections, including dredged rocks and cores, 3) interoperating with other international data collections, 4) interacting with education and outreach projects such as the California Center for Ocean Science Education Excellence (COSEE), 5) continued testing of a real-time stand-alone digital library on a laptop shipboard acquisition system, 6) enhanced use of real-time Real-time Observatories, Applications, and Data management Network (ROADnet) satellite links to SIO vessels, and 7) continued construction of a series of museum exhibits based on digital terrain models. Now that SIOExplorer has become operational, we look forward to collaborating with other institutions for data and technology exchange, as well as for education and outreach opportunities. Support is provided by NSF NSDL, ITR and OCE programs, as well as by UCSD funds.

  15. Digitizing Ethiopic: Coding for Linguistic Continuity in the Face of Digital Extinction

    ERIC Educational Resources Information Center

    Zaugg, Isabelle Alice

    2017-01-01

    Despite the growing sophistication of digital technologies, it appears they are contributing to language extinction on a par with devastating losses in biodiversity. With language extinction comes loss of identity, inter-generational cohesion, culture, and a global wealth of knowledge to address future problems facing humanity. Linguists estimate…

  16. Development of Digital Instruction for Environment for Global Warming Alleviation

    ERIC Educational Resources Information Center

    Praneetham, Chuleewan; Thathong, Kongsak

    2016-01-01

    Technological education and instruction are widely used in the present education trend. Using of digital instruction for environmental subject can encourage students in learning and raise their awareness and attitude on environmental issues. The purposes of this research were: 1) to construct and develop the digital instruction for environment for…

  17. Liberating the Arts: Promoting IT Fluency through the Pedagogy of Digital Storytelling

    ERIC Educational Resources Information Center

    Warren, Kenneth Xavier, Jr.

    2011-01-01

    Fluency in information technology (IT Fluency) is a component of life-long learning necessary for a Liberal Arts student's post-graduate success in the global and digital economy. While challenging, promoting IT fluency at Liberal Arts colleges can be achieved through the integration of digital storytelling pedagogy in existing humanities…

  18. Digital Portfolios and Learning: The Students' Voices

    ERIC Educational Resources Information Center

    Donnelly, Brian Francis

    2010-01-01

    The convergence of innovations in digital technologies and expanding global internet connectivity has given rise to an emerging field of study identified as Digital Media and Learning (DML). (Davidson and Goldberg, 2009; Gee, 2009; Ito, Horst and Bittanti, 2008; Jenkins and Purushotma, 2008). In describing his work for the MacArthur Foundation's…

  19. Building Expertise to Support Digital Scholarship: A Global Perspective

    ERIC Educational Resources Information Center

    Lewis, Vivian; Spiro, Lisa; Wang, Xuemao; Cawthorne, Jon E.

    2015-01-01

    This report sheds light on the expertise required to support a robust and sustainable digital scholarship (DS) program. It focuses first on defining and describing the key domain knowledge, skills, competencies, and mindsets at some of the world's most prominent digital scholarship programs. It then identifies the main strategies used to build…

  20. Revisiting the Digital Divide in the Context of a "Flattening" World

    ERIC Educational Resources Information Center

    Subramony, Deepak Prem

    2014-01-01

    This article employs a variety of theoretical lenses to describe the nature and ramifications of the Digital Divide, which, the author states, continues to remain one of the biggest social challenges to confront the human race in modern times--even as technological advances, globalization, and other socioeconomic shifts are rendering digital media…

  1. Education for Agriculture and Rural Development in Low-Income Countries: Implications of the Digital Divide.

    ERIC Educational Resources Information Center

    Gasperini, Lavinia; Mclean, Scott

    The "digital divide" refers to inequitable access to information and communication technologies (ICTs) between wealthy and poor countries and between privileged and underprivileged social groups within all countries. This presentation outlines global parameters of the digital divide, discusses the use of ICTs in education in…

  2. Worldwide Engagement for Digitizing Biocollections (WeDigBio): The Biocollections Community's Citizen-Science Space on the Calendar

    PubMed Central

    Kimberly, Paul; Guralnick, Robert; Flemons, Paul; Love, Kevin; Ellis, Shari; Allen, Julie M; Best, Jason H; Carter, Richard; Chagnoux, Simon; Costello, Robert; Denslow, Michael W; Dunckel, Betty A; Ferriter, Meghan M; Gilbert, Edward E; Goforth, Christine; Groom, Quentin; Krimmel, Erica R; LaFrance, Raphael; Martinec, Joann Lacey; Miller, Andrew N; Minnaert-Grote, Jamie; Nash, Thomas; Oboyski, Peter; Paul, Deborah L; Pearson, Katelin D; Pentcheff, N Dean; Roberts, Mari A; Seltzer, Carrie E; Soltis, Pamela S; Stephens, Rhiannon; Sweeney, Patrick W; von Konrat, Matt; Wall, Adam; Wetzer, Regina; Zimmerman, Charles; Mast, Austin R

    2018-01-01

    Abstract The digitization of biocollections is a critical task with direct implications for the global community who use the data for research and education. Recent innovations to involve citizen scientists in digitization increase awareness of the value of biodiversity specimens; advance science, technology, engineering, and math literacy; and build sustainability for digitization. In support of these activities, we launched the first global citizen-science event focused on the digitization of biodiversity specimens: Worldwide Engagement for Digitizing Biocollections (WeDigBio). During the inaugural 2015 event, 21 sites hosted events where citizen scientists transcribed specimen labels via online platforms (DigiVol, Les Herbonautes, Notes from Nature, the Smithsonian Institution's Transcription Center, and Symbiota). Many citizen scientists also contributed off-site. In total, thousands of citizen scientists around the world completed over 50,000 transcription tasks. Here, we present the process of organizing an international citizen-science event, an analysis of the event's effectiveness, and future directions—content now foundational to the growing WeDigBio event. PMID:29599548

  3. Energy savings opportunities in the global digital television transition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Won Young; Gopal, Anand; Phadke, Amol

    Globally, terrestrial television (TV) broadcasting is in the midst of a complete transition to digital signals. The last analog terrestrial broadcast is expected to be switched off in the early 2020s. This transition presents huge energy savings opportunities that have thus far been ignored. Digital TV switchovers have likely increased energy consumption as countries have completed transitions by providing digital TV converters to analog TV users, which increase energy consumption and extend the life of energy-inefficient analog TVs. We find that if analog TVs were retired at the time of a digital switchover and replaced with super-efficient flat-panel TVs, suchmore » as light-emitting diode (LED) backlit liquid crystal display (LCD) TVs, there is a combined electricity savings potential of 32 terawatt hours [TWh] per year in countries that have not yet completed their digital TV transition. In view of these findings as well as the dramatic drops of super-efficient TV prices and the unique early-retirement opportunity resulting from cessation of terrestrial analog broadcasts, TV-exchange programs would easily and substantially advance energy efficiency.« less

  4. Energy savings opportunities in the global digital television transition

    DOE PAGES

    Park, Won Young; Gopal, Anand; Phadke, Amol

    2016-12-20

    Globally, terrestrial television (TV) broadcasting is in the midst of a complete transition to digital signals. The last analog terrestrial broadcast is expected to be switched off in the early 2020s. This transition presents huge energy savings opportunities that have thus far been ignored. Digital TV switchovers have likely increased energy consumption as countries have completed transitions by providing digital TV converters to analog TV users, which increase energy consumption and extend the life of energy-inefficient analog TVs. We find that if analog TVs were retired at the time of a digital switchover and replaced with super-efficient flat-panel TVs, suchmore » as light-emitting diode (LED) backlit liquid crystal display (LCD) TVs, there is a combined electricity savings potential of 32 terawatt hours [TWh] per year in countries that have not yet completed their digital TV transition. In view of these findings as well as the dramatic drops of super-efficient TV prices and the unique early-retirement opportunity resulting from cessation of terrestrial analog broadcasts, TV-exchange programs would easily and substantially advance energy efficiency.« less

  5. The One Universal Graph — a free and open graph database

    NASA Astrophysics Data System (ADS)

    Ng, Liang S.; Champion, Corbin

    2016-02-01

    Recent developments in graph database mostly are huge projects involving big organizations, big operations and big capital, as the name Big Data attests. We proposed the concept of One Universal Graph (OUG) which states that all observable and known objects and concepts (physical, conceptual or digitally represented) can be connected with only one single graph; furthermore the OUG can be implemented with a very simple text file format with free software, capable of being executed on Android or smaller devices. As such the One Universal Graph Data Exchange (GOUDEX) modules can potentially be installed on hundreds of millions of Android devices and Intel compatible computers shipped annually. Coupled with its open nature and ability to connect to existing leading search engines and databases currently in operation, GOUDEX has the potential to become the largest and a better interface for users and programmers to interact with the data on the Internet. With a Web User Interface for users to use and program in native Linux environment, Free Crowdware implemented in GOUDEX can help inexperienced users learn programming with better organized documentation for free software, and is able to manage programmer's contribution down to a single line of code or a single variable in software projects. It can become the first practically realizable “Internet brain” on which a global artificial intelligence system can be implemented. Being practically free and open, One Universal Graph can have significant applications in robotics, artificial intelligence as well as social networks.

  6. A Database Management System Application for the Graduate Programs Office of the School of Systems and Logistics. Volume 2. Technical Reference Manual

    DTIC Science & Technology

    1988-09-01

    CIT C 15 Name of local city. InCSrATE C 2 Name of local state as tw letter abbreviatiom. SIC ZIP C 10 Loa ZIP code. Five or nine digits . InC_ PHKtE C 15...record: 10 Database Dictimary for C: \\ BASE\\PAS1E.MF Field Nane Type Width Decimal Coments PMSCODE C 2 Third and fourth digits of PAS code. ON C 3...Version: 3.01 Date: 09/01/88 Time: 21:34 Report Libary : C: ASE\\GPO.RP1 Date: 08/28/88 Time: 11:32 PRMNT OFTICNS CflRL-PRINrM Nmber of copies: 1 Starting

  7. Global Education and Intercultural Awareness in eTwinning

    ERIC Educational Resources Information Center

    Camilleri, Rose-anne

    2016-01-01

    Students today are facing a global society which is interconnected. This necessitates competencies in digital and cultural integration skills to become successful global citizens. This study reviews the benefits and challenges of global education and intercultural interaction amongst students participating in eTwinning projects between various…

  8. Global soil-climate-biome diagram: linking soil properties to climate and biota

    NASA Astrophysics Data System (ADS)

    Zhao, X.; Yang, Y.; Fang, J.

    2017-12-01

    As a critical component of the Earth system, soils interact strongly with both climate and biota and provide fundamental ecosystem services that maintain food, climate, and human security. Despite significant progress in digital soil mapping techniques and the rapidly growing quantity of observed soil information, quantitative linkages between soil properties, climate and biota at the global scale remain unclear. By compiling a large global soil database, we mapped seven major soil properties (bulk density [BD]; sand, silt and clay fractions; soil pH; soil organic carbon [SOC] density [SOCD]; and soil total nitrogen [STN] density [STND]) based on machine learning algorithms (regional random forest [RF] model) and quantitatively assessed the linkage between soil properties, climate and biota at the global scale. Our results demonstrated a global soil-climate-biome diagram, which improves our understanding of the strong correspondence between soils, climate and biomes. Soil pH decreased with greater mean annual precipitation (MAP) and lower mean annual temperature (MAT), and the critical MAP for the transition from alkaline to acidic soil pH decreased with decreasing MAT. Specifically, the critical MAP ranged from 400-500 mm when the MAT exceeded 10 °C but could decrease to 50-100 mm when the MAT was approximately 0 °C. SOCD and STND were tightly linked; both increased in accordance with lower MAT and higher MAP across terrestrial biomes. Global stocks of SOC and STN were estimated to be 788 ± 39.4 Pg (1015 g, or billion tons) and 63 ± 3.3 Pg in the upper 30-cm soil layer, respectively, but these values increased to 1654 ± 94.5 Pg and 133 ± 7.8 Pg in the upper 100-cm soil layer, respectively. These results reveal quantitative linkages between soil properties, climate and biota at the global scale, suggesting co-evolution of the soil, climate and biota under conditions of global environmental change.

  9. Structure and needs of global loss databases about natural disaster

    NASA Astrophysics Data System (ADS)

    Steuer, Markus

    2010-05-01

    Global loss databases are used for trend analyses and statistics in scientific projects, studies for governmental and nongovernmental organizations and for the insurance and finance industry as well. At the moment three global data sets are established: EM-DAT (CRED), Sigma (Swiss Re) and NatCatSERVICE (Munich Re). Together with the Asian Disaster Reduction Center (ADRC) and United Nations Development Program (UNDP) started a collaborative initiative in 2007 with the aim to agreed on and implemented a common "Disaster Category Classification and Peril Terminology for Operational Databases". This common classification has been established through several technical meetings and working groups and represents a first and important step in the development of a standardized international classification of disasters and terminology of perils. This means concrete to set up a common hierarchy and terminology for all global and regional databases on natural disasters and establish a common and agreed definition of disaster groups, main types and sub-types of events. Also the theme of georeferencing, temporal aspects, methodology and sourcing were other issues that have been identified and will be discussed. The implementation of the new and defined structure for global loss databases is already set up for Munich Re NatCatSERVICE. In the following oral session we will show the structure of the global databases as defined and in addition to give more transparency of the data sets behind published statistics and analyses. The special focus will be on the catastrophe classification from a moderate loss event up to a great natural catastrophe, also to show the quality of sources and give inside information about the assessment of overall and insured losses. Keywords: disaster category classification, peril terminology, overall and insured losses, definition

  10. Applications of Precipitation Feature Databases from GPM core and constellation Satellites

    NASA Astrophysics Data System (ADS)

    Liu, C.

    2017-12-01

    Using the observations from Global Precipitation Mission (GPM) core and constellation satellites, global precipitation was quantitatively described from the perspective of precipitation systems and their properties. This presentation will introduce the development of precipitation feature databases, and several scientific questions that have been tackled using this database, including the topics of global snow precipitation, extreme intensive convection, hail storms, extreme precipitation, and microphysical properties derived with dual frequency radars at the top of convective cores. As more and more observations of constellation satellites become available, it is anticipated that the precipitation feature approach will help to address a large variety of scientific questions in the future. For anyone who is interested, all the current precipitation feature databases are freely open to public at: http://atmos.tamucc.edu/trmm/.

  11. Clinical decision support tools: performance of personal digital assistant versus online drug information databases.

    PubMed

    Clauson, Kevin A; Polen, Hyla H; Marsh, Wallace A

    2007-12-01

    To evaluate personal digital assistant (PDA) drug information databases used to support clinical decision-making, and to compare the performance of PDA databases with their online versions. Prospective evaluation with descriptive analysis. Five drug information databases available for PDAs and online were evaluated according to their scope (inclusion of correct answers), completeness (on a 3-point scale), and ease of use; 158 question-answer pairs across 15 weighted categories of drug information essential to health care professionals were used to evaluate these databases. An overall composite score integrating these three measures was then calculated. Scores for the PDA databases and for each PDA-online pair were compared. Among the PDA databases, composite rankings, from highest to lowest, were as follows: Lexi-Drugs, Clinical Pharmacology OnHand, Epocrates Rx Pro, mobileMicromedex (now called Thomson Clinical Xpert), and Epocrates Rx free version. When we compared database pairs, online databases that had greater scope than their PDA counterparts were Clinical Pharmacology (137 vs 100 answers, p<0.001), Micromedex (132 vs 96 answers, p<0.001), Lexi-Comp Online (131 vs 119 answers, p<0.001), and Epocrates Online Premium (103 vs 98 answers, p=0.001). Only Micromedex online was more complete than its PDA version (p=0.008). Regarding ease of use, the Lexi-Drugs PDA database was superior to Lexi-Comp Online (p<0.001); however, Epocrates Online Premium, Epocrates Online Free, and Micromedex online were easier to use than their PDA counterparts (p<0.001). In terms of composite scores, only the online versions of Clinical Pharmacology and Micromedex demonstrated superiority over their PDA versions (p>0.01). Online and PDA drug information databases assist practitioners in improving their clinical decision-making. Lexi-Drugs performed significantly better than all of the other PDA databases evaluated. No PDA database demonstrated superiority to its online counterpart; however, the online versions of Clinical Pharmacology and Micromedex were superior to their PDA versions in answering questions.

  12. A case study for a digital seabed database: Bohai Sea engineering geology database

    NASA Astrophysics Data System (ADS)

    Tianyun, Su; Shikui, Zhai; Baohua, Liu; Ruicai, Liang; Yanpeng, Zheng; Yong, Wang

    2006-07-01

    This paper discusses the designing plan of ORACLE-based Bohai Sea engineering geology database structure from requisition analysis, conceptual structure analysis, logical structure analysis, physical structure analysis and security designing. In the study, we used the object-oriented Unified Modeling Language (UML) to model the conceptual structure of the database and used the powerful function of data management which the object-oriented and relational database ORACLE provides to organize and manage the storage space and improve its security performance. By this means, the database can provide rapid and highly effective performance in data storage, maintenance and query to satisfy the application requisition of the Bohai Sea Oilfield Paradigm Area Information System.

  13. Global lightning studies

    NASA Technical Reports Server (NTRS)

    Goodman, Steven J.; Wright, Pat; Christian, Hugh; Blakeslee, Richard; Buechler, Dennis; Scharfen, Greg

    1991-01-01

    The global lightning signatures were analyzed from the DMSP Optical Linescan System (OLS) imagery archived at the National Snow and Ice Data Center. Transition to analysis of the digital archive becomes available and compare annual, interannual, and seasonal variations with other global data sets. An initial survey of the quality of the existing film archive was completed and lightning signatures were digitized for the summer months of 1986 to 1987. The relationship is studied between: (1) global and regional lightning activity and rainfall, and (2) storm electrical development and environment. Remote sensing data sets obtained from field programs are used in conjunction with satellite/radar/lightning data to develop and improve precipitation estimation algorithms, and to provide a better understanding of the co-evolving electrical, microphysical, and dynamical structure of storms.

  14. Use of globally unique identifiers (GUIDs) to link herbarium specimen records to physical specimens.

    PubMed

    Nelson, Gil; Sweeney, Patrick; Gilbert, Edward

    2018-02-01

    With the advent of the U.S. National Science Foundation's Advancing Digitization of Biodiversity Collections program and related worldwide digitization initiatives, the rate of herbarium specimen digitization in the United States has expanded exponentially. As the number of electronic herbarium records proliferates, the importance of linking these records to the physical specimens they represent as well as to related records from other sources will intensify. Although a rich and diverse literature has developed over the past decade that addresses the use of specimen identifiers for facilitating linking across the internet, few implementable guidelines or recommended practices for herbaria have been advanced. Here we review this literature with the express purpose of distilling a specific set of recommendations especially tailored to herbarium specimen digitization, curation, and management. We argue that associating globally unique identifiers (GUIDs) with physical herbarium specimens and including these identifiers in all electronic records about those specimens is essential to effective digital data curation. We also address practical applications for ensuring these associations.

  15. Access Control based on Attribute Certificates for Medical Intranet Applications

    PubMed Central

    Georgiadis, Christos; Pangalos, George; Khair, Marie

    2001-01-01

    Background Clinical information systems frequently use intranet and Internet technologies. However these technologies have emphasized sharing and not security, despite the sensitive and private nature of much health information. Digital certificates (electronic documents which recognize an entity or its attributes) can be used to control access in clinical intranet applications. Objectives To outline the need for access control in distributed clinical database systems, to describe the use of digital certificates and security policies, and to propose the architecture for a system using digital certificates, cryptography and security policy to control access to clinical intranet applications. Methods We have previously developed a security policy, DIMEDAC (Distributed Medical Database Access Control), which is compatible with emerging public key and privilege management infrastructure. In our implementation approach we propose the use of digital certificates, to be used in conjunction with DIMEDAC. Results Our proposed access control system consists of two phases: the ways users gain their security credentials; and how these credentials are used to access medical data. Three types of digital certificates are used: identity certificates for authentication; attribute certificates for authorization; and access-rule certificates for propagation of access control policy. Once a user is identified and authenticated, subsequent access decisions are based on a combination of identity and attribute certificates, with access-rule certificates providing the policy framework. Conclusions Access control in clinical intranet applications can be successfully and securely managed through the use of digital certificates and the DIMEDAC security policy. PMID:11720951

  16. RiceAtlas, a spatial database of global rice calendars and production.

    PubMed

    Laborte, Alice G; Gutierrez, Mary Anne; Balanza, Jane Girly; Saito, Kazuki; Zwart, Sander J; Boschetti, Mirco; Murty, M V R; Villano, Lorena; Aunario, Jorrel Khalil; Reinke, Russell; Koo, Jawoo; Hijmans, Robert J; Nelson, Andrew

    2017-05-30

    Knowing where, when, and how much rice is planted and harvested is crucial information for understanding the effects of policy, trade, and global and technological change on food security. We developed RiceAtlas, a spatial database on the seasonal distribution of the world's rice production. It consists of data on rice planting and harvesting dates by growing season and estimates of monthly production for all rice-producing countries. Sources used for planting and harvesting dates include global and regional databases, national publications, online reports, and expert knowledge. Monthly production data were estimated based on annual or seasonal production statistics, and planting and harvesting dates. RiceAtlas has 2,725 spatial units. Compared with available global crop calendars, RiceAtlas is nearly ten times more spatially detailed and has nearly seven times more spatial units, with at least two seasons of calendar data, making RiceAtlas the most comprehensive and detailed spatial database on rice calendar and production.

  17. In Situ Global Sea Surface Salinity and Variability from the NCEI Global Thermosalinograph Database

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Boyer, T.; Zhang, H. M.

    2017-12-01

    Sea surface salinity (SSS) plays an important role in the global ocean circulations. The variations of sea surface salinity are key indicators of changes in air-sea water fluxes. Using nearly 30 years of in situ measurements of sea surface salinity from thermosalinographs, we will evaluate the variations of the sea surface salinity in the global ocean. The sea surface salinity data used are from our newly-developed NCEI Global Thermosalinograph Database - NCEI-TSG. This database provides a comprehensive set of quality-controlled in-situ sea-surface salinity and temperature measurements collected from over 340 vessels during the period 1989 to the present. The NCEI-TSG is the world's most complete TSG dataset, containing all data from the different TSG data assembly centers, e.g. COAPS (SAMOS), IODE (GOSUD) and AOML, with more historical data from NCEI's archive to be added. Using this unique dataset, we will investigate the spatial variations of the global SSS and its variability. Annual and interannual variability will also be studied at selected regions.

  18. Studying Venus using a GIS database

    NASA Technical Reports Server (NTRS)

    Price, Maribeth; Suppe, John

    1993-01-01

    A Geographic Information System (GIS) can significantly enhance geological studies on Venus because it facilitates concurrent analysis of many sources of data, as demonstrated by our work on topographic and deformation characteristics of tesserae. We are creating a database of structures referenced to real-world coordinates to encourage the archival of Venusian studies in digital format and to foster quantitative analysis of many combinations of data. Contributions to this database from all aspects of Venusian science are welcome.

  19. Application GIS on university planning: building a spatial database aided spatial decision

    NASA Astrophysics Data System (ADS)

    Miao, Lei; Wu, Xiaofang; Wang, Kun; Nong, Yu

    2007-06-01

    With the development of university and its size enlarging, kinds of resource need to effective management urgently. Spacial database is the right tool to assist administrator's spatial decision. And it's ready for digital campus with integrating existing OMS. It's researched about the campus planning in detail firstly. Following instanced by south china agriculture university it is practiced that how to build the geographic database of the campus building and house for university administrator's spatial decision.

  20. Toward a standard reference database for computer-aided mammography

    NASA Astrophysics Data System (ADS)

    Oliveira, Júlia E. E.; Gueld, Mark O.; de A. Araújo, Arnaldo; Ott, Bastian; Deserno, Thomas M.

    2008-03-01

    Because of the lack of mammography databases with a large amount of codified images and identified characteristics like pathology, type of breast tissue, and abnormality, there is a problem for the development of robust systems for computer-aided diagnosis. Integrated to the Image Retrieval in Medical Applications (IRMA) project, we present an available mammography database developed from the union of: The Mammographic Image Analysis Society Digital Mammogram Database (MIAS), The Digital Database for Screening Mammography (DDSM), the Lawrence Livermore National Laboratory (LLNL), and routine images from the Rheinisch-Westfälische Technische Hochschule (RWTH) Aachen. Using the IRMA code, standardized coding of tissue type, tumor staging, and lesion description was developed according to the American College of Radiology (ACR) tissue codes and the ACR breast imaging reporting and data system (BI-RADS). The import was done automatically using scripts for image download, file format conversion, file name, web page and information file browsing. Disregarding the resolution, this resulted in a total of 10,509 reference images, and 6,767 images are associated with an IRMA contour information feature file. In accordance to the respective license agreements, the database will be made freely available for research purposes, and may be used for image based evaluation campaigns such as the Cross Language Evaluation Forum (CLEF). We have also shown that it can be extended easily with further cases imported from a picture archiving and communication system (PACS).

  1. Trainable Cataloging for Digital Image Libraries with Applications to Volcano Detection

    NASA Technical Reports Server (NTRS)

    Burl, M. C.; Fayyad, U. M.; Perona, P.; Smyth, P.

    1995-01-01

    Users of digital image libraries are often not interested in image data per se but in derived products such as catalogs of objects of interest. Converting an image database into a usable catalog is typically carried out manually at present. For many larger image databases the purely manual approach is completely impractical. In this paper we describe the development of a trainable cataloging system: the user indicates the location of the objects of interest for a number of training images and the system learns to detect and catalog these objects in the rest of the database. In particular we describe the application of this system to the cataloging of small volcanoes in radar images of Venus. The volcano problem is of interest because of the scale (30,000 images, order of 1 million detectable volcanoes), technical difficulty (the variability of the volcanoes in appearance) and the scientific importance of the problem. The problem of uncertain or subjective ground truth is of fundamental importance in cataloging problems of this nature and is discussed in some detail. Experimental results are presented which quantify and compare the detection performance of the system relative to human detection performance. The paper concludes by discussing the limitations of the proposed system and the lessons learned of general relevance to the development of digital image libraries.

  2. EarthChem and SESAR: Data Resources and Interoperability for EarthScope Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Walker, D.; Block, K.; Vinay, S.; Ash, J.

    2008-12-01

    Data management within the EarthScope Cyberinfrastructure needs to pursue two goals in order to advance and maximize the broad scientific application and impact of the large volumes of observational data acquired by EarthScope facilities: (a) to provide access to all data acquired by EarthScope facilities, and to promote their use by broad audiences, and (b) to facilitate discovery of, access to, and integration of multi-disciplinary data sets that complement EarthScope data in support of EarthScope science. EarthChem and SESAR, the System for Earth Sample Registration, are two projects within the Geoinformatics for Geochemistry program that offer resources for EarthScope CI. EarthChem operates a data portal that currently provides access to >13 million analytical values for >600,000 samples, more than half of which are from North America, including data from the USGS and all data from the NAVDAT database, a web-accessible repository for age, chemical and isotopic data from Mesozoic and younger igneous rocks in western North America. The new EarthChem GEOCHRON database will house data collected in association with GeoEarthScope, storing and serving geochronological data submitted by participating facilities. The EarthChem Deep Lithosphere Dataset is a compilation of petrological data for mantle xenoliths, initiated in collaboration with GeoFrame to complement geophysical endeavors within EarthScope science. The EarthChem Geochemical Resource Library provides a home for geochemical and petrological data products and data sets. Parts of the digital data in EarthScope CI refer to physical samples such as drill cores, igneous rocks, or water and gas samples, collected, for example, by SAFOD or by EarthScope science projects and acquired through lab-based analysis. Management of sample-based data requires the use of global unique identifiers for samples, so that distributed data for individual samples generated in different labs and published in different papers can be unambiguously linked and integrated. SESAR operates a registry for Earth samples that assigns and administers the International GeoSample Numbers (IGSN) as a global unique identifier for samples. Registration of EarthScope samples with SESAR and use of the IGSN will ensure their unique identification in publications and data systems, thus facilitating interoperability among sample-based data relevant to EarthScope CI and globally. It will also make these samples visible to global audiences via the SESAR Global Sample Catalog.

  3. Digital Mapping Techniques '09-Workshop Proceedings, Morgantown, West Virginia, May 10-13, 2009

    USGS Publications Warehouse

    Soller, David R.

    2011-01-01

    As in the previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  4. Toward an open-access global database for mapping, control, and surveillance of neglected tropical diseases.

    PubMed

    Hürlimann, Eveline; Schur, Nadine; Boutsika, Konstantina; Stensgaard, Anna-Sofie; Laserna de Himpsl, Maiti; Ziegelbauer, Kathrin; Laizer, Nassor; Camenzind, Lukas; Di Pasquale, Aurelio; Ekpo, Uwem F; Simoonga, Christopher; Mushinge, Gabriel; Saarnak, Christopher F L; Utzinger, Jürg; Kristensen, Thomas K; Vounatsou, Penelope

    2011-12-01

    After many years of general neglect, interest has grown and efforts came under way for the mapping, control, surveillance, and eventual elimination of neglected tropical diseases (NTDs). Disease risk estimates are a key feature to target control interventions, and serve as a benchmark for monitoring and evaluation. What is currently missing is a georeferenced global database for NTDs providing open-access to the available survey data that is constantly updated and can be utilized by researchers and disease control managers to support other relevant stakeholders. We describe the steps taken toward the development of such a database that can be employed for spatial disease risk modeling and control of NTDs. With an emphasis on schistosomiasis in Africa, we systematically searched the literature (peer-reviewed journals and 'grey literature'), contacted Ministries of Health and research institutions in schistosomiasis-endemic countries for location-specific prevalence data and survey details (e.g., study population, year of survey and diagnostic techniques). The data were extracted, georeferenced, and stored in a MySQL database with a web interface allowing free database access and data management. At the beginning of 2011, our database contained more than 12,000 georeferenced schistosomiasis survey locations from 35 African countries available under http://www.gntd.org. Currently, the database is expanded to a global repository, including a host of other NTDs, e.g. soil-transmitted helminthiasis and leishmaniasis. An open-access, spatially explicit NTD database offers unique opportunities for disease risk modeling, targeting control interventions, disease monitoring, and surveillance. Moreover, it allows for detailed geostatistical analyses of disease distribution in space and time. With an initial focus on schistosomiasis in Africa, we demonstrate the proof-of-concept that the establishment and running of a global NTD database is feasible and should be expanded without delay.

  5. Therapeutic hypnosis, psychotherapy, and the digital humanities: the narratives and culturomics of hypnosis, 1800-2008.

    PubMed

    Rossi, Ernest; Mortimer, Jane; Rossi, Kathryn

    2013-04-01

    Culturomics is a new scientific discipline of the digital humanities-the use of computer algorithms to search for meaning in large databases of text and media. This new digital discipline is used to explore 200 years of the history of hypnosis and psychotherapy in over five million digitized books from more than 40 university libraries around the world. It graphically compares the frequencies of English words about hypnosis, hypnotherapy, psychoanalysis, psychotherapy, and their founders from 1800 to 2008. This new perspective explore issues such as: Who were the major innovators in the history of therapeutic hypnosis, psychoanalysis, and psychotherapy? How well does this new digital approach to the humanities correspond to traditional histories of hypnosis and psychotherapy?

  6. Digital geologic map of the Spokane 1:100,000 quadrangle, Washington and Idaho: a digital database for the 1990 N.L. Joseph map

    USGS Publications Warehouse

    Johnson, Bruce R.; Derkey, Pamela D.

    1998-01-01

    Geologic data from the geologic map of the Spokane 1:100,000-scale quadrangle compiled by Joseph (1990) were entered into a geographic information system (GIS) as part of a larger effort to create regional digital geology for the Pacific Northwest. The map area is located in eastern Washington and extends across the state border into western Idaho (Fig. 1). This open-file report describes the methods used to convert the geologic map data into a digital format, documents the file structures, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet.

  7. Preliminary integrated geologic map databases for the United States : Central states : Montana, Wyoming, Colorado, New Mexico, Kansas, Oklahoma, Texas, Missouri, Arkansas, and Louisiana

    USGS Publications Warehouse

    Stoeser, Douglas B.; Green, Gregory N.; Morath, Laurie C.; Heran, William D.; Wilson, Anna B.; Moore, David W.; Van Gosen, Bradley S.

    2005-01-01

    The growth in the use of Geographic Information Systems (GIS) has highlighted the need for regional and national digital geologic maps attributed with age and lithology information. Such maps can be conveniently used to generate derivative maps for purposes including mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This Open-File Report is a preliminary version of part of a series of integrated state geologic map databases that cover the entire United States. The only national-scale digital geologic maps that portray most or all of the United States for the conterminous U.S. are the digital version of the King and Beikman (1974a, b) map at a scale of 1:2,500,000, as digitized by Schruben and others (1994) and the digital version of the Geologic Map of North America (Reed and others, 2005a, b) compiled at a scale of 1:5,000,000 which is currently being prepared by the U.S. Geological Survey. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. In a few cases, new digital compilations were prepared (e.g. OH, SC, SD) or existing paper maps were digitized (e.g. KY, TX). For Alaska and Hawaii, new regional maps are being compiled and ultimately new state maps will be produced. The digital geologic maps are presented in standardized formats as ARC/INFO (.e00) export files and as ArcView shape (.shp) files. Accompanying these spatial databases are a set of five supplemental data tables that relate the map units to detailed lithologic and age information. The maps for the CONUS have been fitted to a common set of state boundaries based on the 1:100,000 topographic map series of the United States Geological Survey (USGS). When the individual state maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps. No attempt has been made to reconcile differences in mapped geology across state lines. This is the first version of this product and it will be subsequently updated to include four additional states (North Dakota, South Dakota, Nebraska, and Iowa)

  8. Abstracts of SIG Sessions.

    ERIC Educational Resources Information Center

    Proceedings of the ASIS Annual Meeting, 1996

    1996-01-01

    Includes abstracts of special interest group (SIG) sessions. Highlights include digital imagery; text summarization; browsing; digital libraries; icons and the Web; information management; curricula planning; interfaces; information systems; theories; scholarly and scientific communication; global development; archives; document delivery;…

  9. Consequences of "going digital" for pathology professionals - entering the cloud.

    PubMed

    Laurinavicius, Arvydas; Raslavicus, Paul

    2012-01-01

    New opportunities and the adoption of digital technologies will transform the way pathology professionals and services work. Many areas of our daily life as well as medical professions have experienced this change already which has resulted in a paradigm shift in many activities. Pathology is an image-based discipline, therefore, arrival of digital imaging into this domain promises major shift in our work and required mentality. Recognizing the physical and digital duality of the pathology workflow, we can prepare for the imminent increase of the digital component, synergize and enjoy its benefits. Development of a new generation of laboratory information systems along with seamless integration of digital imaging, decision-support, and knowledge databases will enable pathologists to work in a distributed environment. The paradigm of "cloud pathology" is proposed as an ultimate vision of digital pathology workstations plugged into the integrated multidisciplinary patient care systems.

  10. Instructional Suggestions Supporting Science Learning in Digital Environments Based on a Review of Eye-Tracking Studies

    ERIC Educational Resources Information Center

    Yang, Fang-Ying; Tsai, Meng-Jung; Chiou, Guo-Li; Lee, Silvia Wen-Yu; Chang, Cheng-Chieh; Chen, Li-Ling

    2018-01-01

    The main purpose of this study was to provide instructional suggestions for supporting science learning in digital environments based on a review of eye tracking studies in e-learning related areas. Thirty-three eye-tracking studies from 2005 to 2014 were selected from the Social Science Citation Index (SSCI) database for review. Through a…

  11. Digitization and the Creation of Virtual Libraries: The Princeton University Image Card Catalog--Reaping the Benefits of Imaging.

    ERIC Educational Resources Information Center

    Henthorne, Eileen

    1995-01-01

    Describes a project at the Princeton University libraries that converted the pre-1981 public card catalog, using digital imaging and optical character recognition technology, to fully tagged and indexed records of text in MARC format that are available on an online database and will be added to the online catalog. (LRW)

  12. Choosing a Global Positioning System Device for Use in U.S. Army Corps of Engineers Regulatory Districts

    DTIC Science & Technology

    2017-12-01

    Information Systems Center of Expertise (RS/GIS CX) (CEERD-RZR), U.S. Army Engineer Research and Development Center, Cold Regions Research and...GIS Geographic Information Systems GPS Global Positioning System HH Handheld IWR U.S. Army Engineer Institute for Water Resources n/a Not...Applicable NAE U.S. Army New England Regulatory District RS/GIS Remote Sensing/Geographic Information Systems SD Secure Digital SDHC Secure Digital High

  13. Research on enhancing the utilization of digital multispectral data and geographic information systems in global habitability studies

    NASA Technical Reports Server (NTRS)

    Martinko, Edward A.; Merchant, James W.

    1988-01-01

    During 1986 to 1987, the Kansas Applied Remote Sensing (KARS) Program continued to build upon long-term research efforts oriented towards enhancement and development of technologies for using remote sensing in the inventory and evaluation of land use and renewable resources (both natural and agricultural). These research efforts directly addressed needs and objectives of NASA's Land-Related Global Habitability Program as well as needs of and interests of public agencies and private firms. The KARS Program placed particular emphasis on two major areas: development of intelligent algorithms to improve automated classification of digital multispectral data; and integrating and merging digital multispectral data with ancillary data in spatial modes.

  14. The International Postal Network and Other Global Flows as Proxies for National Wellbeing.

    PubMed

    Hristova, Desislava; Rutherford, Alex; Anson, Jose; Luengo-Oroz, Miguel; Mascolo, Cecilia

    2016-01-01

    The digital exhaust left by flows of physical and digital commodities provides a rich measure of the nature, strength and significance of relationships between countries in the global network. With this work, we examine how these traces and the network structure can reveal the socioeconomic profile of different countries. We take into account multiple international networks of physical and digital flows, including the previously unexplored international postal network. By measuring the position of each country in the Trade, Postal, Migration, International Flights, IP and Digital Communications networks, we are able to build proxies for a number of crucial socioeconomic indicators such as GDP per capita and the Human Development Index ranking along with twelve other indicators used as benchmarks of national well-being by the United Nations and other international organisations. In this context, we have also proposed and evaluated a global connectivity degree measure applying multiplex theory across the six networks that accounts for the strength of relationships between countries. We conclude by showing how countries with shared community membership over multiple networks have similar socioeconomic profiles. Combining multiple flow data sources can help understand the forces which drive economic activity on a global level. Such an ability to infer proxy indicators in a context of incomplete information is extremely timely in light of recent discussions on measurement of indicators relevant to the Sustainable Development Goals.

  15. The International Postal Network and Other Global Flows as Proxies for National Wellbeing

    PubMed Central

    Rutherford, Alex; Anson, Jose; Luengo-Oroz, Miguel; Mascolo, Cecilia

    2016-01-01

    The digital exhaust left by flows of physical and digital commodities provides a rich measure of the nature, strength and significance of relationships between countries in the global network. With this work, we examine how these traces and the network structure can reveal the socioeconomic profile of different countries. We take into account multiple international networks of physical and digital flows, including the previously unexplored international postal network. By measuring the position of each country in the Trade, Postal, Migration, International Flights, IP and Digital Communications networks, we are able to build proxies for a number of crucial socioeconomic indicators such as GDP per capita and the Human Development Index ranking along with twelve other indicators used as benchmarks of national well-being by the United Nations and other international organisations. In this context, we have also proposed and evaluated a global connectivity degree measure applying multiplex theory across the six networks that accounts for the strength of relationships between countries. We conclude by showing how countries with shared community membership over multiple networks have similar socioeconomic profiles. Combining multiple flow data sources can help understand the forces which drive economic activity on a global level. Such an ability to infer proxy indicators in a context of incomplete information is extremely timely in light of recent discussions on measurement of indicators relevant to the Sustainable Development Goals. PMID:27248142

  16. Buckets: Smart Objects for Digital Libraries

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.

    2001-01-01

    Current discussion of digital libraries (DLs) is often dominated by the merits of the respective storage, search and retrieval functionality of archives, repositories, search engines, search interfaces and database systems. While these technologies are necessary for information management, the information content is more important than the systems used for its storage and retrieval. Digital information should have the same long-term survivability prospects as traditional hardcopy information and should be protected to the extent possible from evolving search engine technologies and vendor vagaries in database management systems. Information content and information retrieval systems should progress on independent paths and make limited assumptions about the status or capabilities of the other. Digital information can achieve independence from archives and DL systems through the use of buckets. Buckets are an aggregative, intelligent construct for publishing in DLs. Buckets allow the decoupling of information content from information storage and retrieval. Buckets exist within the Smart Objects and Dumb Archives model for DLs in that many of the functionalities and responsibilities traditionally associated with archives are pushed down (making the archives dumber) into the buckets (making them smarter). Some of the responsibilities imbued to buckets are the enforcement of their terms and conditions, and maintenance and display of their contents.

  17. Digital mining claim density map for federal lands in Nevada: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Nevada as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate Bureau of Land Management (BLM) State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  18. Digital mining claim density map for federal lands in Utah: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Utah as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  19. Digital mining claim density map for federal lands in Wyoming: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Wyoming as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  20. Digital mining claim density map for federal lands in Colorado: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Colorado as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  1. Digital mining claim density map for federal lands in California: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in California as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  2. Digital mining claim density map for federal lands in New Mexico: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in New Mexico as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  3. Digital mining claim density map for federal lands in Washington: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Washington as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  4. Digital mining claim density map for federal lands in Arizona: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Arizona as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  5. Digital Mapping Techniques '10-Workshop Proceedings, Sacramento, California, May 16-19, 2010

    USGS Publications Warehouse

    Soller, David R.; Soller, David R.

    2012-01-01

    The Digital Mapping Techniques '10 (DMT'10) workshop was attended by 110 technical experts from 40 agencies, universities, and private companies, including representatives from 19 State geological surveys (see Appendix A). This workshop, hosted by the California Geological Survey, May 16-19, 2010, in Sacramento, California, was similar in nature to the previous 13 meetings (see Appendix B). The meeting was coordinated by the U.S. Geological Survey's (USGS) National Geologic Map Database project. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was again successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products ("publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  6. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, themore » necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.« less

  7. A DBMS architecture for global change research

    NASA Astrophysics Data System (ADS)

    Hachem, Nabil I.; Gennert, Michael A.; Ward, Matthew O.

    1993-08-01

    The goal of this research is the design and development of an integrated system for the management of very large scientific databases, cartographic/geographic information processing, and exploratory scientific data analysis for global change research. The system will represent both spatial and temporal knowledge about natural and man-made entities on the eath's surface, following an object-oriented paradigm. A user will be able to derive, modify, and apply, procedures to perform operations on the data, including comparison, derivation, prediction, validation, and visualization. This work represents an effort to extend the database technology with an intrinsic class of operators, which is extensible and responds to the growing needs of scientific research. Of significance is the integration of many diverse forms of data into the database, including cartography, geography, hydrography, hypsography, images, and urban planning data. Equally important is the maintenance of metadata, that is, data about the data, such as coordinate transformation parameters, map scales, and audit trails of previous processing operations. This project will impact the fields of geographical information systems and global change research as well as the database community. It will provide an integrated database management testbed for scientific research, and a testbed for the development of analysis tools to understand and predict global change.

  8. Needs assessment for next generation computer-aided mammography reference image databases and evaluation studies.

    PubMed

    Horsch, Alexander; Hapfelmeier, Alexander; Elter, Matthias

    2011-11-01

    Breast cancer is globally a major threat for women's health. Screening and adequate follow-up can significantly reduce the mortality from breast cancer. Human second reading of screening mammograms can increase breast cancer detection rates, whereas this has not been proven for current computer-aided detection systems as "second reader". Critical factors include the detection accuracy of the systems and the screening experience and training of the radiologist with the system. When assessing the performance of systems and system components, the choice of evaluation methods is particularly critical. Core assets herein are reference image databases and statistical methods. We have analyzed characteristics and usage of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM) from the University of South Florida, in literature indexed in Medline, IEEE Xplore, SpringerLink, and SPIE, with respect to type of computer-aided diagnosis (CAD) (detection, CADe, or diagnostics, CADx), selection of database subsets, choice of evaluation method, and quality of descriptions. 59 publications presenting 106 evaluation studies met our selection criteria. In 54 studies (50.9%), the selection of test items (cases, images, regions of interest) extracted from the DDSM was not reproducible. Only 2 CADx studies, not any CADe studies, used the entire DDSM. The number of test items varies from 100 to 6000. Different statistical evaluation methods are chosen. Most common are train/test (34.9% of the studies), leave-one-out (23.6%), and N-fold cross-validation (18.9%). Database-related terminology tends to be imprecise or ambiguous, especially regarding the term "case". Overall, both the use of the DDSM as data source for evaluation of mammography CAD systems, and the application of statistical evaluation methods were found highly diverse. Results reported from different studies are therefore hardly comparable. Drawbacks of the DDSM (e.g. varying quality of lesion annotations) may contribute to the reasons. But larger bias seems to be caused by authors' own decisions upon study design. RECOMMENDATIONS/CONCLUSION: For future evaluation studies, we derive a set of 13 recommendations concerning the construction and usage of a test database, as well as the application of statistical evaluation methods.

  9. Development of an Integrated Hydrologic Modeling System for Rainfall-Runoff Simulation

    NASA Astrophysics Data System (ADS)

    Lu, B.; Piasecki, M.

    2008-12-01

    This paper aims to present the development of an integrated hydrological model which involves functionalities of digital watershed processing, online data retrieval, hydrologic simulation and post-event analysis. The proposed system is intended to work as a back end to the CUAHSI HIS cyberinfrastructure developments. As a first step into developing this system, a physics-based distributed hydrologic model PIHM (Penn State Integrated Hydrologic Model) is wrapped into OpenMI(Open Modeling Interface and Environment ) environment so as to seamlessly interact with OpenMI compliant meteorological models. The graphical user interface is being developed from the openGIS application called MapWindows which permits functionality expansion through the addition of plug-ins. . Modules required to set up through the GUI workboard include those for retrieving meteorological data from existing database or meteorological prediction models, obtaining geospatial data from the output of digital watershed processing, and importing initial condition and boundary condition. They are connected to the OpenMI compliant PIHM to simulate rainfall-runoff processes and includes a module for automatically displaying output after the simulation. Online databases are accessed through the WaterOneFlow web services, and the retrieved data are either stored in an observation database(OD) following the schema of Observation Data Model(ODM) in case for time series support, or a grid based storage facility which may be a format like netCDF or a grid-based-data database schema . Specific development steps include the creation of a bridge to overcome interoperability issue between PIHM and the ODM, as well as the embedding of TauDEM (Terrain Analysis Using Digital Elevation Models) into the model. This module is responsible for developing watershed and stream network using digital elevation models. Visualizing and editing geospatial data is achieved by the usage of MapWinGIS, an ActiveX control developed by MapWindow team. After applying to the practical watershed, the performance of the model can be tested by the post-event analysis module.

  10. The Toolbox for Local and Global Plagiarism Detection

    ERIC Educational Resources Information Center

    Butakov, Sergey; Scherbinin, Vladislav

    2009-01-01

    Digital plagiarism is a problem for educators all over the world. There are many software tools on the market for uncovering digital plagiarism. Most of them can work only with text submissions. In this paper, we present a new architecture for a plagiarism detection tool that can work with many different kinds of digital submissions, from plain or…

  11. Creative and Critical Approaches to Language Learning and Digital Technology: Findings from a Multilingual Digital Storytelling Project

    ERIC Educational Resources Information Center

    Anderson, Jim; Chung, Yu-Chiao; Macleroy, Vicky

    2018-01-01

    This article presents findings from the global literacy project, Critical Connections: Multilingual Digital Storytelling (MDST), which provides a means of nurturing and reflecting multiliteracies in practice. It recognises the power of storytelling and the space stories offer both for self-representation and for engaging with otherness. It draws…

  12. Multi-Site Ethnography, Hypermedia and the Productive Hazards of Digital Methods: A Struggle for Liveness

    ERIC Educational Resources Information Center

    Gallagher, Kathleen; Freeman, Barry

    2011-01-01

    This article explores the possibilities and frustrations of using digital methods in a multi-sited ethnographic research project. The project, "Urban School Performances: The interplay, through live and digital drama, of local-global knowledge about student engagement", is a study of youth and teachers in drama classrooms in contexts of…

  13. Use of Knowledge Bases in Education of Database Management

    ERIC Educational Resources Information Center

    Radványi, Tibor; Kovács, Emod

    2008-01-01

    In this article we present a segment of Sulinet Digital Knowledgebase curriculum system in which you can find the sections of subject-matter which aid educating the database management. You can follow the order of the course from the beginning when some topics appearance and raise in elementary school, through the topics accomplish in secondary…

  14. Using Screencasting to Promote Database Trials and Library Resources

    ERIC Educational Resources Information Center

    Emanuel, Michelle

    2013-01-01

    At the University of Mississippi, screencasting was used to promote a database trial to the ARTStor Digital Library. Using Jing, a free product used for recording and posting screencasts, and a Snowball USB microphone, 11 videos averaging 3 minutes in length were posted to an online topic guide. Screencasting was used as a quick, creative, and…

  15. The Cardiac Safety Research Consortium ECG database.

    PubMed

    Kligfield, Paul; Green, Cynthia L

    2012-01-01

    The Cardiac Safety Research Consortium (CSRC) ECG database was initiated to foster research using anonymized, XML-formatted, digitized ECGs with corresponding descriptive variables from placebo- and positive-control arms of thorough QT studies submitted to the US Food and Drug Administration (FDA) by pharmaceutical sponsors. The database can be expanded to other data that are submitted directly to CSRC from other sources, and currently includes digitized ECGs from patients with genotyped varieties of congenital long-QT syndrome; this congenital long-QT database is also linked to ambulatory electrocardiograms stored in the Telemetric and Holter ECG Warehouse (THEW). Thorough QT data sets are available from CSRC for unblinded development of algorithms for analysis of repolarization and for blinded comparative testing of algorithms developed for the identification of moxifloxacin, as used as a positive control in thorough QT studies. Policies and procedures for access to these data sets are available from CSRC, which has developed tools for statistical analysis of blinded new algorithm performance. A recently approved CSRC project will create a data set for blinded analysis of automated ECG interval measurements, whose initial focus will include comparison of four of the major manufacturers of automated electrocardiographs in the United States. CSRC welcomes application for use of the ECG database for clinical investigation. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. A database and tool for boundary conditions for regional air quality modeling: description and evaluation

    NASA Astrophysics Data System (ADS)

    Henderson, B. H.; Akhtar, F.; Pye, H. O. T.; Napelenok, S. L.; Hutzell, W. T.

    2013-09-01

    Transported air pollutants receive increasing attention as regulations tighten and global concentrations increase. The need to represent international transport in regional air quality assessments requires improved representation of boundary concentrations. Currently available observations are too sparse vertically to provide boundary information, particularly for ozone precursors, but global simulations can be used to generate spatially and temporally varying Lateral Boundary Conditions (LBC). This study presents a public database of global simulations designed and evaluated for use as LBC for air quality models (AQMs). The database covers the contiguous United States (CONUS) for the years 2000-2010 and contains hourly varying concentrations of ozone, aerosols, and their precursors. The database is complimented by a tool for configuring the global results as inputs to regional scale models (e.g., Community Multiscale Air Quality or Comprehensive Air quality Model with extensions). This study also presents an example application based on the CONUS domain, which is evaluated against satellite retrieved ozone vertical profiles. The results show performance is largely within uncertainty estimates for the Tropospheric Emission Spectrometer (TES) with some exceptions. The major difference shows a high bias in the upper troposphere along the southern boundary in January. This publication documents the global simulation database, the tool for conversion to LBC, and the fidelity of concentrations on the boundaries. This documentation is intended to support applications that require representation of long-range transport of air pollutants.

  17. Unified Database for Rejected Image Analysis Across Multiple Vendors in Radiography.

    PubMed

    Little, Kevin J; Reiser, Ingrid; Liu, Lili; Kinsey, Tiffany; Sánchez, Adrian A; Haas, Kateland; Mallory, Florence; Froman, Carmen; Lu, Zheng Feng

    2017-02-01

    Reject rate analysis has been part of radiography departments' quality control since the days of screen-film radiography. In the era of digital radiography, one might expect that reject rate analysis is easily facilitated because of readily available information produced by the modality during the examination procedure. Unfortunately, this is not always the case. The lack of an industry standard and the wide variety of system log entries and formats have made it difficult to implement a robust multivendor reject analysis program, and logs do not always include all relevant information. The increased use of digital detectors exacerbates this problem because of higher reject rates associated with digital radiography compared with computed radiography. In this article, the authors report on the development of a unified database for vendor-neutral reject analysis across multiple sites within an academic institution and share their experience from a team-based approach to reduce reject rates. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  18. Multi-Decadal analysis of Global Trends in Microseism Intensity: A Proxy for Changes in Extremal Storm Activity and Oceanic Wave State

    NASA Astrophysics Data System (ADS)

    Anthony, R. E.; Aster, R. C.; Rowe, C. A.

    2016-12-01

    The Earth's seismic noise spectrum features two globally ubiquitous peaks near 8 and 16 s periods (secondary and primary bands) that arise when storm-generated ocean gravity waves are converted to seismic energy, predominantly into Rayleigh waves. Because of its regionally integrative nature, microseism intensity and other seismographic data from long running sites can provide useful proxies for wave state. Expanding an earlier study of global microseism trends (Aster et al., 2010), we analyze digitally-archived, up-to-date (through late 2016) multi-decadal seismic data from stations of global seismographic networks to characterize the spatiotemporal evolution of wave climate over the past >20 years. The IRIS Noise Tool Kit (Bahavair et al., 2013) is used to produce ground motion power spectral density (PSD) estimates in 3-hour overlapping time series segments. The result of this effort is a longer duration and more broadly geographically distributed PSD database than attained in previous studies, particularly for the primary microseism band. Integrating power within the primary and secondary microseism bands enables regional characterization of spatially-integrated trends in wave states and storm event statistics of varying thresholds. The results of these analyses are then interpreted within the context of recognized modes of atmospheric variability, including the particularly strong 2015-2016 El Niño. We note a number of statistically significant increasing trends in both raw microseism power and storm activity occurring at multiple stations in the Northwest Atlantic and Southeast Pacific consistent with generally increased wave heights and storminess in these regions. Such trends in wave activity have the potential to significantly influence coastal environments particularly under rising global sea levels.

  19. Global Distribution of Outbreaks of Water-Associated Infectious Diseases

    PubMed Central

    Yang, Kun; LeJeune, Jeffrey; Alsdorf, Doug; Lu, Bo; Shum, C. K.; Liang, Song

    2012-01-01

    Background Water plays an important role in the transmission of many infectious diseases, which pose a great burden on global public health. However, the global distribution of these water-associated infectious diseases and underlying factors remain largely unexplored. Methods and Findings Based on the Global Infectious Disease and Epidemiology Network (GIDEON), a global database including water-associated pathogens and diseases was developed. In this study, reported outbreak events associated with corresponding water-associated infectious diseases from 1991 to 2008 were extracted from the database. The location of each reported outbreak event was identified and geocoded into a GIS database. Also collected in the GIS database included geo-referenced socio-environmental information including population density (2000), annual accumulated temperature, surface water area, and average annual precipitation. Poisson models with Bayesian inference were developed to explore the association between these socio-environmental factors and distribution of the reported outbreak events. Based on model predictions a global relative risk map was generated. A total of 1,428 reported outbreak events were retrieved from the database. The analysis suggested that outbreaks of water-associated diseases are significantly correlated with socio-environmental factors. Population density is a significant risk factor for all categories of reported outbreaks of water-associated diseases; water-related diseases (e.g., vector-borne diseases) are associated with accumulated temperature; water-washed diseases (e.g., conjunctivitis) are inversely related to surface water area; both water-borne and water-related diseases are inversely related to average annual rainfall. Based on the model predictions, “hotspots” of risks for all categories of water-associated diseases were explored. Conclusions At the global scale, water-associated infectious diseases are significantly correlated with socio-environmental factors, impacting all regions which are affected disproportionately by different categories of water-associated infectious diseases. PMID:22348158

  20. Global, long-term Earth Science Data Records of forest cover, change, and fragmentation from Landsat: the Global Forest Cover Change Project

    NASA Astrophysics Data System (ADS)

    Sexton, J.; Huang, C.; Channan, S.; Feng, M.; Song, X.; Kim, D.; Song, D.; Vermote, E.; Masek, J.; Townshend, J. R.

    2013-12-01

    Monitoring, analysis, and management of forests require measurements of forest cover that are both spatio-temporally consistent and resolved globally at sub-hectare resolution. The Global Forest Cover Change project, a cooperation between the University of Maryland Global Land Cover Facility and NASA Goddard Space Flight Center, is providing the first long-term, sub-hectare, globally consistent data records of forest cover, change, and fragmentation in circa-1975, -1990, -2000, and -2005 epochs. These data are derived from the Global Land Survey collection of Landsat images in the respective epochs, atmospherically corrected to surface reflectance in 1990, 2000, and 2005 using the Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) implementation of the 6S radiative transfer algorithm, with ancillary information from MODIS Land products, ASTER Global Digital Elevation Model (GDEM), and climatological data layers. Forest cover and change were estimated by a novel continuous-field approach, which produced for the 2000 and 2005 epochs the world's first global, 30-m resolution database of tree cover. Surface reflectance estimates were validated against coincident MODIS measurements, the results of which have been corroborated by subsequent, independent validations against measurements from AERONET sites. Uncertainties in tree- and forest-cover values were estimated in each pixel as a compounding of within-sample uncertainty and accuracy relative to a sample of independent measurements from small-footprint lidar. Accuracy of forest cover and change estimates was further validated relative to expert-interpreted high-resolution imagery, from which unbiased estimates of forest cover and change have been produced at national and eco-regional scales. These first-of-kind Earth Science Data Records--surface reflectance in 1990, 2000, and 2005 and forest cover, change, and fragmentation in and between 1975, 1990, 2000, and 2005--are hosted at native, Landsat resolution for free public access at the Global Land Cover Facility website (www.landcover.org). Global mosaic of circa-2000, Landsat-based estimates of tree cover. Gaps due to clouds and/or snow in each scene were filled first with Landsat-based data from overlapping paths, and the remaining gaps were filled with data from the MODIS VCF Tree Cover layer in 2000.

Top