Sample records for mapping databases agency

  1. 77 FR 47492 - Thirteenth Meeting: RTCA Special Committee 217, Terrain and Airport Mapping Databases, Joint With...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-08

    ... Committee 217, Terrain and Airport Mapping Databases, Joint With EUROCAE WG-44 AGENCY: Federal Aviation... 217, Terrain and Airport Mapping Databases, Joint with EUROCAE WG-44. SUMMARY: The FAA is issuing this... Mapping Databases, Joint with EUROCAE WG-44. DATES: The meeting will be held September 10-14, 2012, from 9...

  2. 75 FR 10552 - Sixth Meeting-RTCA Special Committee 217: Joint With EUROCAE WG-44 Terrain and Airport Mapping...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-08

    ... 217: Joint With EUROCAE WG- 44 Terrain and Airport Mapping Databases AGENCY: Federal Aviation... Airport Mapping Databases. SUMMARY: The FAA is issuing this notice to advise the public of a meeting of RTCA Special Committee 217: Joint with EUROCAE WG-44 Terrain and Airport Mapping Databases. DATES: The...

  3. 76 FR 27744 - Eighth Meeting-RTCA Special Committee 217: Joint With EUROCAE WG-44 Terrain and Airport Mapping...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-12

    ... Committee 217: Joint With EUROCAE WG-44 Terrain and Airport Mapping Databases AGENCY: Federal Aviation... Airport Mapping Databases. SUMMARY: The FAA is issuing this notice to advise the public of a meeting of RTCA Special Committee 217: Joint with EUROCAE WG-44 Terrain and Airport Mapping Databases. DATES: The...

  4. 76 FR 54527 - Ninth Meeting-RTCA Special Committee 217: Joint With EUROCAE WG-44 Terrain and Airport Mapping...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-01

    ... 217: Joint With EUROCAE WG- 44 Terrain and Airport Mapping Databases AGENCY: Federal Aviation... Airport Mapping Databases. SUMMARY: The FAA is issuing this notice to advise the public of a meeting of RTCA Special Committee 217: Joint with EUROCAE WG-44 Terrain and Airport Mapping Databases. DATES: The...

  5. 76 FR 6179 - Eighth Meeting-RTCA Special Committee 217: Joint With EUROCAE WG-44 Terrain and Airport Mapping...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-03

    ... Committee 217: Joint With EUROCAE WG-44 Terrain and Airport Mapping Databases AGENCY: Federal Aviation... Airport Mapping Databases. SUMMARY: The FAA is issuing this notice to advise the public of a meeting of RTCA Special Committee 217: Joint with EUROCAE WG-44 Terrain and Airport Mapping Databases. DATES: The...

  6. 76 FR 70531 - Tenth Meeting: RTCA Special Committee 217/EUROCAE WG-44: Terrain and Airport Mapping Databases

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... 217/EUROCAE WG-44: Terrain and Airport Mapping Databases AGENCY: Federal Aviation Administration (FAA... Databases: For the tenth meeting DATES: The meeting will be held December 6-9, 2011, from 9 a.m. to 5 p.m... Mapping Databases. The agenda will include the following: December 6, 2011 Open Plenary Session. Chairman...

  7. 77 FR 14584 - Eleventh Meeting: RTCA Special Committee 217, Joint With EUROCAE Working Group-44, Terrain and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-12

    ... Committee 217, Joint With EUROCAE Working Group--44, Terrain and Airport Mapping Databases AGENCY: Federal... Special Committee 217, Joint with EUROCAE Working Group--44, Terrain and Airport Mapping Databases... Committee 217, Joint with EUROCAE Working Group--44, Terrain and Airport Mapping Databases. DATES: The...

  8. Semantic mediation in the national geologic map database (US)

    USGS Publications Warehouse

    Percy, D.; Richard, S.; Soller, D.

    2008-01-01

    Controlled language is the primary challenge in merging heterogeneous databases of geologic information. Each agency or organization produces databases with different schema, and different terminology for describing the objects within. In order to make some progress toward merging these databases using current technology, we have developed software and a workflow that allows for the "manual semantic mediation" of these geologic map databases. Enthusiastic support from many state agencies (stakeholders and data stewards) has shown that the community supports this approach. Future implementations will move toward a more Artificial Intelligence-based approach, using expert-systems or knowledge-bases to process data based on the training sets we have developed manually.

  9. 77 FR 29749 - Twelfth Meeting: RTCA Special Committee 217, Joint with EUROCAE WG-44, Terrain and Airport...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-18

    ... Committee 217, Joint with EUROCAE WG-44, Terrain and Airport Mapping Databases AGENCY: Federal Aviation... 217, Joint with EUROCAE WG-44, Terrain and Airport Mapping Databases. SUMMARY: The FAA is issuing this..., Terrain and Airport Mapping Databases. DATES: The meeting will be held June 18-22, 2012, from 9:00 a.m.-5...

  10. Geologic database for digital geology of California, Nevada, and Utah: an application of the North American Data Model

    USGS Publications Warehouse

    Bedford, David R.; Ludington, Steve; Nutt, Constance M.; Stone, Paul A.; Miller, David M.; Miller, Robert J.; Wagner, David L.; Saucedo, George J.

    2003-01-01

    The USGS is creating an integrated national database for digital state geologic maps that includes stratigraphic, age, and lithologic information. The majority of the conterminous 48 states have digital geologic base maps available, often at scales of 1:500,000. This product is a prototype, and is intended to demonstrate the types of derivative maps that will be possible with the national integrated database. This database permits the creation of a number of types of maps via simple or sophisticated queries, maps that may be useful in a number of areas, including mineral-resource assessment, environmental assessment, and regional tectonic evolution. This database is distributed with three main parts: a Microsoft Access 2000 database containing geologic map attribute data, an Arc/Info (Environmental Systems Research Institute, Redlands, California) Export format file containing points representing designation of stratigraphic regions for the Geologic Map of Utah, and an ArcView 3.2 (Environmental Systems Research Institute, Redlands, California) project containing scripts and dialogs for performing a series of generalization and mineral resource queries. IMPORTANT NOTE: Spatial data for the respective stage geologic maps is not distributed with this report. The digital state geologic maps for the states involved in this report are separate products, and two of them are produced by individual state agencies, which may be legally and/or financially responsible for this data. However, the spatial datasets for maps discussed in this report are available to the public. Questions regarding the distribution, sale, and use of individual state geologic maps should be sent to the respective state agency. We do provide suggestions for obtaining and formatting the spatial data to make it compatible with data in this report. See section ‘Obtaining and Formatting Spatial Data’ in the PDF version of the report.

  11. Digital map databases in support of avionic display systems

    NASA Astrophysics Data System (ADS)

    Trenchard, Michael E.; Lohrenz, Maura C.; Rosche, Henry, III; Wischow, Perry B.

    1991-08-01

    The emergence of computerized mission planning systems (MPS) and airborne digital moving map systems (DMS) has necessitated the development of a global database of raster aeronautical chart data specifically designed for input to these systems. The Naval Oceanographic and Atmospheric Research Laboratory''s (NOARL) Map Data Formatting Facility (MDFF) is presently dedicated to supporting these avionic display systems with the development of the Compressed Aeronautical Chart (CAC) database on Compact Disk Read Only Memory (CDROM) optical discs. The MDFF is also developing a series of aircraft-specific Write-Once Read Many (WORM) optical discs. NOARL has initiated a comprehensive research program aimed at improving the pilots'' moving map displays current research efforts include the development of an alternate image compression technique and generation of a standard set of color palettes. The CAC database will provide digital aeronautical chart data in six different scales. CAC is derived from the Defense Mapping Agency''s (DMA) Equal Arc-second (ARC) Digitized Raster Graphics (ADRG) a series of scanned aeronautical charts. NOARL processes ADRG to tailor the chart image resolution to that of the DMS display while reducing storage requirements through image compression techniques. CAC is being distributed by DMA as a library of CDROMs.

  12. GIS applications for military operations in coastal zones

    USGS Publications Warehouse

    Fleming, S.; Jordan, T.; Madden, M.; Usery, E.L.; Welch, R.

    2009-01-01

    In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations. ?? 2008 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).

  13. GIS applications for military operations in coastal zones

    NASA Astrophysics Data System (ADS)

    Fleming, S.; Jordan, T.; Madden, M.; Usery, E. L.; Welch, R.

    In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bower, J.C.; Burford, M.J.; Downing, T.R.

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using themore » site map database.« less

  15. Geologic Map and Map Database of the Oakland Metropolitan Area, Alameda, Contra Costa, and San Francisco Counties, California

    USGS Publications Warehouse

    Graymer, R.W.

    2000-01-01

    Introduction This report contains a new geologic map at 1:50,000 scale, derived from a set of geologic map databases containing information at a resolution associated with 1:24,000 scale, and a new description of geologic map units and structural relationships in the mapped area. The map database represents the integration of previously published reports and new geologic mapping and field checking by the author (see Sources of Data index map on the map sheet or the Arc-Info coverage pi-so and the textfile pi-so.txt). The descriptive text (below) contains new ideas about the Hayward fault and other faults in the East Bay fault system, as well as new ideas about the geologic units and their relations. These new data are released in digital form in conjunction with the Federal Emergency Management Agency Project Impact in Oakland. The goal of Project Impact is to use geologic information in land-use and emergency services planning to reduce the losses occurring during earthquakes, landslides, and other hazardous geologic events. The USGS, California Division of Mines and Geology, FEMA, California Office of Emergency Services, and City of Oakland participated in the cooperative project. The geologic data in this report were provided in pre-release form to other Project Impact scientists, and served as one of the basic data layers for the analysis of hazard related to earthquake shaking, liquifaction, earthquake induced landsliding, and rainfall induced landsliding. The publication of these data provides an opportunity for regional planners, local, state, and federal agencies, teachers, consultants, and others outside Project Impact who are interested in geologic data to have the new data long before a traditional paper map could be published. Because the database contains information about both the bedrock and surficial deposits, it has practical applications in the study of groundwater and engineering of hillside materials, as well as the study of geologic hazards and the academic research on the geologic history and development of the region.

  16. Database of the Geologic Map of North America - Adapted from the Map by J.C. Reed, Jr. and others (2005)

    USGS Publications Warehouse

    Garrity, Christopher P.; Soller, David R.

    2009-01-01

    The Geological Society of America's (GSA) Geologic Map of North America (Reed and others, 2005; 1:5,000,000) shows the geology of a significantly large area of the Earth, centered on North and Central America and including the submarine geology of parts of the Atlantic and Pacific Oceans. This map is now converted to a Geographic Information System (GIS) database that contains all geologic and base-map information shown on the two printed map sheets and the accompanying explanation sheet. We anticipate this map database will be revised at some unspecified time in the future, likely through the actions of a steering committee managed by the Geological Society of America (GSA) and staffed by scientists from agencies including, but not limited to, those responsible for the original map compilation (U.S. Geological Survey, Geological Survey of Canada, and Woods Hole Oceanographic Institute). Regarding the use of this product, as noted by the map's compilers: 'The Geologic Map of North America is an essential educational tool for teaching the geology of North America to university students and for the continuing education of professional geologists in North America and elsewhere. In addition, simplified maps derived from the Geologic Map of North America are useful for enlightening younger students and the general public about the geology of the continent.' With publication of this database, the preparation of any type of simplified map is made significantly easier. More important perhaps, the database provides a more accessible means to explore the map information and to compare and analyze it in conjunction with other types of information (for example, land use, soils, biology) to better understand the complex interrelations among factors that affect Earth resources, hazards, ecosystems, and climate.

  17. A 30-meter spatial database for the nation's forests

    Treesearch

    Raymond L. Czaplewski

    2002-01-01

    The FIA vision for remote sensing originated in 1992 with the Blue Ribbon Panel on FIA, and it has since evolved into an ambitious performance target for 2003. FIA is joining a consortium of Federal agencies to map the Nation's land cover. FIA field data will help produce a seamless, standardized, national geospatial database for forests at the scale of 30-m...

  18. The National Map - Orthoimagery Layer

    USGS Publications Warehouse

    ,

    2007-01-01

    Many Federal, State, and local agencies use a common set of framework geographic information databases as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continually maintained, and nationally consistent set of online, public domain, framework geographic information databases. The National Map will serve as a foundation for integrating, sharing, and using data easily and consistently. The data will be the source of revised paper topographic maps. The National Map includes digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information.

  19. USGS Mineral Resources Program; national maps and datasets for research and land planning

    USGS Publications Warehouse

    Nicholson, S.W.; Stoeser, D.B.; Ludington, S.D.; Wilson, Frederic H.

    2001-01-01

    The U.S. Geological Survey, the Nation’s leader in producing and maintaining earth science data, serves as an advisor to Congress, the Department of the Interior, and many other Federal and State agencies. Nationwide datasets that are easily available and of high quality are critical for addressing a wide range of land-planning, resource, and environmental issues. Four types of digital databases (geological, geophysical, geochemical, and mineral occurrence) are being compiled and upgraded by the Mineral Resources Program on regional and national scales to meet these needs. Where existing data are incomplete, new data are being collected to ensure national coverage. Maps and analyses produced from these databases provide basic information essential for mineral resource assessments and environmental studies, as well as fundamental information for regional and national land-use studies. Maps and analyses produced from the databases are instrumental to ongoing basic research, such as the identification of mineral deposit origins, determination of regional background values of chemical elements with known environmental impact, and study of the relationships between toxic elements or mining practices to human health. As datasets are completed or revised, the information is made available through a variety of media, including the Internet. Much of the available information is the result of cooperative activities with State and other Federal agencies. The upgraded Mineral Resources Program datasets make geologic, geophysical, geochemical, and mineral occurrence information at the state, regional, and national scales available to members of Congress, State and Federal government agencies, researchers in academia, and the general public. The status of the Mineral Resources Program datasets is outlined below.

  20. Latent fingerprint matching.

    PubMed

    Jain, Anil K; Feng, Jianjiang

    2011-01-01

    Latent fingerprint identification is of critical importance to law enforcement agencies in identifying suspects: Latent fingerprints are inadvertent impressions left by fingers on surfaces of objects. While tremendous progress has been made in plain and rolled fingerprint matching, latent fingerprint matching continues to be a difficult problem. Poor quality of ridge impressions, small finger area, and large nonlinear distortion are the main difficulties in latent fingerprint matching compared to plain or rolled fingerprint matching. We propose a system for matching latent fingerprints found at crime scenes to rolled fingerprints enrolled in law enforcement databases. In addition to minutiae, we also use extended features, including singularity, ridge quality map, ridge flow map, ridge wavelength map, and skeleton. We tested our system by matching 258 latents in the NIST SD27 database against a background database of 29,257 rolled fingerprints obtained by combining the NIST SD4, SD14, and SD27 databases. The minutiae-based baseline rank-1 identification rate of 34.9 percent was improved to 74 percent when extended features were used. In order to evaluate the relative importance of each extended feature, these features were incrementally used in the order of their cost in marking by latent experts. The experimental results indicate that singularity, ridge quality map, and ridge flow map are the most effective features in improving the matching accuracy.

  1. Geology of the Palo Alto 30 x 60 minute quadrangle, California: A digital database

    USGS Publications Warehouse

    Brabb, Earl E.; Graymer, R.W.; Jones, David Lawrence

    1998-01-01

    This map database represents the integration of previously published and unpublished maps by several workers (see Sources of Data index map on Sheet 2 and the corresponding table below) and new geologic mapping and field checking by the authors with the previously published geologic map of San Mateo County (Brabb and Pampeyan, 1983) and Santa Cruz County (Brabb, 1989, Brabb and others, 1997), and various sources in a small part of Santa Clara County. These new data are released in digital form to provide an opportunity for regional planners, local, state, and federal agencies, teachers, consultants, and others interested in geologic data to have the new data long before a traditional paper map is published. The new data include a new depiction of Quaternary units in the San Francisco Bay plain emphasizing depositional environment, important new observations between the San Andreas and Pilarcitos faults, and a new interpretation of structural and stratigraphic relationships of rock packages (Assemblages).

  2. Digital mapping techniques '00, workshop proceedings - May 17-20, 2000, Lexington, Kentucky

    USGS Publications Warehouse

    Soller, David R.

    2000-01-01

    Introduction: The Digital Mapping Techniques '00 (DMT'00) workshop was attended by 99 technical experts from 42 agencies, universities, and private companies, including representatives from 28 state geological surveys (see Appendix A). This workshop was similar in nature to the first three meetings, held in June, 1997, in Lawrence, Kansas (Soller, 1997), in May, 1998, in Champaign, Illinois (Soller, 1998a), and in May, 1999, in Madison, Wisconsin (Soller, 1999). This year's meeting was hosted by the Kentucky Geological Survey, from May 17 to 20, 2000, on the University of Kentucky campus in Lexington. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. When, based on discussions at the workshop, an attendee adopts or modifies a newly learned technique, the workshop clearly has met that objective. Evidence of learning and cooperation among participating agencies continued to be a highlight of the DMT workshops (see example in Soller, 1998b, and various papers in this volume). The meeting's general goal was to help move the state geological surveys and the USGS toward development of more cost-effective, flexible, and useful systems for digital mapping and geographic information systems (GIS) analysis. Through oral and poster presentations and special discussion sessions, emphasis was given to: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) continued development of the National Geologic Map Database; 3) progress toward building a standard geologic map data model; 4) field data-collection systems; and 5) map citation and authorship guidelines. Four representatives of the GIS hardware and software vendor community were invited to participate. The four annual DMT workshops were coordinated by the AASG/USGS Data Capture Working Group, which was formed in August, 1996, to support the Association of American State Geologists and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ncgmp.usgs.gov/ngmdbproject/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed to help the Database, and the State and Federal geological surveys, provide more high-quality digital maps to the public.

  3. Digital Mapping Techniques '07 - Workshop Proceedings

    USGS Publications Warehouse

    Soller, David R.

    2008-01-01

    The Digital Mapping Techniques '07 (DMT'07) workshop was attended by 85 technical experts from 49 agencies, universities, and private companies, including representatives from 27 state geological surveys. This year's meeting, the tenth in the annual series, was hosted by the South Carolina Geological Survey, from May 20-23, 2007, on the University of South Carolina campus in Columbia, South Carolina. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous year's meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.

  4. Scoping of flood hazard mapping needs for Kennebec County, Maine

    USGS Publications Warehouse

    Dudley, Robert W.; Schalk, Charles W.

    2006-01-01

    This report was prepared by the U.S. Geological Survey (USGS) Maine Water Science Center as the deliverable for scoping of flood hazard mapping needs for Kennebec County, Maine, under Federal Emergency Management Agency (FEMA) Inter-Agency Agreement Number HSFE01-05-X-0018. This section of the report explains the objective of the task and the purpose of the report. The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program, began scoping work in 2005 for Kennebec County. Scoping activities included assembling existing data and map needs information for communities in Kennebec County (efforts were made to not duplicate those of pre-scoping completed in March 2005), documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database or its successor with information gathered during the scoping process. The average age of the FEMA floodplain maps in Kennebec County, Maine is 16 years. Most of these studies were in the late 1970's to the mid 1980s. However, in the ensuing 20-30 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights. The following is the scope of work as defined in the FEMA/USGS Statement of Work: Task 1: Collect data from a variety of sources including community surveys, other Federal and State Agencies, National Flood Insurance Program (NFIP) State Coordinators, Community Assistance Visits (CAVs) and FEMA archives. Lists of mapping needs will be obtained from the MNUSS database, community surveys, and CAVs, if available. FEMA archives will be inventoried for effective FIRM panels, FIS reports, and other flood-hazard data or existing study data. Best available base map information, topographic data, flood-hazard data, and hydrologic and hydraulic data will be identified. Data from the Maine Floodplain Management Program database also will be utilized. Task 2: Contact communities in Kennebec County to notify them that FEMA and the State have selected them for a map update, and that a project scope will be developed with their input. Topics to be reviewed with the communities include (1) Purpose of the Flood Map Project (for example, the update needs that have prompted the map update); (2) The community's mapping needs; (3) The community's available mapping, hydrologic, hydraulic, and flooding information; (4) target schedule for completing the project; and (5) The community's engineering, planning, and geographic information system (GIS) capabilities. On the basis of the collected information from Task 1 and community contacts/meetings in Task 2, the USGS will develop a Draft Project Scope for the identified mapping needs of the communities in Kennebec County. The following items will be addressed in the Draft Project Scope: review of available information, determine if and how e

  5. Scoping of flood hazard mapping needs for Somerset County, Maine

    USGS Publications Warehouse

    Dudley, Robert W.; Schalk, Charles W.

    2006-01-01

    This report was prepared by the U.S. Geological Survey (USGS) Maine Water Science Center as the deliverable for scoping of flood hazard mapping needs for Somerset County, Maine, under Federal Emergency Management Agency (FEMA) Inter-Agency Agreement Number HSFE01-05-X-0018. This section of the report explains the objective of the task and the purpose of the report. The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program, began scoping work in 2005 for Somerset County. Scoping activities included assembling existing data and map needs information for communities in Somerset County (efforts were made to not duplicate those of pre-scoping completed in March 2005), documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database or its successor with information gathered during the scoping process. The average age of the FEMA floodplain maps in Somerset County, Maine is 18.1 years. Most of these studies were in the late 1970's to the mid 1980s. However, in the ensuing 20-30 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights. The following is the scope of work as defined in the FEMA/USGS Statement of Work: Task 1: Collect data from a variety of sources including community surveys, other Federal and State Agencies, National Flood Insurance Program (NFIP) State Coordinators, Community Assistance Visits (CAVs) and FEMA archives. Lists of mapping needs will be obtained from the MNUSS database, community surveys, and CAVs, if available. FEMA archives will be inventoried for effective FIRM panels, FIS reports, and other flood-hazard data or existing study data. Best available base map information, topographic data, flood-hazard data, and hydrologic and hydraulic data will be identified. Data from the Maine Floodplain Management Program database also will be utilized. Task 2: Contact communities in Somerset County to notify them that FEMA and the State have selected them for a map update, and that a project scope will be developed with their input. Topics to be reviewed with the communities include (1) Purpose of the Flood Map Project (for example, the update needs that have prompted the map update); (2) The community's mapping needs; (3) The community's available mapping, hydrologic, hydraulic, and flooding information; (4) target schedule for completing the project; and (5) The community's engineering, planning, and geographic information system (GIS) capabilities. On the basis of the collected information from Task 1 and community contacts/meetings in Task 2, the USGS will develop a Draft Project Scope for the identified mapping needs of the communities in Somerset County. The following items will be addressed in the Draft Project Scope: review of available information, determine if and ho

  6. Scoping of flood hazard mapping needs for Cumberland County, Maine

    USGS Publications Warehouse

    Dudley, Robert W.; Schalk, Charles W.

    2006-01-01

    This report was prepared by the U.S. Geological Survey (USGS) Maine Water Science Center as the deliverable for scoping of flood hazard mapping needs for Cumberland County, Maine, under Federal Emergency Management Agency (FEMA) Inter-Agency Agreement Number HSFE01-05-X-0018. This section of the report explains the objective of the task and the purpose of the report. The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program, began scoping work in 2005 for Cumberland County. Scoping activities included assembling existing data and map needs information for communities in Cumberland County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database or its successor with information gathered during the scoping process. The average age of the FEMA floodplain maps in Cumberland County, Maine is 21 years. Most of these studies were in the early to mid 1980s. However, in the ensuing 20-25 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights. The following is the scope of work as defined in the FEMA/USGS Statement of Work: Task 1: Collect data from a variety of sources including community surveys, other Federal and State Agencies, National Flood Insurance Program (NFIP) State Coordinators, Community Assistance Visits (CAVs) and FEMA archives. Lists of mapping needs will be obtained from the MNUSS database, community surveys, and CAVs, if available. FEMA archives will be inventoried for effective FIRM panels, FIS reports, and other flood-hazard data or existing study data. Best available base map information, topographic data, flood-hazard data, and hydrologic and hydraulic data will be identified. Data from the Maine Floodplain Management Program database also will be utilized. Task 2: Contact communities in Cumberland County to notify them that FEMA and the State have selected them for a map update, and that a project scope will be developed with their input. Topics to be reviewed with the communities include (1) Purpose of the Flood Map Project (for example, the update needs that have prompted the map update); (2) The community's mapping needs; (3) The community's available mapping, hydrologic, hydraulic, and flooding information; (4) target schedule for completing the project; and (5) The community's engineering, planning, and geographic information system (GIS) capabilities. On the basis of the collected information from Task 1 and community contacts/meetings in Task 2, the USGS will develop a Draft Project Scope for the identified mapping needs of the communities in Cumberland County. The following items will be addressed in the Draft Project Scope: review of available information, determine if and how effective FIS data can be used in new project, and identify other data needed to

  7. Digital Mapping Techniques '08—Workshop Proceedings, Moscow, Idaho, May 18–21, 2008

    USGS Publications Warehouse

    Soller, David R.

    2009-01-01

    The Digital Mapping Techniques '08 (DMT'08) workshop was attended by more than 100 technical experts from 40 agencies, universities, and private companies, including representatives from 24 State geological surveys. This year's meeting, the twelfth in the annual series, was hosted by the Idaho Geological Survey, from May 18-21, 2008, on the University of Idaho campus in Moscow, Idaho. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  8. Digital Mapping Techniques '10-Workshop Proceedings, Sacramento, California, May 16-19, 2010

    USGS Publications Warehouse

    Soller, David R.; Soller, David R.

    2012-01-01

    The Digital Mapping Techniques '10 (DMT'10) workshop was attended by 110 technical experts from 40 agencies, universities, and private companies, including representatives from 19 State geological surveys (see Appendix A). This workshop, hosted by the California Geological Survey, May 16-19, 2010, in Sacramento, California, was similar in nature to the previous 13 meetings (see Appendix B). The meeting was coordinated by the U.S. Geological Survey's (USGS) National Geologic Map Database project. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was again successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products ("publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  9. Geological and technological assessment of artificial reef sites, Louisiana outer continental shelf

    USGS Publications Warehouse

    Pope, D.L.; Moslow, T.F.; Wagner, J.B.

    1993-01-01

    This paper describes the general procedures used to select sites for obsolete oil and gas platforms as artificial reefs on the Louisiana outer continental shelf (OCS). The methods employed incorporate six basic steps designed to resolve multiple-use conflicts that might otherwise arise with daily industry and commercial fishery operations, and to identify and assess both geological and technological constraints that could affect placement of the structures. These steps include: (1) exclusion mapping; (2) establishment of artificial reef planning areas; (3) database compilation; (4) assessment and interpretation of database; (5) mapping of geological and man-made features within each proposed reef site; and (6) site selection. Nautical charts, bathymetric maps, and offshore oil and gas maps were used for exclusion mapping, and to select nine regional planning areas. Pipeline maps were acquired from federal agencies and private industry to determine their general locations within each planning area, and to establish exclusion fairways along each pipeline route. Approximately 1600 line kilometers of high-resolution geophysical data collected by federal agencies and private industry was acquired for the nine planning areas. These data were interpreted to determine the nature and extent of near-surface geologic features that could affect placement of the structures. Seismic reflection patterns were also characterized to evaluate near-bottom sedimentation processes in the vicinity of each reef site. Geotechnical borings were used to determine the lithological and physical properties of the sediment, and for correlation with the geophysical data. Since 1987, five sites containing 10 obsolete production platforms have been selected on the Louisiana OCS using these procedures. Industry participants have realized a total savings of approximately US $1 500 000 in salvaging costs by converting these structures into artificial reefs. ?? 1993.

  10. National Land Cover Database 2001 (NLCD01)

    USGS Publications Warehouse

    LaMotte, Andrew E.

    2016-01-01

    This 30-meter data set represents land use and land cover for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System (see http://water.usgs.gov/GIS/browse/nlcd01-partition.jpg). The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (http://www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004), (see: http://www.mrlc.gov/mrlc2k.asp). The NLCD 2001 was created by partitioning the United States into mapping zones. A total of 68 mapping zones (see http://water.usgs.gov/GIS/browse/nlcd01-mappingzones.jpg), were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  11. National Land Cover Database 2001 (NLCD01) Imperviousness Layer Tile 1, Northwest United States: IMPV01_1

    USGS Publications Warehouse

    LaMotte, Andrew E.; Wieczorek, Michael

    2010-01-01

    This 30-meter resolution data set represents the imperviousness layer for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System, browse graphic: nlcd01-partition. The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004) and http://www.mrlc.gov/mrlc2k.asp.. The NLCD 2001 was created by partitioning the United States into mapping-zones. A total of 68 mapping-zones browse graphic: nlcd01-mappingzones.jpg were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping-zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  12. National Land Cover Database 2001 (NLCD01) Tree Canopy Layer Tile 2, Northeast United States: CNPY01_2

    USGS Publications Warehouse

    LaMotte, Andrew E.; Wieczorek, Michael

    2010-01-01

    This 30-meter resolution data set represents the tree canopy layer for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System, browse graphic: nlcd01-partition.jpg The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004) and http://www.mrlc.gov/mrlc2k.asp. The NLCD 2001 was created by partitioning the United States into mapping-zones. A total of 68 mapping-zones browse graphic: nlcd01-mappingzones.jpg were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping-zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  13. National Land Cover Database 2001 (NLCD01) Imperviousness Layer Tile 4, Southeast United States: IMPV01_4

    USGS Publications Warehouse

    Wieczorek, Michael; LaMotte, Andrew E.

    2010-01-01

    This 30-meter resolution data set represents the imperviousness layer for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System, browse graphic: nlcd01-partition. The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004) and http://www.mrlc.gov/mrlc2k.asp.. The NLCD 2001 was created by partitioning the United States into mapping-zones. A total of 68 mapping-zones browse graphic: nlcd01-mappingzones.jpg were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping-zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  14. National Land Cover Database 2001 (NLCD01) Tree Canopy Layer Tile 1, Northwest United States: CNPY01_1

    USGS Publications Warehouse

    LaMotte, Andrew E.; Wieczorek, Michael

    2010-01-01

    This 30-meter resolution data set represents the tree canopy layer for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System, browse graphic: nlcd01-partition.jpg. The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004) and http://www.mrlc.gov/mrlc2k.asp. The NLCD 2001 was created by partitioning the United States into mapping-zones. A total of 68 mapping-zones browse graphic: nlcd01-mappingzones.jpg were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping-zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov

  15. National Land Cover Database 2001 (NLCD01) Imperviousness Layer Tile 2, Northeast United States: IMPV01_2

    USGS Publications Warehouse

    LaMotte, Andrew E.; Wieczorek, Michael

    2010-01-01

    This 30-meter resolution data set represents the imperviousness layer for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System, browse graphic: nlcd01-partition. The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004) and http://www.mrlc.gov/mrlc2k.asp.. The NLCD 2001 was created by partitioning the United States into mapping-zones. A total of 68 mapping-zones browse graphic: nlcd01-mappingzones.jpg were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping-zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  16. National Land Cover Database 2001 (NLCD01) Tree Canopy Layer Tile 4, Southeast United States: CNPY01_4

    USGS Publications Warehouse

    LaMotte, Andrew E.; Wieczorek, Michael

    2010-01-01

    This 30-meter resolution data set represents the tree canopy layer for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System, browse graphic: nlcd01-partition.jpg The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004) and http://www.mrlc.gov/mrlc2k.asp. The NLCD 2001 was created by partitioning the United States into mapping-zones. A total of 68 mapping-zones browse graphic: nlcd01-mappingzones.jpg were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping-zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  17. National Land Cover Database 2001 (NLCD01) Imperviousness Layer Tile 3, Southwest United States: IMPV01_3

    USGS Publications Warehouse

    LaMotte, Andrew E.; Wieczorek, Michael

    2010-01-01

    This 30-meter resolution data set represents the imperviousness layer for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System, browse graphic: nlcd01-partition. The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004) and http://www.mrlc.gov/mrlc2k.asp.. The NLCD 2001 was created by partitioning the United States into mapping-zones. A total of 68 mapping-zones browse graphic: nlcd01-mappingzones.jpg were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping-zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  18. National Land Cover Database 2001 (NLCD01) Tree Canopy Layer Tile 3, Southwest United States: CNPY01_3

    USGS Publications Warehouse

    LaMotte, Andrew E.; Wieczorek, Michael

    2010-01-01

    This 30-meter resolution data set represents the tree canopy layer for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System, browse graphic: nlcd01-partition.jpg The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004) and http://www.mrlc.gov/mrlc2k.asp. The NLCD 2001 was created by partitioning the United States into mapping-zones. A total of 68 mapping-zones browse graphic: nlcd01-mappingzones.jpg were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping-zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  19. Digital Mapping and Environmental Characterization of National Wild and Scenic River Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McManamay, Ryan A; Bosnall, Peter; Hetrick, Shelaine L

    2013-09-01

    Spatially accurate geospatial information is required to support decision-making regarding sustainable future hydropower development. Under a memorandum of understanding among several federal agencies, a pilot study was conducted to map a subset of National Wild and Scenic Rivers (WSRs) at a higher resolution and provide a consistent methodology for mapping WSRs across the United States and across agency jurisdictions. A subset of rivers (segments falling under the jurisdiction of the National Park Service) were mapped at a high resolution using the National Hydrography Dataset (NHD). The spatial extent and representation of river segments mapped at NHD scale were compared withmore » the prevailing geospatial coverage mapped at a coarser scale. Accurately digitized river segments were linked to environmental attribution datasets housed within the Oak Ridge National Laboratory s National Hydropower Asset Assessment Program database to characterize the environmental context of WSR segments. The results suggest that both the spatial scale of hydrography datasets and the adherence to written policy descriptions are critical to accurately mapping WSRs. The environmental characterization provided information to deduce generalized trends in either the uniqueness or the commonness of environmental variables associated with WSRs. Although WSRs occur in a wide range of human-modified landscapes, environmental data layers suggest that they provide habitats important to terrestrial and aquatic organisms and recreation important to humans. Ultimately, the research findings herein suggest that there is a need for accurate, consistent, mapping of the National WSRs across the agencies responsible for administering each river. Geospatial applications examining potential landscape and energy development require accurate sources of information, such as data layers that portray realistic spatial representations.« less

  20. Improving Land Cover Mapping: a Mobile Application Based on ESA Sentinel 2 Imagery

    NASA Astrophysics Data System (ADS)

    Melis, M. T.; Dessì, F.; Loddo, P.; La Mantia, C.; Da Pelo, S.; Deflorio, A. M.; Ghiglieri, G.; Hailu, B. T.; Kalegele, K.; Mwasi, B. N.

    2018-04-01

    The increasing availability of satellite data is a real value for the enhancement of environmental knowledge and land management. Possibilities to integrate different source of geo-data are growing and methodologies to create thematic database are becoming very sophisticated. Moreover, the access to internet services and, in particular, to web mapping services is well developed and spread either between expert users than the citizens. Web map services, like Google Maps or Open Street Maps, give the access to updated optical imagery or topographic maps but information on land cover/use - are not still provided. Therefore, there are many failings in the general utilization -non-specialized users- and access to those maps. This issue is particularly felt where the digital (web) maps could form the basis for land use management as they are more economic and accessible than the paper maps. These conditions are well known in many African countries where, while the internet access is becoming open to all, the local map agencies and their products are not widespread.

  1. The National Landslide Database of Great Britain: Acquisition, communication and the role of social media

    NASA Astrophysics Data System (ADS)

    Pennington, Catherine; Freeborough, Katy; Dashwood, Claire; Dijkstra, Tom; Lawrie, Kenneth

    2015-11-01

    The British Geological Survey (BGS) is the national geological agency for Great Britain that provides geoscientific information to government, other institutions and the public. The National Landslide Database has been developed by the BGS and is the focus for national geohazard research for landslides in Great Britain. The history and structure of the geospatial database and associated Geographical Information System (GIS) are explained, along with the future developments of the database and its applications. The database is the most extensive source of information on landslides in Great Britain with over 17,000 records of landslide events to date, each documented as fully as possible for inland, coastal and artificial slopes. Data are gathered through a range of procedures, including: incorporation of other databases; automated trawling of current and historical scientific literature and media reports; new field- and desk-based mapping technologies with digital data capture, and using citizen science through social media and other online resources. This information is invaluable for directing the investigation, prevention and mitigation of areas of unstable ground in accordance with Government planning policy guidelines. The national landslide susceptibility map (GeoSure) and a national landslide domains map currently under development, as well as regional mapping campaigns, rely heavily on the information contained within the landslide database. Assessing susceptibility to landsliding requires knowledge of the distribution of failures, an understanding of causative factors, their spatial distribution and likely impacts, whilst understanding the frequency and types of landsliding present is integral to modelling how rainfall will influence the stability of a region. Communication of landslide data through the Natural Hazard Partnership (NHP) and Hazard Impact Model contributes to national hazard mitigation and disaster risk reduction with respect to weather and climate. Daily reports of landslide potential are published by BGS through the NHP partnership and data collected for the National Landslide Database are used widely for the creation of these assessments. The National Landslide Database is freely available via an online GIS and is used by a variety of stakeholders for research purposes.

  2. Protection of Marine Mammals.

    PubMed

    Knoll, Michaela; Ciaccia, Ettore; Dekeling, René; Kvadsheim, Petter; Liddell, Kate; Gunnarsson, Stig-Lennart; Ludwig, Stefan; Nissen, Ivor; Lorenzen, Dirk; Kreimeyer, Roman; Pavan, Gianni; Meneghetti, Nello; Nordlund, Nina; Benders, Frank; van der Zwan, Timo; van Zon, Tim; Fraser, Leanne; Johansson, Torbjörn; Garmelius, Martin

    2016-01-01

    Within the European Defense Agency (EDA), the Protection of Marine Mammals (PoMM) project, a comprehensive common marine mammal database essential for risk mitigation tools, was established. The database, built on an extensive dataset collection with the focus on areas of operational interest for European navies, consists of annual and seasonal distribution and density maps, random and systematic sightings, an encyclopedia providing knowledge on the characteristics of 126 marine mammal species, data on marine mammal protection areas, and audio information including numerous examples of various vocalizations. Special investigations on marine mammal acoustics were carried out to improve the detection and classification capabilities.

  3. Software development, nomenclature schemes, and mapping strategies for an international pediatric cardiac surgery database system.

    PubMed

    Jacobs, Jeffrey P

    2002-01-01

    The field of congenital heart surgery has the opportunity to create the first comprehensive international database for a medical subspecialty. An understanding of the demographics of congenital heart disease and the rapid growth of computer technology leads to the realization that creating a comprehensive international database for pediatric cardiac surgery represents an important and achievable goal. The evolution of computer-based data analysis creates an opportunity to develop software to manage an international congenital heart surgery database and eventually become an electronic medical record. The same database data set for congenital heart surgery is now being used in Europe and North America. Additional work is under way to involve Africa, Asia, Australia, and South America. The almost simultaneous publication of the European Association for Cardio-thoracic Surgery/Society of Thoracic Surgeons coding system and the Association for European Paediatric Cardiology coding system resulted in the potential for multiple coding. Representatives of the Association for European Paediatric Cardiology, Society of Thoracic Surgeons, European Association for Cardio-thoracic Surgery, and European Congenital Heart Surgeons Foundation agree that these hierarchical systems are complementary and not competitive. An international committee will map the two systems. The ideal coding system will permit a diagnosis or procedure to be coded only one time with mapping allowing this code to be used for patient care, billing, practice management, teaching, research, and reporting to governmental agencies. The benefits of international data gathering and sharing are global, with the long-term goal of the continued upgrade in the quality of congenital heart surgery worldwide. Copyright 2002 by W.B. Saunders Company

  4. EPA GHG certification of medium- and heavy-duty vehicles: Development of road grade profiles representative of US controlled access highways

    DOE PAGES

    Wood, Eric; Duran, Adam; Kelly, Kenneth

    2016-09-27

    In collaboration with the U.S. Environmental Protection Agency and the U.S. Department of Energy, the National Renewable Energy Laboratory has conducted a national analysis of road grade characteristics experienced by U.S. medium- and heavy-duty trucks on controlled access highways. These characteristics have been developed using TomTom's commercially available street map and road grade database. Using the TomTom national road grade database, national statistics on road grade and hill distances were generated for the U.S. network of controlled access highways. These statistical distributions were then weighted using data provided by the U.S. Environmental Protection Agency for activity of medium- and heavy-dutymore » trucks on controlled access highways. Here, the national activity-weighted road grade and hill distance distributions were then used as targets for development of a handful of sample grade profiles potentially to be used in the U.S. Environmental Protection Agency's Greenhouse Gas Emissions Model certification tool as well as in dynamometer testing of medium- and heavy-duty vehicles and their powertrains.« less

  5. EPA GHG certification of medium- and heavy-duty vehicles: Development of road grade profiles representative of US controlled access highways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Eric; Duran, Adam; Kelly, Kenneth

    In collaboration with the U.S. Environmental Protection Agency and the U.S. Department of Energy, the National Renewable Energy Laboratory has conducted a national analysis of road grade characteristics experienced by U.S. medium- and heavy-duty trucks on controlled access highways. These characteristics have been developed using TomTom's commercially available street map and road grade database. Using the TomTom national road grade database, national statistics on road grade and hill distances were generated for the U.S. network of controlled access highways. These statistical distributions were then weighted using data provided by the U.S. Environmental Protection Agency for activity of medium- and heavy-dutymore » trucks on controlled access highways. Here, the national activity-weighted road grade and hill distance distributions were then used as targets for development of a handful of sample grade profiles potentially to be used in the U.S. Environmental Protection Agency's Greenhouse Gas Emissions Model certification tool as well as in dynamometer testing of medium- and heavy-duty vehicles and their powertrains.« less

  6. Updating National Topographic Data Base Using Change Detection Methods

    NASA Astrophysics Data System (ADS)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  7. Geographic information systems for mapping the National Exam Result of Junior High School in 2014 at West Java Province

    NASA Astrophysics Data System (ADS)

    Setiawan Abdullah, Atje; Nurani Ruchjana, Budi; Rejito, Juli; Rosadi, Rudi; Candra Permana, Fahmi

    2017-10-01

    National Exam level of schooling is implemented by the Ministry of Education and Culture for the development of education in Indonesia. The national examinations are centrally evaluated by the National Education Standards Agency, and the expected implementation of the national exams can describe the successful implementation of education at the district, municipal, provincial, or national level. In this study, we evaluate, analyze, and explore the implementation of the national exam database of the results of the Junior High School in 2014, with the Junior High School (SMP/MTs) as the smallest unit of analysis at the district level. The method used in this study is a data mining approach using the methodology of Knowledge Discovery in Databases (KDD) using descriptive analysis and spatial mapping of national examinations. The results of the classification of the data mining process to national exams of Junior High School in 2014 using data 6,878 SMP/MTs in West Java showed that 81.01 % were at moderate levels. While the results of the spatial mapping for SMP/MTs in West Java can be explained 36,99 % at the unfavorable level. The evaluation results visualization in graphic is done using ArcGIS to provide position information quality of education in municipal, provincial or national level. The results of this study can be used by management to make decision to improve educational services based on the national exam database in West Java. Keywords: KDD, spatial mapping, national exam.

  8. Digital mapping techniques '06 - Workshop proceedings

    USGS Publications Warehouse

    Soller, David R.

    2007-01-01

    The Digital Mapping Techniques `06 (DMT`06) workshop was attended by more than 110 technical experts from 51 agencies, universities, and private companies, including representatives from 27 state geological surveys (see Appendix A of these Proceedings). This workshop was similar in nature to the previous nine meetings, which were held in Lawrence, Kansas (Soller, 1997), Champaign, Illinois (Soller, 1998), Madison, Wisconsin (Soller, 1999), Lexington, Kentucky (Soller, 2000), Tuscaloosa, Alabama (Soller, 2001), Salt Lake City, Utah (Soller, 2002), Millersville, Pennsylvania (Soller, 2003), Portland, Oregon (Soller, 2004), and Baton Rouge, Louisiana (Soller, 2005). This year?s meeting was hosted by the Ohio Geological Survey, from June 11-14, 2006, on the Ohio State University campus in Columbus, Ohio. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops.Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, the latter of which was formed in August 1996 to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database - and for the State and Federal geological surveys - to provide more high-quality digital maps to the public.At the 2006 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, "publishing" includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.

  9. KSC-99pp0331

    NASA Image and Video Library

    1999-03-22

    The Shuttle Radar Topography Mission (SRTM) sits uncovered inside the Multi-Payload Processing Facility. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  10. KSC-99pp0326

    NASA Image and Video Library

    1999-03-24

    The vehicle carrying the Shuttle Radar Topography Mission (SRTM) arrives at the Multi-Payload Processing Facility. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  11. KSC-99pp0327

    NASA Image and Video Library

    1999-03-24

    Inside the Multi-Payload Processing Facility, the lid covering the Shuttle Radar Topography Mission (SRTM) is lifted. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  12. National Land Cover Database 2001 (NLCD01) Tile 2, Northeast United States: NLCD01_2

    USGS Publications Warehouse

    LaMotte, Andrew

    2008-01-01

    This 30-meter data set represents land use and land cover for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System (see http://water.usgs.gov/GIS/browse/nlcd01-partition.jpg). The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (http://www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004), (see: http://www.mrlc.gov/mrlc2k.asp). The NLCD 2001 was created by partitioning the United States into mapping zones. A total of 68 mapping zones (see http://water.usgs.gov/GIS/browse/nlcd01-mappingzones.jpg), were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  13. National Land Cover Database 2001 (NLCD01) Tile 3, Southwest United States: NLCD01_3

    USGS Publications Warehouse

    LaMotte, Andrew

    2008-01-01

    This 30-meter data set represents land use and land cover for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System (see http://water.usgs.gov/GIS/browse/nlcd01-partition.jpg).The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (http://www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004), (see: http://www.mrlc.gov/mrlc2k.asp). The NLCD 2001 was created by partitioning the United States into mapping zones. A total of 68 mapping zones (see http://water.usgs.gov/GIS/browse/nlcd01-mappingzones.jpg), were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  14. National Land Cover Database 2001 (NLCD01) Tile 1, Northwest United States: NLCD01_1

    USGS Publications Warehouse

    LaMotte, Andrew

    2008-01-01

    This 30-meter data set represents land use and land cover for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System (see http://water.usgs.gov/GIS/browse/nlcd01-partition.jpg). The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (http://www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004), (see: http://www.mrlc.gov/mrlc2k.asp). The NLCD 2001 was created by partitioning the United States into mapping zones. A total of 68 mapping zones (see http://water.usgs.gov/GIS/browse/nlcd01-mappingzones.jpg), were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  15. National Land Cover Database 2001 (NLCD01) Tile 4, Southeast United States: NLCD01_4

    USGS Publications Warehouse

    LaMotte, Andrew

    2008-01-01

    This 30-meter data set represents land use and land cover for the conterminous United States for the 2001 time period. The data have been arranged into four tiles to facilitate timely display and manipulation within a Geographic Information System (see http://water.usgs.gov/GIS/browse/nlcd01-partition.jpg). The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (http://www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). One of the primary goals of the project is to generate a current, consistent, seamless, and accurate National Land Cover Database (NLCD) circa 2001 for the United States at medium spatial resolution. For a detailed definition and discussion on MRLC and the NLCD 2001 products, refer to Homer and others (2004), (see: http://www.mrlc.gov/mrlc2k.asp). The NLCD 2001 was created by partitioning the United States into mapping zones. A total of 68 mapping zones (see http://water.usgs.gov/GIS/browse/nlcd01-mappingzones.jpg), were delineated within the conterminous United States based on ecoregion and geographical characteristics, edge-matching features, and the size requirement of Landsat mosaics. Mapping zones encompass the whole or parts of several states. Questions about the NLCD mapping zones can be directed to the NLCD 2001 Land Cover Mapping Team at the USGS/EROS, Sioux Falls, SD (605) 594-6151 or mrlc@usgs.gov.

  16. Conversion of environmental data to a digital-spatial database, Puget Sound area, Washington

    USGS Publications Warehouse

    Uhrich, M.A.; McGrath, T.S.

    1997-01-01

    Data and maps from the Puget Sound Environmental Atlas, compiled for the U.S. Environmental Protection Agency, the Puget Sound Water Quality Authority, and the U.S. Army Corps of Engineers, have been converted into a digital-spatial database using a geographic information system. Environmental data for the Puget Sound area,collected from sources other than the Puget SoundEnvironmental Atlas by different Federal, State, andlocal agencies, also have been converted into thisdigital-spatial database. Background on the geographic-information-system planning process, the design and implementation of the geographic information-system database, and the reasons for conversion to this digital-spatial database are included in this report. The Puget Sound Environmental Atlas data layers include information about seabird nesting areas, eelgrass and kelp habitat, marine mammal and fish areas, and shellfish resources and bed certification. Data layers, from sources other than the Puget Sound Environmental Atlas, include the Puget Sound shoreline, the water-body system, shellfish growing areas, recreational shellfish beaches, sewage-treatment outfalls, upland hydrography,watershed and political boundaries, and geographicnames. The sources of data, descriptions of the datalayers, and the steps and errors of processing associated with conversion to a digital-spatial database used in development of the Puget Sound Geographic Information System also are included in this report. The appendixes contain data dictionaries for each of the resource layers and error values for the conversion of Puget SoundEnvironmental Atlas data.

  17. A Conceptual Model and Database to Integrate Data and Project Management

    NASA Astrophysics Data System (ADS)

    Guarinello, M. L.; Edsall, R.; Helbling, J.; Evaldt, E.; Glenn, N. F.; Delparte, D.; Sheneman, L.; Schumaker, R.

    2015-12-01

    Data management is critically foundational to doing effective science in our data-intensive research era and done well can enhance collaboration, increase the value of research data, and support requirements by funding agencies to make scientific data and other research products available through publically accessible online repositories. However, there are few examples (but see the Long-term Ecological Research Network Data Portal) of these data being provided in such a manner that allows exploration within the context of the research process - what specific research questions do these data seek to answer? what data were used to answer these questions? what data would have been helpful to answer these questions but were not available? We propose an agile conceptual model and database design, as well as example results, that integrate data management with project management not only to maximize the value of research data products but to enhance collaboration during the project and the process of project management itself. In our project, which we call 'Data Map,' we used agile principles by adopting a user-focused approach and by designing our database to be simple, responsive, and expandable. We initially designed Data Map for the Idaho EPSCoR project "Managing Idaho's Landscapes for Ecosystem Services (MILES)" (see https://www.idahoecosystems.org//) and will present example results for this work. We consulted with our primary users- project managers, data managers, and researchers to design the Data Map. Results will be useful to project managers and to funding agencies reviewing progress because they will readily provide answers to the questions "For which research projects/questions are data available and/or being generated by MILES researchers?" and "Which research projects/questions are associated with each of the 3 primary questions from the MILES proposal?" To be responsive to the needs of the project, we chose to streamline our design for the prototype database and build it in a way that is modular and can be changed or expanded to meet user needs. Our hope is that others, especially those managing large collaborative research grants, will be able to use our project model and database design to enhance the value of their project and data management both during and following the active research period.

  18. An Investigation of Automatic Change Detection for Topographic Map Updating

    NASA Astrophysics Data System (ADS)

    Duncan, P.; Smit, J.

    2012-08-01

    Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI), South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.

  19. Vegetation classification and distribution mapping report Mesa Verde National Park

    USGS Publications Warehouse

    Thomas, Kathryn A.; McTeague, Monica L.; Ogden, Lindsay; Floyd, M. Lisa; Schulz, Keith; Friesen, Beverly A.; Fancher, Tammy; Waltermire, Robert G.; Cully, Anne

    2009-01-01

    The classification and distribution mapping of the vegetation of Mesa Verde National Park (MEVE) and surrounding environment was achieved through a multi-agency effort between 2004 and 2007. The National Park Service’s Southern Colorado Plateau Network facilitated the team that conducted the work, which comprised the U.S. Geological Survey’s Southwest Biological Science Center, Fort Collins Research Center, and Rocky Mountain Geographic Science Center; Northern Arizona University; Prescott College; and NatureServe. The project team described 47 plant communities for MEVE, 34 of which were described from quantitative classification based on f eld-relevé data collected in 1993 and 2004. The team derived 13 additional plant communities from field observations during the photointerpretation phase of the project. The National Vegetation Classification Standard served as a framework for classifying these plant communities to the alliance and association level. Eleven of the 47 plant communities were classified as “park specials;” that is, plant communities with insufficient data to describe them as new alliances or associations. The project team also developed a spatial vegetation map database representing MEVE, with three different map-class schemas: base, group, and management map classes. The base map classes represent the fi nest level of spatial detail. Initial polygons were developed using Definiens Professional (at the time of our use, this software was called eCognition), assisted by interpretation of 1:12,000 true-color digital orthophoto quarter quadrangles (DOQQs). These polygons (base map classes) were labeled using manual photo interpretation of the DOQQs and 1:12,000 true-color aerial photography. Field visits verified interpretation concepts. The vegetation map database includes 46 base map classes, which consist of associations, alliances, and park specials classified with quantitative analysis, additional associations and park specials noted during photointerpretation, and non-vegetated land cover, such as infrastructure, land use, and geological land cover. The base map classes consist of 5,007 polygons in the project area. A field-based accuracy assessment of the base map classes showed overall accuracy to be 43.5%. Seven map classes comprise 89.1% of the park vegetated land cover. The group map classes represent aggregations of the base map classes, approximating the group level of the National Vegetation Classification Standard, version 2 (Federal Geographic Data Committee 2007), and reflecting physiognomy and floristics. Terrestrial ecological systems, as described by NatureServe (Comer et al. 2003), were used as the fi rst approximation of the group level. The project team identified 14 group map classes for this project. The overall accuracy of the group map classes was determined using the same accuracy assessment data as for the base map classes. The overall accuracy of the group representation of vegetation was 80.3%. In consultation with park staff , the team developed management map classes, consisting of park-defined groupings of base map classes intended to represent a balance between maintaining required accuracy and providing a focus on vegetation of particular interest or import to park managers. The 23 management map classes had an overall accuracy of 73.3%. While the main products of this project are the vegetation classification and the vegetation map database, a number of ancillary digital geographic information system and database products were also produced that can be used independently or to augment the main products. These products include shapefiles of the locations of field-collected data and relational databases of field-collected data.

  20. The Seismotectonic Map of Africa

    NASA Astrophysics Data System (ADS)

    Meghraoui, Mustapha

    2015-04-01

    We present the Seismotectonic Map of Africa based on a geological, geophysical and geodetic database including the instrumental seismicity and re-appraisal of large historical events with harmonization and homogenization of earthquake parameters in catalogues. Although the seismotectonic framework and mapping of the African continent is a difficult task, several previous and ongoing projects provide a wealth of data and outstanding results. The database of large and moderate earthquakes in different geological domains includes the coseismic and Quaternary faulting that reveals the complex nature of the active tectonics in Africa. The map also benefits from previous works on local and regional seismotectonic maps that needed to be integrated with the lithospheric and upper mantle structures from tomographic anisotropy and gravity anomaly into a continental framework. The synthesis of earthquake and volcanic studies with the analysis of long-term (late Quaternary) and short-term (last decades and centuries) active deformation observed with geodetic and other approaches presented along with the seismotectonic map serves as a basis for hazard calculations and the reduction of seismic risks. The map may also be very useful in the assessment of seismic hazard and mitigation of earthquake risk for significant infrastructures and their implications in the socio-economic impact in Africa. In addition, the constant population increase and infrastructure growth in the continent that exacerbate the earthquake risk justify the necessity for a continuous updating of the seismotectonic map. The database and related map are prepared in the framework of the IGC Project-601 "Seismotectonics and Seismic Hazards in Africa" of UNESCO-IUGS, funded by the Swedish International Development Agency and UNESCO-Nairobi for a period of 4 years (2011 - 2014), extended to 2016. * Mustapha Meghraoui (Coordinator) EOST - IPG Strasbourg CNRS-UMR 7516 m.meghraoui@unistra.fr corresponding author. Paulina Amponsah (AECG, Accra), Abdelhakim Ayadi (CRAAG, Algiers), Atalay Ayele (Univ. Addis Ababa), Ateba Bekoa (Bueah Univ. Yaounde), Abdunnur Bensuleman (Tripoli Univ.), Damien Delvaux (MRAC-Tervuren); Mohamed El Gabry (NRIAG, Cairo), Rui-Manuel Fernandes (Beira Univ.) ; Vunganai Midzi & Magda Roos (CGS, Pretoria), Youssef Timoulali (Univ. Mohamed V, Rabat). Website: http://eost.u-strasbg.fr/igcp601/index.html

  1. Digital Mapping Techniques '05--Workshop Proceedings, Baton Rouge, Louisiana, April 24-27, 2005

    USGS Publications Warehouse

    Soller, David R.

    2005-01-01

    Intorduction: The Digital Mapping Techniques '05 (DMT'05) workshop was attended by more than 100 technical experts from 47 agencies, universities, and private companies, including representatives from 25 state geological surveys (see Appendix A). This workshop was similar in nature to the previous eight meetings, held in Lawrence, Kansas (Soller, 1997), in Champaign, Illinois (Soller, 1998), in Madison, Wisconsin (Soller, 1999), in Lexington, Kentucky (Soller, 2000), in Tuscaloosa, Alabama (Soller, 2001), in Salt Lake City, Utah (Soller, 2002), in Millersville, Pennsylvania (Soller, 2003), and in Portland, Oregon (Soller, 2004). This year's meeting was hosted by the Louisiana Geological Survey, from April 24-27, 2005, on the Louisiana State University campus in Baton Rouge, Louisiana. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and to renew friendships and collegial work begun at past DMT workshops. Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, which was formed in August 1996, to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database?and for the State and Federal geological surveys?to provide more high-quality digital maps to the public. At the 2005 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; 6) continued development of the National Geologic Map Database; and 7) progress toward building and implementing a standard geologic map data model and standard science language for the U.S. and for North America.

  2. KSC-99pp0313

    NASA Image and Video Library

    1999-03-23

    In the Multi-Payload Processing Facility, Mary Reaves (left) and Richard Rainen, with the Jet Propulsion Laboratory, check out the carrier and horizontal antenna mast for the STS-99 Shuttle Radar Topography Mission (SRTM). The SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during an 11-day mission in September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  3. KSC-99pp0503

    NASA Image and Video Library

    1999-05-07

    Inside the Space Station Processing Facility, the Shuttle Radar Topography Mission (SRTM) is maneuvered into place to prepare it for launch targeted for September 1999. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  4. KSC-99pp0312

    NASA Image and Video Library

    1999-03-23

    In the Multi-Payload Processing Facility, Beverly St. Ange, with the Jet Propulsion Laboratory, wires a biopod, a component of the STS-99 Shuttle Radar Topography Mission (SRTM). The SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during an 11-day mission in September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  5. KSC-99pp0330

    NASA Image and Video Library

    1999-03-24

    The Shuttle Radar Topography Mission (SRTM) sits inside the Multi-Payload Processing Facility after the SRTM's cover was removed. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  6. KSC-99pp0329

    NASA Image and Video Library

    1999-03-24

    Inside the Multi-Payload Processing Facility, the Shuttle Radar Topography Mission (SRTM) is revealed after the lid of its container was removed. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  7. KSC-99pp0328

    NASA Image and Video Library

    1999-03-24

    Inside the Multi-Payload Processing Facility, the lid covering the Shuttle Radar Topography Mission (SRTM) is lifted from the crate. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  8. KSC-99pp0502

    NASA Image and Video Library

    1999-05-07

    The Shuttle Radar Topography Mission (SRTM) is moved into the Space Station Processing Facility to prepare it for launch targeted for September 1999. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  9. Mapping the Rainforest of the Sea: Global Coral Reef Maps for Global Conservation

    NASA Technical Reports Server (NTRS)

    Robinson, Julie A.

    2006-01-01

    Coral reefs are the center of marine biodiversity, yet are under threat with an estimated 60% of coral reef habitats considered at risk by the World Resources Institute. The location and extent of coral reefs in the world are the basic information required for resource management and as a baseline for monitoring change. A NASA sponsored partnership between remote sensing scientists, international agencies and NGOs, has developed a new generation of global reef maps based on data collected by satellites. The effort, dubbed the Millennium Coral Reef Map aims to develop new methods for wide distribution of voluminous satellite data of use to the conservation and management communities. We discuss the tradeoffs between remote sensing data sources, mapping objectives, and the needs for conservation and resource management. SeaWiFS data were used to produce a composite global shallow bathymetry map at 1 km resolution. Landsat 7/ETM+ data acquisition plans were modified to collect global reefs and new operational methods were designed to generate the firstever global coral reef geomorphology map. We discuss the challenges encountered to build these databases and in implementing the geospatial data distribution strategies. Conservation applications include a new assessment of the distribution of the world s marine protected areas (UNEPWCMC), improved spatial resolution in the Reefs at Risk analysis for the Caribbean (WRI), and a global basemap for the Census of Marine Life's OBIS database. The Millennium Coral Reef map and digital image archive will pay significant dividends for local and regional conservation projects around the globe. Complete details of the project are available at http://eol.jsc.nasa.gov/reefs.

  10. Rationale and operational plan to upgrade the U.S. gravity database

    USGS Publications Warehouse

    Hildenbrand, Thomas G.; Briesacher, Allen; Flanagan, Guy; Hinze, William J.; Hittelman, A.M.; Keller, Gordon R.; Kucks, R.P.; Plouff, Donald; Roest, Walter; Seeley, John; Stith, David A.; Webring, Mike

    2002-01-01

    A concerted effort is underway to prepare a substantially upgraded digital gravity anomaly database for the United States and to make this data set and associated usage tools available on the internet. This joint effort, spearheaded by the geophysics groups at the National Imagery and Mapping Agency (NIMA), University of Texas at El Paso (UTEP), U.S. Geological Survey (USGS), and National Oceanic and Atmospheric Administration (NOAA), is an outgrowth of the new geoscientific community initiative called Geoinformatics (www.geoinformaticsnetwork.org). This dominantly geospatial initiative reflects the realization by Earth scientists that existing information systems and techniques are inadequate to address the many complex scientific and societal issues. Currently, inadequate standardization and chaotic distribution of geoscience data, inadequate accompanying documentation, and the lack of easy-to-use access tools and computer codes for analysis are major obstacles for scientists, government agencies, and educators. An example of the type of activities envisioned, within the context of Geoinformatics, is the construction, maintenance, and growth of a public domain gravity database and development of the software tools needed to access, implement, and expand it. This product is far more than a high quality database; it is a complete data system for a specific type of geophysical measurement that includes, for example, tools to manipulate the data and tutorials to understand and properly utilize the data. On August 9, 2002, twenty-one scientists from the federal, private and academic sectors met at a workshop to discuss the rationale for upgrading both the United States and North American gravity databases (including offshore regions) and, more importantly, to begin developing an operational plan to effectively create a new gravity data system. We encourage anyone interested in contributing data or participating in this effort to contact G.R. Keller or T.G. Hildenbrand. This workshop was the first step in building a web-based data system for sharing quality gravity data and methodology, and it builds on existing collaborative efforts. This compilation effort will result in significant additions to and major refinement of the U.S. database that is currently released publicly by NOAA’s National Geophysical Data Center and will also include an additional objective to substantially upgrade the North American database, released over 15 years ago (Committee for the Gravity Anomaly Map of North America, 1987).

  11. The Design and Product of National 1:1000000 Cartographic Data of Topographic Map

    NASA Astrophysics Data System (ADS)

    Wang, Guizhi

    2016-06-01

    National administration of surveying, mapping and geoinformation started to launch the project of national fundamental geographic information database dynamic update in 2012. Among them, the 1:50000 database was updated once a year, furthermore the 1:250000 database was downsized and linkage-updated on the basis. In 2014, using the latest achievements of 1:250000 database, comprehensively update the 1:1000000 digital line graph database. At the same time, generate cartographic data of topographic map and digital elevation model data. This article mainly introduce national 1:1000000 cartographic data of topographic map, include feature content, database structure, Database-driven Mapping technology, workflow and so on.

  12. Stennis Space Center Environmental Geographic Information System

    NASA Technical Reports Server (NTRS)

    Lovely, Janette; Cohan, Tyrus

    2000-01-01

    As NASA's lead center for rocket propulsion testing, the John C. Stennis Space Center (SSC) monitors and assesses the off-site impacts of such testing through its Environmental Office (SSC-EO) using acoustical models and ancillary data. The SSC-EO has developed a geographical database, called the SSC Environmental Geographic Information System (SSC-EGIS), that covers an eight-county area bordering the NASA facility. Through the SSC-EGIS, the Enivronmental Office inventories, assesses, and manages the nearly 139,000 acres that comprise Stennis Space Center and its surrounding acoustical buffer zone. The SSC-EGIS contains in-house data as well as a wide range of data obtained from outside sources, including private agencies and local, county, state, and U.S. government agencies. The database comprises cadastral/geodetic, hydrology, infrastructure, geo-political, physical geography, and socio-economic vector and raster layers. The imagery contained in the database is varied, including low-resolution imagery, such as Landsat TM and SPOT; high-resolution imagery, such as IKONOS and AVIRIS; and aerial photographs. The SSC-EGIS has been an integral part of several major projects and the model upon which similar EGIS's will be developed for other NASA facilities. The Corps of Engineers utilized the SSC-EGIS in a plan to establish wetland mitigation sites within the SSC buffer zone. Mississippi State University employed the SSC-EGIS in a preliminary study to evaluate public access points within the buffer zone. The SSC-EO has also expressly used the SSC-EGIS to assess noise pollution modeling, land management/wetland mitigation assessment, environmental hazards mapping, and protected areas mapping for archaeological sites and for threatened and endangered species habitats. The SSC-EO has several active and planned projects that will also make use of the SSC-EGIS during this and the coming fiscal year.

  13. Databases in the Central Government : State-of-the-art and the Future

    NASA Astrophysics Data System (ADS)

    Ohashi, Tomohiro

    Management and Coordination Agency, Prime Minister’s Office, conducted a survey by questionnaire against all Japanese Ministries and Agencies, in November 1985, on a subject of the present status of databases produced or planned to be produced by the central government. According to the results, the number of the produced databases has been 132 in 19 Ministries and Agencies. Many of such databases have been possessed by Defence Agency, Ministry of Construction, Ministry of Agriculture, Forestry & Fisheries, and Ministry of International Trade & Industries and have been in the fields of architecture & civil engineering, science & technology, R & D, agriculture, forestry and fishery. However the ratio of the databases available for other Ministries and Agencies has amounted to only 39 percent of all produced databases and the ratio of the databases unavailable for them has amounted to 60 percent of all of such databases, because of in-house databases and so forth. The outline of such results of the survey is reported and the databases produced by the central government are introduced under the items of (1) databases commonly used by all Ministries and Agencies, (2) integrated databases, (3) statistical databases and (4) bibliographic databases. The future problems are also described from the viewpoints of technology developments and mutual uses of databases.

  14. ShakeMap Atlas 2.0: an improved suite of recent historical earthquake ShakeMaps for global hazard analyses and loss model calibration

    USGS Publications Warehouse

    Garcia, D.; Mah, R.T.; Johnson, K.L.; Hearne, M.G.; Marano, K.D.; Lin, K.-W.; Wald, D.J.

    2012-01-01

    We introduce the second version of the U.S. Geological Survey ShakeMap Atlas, which is an openly-available compilation of nearly 8,000 ShakeMaps of the most significant global earthquakes between 1973 and 2011. This revision of the Atlas includes: (1) a new version of the ShakeMap software that improves data usage and uncertainty estimations; (2) an updated earthquake source catalogue that includes regional locations and finite fault models; (3) a refined strategy to select prediction and conversion equations based on a new seismotectonic regionalization scheme; and (4) vastly more macroseismic intensity and ground-motion data from regional agencies All these changes make the new Atlas a self-consistent, calibrated ShakeMap catalogue that constitutes an invaluable resource for investigating near-source strong ground-motion, as well as for seismic hazard, scenario, risk, and loss-model development. To this end, the Atlas will provide a hazard base layer for PAGER loss calibration and for the Earthquake Consequences Database within the Global Earthquake Model initiative.

  15. AC-DCFS: a toolchain implementation to Automatically Compute Coulomb Failure Stress changes after relevant earthquakes.

    NASA Astrophysics Data System (ADS)

    Alvarez-Gómez, José A.; García-Mayordomo, Julián

    2017-04-01

    We present an automated free software-based toolchain to obtain Coulomb Failure Stress change maps on fault planes of interest following the occurrence of a relevant earthquake. The system uses as input the focal mechanism data of the event occurred and an active fault database for the region. From the focal mechanism the orientations of the possible rupture planes, the location of the event and the size of the earthquake are obtained. From the size of the earthquake, the dimensions of the rupture plane are obtained by means of an algorithm based on empirical relations. Using the active fault database in the area, the stress-receiving planes are obtained and a verisimilitude index is assigned to the source plane from the two nodal planes of the focal mechanism. The obtained product is a series of layers in a format compatible with any type of GIS (or map completely edited in PDF format) showing the possible stress change maps on the different families of fault planes present in the epicentral zone. These type of products are presented generally in technical reports developed in the weeks following the occurrence of the event, or in scientific publications; however they have been proven useful for emergency management in the hours and days after a major event being these stress changes responsible of aftershocks, in addition to the mid-term earthquake forecasting. The automation of the calculation allows its incorporation within the products generated by the alert and surveillance agencies shortly after the earthquake occurred. It is now being implemented in the Spanish Geological Survey as one of the products that this agency would provides after the occurrence of relevant seismic series in Spain.

  16. Translation from the collaborative OSM database to cartography

    NASA Astrophysics Data System (ADS)

    Hayat, Flora

    2018-05-01

    The OpenStreetMap (OSM) database includes original items very useful for geographical analysis and for creating thematic maps. Contributors record in the open database various themes regarding amenities, leisure, transports, buildings and boundaries. The Michelin mapping department develops map prototypes to test the feasibility of mapping based on OSM. To translate the OSM database structure into a database structure fitted with Michelin graphic guidelines a research project is in development. It aims at defining the right structure for the Michelin uses. The research project relies on the analysis of semantic and geometric heterogeneities in OSM data. In that order, Michelin implements methods to transform the input geographical database into a cartographic image dedicated for specific uses (routing and tourist maps). The paper focuses on the mapping tools available to produce a personalised spatial database. Based on processed data, paper and Web maps can be displayed. Two prototypes are described in this article: a vector tile web map and a mapping method to produce paper maps on a regional scale. The vector tile mapping method offers an easy navigation within the map and within graphic and thematic guide- lines. Paper maps can be partly automatically drawn. The drawing automation and data management are part of the mapping creation as well as the final hand-drawing phase. Both prototypes have been set up using the OSM technical ecosystem.

  17. Geologic map and digital database of the Cougar Buttes 7.5' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Powell, R.E.; Matti, J.C.; Cossette, P.M.

    2000-01-01

    The Southern California Areal Mapping Project (SCAMP) of Geologic Division has undertaken regional geologic mapping investigations in the Lucerne Valley area co-sponsored by the Mojave Water Agency and the San Bernardino National Forest. These investigations span the Lucerne Valley basin from the San Bernardino Mountains front northward to the basin axis on the Mojave Desert floor, and from the Rabbit Lake basin east to the Old Woman Springs area. Quadrangles mapped include the Cougar Buttes 7.5' quadrangle, the Lucerne Valley 7.5' quadrangle (Matti and others, in preparation b), the Fawnskin 7.5' quadrangle (Miller and others, 1998), and the Big Bear City 7.5' quadrangle (Matti and others, in preparation a). The Cougar Buttes quadrangle has been mapped previously at scales of 1:62,500 (Dibblee, 1964) and 1:24,000 (Shreve, 1958, 1968; Sadler, 1982a). In line with the goals of the National Cooperative Geologic Mapping Program (NCGMP), our mapping of the Cougar Buttes quadrangle has been directed toward generating a multipurpose digital geologic map database. Guided by the mapping of previous investigators, we have focused on improving our understanding and representation of late Pliocene and Quaternary deposits. In cooperation with the Water Resources Division of the U.S. Geological Survey, we have used our mapping in the Cougar Buttes and Lucerne Valley quadrangles together with well log data to construct cross-sections of the Lucerne Valley basin (R.E. Powell, unpublished data, 1996-1998) and to develop a hydrogeologic framework for the basin. Currently, our mapping in these two quadrangles also is being used as a base for studying soils on various Quaternary landscape surfaces on the San Bernardino piedmont (Eppes and others, 1998). In the Cougar Buttes quadrangle, we have endeavored to represent the surficial geology in a way that provides a base suitable for ecosystem assessment, an effort that has entailed differentiating surficial veneers on piedmont and pediment surfaces and distinguishing the various substrates found beneath these veneers.

  18. LOINC, a universal standard for identifying laboratory observations: a 5-year update.

    PubMed

    McDonald, Clement J; Huff, Stanley M; Suico, Jeffrey G; Hill, Gilbert; Leavelle, Dennis; Aller, Raymond; Forrey, Arden; Mercer, Kathy; DeMoor, Georges; Hook, John; Williams, Warren; Case, James; Maloney, Pat

    2003-04-01

    The Logical Observation Identifier Names and Codes (LOINC) database provides a universal code system for reporting laboratory and other clinical observations. Its purpose is to identify observations in electronic messages such as Health Level Seven (HL7) observation messages, so that when hospitals, health maintenance organizations, pharmaceutical manufacturers, researchers, and public health departments receive such messages from multiple sources, they can automatically file the results in the right slots of their medical records, research, and/or public health systems. For each observation, the database includes a code (of which 25 000 are laboratory test observations), a long formal name, a "short" 30-character name, and synonyms. The database comes with a mapping program called Regenstrief LOINC Mapping Assistant (RELMA(TM)) to assist the mapping of local test codes to LOINC codes and to facilitate browsing of the LOINC results. Both LOINC and RELMA are available at no cost from http://www.regenstrief.org/loinc/. The LOINC medical database carries records for >30 000 different observations. LOINC codes are being used by large reference laboratories and federal agencies, e.g., the CDC and the Department of Veterans Affairs, and are part of the Health Insurance Portability and Accountability Act (HIPAA) attachment proposal. Internationally, they have been adopted in Switzerland, Hong Kong, Australia, and Canada, and by the German national standards organization, the Deutsches Instituts für Normung. Laboratories should include LOINC codes in their outbound HL7 messages so that clinical and research clients can easily integrate these results into their clinical and research repositories. Laboratories should also encourage instrument vendors to deliver LOINC codes in their instrument outputs and demand LOINC codes in HL7 messages they get from reference laboratories to avoid the need to lump so many referral tests under the "send out lab" code.

  19. AirBase - A database of 160,000 aerial photos of Greenland 1930-1980s

    NASA Astrophysics Data System (ADS)

    Korsgaard, Niels; Weng, Willy L.; Kjær, Kurt H.

    2017-04-01

    Beginning in the 1930s Danish survey agencies and US military organizations conducted large-scale aerial photograph surveys of Greenland for mapping purposes (1), eventuating in the recording of more than 160,000 photographs. Glaciological researchers have used this amazing resource of multi-decadal observations of the Greenlandic cryosphere for many decades (e.g. (2), (3), (4)). In recent years this information has been synthesized with modern remote sensing data resulting in a range of published research and data sets ((5), (6), (7), (8)). Today, the historical aerial photographs are stored at the SDFE (Agency for Data Supply and Effiency), the successor agency for the institutions doing the surveying and mapping of Greenland where the material is accessible to researchers and general public alike. The digitized flightline maps and databases necessary for the creation of this data for this work was made available by the SDFE, and it the past and present work with this database we present here. Based on digitized flight line maps, the database contains geocoded metadata such as recording dates, camera and film roll canister, connecting the database with the analog archive material. Past work concentrated on bulk digitization, while the focus of the current work is to improve positional accuracy, completeness, and refinements for web publication. (1) Nielsen, A., Olsen, J. & Weng, W. L. Grønlands opmåling og kortlægning. Landinspektøren 37 (1995). (2) Weidick A. Frontal variations at Upernaviks Isstrøm in the last 100 years. Medd. fra Dansk Geol. Forening. Vol. 14 (1958. (3) Bauer, A., Baussart, M., Carbonnell, M., Kasser, P. Perroud, P. & Renaud, A. Missions aériennes de reconnaissance au Groenland 1957-1958. Observations aériennes et terrestres, exploitation des photographies aériennes, détermination des vitesses des glaciers vêlant dans Disko Bugt et Umanak Fjord. Meddelelser om Grønland 173(3) (1968a. (4) Rignot, E. Box, J.E., Burgess, E. & Hanna, E. Mass balance of the Greenland ice sheet from 1958 to 2007. Geophys. Res. Lett. (2008. (5) Kjær, K.H., Khan, S.A., Korsgaard, N.J., Wahr, J., Bamber, J.L., Hurkmans, R., van den Broeke, M., Timm, L.H., Kjeldsen, K.K., Bjørk, A.A., Larsen, N.K., Jørgensen, L.T., Færch-Jensen, A. & Willerslev, E. Aerial Photographs Reveal Late-20th-Century Dynamic Ice Loss in Northwestern Greenland. Science 337 (2012). (6) Bjørk, A.A., Kjær, K.H., Korsgaard, N.J., Khan, S.A., Kjeldsen, K.K., Andresen, C.S., Box, J, Larsen, N.K. & Funder, S.V. An aerial view of 80 years of climate-related glacier fluctuations in southeast Greenland. Nat. Geosci. 5 (2012). (7) Kjeldsen, K.K., Korsgaard, N.J., Bjørk, A.A., Khan, S.A., Box, J.E., Funder, S., Larsen, N.K., Bamber, J.L., Colgan, W., van den Broeke, M., Siggaard-Andersen, M.-L., Nuth, C., Schomacker, A., Andresen, C.S., Willerslev, E. & Kjær, K.H. Spatial and temporal distribution of mass loss from the Greenland Ice Sheet since AD 1900. Nature 528 (2015). (8) Korsgaard, N.J., Nuth, C., Khan, S.A., Kjeldsen, K.K., Bjørk, A.A., Schomacker A. & Kjær, K.H. Digital elevation model and orthophotographs of Greenland based on aerial photographs from 1978-1987. Sci. Data 3:160032 (2016).

  20. Preliminary surficial geologic map database of the Amboy 30 x 60 minute quadrangle, California

    USGS Publications Warehouse

    Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.

    2006-01-01

    The surficial geologic map database of the Amboy 30x60 minute quadrangle presents characteristics of surficial materials for an area approximately 5,000 km2 in the eastern Mojave Desert of California. This map consists of new surficial mapping conducted between 2000 and 2005, as well as compilations of previous surficial mapping. Surficial geology units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects occurring post-deposition, and, where appropriate, the lithologic nature of the material. The physical properties recorded in the database focus on those that drive hydrologic, biologic, and physical processes such as particle size distribution (PSD) and bulk density. This version of the database is distributed with point data representing locations of samples for both laboratory determined physical properties and semi-quantitative field-based information. Future publications will include the field and laboratory data as well as maps of distributed physical properties across the landscape tied to physical process models where appropriate. The database is distributed in three parts: documentation, spatial map-based data, and printable map graphics of the database. Documentation includes this file, which provides a discussion of the surficial geology and describes the format and content of the map data, a database 'readme' file, which describes the database contents, and FGDC metadata for the spatial map information. Spatial data are distributed as Arc/Info coverage in ESRI interchange (e00) format, or as tabular data in the form of DBF3-file (.DBF) file formats. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files, and are appropriate for representing a view of the spatial database at the mapped scale.

  1. Geologic map and map database of parts of Marin, San Francisco, Alameda, Contra Costa, and Sonoma counties, California

    USGS Publications Warehouse

    Blake, M.C.; Jones, D.L.; Graymer, R.W.; digital database by Soule, Adam

    2000-01-01

    This digital map database, compiled from previously published and unpublished data, and new mapping by the authors, represents the general distribution of bedrock and surficial deposits in the mapped area. Together with the accompanying text file (mageo.txt, mageo.pdf, or mageo.ps), it provides current information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:62,500 or smaller general distribution of bedrock and surficial deposits in the mapped area. Together with the accompanying text file (mageo.txt, mageo.pdf, or mageo.ps), it provides current information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:62,500 or smaller.

  2. STS-99 Crew Interviews: Janet L. Kavandi

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This NASA JSC video release is one in a series of space shuttle astronaut interviews and was recorded Aug. 9, 1999. Mission Specialist, Janet L. Kavandi, Ph.D. provides answers to questions regarding her role in the Shuttle Radar Topography Mission (SRTM), mission objectives, which center on the three-dimensional mapping of the entire Earth's surface, shuttle imaging radar, payload mast deploy and retraction, data recording vs. downlinking, the fly cast maneuver, applications of recorded data, international participation (DLR), the National Imaging and Mapping Agency (NIMA), and EarthCam (educational middle school project). The interview is summed up by Dr. Kavandi explaining that the mission's objective, if successful, will result in the the most complete high-resolution digital topographic database of the Earth.

  3. KSC-99pp0311

    NASA Image and Video Library

    1999-03-23

    In the Multi-Payload Processing Facility, Mary Reaves and Richard Rainen, with the Jet Propulsion Laboratory, work on the carrier and horizontal antenna mast for the STS-99 Shuttle Radar Topography Mission (SRTM) while Larry Broms watches. The SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during an 11-day mission in September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  4. KSC-99pp0505

    NASA Image and Video Library

    1999-05-07

    In the Space Station Processing Facility (SSPF), workers (lower right) disconnect the transport vehicle from the Shuttle Radar Topography Mission (SRTM) after moving it into the building for pre-launch preparations. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission targeted for launch in September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  5. Preliminary geologic map of the Oat Mountain 7.5' quadrangle, Southern California: a digital database

    USGS Publications Warehouse

    Yerkes, R.F.; Campbell, Russell H.

    1995-01-01

    This database, identified as "Preliminary Geologic Map of the Oat Mountain 7.5' Quadrangle, southern California: A Digital Database," has been approved for release and publication by the Director of the USGS. Although this database has been reviewed and is substantially complete, the USGS reserves the right to revise the data pursuant to further analysis and review. This database is released on condition that neither the USGS nor the U. S. Government may be held liable for any damages resulting from its use. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1993). More specific information about the units may be available in the original sources.

  6. Performance analysis of different database in new internet mapping system

    NASA Astrophysics Data System (ADS)

    Yao, Xing; Su, Wei; Gao, Shuai

    2017-03-01

    In the Mapping System of New Internet, Massive mapping entries between AID and RID need to be stored, added, updated, and deleted. In order to better deal with the problem when facing a large number of mapping entries update and query request, the Mapping System of New Internet must use high-performance database. In this paper, we focus on the performance of Redis, SQLite, and MySQL these three typical databases, and the results show that the Mapping System based on different databases can adapt to different needs according to the actual situation.

  7. Intrusive Rock Database for the Digital Geologic Map of Utah

    USGS Publications Warehouse

    Nutt, C.J.; Ludington, Steve

    2003-01-01

    Digital geologic maps offer the promise of rapid and powerful answers to geologic questions using Geographic Information System software (GIS). Using modern GIS and database methods, a specialized derivative map can be easily prepared. An important limitation can be shortcomings in the information provided in the database associated with the digital map, a database which is often based on the legend of the original map. The purpose of this report is to show how the compilation of additional information can, when prepared as a database that can be used with the digital map, be used to create some types of derivative maps that are not possible with the original digital map and database. This Open-file Report consists of computer files with information about intrusive rocks in Utah that can be linked to the Digital Geologic Map of Utah (Hintze et al., 2000), an explanation of how to link the databases and map, and a list of references for the databases. The digital map, which represents the 1:500,000-scale Geologic Map of Utah (Hintze, 1980), can be obtained from the Utah Geological Survey (Map 179DM). Each polygon in the map has a unique identification number. We selected the polygons identified on the geologic map as intrusive rock, and constructed a database (UT_PLUT.xls) that classifies the polygons into plutonic map units (see tables). These plutonic map units are the key information that is used to relate the compiled information to the polygons on the map. The map includes a few polygons that were coded as intrusive on the state map but are largely volcanic rock; in these cases we note the volcanic rock names (rhyolite and latite) as used in the original sources Some polygons identified on the digital state map as intrusive rock were misidentified; these polygons are noted in a separate table of the database, along with some information about their true character. Fields may be empty because of lack of information from references used or difficulty in finding information. The information in the database is from a variety of sources, including geologic maps at scales ranging from 1:500,000 to 1:24,000, and thesis monographs. The references are shown twice: alphabetically and by region. The digital geologic map of Utah (Hintze and others, 2000) classifies intrusive rocks into only 3 categories, distinguished by age. They are: Ti, Tertiary intrusive rock; Ji, Upper to Middle Jurassic granite to quartz monzonite; and pCi, Early Proterozoic to Late Archean intrusive rock. Use of the tables provided in this report will permit selection and classification of those rocks by lithology and age. This database is a pilot study by the Survey and Analysis Project of the U.S. Geological Survey to characterize igneous rocks and link them to a digital map. The database, and others like it, will evolve as the project continues and other states are completed. We release this version now as an example, as a reference, and for those interested in Utah plutonic rocks.

  8. Geographic Disparities in Access to Agencies Providing Income-Related Social Services.

    PubMed

    Bauer, Scott R; Monuteaux, Michael C; Fleegler, Eric W

    2015-10-01

    Geographic location is an important factor in understanding disparities in access to health-care and social services. The objective of this cross-sectional study is to evaluate disparities in the geographic distribution of income-related social service agencies relative to populations in need within Boston. Agency locations were obtained from a comprehensive database of social services in Boston. Geographic information systems mapped the spatial relationship of the agencies to the population using point density estimation and was compared to census population data. A multivariate logistic regression was conducted to evaluate factors associated with categories of income-related agency density. Median agency density within census block groups ranged from 0 to 8 agencies per square mile per 100 population below the federal poverty level (FPL). Thirty percent (n = 31,810) of persons living below the FPL have no access to income-related social services within 0.5 miles, and 77 % of persons living below FPL (n = 83,022) have access to 2 or fewer agencies. 27.0 % of Blacks, 30.1 % of Hispanics, and 41.0 % of non-Hispanic Whites with incomes below FPL have zero access. In conclusion, some neighborhoods in Boston with a high concentration of low-income populations have limited access to income-related social service agencies.

  9. Development and characterization of a 3D high-resolution terrain database

    NASA Astrophysics Data System (ADS)

    Wilkosz, Aaron; Williams, Bryan L.; Motz, Steve

    2000-07-01

    A top-level description of methods used to generate elements of a high resolution 3D characterization database is presented. The database elements are defined as ground plane elevation map, vegetation height elevation map, material classification map, discrete man-made object map, and temperature radiance map. The paper will cover data collection by means of aerial photography, techniques of soft photogrammetry used to derive the elevation data, and the methodology followed to generate the material classification map. The discussion will feature the development of the database elements covering Fort Greely, Alaska. The developed databases are used by the US Army Aviation and Missile Command to evaluate the performance of various missile systems.

  10. Preliminary geologic map of the Piru 7.5' quadrangle, southern California: a digital database

    USGS Publications Warehouse

    Yerkes, R.F.; Campbell, Russell H.

    1995-01-01

    This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1995). More specific information about the units may be available in the original sources.

  11. Geologic map and map database of the Palo Alto 30' x 60' quadrangle, California

    USGS Publications Warehouse

    Brabb, E.E.; Jones, D.L.; Graymer, R.W.

    2000-01-01

    This digital map database, compiled from previously published and unpublished data, and new mapping by the authors, represents the general distribution of bedrock and surficial deposits in the mapped area. Together with the accompanying text file (pamf.ps, pamf.pdf, pamf.txt), it provides current information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:62,500 or smaller.

  12. Geologic map and map database of western Sonoma, northernmost Marin, and southernmost Mendocino counties, California

    USGS Publications Warehouse

    Blake, M.C.; Graymer, R.W.; Stamski, R.E.

    2002-01-01

    This digital map database, compiled from previously published and unpublished data, and new mapping by the authors, represents the general distribution of bedrock and surficial deposits in the mapped area. Together with the accompanying text file (wsomf.ps, wsomf.pdf, wsomf.txt), it provides current information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:62,500 or smaller.

  13. Analysis of national and regional landslide inventories in Europe

    NASA Astrophysics Data System (ADS)

    Hervás, J.; Van Den Eeckhaut, M.

    2012-04-01

    A landslide inventory can be defined as a detailed register of the distribution and characteristics of past landslides in an area. Today most landslide inventories have the form of digital databases including landslide distribution maps and associated alphanumeric information for each landslide. While landslide inventories are of the utmost importance for land use planning and risk management through the generation of landslide zonation (susceptibility, hazard and risk) maps, landslide databases are thought to greatly differ from one country to another and often also within the same country. This hampers the generation of comparable, harmonised landslide zonation maps at national and continental scales, which is needed for policy and decision making at EU level as regarded for instance in the INSPIRE Directive and the Thematic Strategy for Soil Protection. In order to have a clear understanding of the landslide inventories available in Europe and their potential to produce landslide zonation maps as well as to draw recommendations to improve harmonisation and interoperability between landslide databases, we have surveyed 37 countries. In total, information has been collected and analysed for 24 national databases in 22 countries (Albania, Andorra, Austria, Bosnia and Herzegovina, Bulgaria, Czech Republic, Former Yugoslav Republic of Macedonia, France, Greece, Hungary, Iceland, Ireland, Italy, Norway, Poland, Portugal, Slovakia, Slovenia, Spain, Sweden, Switzerland and UK) and 22 regional databases in 10 countries. At the moment, over 633,000 landslides are recorded in national databases, representing on average less than 50% of the estimated landslides occurred in these countries. The sample of regional databases included over 103,000 landslides, with an estimated completeness substantially higher than that of national databases, as more attention can be paid for data collection over smaller regions. Yet, both for national and regional coverage, the data collection methods only occasionally included advanced technologies such as remote sensing. With regard to the inventory maps of most databases, the analysis illustrates the high variability of scales (between 1:10 000 and 1:1 M for national inventories, and from 1:10 000 to 1:25 000 for regional inventories), landslide classification systems and representation symbology. It also shows the difficulties to precisely locate landslides referred to in historical documents only. In addition, information on landslide magnitude, geometrical characteristics and age reported in national and regional databases greatly differs, even within the same database, as it strongly depends on the objectives of the database, the data collection methods used, the resources employed and the remaining landslide expression. In particular, landslide initiation and/or reactivation dates are generally estimated in less than 25% of records, thus making hazard and hence risk assessment difficult. In most databases, scarce information on landslide impact (damage and casualties) further hinders risk assessment at regional and national scales. Estimated landslide activity, which is very relevant to early warning and emergency management, is only included in half of the national databases and restricted to part of the landslides registered. Moreover, the availability of this information is not substantially higher in regional databases than in national ones. Most landslide databases further included information on geo-environmental characteristics at the landslide site, which is very important for modelling landslide zoning. Although a number of national and regional agencies provide free web-GIS visualisation services, the potential of existing landslide databases is often not fully exploited as, in many cases, access by the general public and external researchers is restricted. Additionally, the availability of information only in the national or local language is common to most national and regional databases, thus hampering consultation for most foreigners. Finally, some suggestions for a minimum set of attributes to be collected and made available by European countries for building up a continental landslide database in support of EU policies are presented. This study has been conducted in the framework of the EU-FP7 project SafeLand (Grant Agreement 22647).

  14. Enhancements to Demilitarization Process Maps Program (ProMap)

    DTIC Science & Technology

    2016-10-14

    map tool, ProMap, was improved by implementing new features, and sharing data with MIDAS and AMDIT databases . Specifically, process efficiency was...improved by 1) providing access to APE information contained in the AMDIT database directly from inside ProMap when constructing a process map, 2...what equipment can be efficiently used to demil a particular munition. Associated with this task was the upgrade of the AMDIT database so that

  15. Quaternary Geology and Liquefaction Susceptibility, San Francisco, California 1:100,000 Quadrangle: A Digital Database

    USGS Publications Warehouse

    Knudsen, Keith L.; Noller, Jay S.; Sowers, Janet M.; Lettis, William R.

    1997-01-01

    This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There are no paper maps included in the Open-File report. The report does include, however, PostScript plot files containing the images of the geologic map sheets with explanations, as well as the accompanying text describing the geology of the area. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled 'For Those Who Aren't Familiar With Digital Geologic Map Databases' below. This digital map database, compiled from previously unpublished data, and new mapping by the authors, represents the general distribution of surficial deposits in the San Francisco bay region. Together with the accompanying text file (sf_geo.txt or sf_geo.pdf), it provides current information on Quaternary geology and liquefaction susceptibility of the San Francisco, California, 1:100,000 quadrangle. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:100,000 or smaller. The content and character of the database, as well as three methods of obtaining the database, are described below.

  16. World distribution of uranium deposits

    USGS Publications Warehouse

    Fairclough, M. C.; Irvine, J. A.; Katona, L. F.; Simmon, W. L.; Bruneton, P.; Mihalasky, Mark J.; Cuney, M.; Aranha, M.; Pylypenko, O.; Poliakovska, K.

    2018-01-01

    Deposit data derived from IAEA UDEPO (http://infcis.iaea.org/UDEPO/About.cshtml) database with assistance from P. Bruneton (France) and M. Mihalasky (U.S.A.). The map is an updated companion to "World Distribution of Uranium Deposits (UDEPO) with Uranium Deposit Classification, IAEA Tech-Doc-1629". Geology was derived from L.B. Chorlton, Generalized Geology of the World, Geological Survey of Canada, Open File 5529 , 2007. Map production by M.C. Fairclough (IAEA), J.A. Irvine (Austrailia), L.F. Katona (Australia) and W.L. Slimmon (Canada). World Distribution of Uranium Deposits, International Atomic Energy Agency, Vienna, Austria. Cartographic Assistance was supplied by the Geological Survey of South Australia, the Saskatchewan Geological Survey and United States Geological Survey to the IAEA. Coastlines, drainage, and country boundaries were obtained from ArcMap, 1:25 000 000 scale, and are copyrighted data containing the intellectual property of Environmental Systems Research Institute (ESRI). The use of particular designations of countries or territories does not imply any judgment by the publisher, the IAEA, as to the legal status of such countries or territories, of their authorities and institutions or of the delimitation of their boundaries. Any revisions or additional geological information known to the user would be welcomed by the International Atomic Energy Agency and the Geological Survey of Canada.

  17. Land-use in Amazonia and the Cerrado of Brazil: State of Knowledge and GIS Database

    NASA Technical Reports Server (NTRS)

    Nepstad, Daniel C.

    1997-01-01

    We have assembled datasets to strengthen the LargeScale Biosphere Atmosphere Experiment in Amazonia (LBA). These datasets can now be accessed through the Woods Hole Research Center homepage (www.whrc.org), and will soon be linked to the Pre-LBA homepages of the Brazilian Space Research Institute's Center for Weather and Climate Prediction (Instituto de Pesquisas Espaciais, Centro de Previsao de Tempo e Estudos Climaticos, INPE/CPTEC) and through the Oak Ridge National Laboratory, Distributed Active Archive Center (ORNL/DMC). Some of the datasets that we are making available involved new field research and/or the digitization of data available in Brazilian government agencies. For example, during the grant period we conducted interviews at 1,100 sawmills across Amazonia to determine their production of sawn timber, and their harvest intensities. These data provide the basis for the first quantitative assessment of the area of forest affected each year by selective logging (Nepstad et al, submitted to Nature). We digitized the locations of all of the rural households in the State of Para that have been mapped by the Brazilian malaria combat agency (SUCAM). We also mapped and digitized areas of deforestation in the state of Tocantins, which is comprised largely of savanna (cerrado), an ecosystem that has been routinely excluded from deforestation mapping exercises.

  18. Preliminary Integrated Geologic Map Databases for the United States: Connecticut, Maine, Massachusetts, New Hampshire, New Jersey, Rhode Island and Vermont

    USGS Publications Warehouse

    Nicholson, Suzanne W.; Dicken, Connie L.; Horton, John D.; Foose, Michael P.; Mueller, Julia A.L.; Hon, Rudi

    2006-01-01

    The rapid growth in the use of Geographic Information Systems (GIS) has highlighted the need for regional and national scale digital geologic maps that have standardized information about geologic age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. Although two digital geologic maps (Schruben and others, 1994; Reed and Bush, 2004) of the United States currently exist, their scales (1:2,500,000 and 1:5,000,000) are too general for many regional applications. Most states have digital geologic maps at scales of about 1:500,000, but the databases are not comparably structured and, thus, it is difficult to use the digital database for more than one state at a time. This report describes the result for a seven state region of an effort by the U.S. Geological Survey to produce a series of integrated and standardized state geologic map databases that cover the entire United States. In 1997, the United States Geological Survey's Mineral Resources Program initiated the National Surveys and Analysis (NSA) Project to develop national digital databases. One primary activity of this project was to compile a national digital geologic map database, utilizing state geologic maps, to support studies in the range of 1:250,000- to 1:1,000,000-scale. To accomplish this, state databases were prepared using a common standard for the database structure, fields, attribution, and data dictionaries. For Alaska and Hawaii new state maps are being prepared and the preliminary work for Alaska is being released as a series of 1:250,000 scale quadrangle reports. This document provides background information and documentation for the integrated geologic map databases of this report. This report is one of a series of such reports releasing preliminary standardized geologic map databases for the United States. The data products of the project consist of two main parts, the spatial databases and a set of supplemental tables relating to geologic map units. The datasets serve as a data resource to generate a variety of stratigraphic, age, and lithologic maps. This documentation is divided into four main sections: (1) description of the set of data files provided in this report, (2) specifications of the spatial databases, (3) specifications of the supplemental tables, and (4) an appendix containing the data dictionaries used to populate some fields of the spatial database and supplemental tables.

  19. Algorithms and methodology used in constructing high-resolution terrain databases

    NASA Astrophysics Data System (ADS)

    Williams, Bryan L.; Wilkosz, Aaron

    1998-07-01

    This paper presents a top-level description of methods used to generate high-resolution 3D IR digital terrain databases using soft photogrammetry. The 3D IR database is derived from aerial photography and is made up of digital ground plane elevation map, vegetation height elevation map, material classification map, object data (tanks, buildings, etc.), and temperature radiance map. Steps required to generate some of these elements are outlined. The use of metric photogrammetry is discussed in the context of elevation map development; and methods employed to generate the material classification maps are given. The developed databases are used by the US Army Aviation and Missile Command to evaluate the performance of various missile systems. A discussion is also presented on database certification which consists of validation, verification, and accreditation procedures followed to certify that the developed databases give a true representation of the area of interest, and are fully compatible with the targeted digital simulators.

  20. Evaluating Aggregate Terrestrial Impacts of Road Construction Projects for Advanced Regional Mitigation

    NASA Astrophysics Data System (ADS)

    Thorne, James H.; Girvetz, Evan H.; McCoy, Michael C.

    2009-05-01

    This study presents a GIS-based database framework used to assess aggregate terrestrial habitat impacts from multiple highway construction projects in California, USA. Transportation planners need such impact assessment tools to effectively address additive biological mitigation obligations. Such assessments can reduce costly delays due to protracted environmental review. This project incorporated the best available statewide natural resource data into early project planning and preliminary environmental assessments for single and multiple highway construction projects, and provides an assessment of the 10-year state-wide mitigation obligations for the California Department of Transportation. Incorporation of these assessments will facilitate early and more strategic identification of mitigation opportunities, for single-project and regional mitigation efforts. The data architecture format uses eight spatial scales: six nested watersheds, counties, and transportation planning districts, which were intersected. This resulted in 8058 map planning units statewide, which were used to summarize all subsequent analyses. Range maps and georeferenced locations of federally and state-listed plants and animals and a 55-class landcover map were spatially intersected with the planning units and the buffered spatial footprint of 967 funded projects. Projected impacts were summarized and output to the database. Queries written in the database can sum expected impacts and provide summaries by individual construction project, or by watershed, county, transportation district or highway. The data architecture allows easy incorporation of new information and results in a tool usable without GIS by a wide variety of agency biologists and planners. The data architecture format would be useful for other types of regional planning.

  1. Evaluating aggregate terrestrial impacts of road construction projects for advanced regional mitigation.

    PubMed

    Thorne, James H; Girvetz, Evan H; McCoy, Michael C

    2009-05-01

    This study presents a GIS-based database framework used to assess aggregate terrestrial habitat impacts from multiple highway construction projects in California, USA. Transportation planners need such impact assessment tools to effectively address additive biological mitigation obligations. Such assessments can reduce costly delays due to protracted environmental review. This project incorporated the best available statewide natural resource data into early project planning and preliminary environmental assessments for single and multiple highway construction projects, and provides an assessment of the 10-year state-wide mitigation obligations for the California Department of Transportation. Incorporation of these assessments will facilitate early and more strategic identification of mitigation opportunities, for single-project and regional mitigation efforts. The data architecture format uses eight spatial scales: six nested watersheds, counties, and transportation planning districts, which were intersected. This resulted in 8058 map planning units statewide, which were used to summarize all subsequent analyses. Range maps and georeferenced locations of federally and state-listed plants and animals and a 55-class landcover map were spatially intersected with the planning units and the buffered spatial footprint of 967 funded projects. Projected impacts were summarized and output to the database. Queries written in the database can sum expected impacts and provide summaries by individual construction project, or by watershed, county, transportation district or highway. The data architecture allows easy incorporation of new information and results in a tool usable without GIS by a wide variety of agency biologists and planners. The data architecture format would be useful for other types of regional planning.

  2. Digital database of the geologic map of the island of Hawai'i [Hawaii

    USGS Publications Warehouse

    Trusdell, Frank A.; Wolfe, Edward W.; Morris, Jean

    2006-01-01

    This online publication (DS 144) provides the digital database for the printed map by Edward W. Wolfe and Jean Morris (I-2524-A; 1996). This digital database contains all the information used to publish U.S. Geological Survey Geologic Investigations Series I-2524-A (available only in paper form; see http://pubs.er.usgs.gov/pubs/i/i2524A). The database contains the distribution and relationships of volcanic and surficial-sedimentary deposits on the island of Hawai‘i. This dataset represents the geologic history for the five volcanoes that comprise the Island of Hawai'i. The volcanoes are Kohala, Mauna Kea, Hualalai, Mauna Loa and Kīlauea.This database of the geologic map contributes to understanding the geologic history of the Island of Hawai‘i and provides the basis for understanding long-term volcanic processes in an intra-plate ocean island volcanic system. In addition the database also serves as a basis for producing volcanic hazards assessment for the island of Hawai‘i. Furthermore it serves as a base layer to be used for interdisciplinary research.This online publication consists of a digital database of the geologic map, an explanatory pamphlet, description of map units, correlation of map units diagram, and images for plotting. Geologic mapping was compiled at a scale of 1:100,000 for the entire mapping area. The geologic mapping was compiled as a digital geologic database in ArcInfo GIS format.

  3. Geologic map and map database of northeastern San Francisco Bay region, California, [including] most of Solano County and parts of Napa, Marin, Contra Costa, San Joaquin, Sacramento, Yolo, and Sonoma Counties

    USGS Publications Warehouse

    Graymer, Russell Walter; Jones, David Lawrence; Brabb, Earl E.

    2002-01-01

    This digital map database, compiled from previously published and unpublished data, and new mapping by the authors, represents the general distribution of bedrock and surficial deposits in the mapped area. Together with the accompanying text file (nesfmf.ps, nesfmf.pdf, nesfmf.txt), it provides current information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:62,500 or smaller.

  4. Large-Scale Digital Geologic Map Databases and Reports of the North Coal District in Afghanistan

    USGS Publications Warehouse

    Hare, Trent M.; Davis, Philip A.; Nigh, Devon; Skinner, James A.; SanFilipo, John R.; Bolm, Karen S.; Fortezzo, Corey M.; Galuszka, Donna; Stettner, William R.; Sultani, Shafiqullah; Nader, Billal

    2008-01-01

    This report describes the Afghanistan coal resource maps and associated databases that have been digitally captured and maps that have been thus far converted to GIS databases. Several maps by V/O Technoexport, USSR (VOTU) and Bundesanstalt fur Bodenforschung (BGR), Hannover, Germany, are captured here. Most of the historical coal exploration is concentrated in north-central Afghanistan, a region referred to as the 'North Coal District', and almost all of the coal-related maps found Afghanistan Geological Survey (AGS) archives to date cover various locations within that district as shown in the index map. Most of the maps included herein were originally scanned during U.S. Geological Survey (USGS) site visits to Kabul in November 2004 and February 2006. The scanning was performed using equipment purchased by U.S. Agency for International Development (USAID) and U.S. Trade and Development Agency (USTDA) and installed at the AGS by USGS. Many of these maps and associated reports exist as single unpublished copies in the AGS archives, so these efforts served not only to provide a basis for digital capturing, but also as a means for preserving these rare geologic maps and reports. The data included herein represent most of the coal-related reports and maps that are available in the AGS archives. This report excludes the limited cases when a significant portion of a report's text could not be located, but it does not exclude reports with missing plates. The vector files are released using the Environmental Systems Research Institute (ESRI) Personal Geodatabase, ESRI shapefile vector format, and the open Geography Markup Language (GML) format. Scanned images are available in JPEG and, when rectified, GeoTIFF format. The authors wish to acknowledge the contributions made by the staff of the AGS Records and Coal Departments whose valuable assistance made it possible to locate and catalogue the data provided herein. We especially acknowledge the efforts of particular members of the coal team: Engineer Saifuddin Aminy (Team Leader); Engineer Gul Pacha Azizi; Engineer Abdul Haq Barakati; Engineer Abdul Basir; Engineer Mohammad Daoud; Engineer Abdullah Ebadi; Engineer Abdul Ahad Omaid; Engineer Spozmy; and Engineer Shapary Tokhi. The ongoing efforts of Engineer Mir M. Atiq Kazimi (Team leader); Engineer M. Anwar Housinzada; and Engineer Shereen Agha of the AGS Records Department to organize and catalogue the AGS material were invaluable in locating and preserving these data. The efforts of the entire AGS staff to personally preserve these data during war time, in the absence of virtually any supporting infrastructure, was truly remarkable. The efforts by the British Geological Survey (BGS) to assist the AGS in archiving these data, and the personal assistance provided by BGS (notably Robert McIntosh), to the USGS teams were also appreciated. The logistical support provided by the U.S. Embassy in Kabul, particularly the Afghanistan Reconstruction Group, was critical to the success of the USGS teams while in Afghanistan. Finally, the efforts of the Minister of the Ministry of Mines and Industries (M. Ibrahim Adel) to support the USGS coal resource assessment in Afghanistan, in both his current and former role as President of the Mines Affairs Department was vital to this effort.

  5. Geology of Point Reyes National Seashore and vicinity, California: a digital database

    USGS Publications Warehouse

    Clark, Jospeh C.; Brabb, Earl E.

    1997-01-01

    This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The report does include, however, a PostScript plot file containing an image of the geologic map sheet with explanation, as well as the accompanying text describing the geology of the area. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled 'For Those Who Aren't Familiar With Digital Geologic Map Databases' below. This digital map database, compiled from previously published and unpublished data and new mapping by the authors, represents the general distribution of surficial deposits and rock units in Point Reyes and surrounding areas. Together with the accompanying text file (pr-geo.txt or pr-geo.ps), it provides current information on the stratigraphy and structural geology of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:48,000 or smaller.

  6. PoMaMo--a comprehensive database for potato genome data.

    PubMed

    Meyer, Svenja; Nagel, Axel; Gebhardt, Christiane

    2005-01-01

    A database for potato genome data (PoMaMo, Potato Maps and More) was established. The database contains molecular maps of all twelve potato chromosomes with about 1000 mapped elements, sequence data, putative gene functions, results from BLAST analysis, SNP and InDel information from different diploid and tetraploid potato genotypes, publication references, links to other public databases like GenBank (http://www.ncbi.nlm.nih.gov/) or SGN (Solanaceae Genomics Network, http://www.sgn.cornell.edu/), etc. Flexible search and data visualization interfaces enable easy access to the data via internet (https://gabi.rzpd.de/PoMaMo.html). The Java servlet tool YAMB (Yet Another Map Browser) was designed to interactively display chromosomal maps. Maps can be zoomed in and out, and detailed information about mapped elements can be obtained by clicking on an element of interest. The GreenCards interface allows a text-based data search by marker-, sequence- or genotype name, by sequence accession number, gene function, BLAST Hit or publication reference. The PoMaMo database is a comprehensive database for different potato genome data, and to date the only database containing SNP and InDel data from diploid and tetraploid potato genotypes.

  7. PoMaMo—a comprehensive database for potato genome data

    PubMed Central

    Meyer, Svenja; Nagel, Axel; Gebhardt, Christiane

    2005-01-01

    A database for potato genome data (PoMaMo, Potato Maps and More) was established. The database contains molecular maps of all twelve potato chromosomes with about 1000 mapped elements, sequence data, putative gene functions, results from BLAST analysis, SNP and InDel information from different diploid and tetraploid potato genotypes, publication references, links to other public databases like GenBank (http://www.ncbi.nlm.nih.gov/) or SGN (Solanaceae Genomics Network, http://www.sgn.cornell.edu/), etc. Flexible search and data visualization interfaces enable easy access to the data via internet (https://gabi.rzpd.de/PoMaMo.html). The Java servlet tool YAMB (Yet Another Map Browser) was designed to interactively display chromosomal maps. Maps can be zoomed in and out, and detailed information about mapped elements can be obtained by clicking on an element of interest. The GreenCards interface allows a text-based data search by marker-, sequence- or genotype name, by sequence accession number, gene function, BLAST Hit or publication reference. The PoMaMo database is a comprehensive database for different potato genome data, and to date the only database containing SNP and InDel data from diploid and tetraploid potato genotypes. PMID:15608284

  8. LexisNexis

    EPA Pesticide Factsheets

    LexisNexis provides access to electronic legal and non-legal research databases to the Agency's attorneys, administrative law judges, law clerks, investigators, and certain non-legal staff (e.g. staff in the Office of Public Affairs). The agency requires access to the following types of electronic databases: Legal databases, Non-legal databases, Public Records databases, and Financial databases.

  9. Integrating Databases with Maps: The Delivery of Cultural Data through TimeMap.

    ERIC Educational Resources Information Center

    Johnson, Ian

    TimeMap is a unique integration of database management, metadata and interactive maps, designed to contextualise and deliver cultural data through maps. TimeMap extends conventional maps with the time dimension, creating and animating maps "on-the-fly"; delivers them as a kiosk application or embedded in Web pages; links flexibly to…

  10. Data management with a landslide inventory of the Franconian Alb (Germany) using a spatial database and GIS tools

    NASA Astrophysics Data System (ADS)

    Bemm, Stefan; Sandmeier, Christine; Wilde, Martina; Jaeger, Daniel; Schwindt, Daniel; Terhorst, Birgit

    2014-05-01

    The area of the Swabian-Franconian cuesta landscape (Southern Germany) is highly prone to landslides. This was apparent in the late spring of 2013, when numerous landslides occurred as a consequence of heavy and long-lasting rainfalls. The specific climatic situation caused numerous damages with serious impact on settlements and infrastructure. Knowledge on spatial distribution of landslides, processes and characteristics are important to evaluate the potential risk that can occur from mass movements in those areas. In the frame of two projects about 400 landslides were mapped and detailed data sets were compiled during years 2011 to 2014 at the Franconian Alb. The studies are related to the project "Slope stability and hazard zones in the northern Bavarian cuesta" (DFG, German Research Foundation) as well as to the LfU (The Bavarian Environment Agency) within the project "Georisks and climate change - hazard indication map Jura". The central goal of the present study is to create a spatial database for landslides. The database should contain all fundamental parameters to characterize the mass movements and should provide the potential for secure data storage and data management, as well as statistical evaluations. The spatial database was created with PostgreSQL, an object-relational database management system and PostGIS, a spatial database extender for PostgreSQL, which provides the possibility to store spatial and geographic objects and to connect to several GIS applications, like GRASS GIS, SAGA GIS, QGIS and GDAL, a geospatial library (Obe et al. 2011). Database access for querying, importing, and exporting spatial and non-spatial data is ensured by using GUI or non-GUI connections. The database allows the use of procedural languages for writing advanced functions in the R, Python or Perl programming languages. It is possible to work directly with the (spatial) data entirety of the database in R. The inventory of the database includes (amongst others), informations on location, landslide types and causes, geomorphological positions, geometries, hazards and damages, as well as assessments related to the activity of landslides. Furthermore, there are stored spatial objects, which represent the components of a landslide, in particular the scarps and the accumulation areas. Besides, waterways, map sheets, contour lines, detailed infrastructure data, digital elevation models, aspect and slope data are included. Examples of spatial queries to the database are intersections of raster and vector data for calculating values for slope gradients or aspects of landslide areas and for creating multiple, overlaying sections for the comparison of slopes, as well as distances to the infrastructure or to the next receiving drainage. Furthermore, getting informations on landslide magnitudes, distribution and clustering, as well as potential correlations concerning geomorphological or geological conditions. The data management concept in this study can be implemented for any academic, public or private use, because it is independent from any obligatory licenses. The created spatial database offers a platform for interdisciplinary research and socio-economic questions, as well as for landslide susceptibility and hazard indication mapping. Obe, R.O., Hsu, L.S. 2011. PostGIS in action. - pp 492, Manning Publications, Stamford

  11. Semantics-informed cartography: the case of Piemonte Geological Map

    NASA Astrophysics Data System (ADS)

    Piana, Fabrizio; Lombardo, Vincenzo; Mimmo, Dario; Giardino, Marco; Fubelli, Giandomenico

    2016-04-01

    In modern digital geological maps, namely those supported by a large geo-database and devoted to dynamical, interactive representation on WMS-WebGIS services, there is the need to provide, in an explicit form, the geological assumptions used for the design and compilation of the database of the Map, and to get a definition and/or adoption of semantic representation and taxonomies, in order to achieve a formal and interoperable representation of the geologic knowledge. These approaches are fundamental for the integration and harmonisation of geological information and services across cultural (e.g. different scientific disciplines) and/or physical barriers (e.g. administrative boundaries). Initiatives such as GeoScience Markup Language (last version is GeoSciML 4.0, 2015, http://www.geosciml.org) and the INSPIRE "Data Specification on Geology" http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_GE_v3.0rc3.pdf (an operative simplification of GeoSciML, last version is 3.0 rc3, 2013), as well as the recent terminological shepherding of the Geoscience Terminology Working Group (GTWG) have been promoting information exchange of the geologic knowledge. Grounded on these standard vocabularies, schemas and data models, we provide a shared semantic classification of geological data referring to the study case of the synthetic digital geological map of the Piemonte region (NW Italy), named "GEOPiemonteMap", developed by the CNR Institute of Geosciences and Earth Resources, Torino (CNR IGG TO) and hosted as a dynamical interactive map on the geoportal of ARPA Piemonte Environmental Agency. The Piemonte Geological Map is grounded on a regional-scale geo-database consisting of some hundreds of GeologicUnits whose thousands instances (Mapped Features, polygons geometry) widely occur in Piemonte region, and each one is bounded by GeologicStructures (Mapped Features, line geometry). GeologicUnits and GeologicStructures have been spatially correlated through the whole region and described using the GeoSciML vocabularies. A hierarchical schema is provided for the Piemonte Geological Map that gives the parental relations between several orders of GeologicUnits referring to mostly recurring geological objects and main GeologicEvents, in a logical framework compliant with GeoSciML and INSPIRE data models. The classification criteria and the Hierarchy Schema used to define the GEOPiemonteMap Legend, as well as the intended meanings of the geological concepts used to achieve the overall classification schema, are explicitly described in several WikiGeo pages (implemented by "MediaWiki" open source software, https://www.mediawiki.org/wiki/MediaWiki). Moreover, a further step toward a formal classification of the contents (both data and interpretation) of the GEOPiemonteMap was triggered, by setting up an ontological framework, named "OntoGeonous", in order to achieve a thorough semantic characterization of the Map.

  12. JAMSTEC DARWIN Database Assimilates GANSEKI and COEDO

    NASA Astrophysics Data System (ADS)

    Tomiyama, T.; Toyoda, Y.; Horikawa, H.; Sasaki, T.; Fukuda, K.; Hase, H.; Saito, H.

    2017-12-01

    Introduction: Japan Agency for Marine-Earth Science and Technology (JAMSTEC) archives data and samples obtained by JAMSTEC research vessels and submersibles. As a common property of the human society, JAMSTEC archive is open for public users with scientific/educational purposes [1]. For publicizing its data and samples online, JAMSTEC is operating NUUNKUI data sites [2], a group of several databases for various data and sample types. For years, data and metadata of JAMSTEC rock samples, sediment core samples and cruise/dive observation were publicized through databases named GANSEKI, COEDO, and DARWIN, respectively. However, because they had different user interfaces and data structures, these services were somewhat confusing for unfamiliar users. Maintenance costs of multiple hardware and software were also problematic for performing sustainable services and continuous improvements. Database Integration: In 2017, GANSEKI, COEDO and DARWIN were integrated into DARWIN+ [3]. The update also included implementation of map-search function as a substitute of closed portal site. Major functions of previous systems were incorporated into the new system; users can perform the complex search, by thumbnail browsing, map area, keyword filtering, and metadata constraints. As for data handling, the new system is more flexible, allowing the entry of variety of additional data types. Data Management: After the DARWIN major update, JAMSTEC data & sample team has been dealing with minor issues of individual sample data/metadata which sometimes need manual modification to be transferred to the new system. Some new data sets, such as onboard sample photos and surface close-up photos of rock samples, are getting available online. Geochemical data of sediment core samples will supposedly be added in the near future. Reference: [1] http://www.jamstec.go.jp/e/database/data_policy.html [2] http://www.godac.jamstec.go.jp/jmedia/portal/e/ [3] http://www.godac.jamstec.go.jp/darwin/e/

  13. Version 2.0 of the International Bathymetric Chart of the Arctic Ocean: A new Database for Oceanographers and Mapmakers

    NASA Astrophysics Data System (ADS)

    Jakobsson, M.; Macnab, R.; Edwards, M.; Schenke, H.; Hatzky, J.

    2007-12-01

    The International Bathymetric Chart of the Arctic Ocean (IBCAO) was first released to the public after its introduction at the American Geophysical Union (AGU) Fall Meeting in 1999 (Jakobsson et al., 2000). This first release consisted of a Digital Bathymetric Model (DBM) on a Polar stereographic projection with grid cell spacing of 2.5 x 2.5 km derived from an accumulated database of all available bathymetric data at the time of compilation. The IBCAO bathymetric database included soundings collected during past and modern expeditions as well as digitized isobaths and depth soundings from published maps. Compared to previous bathymetric maps of the Arctic Ocean, the first released IBCAO compilation was based upon a significantly enhanced database, particularly in the high Arctic. For example, de-classified echo soundings acquired during US and British submarine cruises between 1958 and 1988 were included as well as soundings from icebreaker cruises conducted by Sweden and Germany at the end of the last century. Despite the newly available data in 1999, there were still large areas of the Arctic Ocean where publicly available data were completely absent. Some of these areas had been mapped by Russian agencies, and since these observations were not available to IBCAO, depth contours from the bathymetric contour map published by the Head Department of Navigation and Hydrography (HDNO) (Naryshkin, 1999) were digitized and incorporated in the database. The new IBCAO Version 2.0 comprises the largest update since the first release; moreover, the grid spacing has been decreased to 2 x 2 km. Numerous multibeam data sets that were collected by ice breakers, e.g. USCGC Healy, R/V James Clarke Ross, R/V Polarstern, IB Oden, now form part of the database, as do the swath bathymetric observations acquired during the 1999 SCICEX expedition. The portrayal of the Eastern Arctic Basin is vastly improved due to e.g. the Arctic Mid Ocean Ridge Expedition 2001 (AMORE) and Arctic Gakkel Vents 2007 (AGAVE) expedition while mapping missions aboard the USCGC Healy have revealed the "real" shape of the sea floor of the central Lomonosov Ridge and in areas off Northern Alaska in the Western Arctic. This paper presents an overview of the new data included in Version 2.0 as well as a brief discussion on the improvements and their possible implications for IBCAO users. Jakobsson, M., Cherkis, N., Woodward, J., Macnab, R. and Coakley, B., 2000. New grid of Arctic bathymetry aids scientists and mapmakers. EOS, Transactions American Geophysical Union, 81: 89, 93, 96. Naryshkin, G., 1999. Bottom relief of the Arctic Ocean. In: H.D.o.N.a. Oceanography and A.-R.R.I.f.G.a.M.R.o.t.W. Ocean (Editors). Russian Academy of Sciences, pp. Bathymetric contour map.

  14. Development of ground-water vulnerability database for the U.S. Environmental protection agency's hazard ranking system using a geographic information system

    USGS Publications Warehouse

    Clarke, John S.; Sorensen, Jerry W.; Strickland, Henry G.; Collins, George

    1992-01-01

    Geographic information system (GIS) methods were applied to the U.S. Environmental Protection Agency's (EPA) hazard ranking system (HRS) to evaluate the vulnerability of ground water to contamination from actual or potential releases of hazardous materials from waste-disposal sites. Computerized maps of four factors influencing ground-water vulnerability - hydraulic conductivity, sorptive capacity, depth to water, and net precipitation - were derived for the Southeastern United States from digitized copies of published maps and from computerized databases, including the U.S. Geological Survey's (USGS) national water information system. To test the accuracy of the derived data coverages used to assess ground-water vulnerability, GIS-derived values for hydraulic conductivity, depth to water, and net precipitation were compared to corresponding values assigned by EPA's field investigation teams (FIT) at 28 hazardous waste sites. For each factor, site data were divided into three physiographic groupings: (1) Coastal Plain, (2) Valley and Ridge-Interior Low Plateaus, and (3) Piedmont-Blue Ridge. The best correlation between the paired data sets was for the net precipitation factor, where most GIS-derived values were within 0 to 40% of the FIT data, and 79% were within the same HRS scoring range. For the hydraulic conductivity factor, the best correlation between GIS and FIT data was for values derived from a published surficial deposits map, where most of the values were within one order of magnitude of the FIT data, and on the average were within 1.24 orders of magnitude of the FIT data. For this map, the best match between data sets was in the Coastal Plain province, where the difference in order to magnitude averaged 0.92. For the depth-to-water factor, most of the GIS derived values were within 51 to 100% of the FIT data, and only 44 to 50% of the sites were within a common scoring range. The best correlation for depth to water was in the Coastal Plain where GIS derived values were within 8 to 100% of the FIT data.

  15. Spatial digital database for the tectonic map of Southeast Arizona

    USGS Publications Warehouse

    map by Drewes, Harald; digital database by Fields, Robert A.; Hirschberg, Douglas M.; Bolm, Karen S.

    2002-01-01

    A spatial database was created for Drewes' (1980) tectonic map of southeast Arizona: this database supercedes Drewes and others (2001, ver. 1.0). Staff and a contractor at the U.S. Geological Survey in Tucson, Arizona completed an interim digital geologic map database for the east part of the map in 2001, made revisions to the previously released digital data for the west part of the map (Drewes and others, 2001, ver. 1.0), merged data files for the east and west parts, and added additional data not previously captured. Digital base map data files (such as topography, roads, towns, rivers and lakes) are not included: they may be obtained from a variety of commercial and government sources. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. The resulting digital geologic map database can be queried in many ways to produce a variety of geologic maps and derivative products. Because Drewes' (1980) map sheets include additional text and graphics that were not included in this report, scanned images of his maps (i1109_e.jpg, i1109_w.jpg) are included as a courtesy to the reader. This database should not be used or displayed at any scale larger than 1:125,000 (for example, 1:100,000 or 1:24,000). The digital geologic map plot files (i1109_e.pdf and i1109_w.pdf) that are provided herein are representations of the database (see Appendix A). The map area is located in southeastern Arizona (fig. 1). This report describes the map units (from Drewes, 1980), the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. The manuscript and digital data review by Helen Kayser (Information Systems Support, Inc.) is greatly appreciated.

  16. 78 FR 70569 - Technical Mapping Advisory Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ... DEPARTMENT OF HOMELAND SECURITY Federal Emergency Management Agency [Docket ID: FEMA-2013-0039] Technical Mapping Advisory Council AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice... Technical Mapping Advisory Council (TMAC). The notice incorrectly stated that contractors and potential...

  17. Prototype Packaged Databases and Software in Health

    PubMed Central

    Gardenier, Turkan K.

    1980-01-01

    This paper describes the recent demand for packaged databases and software for health applications in light of developments in mini-and micro-computer technology. Specific features for defining prospective user groups are discussed; criticisms generated for large-scale epidemiological data use as a means of replacing clinical trials and associated controls are posed to the reader. The available collaborative efforts for access and analysis of jointly structured health data are stressed, with recommendations for new analytical techniques specifically geared to monitoring data such as the CTSS (Cumulative Transitional State Score) generated for tacking ongoing patient status over time in clinical trials. Examples of graphic display are given from the Domestic Information Display System (DIDS) which is a collaborative multi-agency effort to computerize and make accessible user-specified U.S. and local maps relating to health, environment, socio-economic and energy data.

  18. Type 2 Diabetes Research Yield, 1951-2012: Bibliometrics Analysis and Density-Equalizing Mapping

    PubMed Central

    Geaney, Fiona; Scutaru, Cristian; Kelly, Clare; Glynn, Ronan W.; Perry, Ivan J.

    2015-01-01

    The objective of this paper is to provide a detailed evaluation of type 2 diabetes mellitus research output from 1951-2012, using large-scale data analysis, bibliometric indicators and density-equalizing mapping. Data were retrieved from the Science Citation Index Expanded database, one of the seven curated databases within Web of Science. Using Boolean operators "OR", "AND" and "NOT", a search strategy was developed to estimate the total number of published items. Only studies with an English abstract were eligible. Type 1 diabetes and gestational diabetes items were excluded. Specific software developed for the database analysed the data. Information including titles, authors’ affiliations and publication years were extracted from all files and exported to excel. Density-equalizing mapping was conducted as described by Groenberg-Kloft et al, 2008. A total of 24,783 items were published and cited 476,002 times. The greatest number of outputs were published in 2010 (n=2,139). The United States contributed 28.8% to the overall output, followed by the United Kingdom (8.2%) and Japan (7.7%). Bilateral cooperation was most common between the United States and United Kingdom (n=237). Harvard University produced 2% of all publications, followed by the University of California (1.1%). The leading journals were Diabetes, Diabetologia and Diabetes Care and they contributed 9.3%, 7.3% and 4.0% of the research yield, respectively. In conclusion, the volume of research is rising in parallel with the increasing global burden of disease due to type 2 diabetes mellitus. Bibliometrics analysis provides useful information to scientists and funding agencies involved in the development and implementation of research strategies to address global health issues. PMID:26208117

  19. Applications of the U.S. Geological Survey's global land cover product

    USGS Publications Warehouse

    Reed, B.

    1997-01-01

    The U.S. Geological Survey (USGS), in partnership with several international agencies and universities, has produced a global land cover characteristics database. The land cover data were created using multitemporal analysis of advanced very high resolution radiometer satellite images in conjunction with other existing geographic data. A translation table permits the conversion of the land cover classes into several conventional land cover schemes that are used by ecosystem modelers, climate modelers, land management agencies, and other user groups. The alternative classification schemes include Global Ecosystems, the Biosphere Atmosphere Transfer Scheme, the Simple Biosphere, the USGS Anderson Level 2, and the International Geosphere Biosphere Programme. The distribution system for these data is through the World Wide Web (the web site address is: http://edcwww.cr.usgs.gov/landdaac/glcc/glcc.html) or by magnetic media upon special request The availability of the data over the World Wide Web, in conjunction with the flexible database structure, allows easy data access to a wide range of users. The web site contains a user registration form that allows analysis of the diverse applications of large-area land cover data. Currently, applications are divided among mapping (20 percent), conservation (30 percent), and modeling (35 percent).

  20. Geologic Map of the Tucson and Nogales Quadrangles, Arizona (Scale 1:250,000): A Digital Database

    USGS Publications Warehouse

    Peterson, J.A.; Berquist, J.R.; Reynolds, S.J.; Page-Nedell, S. S.; Digital database by Oland, Gustav P.; Hirschberg, Douglas M.

    2001-01-01

    The geologic map of the Tucson-Nogales 1:250,000 scale quadrangle (Peterson and others, 1990) was digitized by U.S. Geological Survey staff and University of Arizona contractors at the Southwest Field Office, Tucson, Arizona, in 2000 for input into a geographic information system (GIS). The database was created for use as a basemap in a decision support system designed by the National Industrial Minerals and Surface Processes project. The resulting digital geologic map database can be queried in many ways to produce a variety of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included; they may be obtained from a variety of commercial and government sources. Additionally, point features, such as strike and dip, were not captured from the original paper map and are not included in the database. This database is not meant to be used or displayed at any scale larger than 1:250,000 (for example, 1:100,000 or 1:24,000). The digital geologic map graphics and plot files that are provided in the digital package are representations of the digital database. They are not designed to be cartographic products.

  1. Development of a Florida Coastal Mapping Program Through Local and Regional Coordination

    NASA Astrophysics Data System (ADS)

    Hapke, C. J.; Kramer, P. A.; Fetherston-Resch, E.; Baumstark, R.

    2017-12-01

    The State of Florida has the longest coastline in the contiguous United States (2,170 km). The coastal zone is heavily populated and contains 1,900 km of sandy beaches that support economically important recreation and tourism. Florida's waters also host important marine mineral resources, unique ecosystems, and the largest number of recreational boats and saltwater fishermen in the country. There is increasing need and demand for high resolution data of the coast and adjacent seafloor for resource and habitat mapping, understanding coastal vulnerability, evaluating performance of restoration projects, and many other coastal and marine spatial planning efforts. The Florida Coastal Mapping Program (FCMP), initiated in 2017 as a regional collaboration between four federal and three state agencies, has goals of establishing the priorities for high resolution seafloor mapping of Florida's coastal environment, and developing a strategy for leveraging funds to support mapping priorities set by stakeholders. We began by creating a comprehensive digital inventory of existing data (collected by government, the private sector, and academia) from 1 kilometer inland to the 200 meter isobath for a statewide geospatial database and gap analysis. Data types include coastal topography, bathymetry, and acoustic data such as sidescan sonar and subbottom profiles. Next, we will develop appropriate proposals and legislative budget requests in response to opportunities to collect priority data in high priority areas. Data collection will be undertaken by a combination of state and federal agencies. The FCMP effort will provide the critical baseline information that is required for characterizing changes to fragile ecosystems, assessing marine resources, and forecasting the impacts on coastal infrastructure and recreational beaches from future storms and sea-level rise.

  2. Distributed Fast Self-Organized Maps for Massive Spectrophotometric Data Analysis †.

    PubMed

    Dafonte, Carlos; Garabato, Daniel; Álvarez, Marco A; Manteiga, Minia

    2018-05-03

    Analyzing huge amounts of data becomes essential in the era of Big Data, where databases are populated with hundreds of Gigabytes that must be processed to extract knowledge. Hence, classical algorithms must be adapted towards distributed computing methodologies that leverage the underlying computational power of these platforms. Here, a parallel, scalable, and optimized design for self-organized maps (SOM) is proposed in order to analyze massive data gathered by the spectrophotometric sensor of the European Space Agency (ESA) Gaia spacecraft, although it could be extrapolated to other domains. The performance comparison between the sequential implementation and the distributed ones based on Apache Hadoop and Apache Spark is an important part of the work, as well as the detailed analysis of the proposed optimizations. Finally, a domain-specific visualization tool to explore astronomical SOMs is presented.

  3. Geologic Map and Map Database of Eastern Sonoma and Western Napa Counties, California

    USGS Publications Warehouse

    Graymer, R.W.; Brabb, E.E.; Jones, D.L.; Barnes, J.; Nicholson, R.S.; Stamski, R.E.

    2007-01-01

    Introduction This report contains a new 1:100,000-scale geologic map, derived from a set of geologic map databases (Arc-Info coverages) containing information at 1:62,500-scale resolution, and a new description of the geologic map units and structural relations in the map area. Prepared as part of the San Francisco Bay Region Mapping Project, the study area includes the north-central part of the San Francisco Bay region, and forms the final piece of the effort to generate new, digital geologic maps and map databases for an area which includes Alameda, Contra Costa, Marin, Napa, San Francisco, San Mateo, Santa Clara, Santa Cruz, Solano, and Sonoma Counties. Geologic mapping in Lake County in the north-central part of the map extent was not within the scope of the Project. The map and map database integrates both previously published reports and new geologic mapping and field checking by the authors (see Sources of Data index map on the map sheet or the Arc-Info coverage eswn-so and the textfile eswn-so.txt). This report contains new ideas about the geologic structures in the map area, including the active San Andreas Fault system, as well as the geologic units and their relations. Together, the map (or map database) and the unit descriptions in this report describe the composition, distribution, and orientation of geologic materials and structures within the study area at regional scale. Regional geologic information is important for analysis of earthquake shaking, liquifaction susceptibility, landslide susceptibility, engineering materials properties, mineral resources and hazards, as well as groundwater resources and hazards. These data also assist in answering questions about the geologic history and development of the California Coast Ranges.

  4. Geologic map of the Grand Canyon 30' x 60' quadrangle, Coconino and Mohave Counties, northwestern Arizona

    USGS Publications Warehouse

    Billingsley, G.H.

    2000-01-01

    This digital map database, compiled from previously published and unpublished data as well as new mapping by the author, represents the general distribution of bedrock and surficial deposits in the map area. Together with the accompanying pamphlet, it provides current information on the geologic structure and stratigraphy of the Grand Canyon area. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:100,000 or smaller.

  5. Building MapObjects attribute field in cadastral database based on the method of Jackson system development

    NASA Astrophysics Data System (ADS)

    Chen, Zhu-an; Zhang, Li-ting; Liu, Lu

    2009-10-01

    ESRI's GIS components MapObjects are applied in many cadastral information system because of its miniaturization and flexibility. Some cadastral information was saved in cadastral database directly by MapObjects's Shape file format in this cadastral information system. However, MapObjects didn't provide the function of building attribute field for map layer's attribute data file in cadastral database and user cann't save the result of analysis. This present paper designed and realized the function of building attribute field in MapObjects based on the method of Jackson's system development.

  6. 75 FR 66115 - Agency Information Collection Activities: Submission for OMB Review; Comment Request; FEMA...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-27

    ... Request; FEMA Mitigation Success Story Database AGENCY: Federal Emergency Management Agency, DHS. ACTION... Database. Type of information collection: Revision of a currently approved information collection. [[Page...

  7. GLIMS Glacier Database: Status and Challenges

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Racoviteanu, A.; Khalsa, S. S.; Armstrong, R.

    2008-12-01

    GLIMS (Global Land Ice Measurements from Space) is an international initiative to map the world's glaciers and to build a GIS database that is usable via the World Wide Web. The GLIMS programme includes 70 institutions, and 25 Regional Centers (RCs), who analyze satellite imagery to map glaciers in their regions of expertise. The analysis results are collected at the National Snow and Ice Data Center (NSIDC) and ingested into the GLIMS Glacier Database. The database contains approximately 80 000 glacier outlines, half the estimated total on Earth. In addition, the database contains metadata on approximately 200 000 ASTER images acquired over glacierized terrain. Glacier data and the ASTER metadata can be viewed and searched via interactive maps at http://glims.org/. As glacier mapping with GLIMS has progressed, various hurdles have arisen that have required solutions. For example, the GLIMS community has formulated definitions for how to delineate glaciers with different complicated morphologies and how to deal with debris cover. Experiments have been carried out to assess the consistency of the database, and protocols have been defined for the RCs to follow in their mapping. Hurdles still remain. In June 2008, a workshop was convened in Boulder, Colorado to address issues such as mapping debris-covered glaciers, mapping ice divides, and performing change analysis using two different glacier inventories. This contribution summarizes the status of the GLIMS Glacier Database and steps taken to ensure high data quality.

  8. Early Results from the Global Precipitation Measurement (GPM) Mission in Japan

    NASA Astrophysics Data System (ADS)

    Kachi, Misako; Kubota, Takuji; Masaki, Takeshi; Kaneko, Yuki; Kanemaru, Kaya; Oki, Riko; Iguchi, Toshio; Nakamura, Kenji; Takayabu, Yukari N.

    2015-04-01

    The Global Precipitation Measurement (GPM) mission is an international collaboration to achieve highly accurate and highly frequent global precipitation observations. The GPM mission consists of the GPM Core Observatory jointly developed by U.S. and Japan and Constellation Satellites that carry microwave radiometers and provided by the GPM partner agencies. The Dual-frequency Precipitation Radar (DPR) was developed by the Japan Aerospace Exploration Agency (JAXA) and the National Institute of Information and Communications Technology (NICT), and installed on the GPM Core Observatory. The GPM Core Observatory chooses a non-sun-synchronous orbit to carry on diurnal cycle observations of rainfall from the Tropical Rainfall Measuring Mission (TRMM) satellite and was successfully launched at 3:37 a.m. on February 28, 2014 (JST), while the Constellation Satellites, including JAXA's Global Change Observation Mission (GCOM) - Water (GCOM-W1) or "SHIZUKU," are launched by each partner agency sometime around 2014 and contribute to expand observation coverage and increase observation frequency JAXA develops the DPR Level 1 algorithm, and the NASA-JAXA Joint Algorithm Team develops the DPR Level 2 and DPR-GMI combined Level2 algorithms. JAXA also develops the Global Rainfall Map (GPM-GSMaP) algorithm, which is a latest version of the Global Satellite Mapping of Precipitation (GSMaP), as national product to distribute hourly and 0.1-degree horizontal resolution rainfall map. Major improvements in the GPM-GSMaP algorithm is; 1) improvements in microwave imager algorithm based on AMSR2 precipitation standard algorithm, including new land algorithm, new coast detection scheme; 2) Development of orographic rainfall correction method for warm rainfall in coastal area (Taniguchi et al., 2012); 3) Update of database, including rainfall detection over land and land surface emission database; 4) Development of microwave sounder algorithm over land (Kida et al., 2012); and 5) Development of gauge-calibrated GSMaP algorithm (Ushio et al., 2013). In addition to those improvements in the algorithms number of passive microwave imagers and/or sounders used in the GPM-GSMaP was increased compared to the previous version. After the early calibration and validation of the products and evaluation that all products achieved the release criteria, all GPM standard products and the GPM-GSMaP product has been released to the public since September 2014. The GPM products can be downloaded via the internet through the JAXA G-Portal (https://www.gportal.jaxa.jp).

  9. Digital surveying and mapping of forest road network for development of a GIS tool for the effective protection and management of natural ecosystems

    NASA Astrophysics Data System (ADS)

    Drosos, Vasileios C.; Liampas, Sarantis-Aggelos G.; Doukas, Aristotelis-Kosmas G.

    2014-08-01

    In our time, the Geographic Information Systems (GIS) have become important tools, not only in the geosciences and environmental sciences, as well as virtually for all researches that require monitoring, planning or land management. The purpose of this paper was to develop a planning tool and decision making tool using AutoCAD Map software, ArcGIS and Google Earth with emphasis on the investigation of the suitability of forest roads' mapping and the range of its implementation in Greece in prefecture level. Integrating spatial information into a database makes data available throughout the organization; improving quality, productivity, and data management. Also working in such an environment, you can: Access and edit information, integrate and analyze data and communicate effectively. To select desirable information such as forest road network in a very early stage in the planning of silviculture operations, for example before the planning of the harvest is carried out. The software programs that were used were AutoCAD Map for the export in shape files for the GPS data, and ArcGIS in shape files (ArcGlobe), while Google Earth with KML files (Keyhole Markup Language) in order to better visualize and evaluate existing conditions, design in a real-world context and exchange information with government agencies, utilities, and contractors in both CAD and GIS data formats. The automation of the updating procedure and transfer of any files between agencies-departments is one of the main tasks of the integrated GIS-tool among the others should be addressed.

  10. The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States

    USGS Publications Warehouse

    Horton, John D.; San Juan, Carma A.; Stoeser, Douglas B.

    2017-06-30

    The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States (https://doi. org/10.5066/F7WH2N65) represents a seamless, spatial database of 48 State geologic maps that range from 1:50,000 to 1:1,000,000 scale. A national digital geologic map database is essential in interpreting other datasets that support numerous types of national-scale studies and assessments, such as those that provide geochemistry, remote sensing, or geophysical data. The SGMC is a compilation of the individual U.S. Geological Survey releases of the Preliminary Integrated Geologic Map Databases for the United States. The SGMC geodatabase also contains updated data for seven States and seven entirely new State geologic maps that have been added since the preliminary databases were published. Numerous errors have been corrected and enhancements added to the preliminary datasets using thorough quality assurance/quality control procedures. The SGMC is not a truly integrated geologic map database because geologic units have not been reconciled across State boundaries. However, the geologic data contained in each State geologic map have been standardized to allow spatial analyses of lithology, age, and stratigraphy at a national scale.

  11. Current Approaches to Improving Marine Geophysical Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Jencks, J. H.; Cartwright, J.; Varner, J. D.; Anderson, C.; Robertson, E.; McLean, S. J.

    2016-02-01

    Exploring, understanding, and managing the global oceans is a challenge when hydrographic maps are available for only 5% of the world's oceans, even less of which have been mapped geologically or to identify benthic habitats. Seafloor mapping is expensive and most government and academic budgets continue to tighten. The first step for any mapping program, before setting out to map uncharted waters, should be to identify if data currently exist in the area of interest. There are many reasons why this seemingly simple suggestion is not commonplace. While certain datasets are accessible online (e.g., NOAA's NCEI, EMODnet, IHO-DCDB), many are not. In some cases, data that are publicly available are difficult to discover and access. No single agency can successfully resolve the complex and pressing demands of ocean and coastal mapping and the associated data stewardship. NOAA partners with other federal agencies to provide an integrated approach to carry out a coordinated and comprehensive ocean and coastal mapping program. In order to maximize the return on their mapping investment, legacy and newly acquired data must be easily discoverable and readily accessible by numerous applications and formats now and well into the future. At NOAA's National Centers for Environmental Information (NCEI), resources are focused on ensuring the security and widespread availability of the Nation's scientific marine geophysical data through long-term stewardship. The public value of these data and products is maximized by streamlining data acquisition and processing operations, minimizing redundancies, facilitating discovery, and developing common standards to promote re-use. For its part, NCEI draws on a variety of software technologies and adheres to international standards to meet this challenge. The result is a geospatial framework built on spatially-enabled databases, standards-based web services, and International Standards Organization (ISO) metadata. In order to maximize effectiveness in ocean and coastal mapping, we must be sure that limited funding is not being used to collect data in areas where data already exist. By making data more accessible, NCEI extends the use of, and therefore the value of, these data. Working together, we can ensure that valuable data are made available to the broadest community.

  12. Regional Geologic Map of San Andreas and Related Faults in Carrizo Plain, Temblor, Caliente and La Panza Ranges and Vicinity, California; A Digital Database

    USGS Publications Warehouse

    Dibblee, T. W.; Digital database compiled by Graham, S. E.; Mahony, T.M.; Blissenbach, J.L.; Mariant, J.J.; Wentworth, C.M.

    1999-01-01

    This Open-File Report is a digital geologic map database. The report serves to introduce and describe the digital data. There is no paper map included in the Open-File Report. The report includes PostScript and PDF plot files that can be used to plot images of the geologic map sheet and explanation sheet. This digital map database is prepared from a previously published map by Dibblee (1973). The geologic map database delineates map units that are identified by general age, lithology, and clast size following the stratigraphic nomenclature of the U.S. Geological Survey. For descriptions of the units, their stratigraphic relations, and sources of geologic mapping, consult the explanation sheet (of99-14_4b.ps or of99-14_4d.pdf), or the original published paper map (Dibblee, 1973). The scale of the source map limits the spatial resolution (scale) of the database to 1:125,000 or smaller. For those interested in the geology of Carrizo Plain and vicinity who do not use an ARC/INFO compatible Geographic Information System (GIS), but would like to obtain a paper map and explanation, PDF and PostScript plot files containing map images of the data in the digital database, as well as PostScript and PDF plot files of the explanation sheet and explanatory text, have been included in the database package (please see the section 'Digital Plot Files', page 5). The PostScript plot files require a gzip utility to access them. For those without computer capability, we can provide users with the PostScript or PDF files on tape that can be taken to a vendor for plotting. Paper plots can also be ordered directly from the USGS (please see the section 'Obtaining Plots from USGS Open-File Services', page 5). The content and character of the database, methods of obtaining it, and processes of extracting the map database from the tar (tape archive) file are described herein. The map database itself, consisting of six ARC/INFO coverages, can be obtained over the Internet or by magnetic tape copy as described below. The database was compiled using ARC/INFO, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). The ARC/INFO coverages are stored in uncompressed ARC export format (ARC/INFO version 7.x). All data files have been compressed, and may be uncompressed with gzip, which is available free of charge over the Internet via links from the USGS Public Domain Software page (http://edcwww.cr.usgs.gov/doc/edchome/ndcdb/public.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView.

  13. Spatial Digital Database for the Geologic Map of Oregon

    USGS Publications Warehouse

    Walker, George W.; MacLeod, Norman S.; Miller, Robert J.; Raines, Gary L.; Connors, Katherine A.

    2003-01-01

    Introduction This report describes and makes available a geologic digital spatial database (orgeo) representing the geologic map of Oregon (Walker and MacLeod, 1991). The original paper publication was printed as a single map sheet at a scale of 1:500,000, accompanied by a second sheet containing map unit descriptions and ancillary data. A digital version of the Walker and MacLeod (1991) map was included in Raines and others (1996). The dataset provided by this open-file report supersedes the earlier published digital version (Raines and others, 1996). This digital spatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information for use in spatial analysis in a geographic information system (GIS). This database can be queried in many ways to produce a variety of geologic maps. This database is not meant to be used or displayed at any scale larger than 1:500,000 (for example, 1:100,000). This report describes the methods used to convert the geologic map data into a digital format, describes the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. Scanned images of the printed map (Walker and MacLeod, 1991), their correlation of map units, and their explanation of map symbols are also available for download.

  14. Exploring consumer pathways and patterns of use for ...

    EPA Pesticide Factsheets

    Background: Humans may be exposed to thousands of chemicals through contact in the workplace, home, and via air, water, food, and soil. A major challenge is estimating exposures to these chemicals, which requires understanding potential exposure routes directly related to how chemicals are used. Objectives: We aimed to assign “use categories” to a database of chemicals, including ingredients in consumer products, to help prioritize which chemicals will be given more scrutiny relative to human exposure potential and target populations. The goal was to identify (a) human activities that result in increased chemical exposure while (b) simplifying the dimensionality of hazard assessment for risk characterization. Methods: Major data sources on consumer- and industrial-process based chemical uses were compiled from multiple countries, including from regulatory agencies, manufacturers, and retailers. The resulting categorical chemical use and functional information are presented through the Chemical/Product Categories Database (CPCat). Results: CPCat contains information on 43,596 unique chemicals mapped to 833 terms categorizing their usage or function. Examples presented demonstrate potential applications of the CPCat database, including the identification of chemicals to which children may be exposed (including those that are not identified on product ingredient labels), and prioritization of chemicals for toxicity screening. The CPCat database is availabl

  15. Creating a literature database of low-calorie sweeteners and health studies: evidence mapping.

    PubMed

    Wang, Ding Ding; Shams-White, Marissa; Bright, Oliver John M; Parrott, J Scott; Chung, Mei

    2016-01-05

    Evidence mapping is an emerging tool used to systematically identify, organize and summarize the quantity and focus of scientific evidence on a broad topic, but there are currently no methodological standards. Using the topic of low-calorie sweeteners (LCS) and selected health outcomes, we describe the process of creating an evidence-map database and demonstrate several example descriptive analyses using this database. The process of creating an evidence-map database is described in detail. The steps include: developing a comprehensive literature search strategy, establishing study eligibility criteria and a systematic study selection process, extracting data, developing outcome groups with input from expert stakeholders and tabulating data using descriptive analyses. The database was uploaded onto SRDR™ (Systematic Review Data Repository), an open public data repository. Our final LCS evidence-map database included 225 studies, of which 208 were interventional studies and 17 were cohort studies. An example bubble plot was produced to display the evidence-map data and visualize research gaps according to four parameters: comparison types, population baseline health status, outcome groups, and study sample size. This plot indicated a lack of studies assessing appetite and dietary intake related outcomes using LCS with a sugar intake comparison in people with diabetes. Evidence mapping is an important tool for the contextualization of in-depth systematic reviews within broader literature and identifies gaps in the evidence base, which can be used to inform future research. An open evidence-map database has the potential to promote knowledge translation from nutrition science to policy.

  16. 78 FR 65689 - Technical Mapping Advisory Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-01

    ... DEPARTMENT OF HOMELAND SECURITY Federal Emergency Management Agency [Docket ID FEMA-2013-0039] Technical Mapping Advisory Council AGENCY: Federal Emergency Management Agency, DHS. ACTION: Committee Management; Request for Applicants for Appointment to the Federal Emergency Management Agency's Technical...

  17. 75 FR 41180 - Notice of Order: Revisions to Enterprise Public Use Database

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-15

    ... Database AGENCY: Federal Housing Finance Agency. ACTION: Notice of order. SUMMARY: Section 1323(a)(1) of.... This responsibility to maintain a public use database (PUDB) for such mortgage data was transferred to... purpose of loan data field in these two databases. 4. Single-family Data Field 27 and Multifamily Data...

  18. Database for the geologic map of the Mount Baker 30- by 60-minute quadrangle, Washington (I-2660)

    USGS Publications Warehouse

    Tabor, R.W.; Haugerud, R.A.; Hildreth, Wes; Brown, E.H.

    2006-01-01

    This digital map database has been prepared by R.W. Tabor from the published Geologic map of the Mount Baker 30- by 60-Minute Quadrangle, Washington. Together with the accompanying text files as PDF, it provides information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The authors mapped most of the geology at 1:100,000. The Quaternary contacts and structural data have been much simplified for the 1:100,000-scale map and database. The spatial resolution (scale) of the database is 1:100,000 or smaller. This database depicts the distribution of geologic materials and structures at a regional (1:100,000) scale. The report is intended to provide geologic information for the regional study of materials properties, earthquake shaking, landslide potential, mineral hazards, seismic velocity, and earthquake faults. In addition, the report contains information and interpretations about the regional geologic history and framework. However, the regional scale of this report does not provide sufficient detail for site development purposes.

  19. Database for the geologic map of the Chelan 30-minute by 60-minute quadrangle, Washington (I-1661)

    USGS Publications Warehouse

    Tabor, R.W.; Frizzell, V.A.; Whetten, J.T.; Waitt, R.B.; Swanson, D.A.; Byerly, G.R.; Booth, D.B.; Hetherington, M.J.; Zartman, R.E.

    2006-01-01

    This digital map database has been prepared by R. W. Tabor from the published Geologic map of the Chelan 30-Minute Quadrangle, Washington. Together with the accompanying text files as PDF, it provides information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The authors mapped most of the bedrock geology at 1:100,000 scale, but compiled Quaternary units at 1:24,000 scale. The Quaternary contacts and structural data have been much simplified for the 1:100,000-scale map and database. The spatial resolution (scale) of the database is 1:100,000 or smaller. This database depicts the distribution of geologic materials and structures at a regional (1:100,000) scale. The report is intended to provide geologic information for the regional study of materials properties, earthquake shaking, landslide potential, mineral hazards, seismic velocity, and earthquake faults. In addition, the report contains information and interpretations about the regional geologic history and framework. However, the regional scale of this report does not provide sufficient detail for site development purposes.

  20. Database for the geologic map of the Snoqualmie Pass 30-minute by 60-minute quadrangle, Washington (I-2538)

    USGS Publications Warehouse

    Tabor, R.W.; Frizzell, V.A.; Booth, D.B.; Waitt, R.B.

    2006-01-01

    This digital map database has been prepared by R.W. Tabor from the published Geologic map of the Snoqualmie Pass 30' X 60' Quadrangle, Washington. Together with the accompanying text files as PDF, it provides information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The authors mapped most of the bedrock geology at 1:100,000 scale, but compiled Quaternary units at 1:24,000 scale. The Quaternary contacts and structural data have been much simplified for the 1:100,000-scale map and database. The spatial resolution (scale) of the database is 1:100,000 or smaller. This database depicts the distribution of geologic materials and structures at a regional (1:100,000) scale. The report is intended to provide geologic information for the regional study of materials properties, earthquake shaking, landslide potential, mineral hazards, seismic velocity, and earthquake faults. In addition, the report contains information and interpretations about the regional geologic history and framework. However, the regional scale of this report does not provide sufficient detail for site development purposes.

  1. Geologic Map of the Wenatchee 1:100,000 Quadrangle, Central Washington: A Digital Database

    USGS Publications Warehouse

    Tabor, R.W.; Waitt, R.B.; Frizzell, V.A.; Swanson, D.A.; Byerly, G.R.; Bentley, R.D.

    2005-01-01

    This digital map database has been prepared by R.W. Tabor from the published Geologic map of the Wenatchee 1:100,000 Quadrangle, Central Washington. Together with the accompanying text files as PDF, it provides information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The authors mapped most of the bedrock geology at 1:100,000 scale, but compiled Quaternary units at 1:24,000 scale. The Quaternary contacts and structural data have been much simplified for the 1:100,000-scale map and database. The spatial resolution (scale) of the database is 1:100,000 or smaller. This database depicts the distribution of geologic materials and structures at a regional (1:100,000) scale. The report is intended to provide geologic information for the regional study of materials properties, earthquake shaking, landslide potential, mineral hazards, seismic velocity, and earthquake faults. In addition, the report contains information and interpretations about the regional geologic history and framework. However, the regional scale of this report does not provide sufficient detail for site development purposes.

  2. 77 FR 42744 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-20

    ... information about the request is entered into the appropriate tracking databases. Use of the information in the Agency's tracking databases enables the Agency to monitor progress on the activities attendant to...

  3. Preliminary surficial geologic map of the Newberry Springs 30' x 60' quadrangle, California

    USGS Publications Warehouse

    Phelps, G.A.; Bedford, D.R.; Lidke, D.J.; Miller, D.M.; Schmidt, K.M.

    2012-01-01

    The Newberry Springs 30' x 60' quadrangle is located in the central Mojave Desert of southern California. It is split approximately into northern and southern halves by I-40, with the city of Barstow at its western edge and the town of Ludlow near its eastern edge. The map area spans lat 34°30 to 35° N. to long -116 °to -117° W. and covers over 1,000 km2. We integrate the results of surficial geologic mapping conducted during 2002-2005 with compilations of previous surficial mapping and bedrock geologic mapping. Quaternary units are subdivided in detail on the map to distinguish variations in age, process of formation, pedogenesis, lithology, and spatial interdependency, whereas pre-Quaternary bedrock units are grouped into generalized assemblages that emphasize their attributes as hillslope-forming materials and sources of parent material for the Quaternary units. The spatial information in this publication is presented in two forms: a spatial database and a geologic map. The geologic map is a view (the display of an extracted subset of the database at a given time) of the spatial database; it highlights key aspects of the database and necessarily does not show all of the data contained therein. The database contains detailed information about Quaternary geologic unit composition, authorship, and notes regarding geologic units, faults, contacts, and local vegetation. The amount of information contained in the database is too large to show on a single map, so a restricted subset of the information was chosen to summarize the overall nature of the geology. Refer to the database for additional information. Accompanying the spatial data are the map documentation and spatial metadata. The map documentation (this document) describes the geologic setting and history of the Newberry Springs map sheet, summarizes the age and physical character of each map unit, and describes principal faults and folds. The Federal Geographic Data Committee (FGDC) compliant metadata provides detailed information about the digital files and file structure of the spatial data.

  4. Availability of groundwater data for California, water year 2010

    USGS Publications Warehouse

    Ray, Mary; Orlando, Patricia v.P.

    2011-01-01

    The U.S. Geological Survey, in cooperation with Federal, State, and local agencies, obtains a large amount of data pertaining to the groundwater resources of California each water year (October 1-September 30). These data constitute a valuable database for developing an improved understanding of the water resources of the State. This Fact Sheet serves as an index to groundwater data for Water Year 2010. It contains a map of California showing the number of wells (by county) with available water-level or water-quality data for Water Year 2010 (fig. 1) and instructions for obtaining this and other groundwater information contained in the databases of the U.S. Geological Survey, California Water Science Center. From 1985 to 1993, data were published in the annual report "Water Resources Data for California, Volume 5. Ground-Water Data"; prior to 1985, the data were published in U.S. Geological Survey Water-Supply Papers.

  5. Jobs within a 30-minute transit ride - Service

    EPA Pesticide Factsheets

    This mapping service summarizes the total number of jobs that can be reached within 30 minutes by transit. EPA modeled accessibility via transit by calculating total travel time between block group centroids inclusive of walking to/from transit stops, wait times, and transfers. Block groups that can be accessed in 30 minutes or less from the origin block group are considered accessible. Values reflect public transit service in December 2012 and employment counts in 2010. Coverage is limited to census block groups within metropolitan regions served by transit agencies who share their service data in a standardized format called GTFS.All variable names refer to variables in EPA's Smart Location Database. For instance EmpTot10_sum summarizes total employment (EmpTot10) in block groups that are reachable within a 30-minute transit and walking commute. See Smart Location Database User Guide for full variable descriptions.

  6. Geologic map of Yosemite National Park and vicinity, California

    USGS Publications Warehouse

    Huber, N.K.; Bateman, P.C.; Wahrhaftig, Clyde

    1989-01-01

    This digital map database represents the general distribution of bedrock and surficial deposits of the Yosemite National Park vicinity. It was produced directly from the file used to create the print version in 1989. The Yosemite National Park region is comprised of portions of 15 7.5 minute quadrangles. The original publication of the map in 1989 included the map, described map units and provided correlations, as well as a geologic summary and references, all on the same sheet. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:125,000 or smaller.

  7. Scoping of Flood Hazard Mapping Needs for Androscoggin County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed and as funds allow. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Androscoggin County. Scoping activities included assembling existing data and map needs information for communities in Androscoggin County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database with information gathered during the scoping process. The average age of the FEMA floodplain maps in Androscoggin County, Maine, is at least 17 years. Most studies were published in the early 1990s, and some towns have partial maps that are more recent than their study date. Since the studies were done, development has occurred in many of the watersheds and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.

  8. Scoping of Flood Hazard Mapping Needs for Lincoln County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Lincoln County. Scoping activities included assembling existing data and map needs information for communities in Lincoln County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) database with information gathered during the scoping process. The average age of the FEMA floodplain maps in Lincoln County, Maine is at least 17 years. Many of these studies were published in the mid- to late-1980s, and some towns have partial maps that are more recent than their study. However, in the ensuing 15-20 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.

  9. Digital release of the Alaska Quaternary fault and fold database

    NASA Astrophysics Data System (ADS)

    Koehler, R. D.; Farrell, R.; Burns, P.; Combellick, R. A.; Weakland, J. R.

    2011-12-01

    The Alaska Division of Geological & Geophysical Surveys (DGGS) has designed a Quaternary fault and fold database for Alaska in conformance with standards defined by the U.S. Geological Survey for the National Quaternary fault and fold database. Alaska is the most seismically active region of the United States, however little information exists on the location, style of deformation, and slip rates of Quaternary faults. Thus, to provide an accurate, user-friendly, reference-based fault inventory to the public, we are producing a digital GIS shapefile of Quaternary fault traces and compiling summary information on each fault. Here, we present relevant information pertaining to the digital GIS shape file and online access and availability of the Alaska database. This database will be useful for engineering geologic studies, geologic, geodetic, and seismic research, and policy planning. The data will also contribute to the fault source database being constructed by the Global Earthquake Model (GEM), Faulted Earth project, which is developing tools to better assess earthquake risk. We derived the initial list of Quaternary active structures from The Neotectonic Map of Alaska (Plafker et al., 1994) and supplemented it with more recent data where available. Due to the limited level of knowledge on Quaternary faults in Alaska, pre-Quaternary fault traces from the Plafker map are shown as a layer in our digital database so users may view a more accurate distribution of mapped faults and to suggest the possibility that some older traces may be active yet un-studied. The database will be updated as new information is developed. We selected each fault by reviewing the literature and georegistered the faults from 1:250,000-scale paper maps contained in 1970's vintage and earlier bedrock maps. However, paper map scales range from 1:20,000 to 1:500,000. Fault parameters in our GIS fault attribute tables include fault name, age, slip rate, slip sense, dip direction, fault line type (i.e., well constrained, moderately constrained, or inferred), and mapped scale. Each fault is assigned a three-integer CODE, based upon age, slip rate, and how well the fault is located. This CODE dictates the line-type for the GIS files. To host the database, we are developing an interactive web-map application with ArcGIS for Server and the ArcGIS API for JavaScript from Environmental Systems Research Institute, Inc. (Esri). The web-map application will present the database through a visible scale range with each fault displayed at the resolution of the original map. Application functionality includes: search by name or location, identification of fault by manual selection, and choice of base map. Base map options include topographic, satellite imagery, and digital elevation maps available from ArcGIS on-line. We anticipate that the database will be publically accessible from a portal embedded on the DGGS website by the end of 2011.

  10. National Map Data Base On Landslide Prerequisites In Clay and Silt Areas - Development of Prototype

    NASA Astrophysics Data System (ADS)

    Viberg, Leif

    Swedish geotechnical institute, SGI, has in co-operation with Swedish geologic survey, Lantmateriet (land surveying) and Swedish Rescue Service developed a theme database on landslide prerequisites in clay and silt areas. The work is carried out on commission of the Swedish government. A report with suggestions for production of the database has been delivered to the government. The database is a prototype, which has been tested in an area in northern Sweden. Recommended presentation map scale is about 1:50 000. Distribution of the database via Internet is discussed. The aim of the database is to use it as a modern planning tool in combination with other databases, e g databases on flooding prognoses. The main use is supposed to be in early planning stages, e g for new building and infrastructure development and for risk analyses. The database can also be used in more acute cases, e g for risk analyses and rescue operations in connection with flooding over large areas. Users are supposed to be municipal and county planners and rescue services, infrastructure planners, consultants and assurance companies. The database is constructed by combination of two existing databases: Elevation data and soil map data. The investigation area is divided into three zones with different stability criteria: 1. Clay and silt in sloping ground or adjoining water. 2. Clay and silt in flat ground. 3. Rock and other soils than clay and silt. The geometrical and soil criteria for the zones are specified in an algoritm, that will do the job to sort out the different zones. The algoritm is thereby using data from the elevation and soil databases. The investigation area is divided into cells (raster format) with 5 x 5 m side length. Different algoritms had to be developed before reasonable calculation time was reached. The theme may be presented on screen or as a map plot. A prototype map has been produced for the test area. A description is accompanying the map. The database is suggested to be produced in landslide prone areas in Sweden and approximately 200-300 map sheets (25 x 25 km) are required.

  11. Using collaborative tools for energy corridor planning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuiper, J. A.; Cantwell, B.; Hlohowskyj, I.

    2008-01-01

    In November 2007, the Draft Programmatic Environmental Impact Statement on Designation of Energy Corridors on Federal Land in the Western 11 States was released. The draft proposes a network of 6055 miles of energy corridors on lands managed by seven different federal agencies. Determining the proposed locations of the corridors was a large collaborative effort among the agencies and included local, state, and federal land managers. To connect this geographically dispersed group of people, the project team employed a variety of approaches to communicate corridor siting issues, including sharing GIS layers and electronic maps, a downloadable GIS database and ArcReadermore » project, workshops, and Internet webcast teleconferences. This collaborative approach allowed difficult siting issues to be understood and discussed from many perspectives, which resulted in rapid and effective decision making. The result was a proposed corridor system that avoids many sensitive resources and protected lands while accommodating expected energy development.« less

  12. A simple method for serving Web hypermaps with dynamic database drill-down

    PubMed Central

    Boulos, Maged N Kamel; Roudsari, Abdul V; Carson, Ewart R

    2002-01-01

    Background HealthCyberMap aims at mapping parts of health information cyberspace in novel ways to deliver a semantically superior user experience. This is achieved through "intelligent" categorisation and interactive hypermedia visualisation of health resources using metadata, clinical codes and GIS. HealthCyberMap is an ArcView 3.1 project. WebView, the Internet extension to ArcView, publishes HealthCyberMap ArcView Views as Web client-side imagemaps. The basic WebView set-up does not support any GIS database connection, and published Web maps become disconnected from the original project. A dedicated Internet map server would be the best way to serve HealthCyberMap database-driven interactive Web maps, but is an expensive and complex solution to acquire, run and maintain. This paper describes HealthCyberMap simple, low-cost method for "patching" WebView to serve hypermaps with dynamic database drill-down functionality on the Web. Results The proposed solution is currently used for publishing HealthCyberMap GIS-generated navigational information maps on the Web while maintaining their links with the underlying resource metadata base. Conclusion The authors believe their map serving approach as adopted in HealthCyberMap has been very successful, especially in cases when only map attribute data change without a corresponding effect on map appearance. It should be also possible to use the same solution to publish other interactive GIS-driven maps on the Web, e.g., maps of real world health problems. PMID:12437788

  13. Error and Uncertainty in the Accuracy Assessment of Land Cover Maps

    NASA Astrophysics Data System (ADS)

    Sarmento, Pedro Alexandre Reis

    Traditionally the accuracy assessment of land cover maps is performed through the comparison of these maps with a reference database, which is intended to represent the "real" land cover, being this comparison reported with the thematic accuracy measures through confusion matrixes. Although, these reference databases are also a representation of reality, containing errors due to the human uncertainty in the assignment of the land cover class that best characterizes a certain area, causing bias in the thematic accuracy measures that are reported to the end users of these maps. The main goal of this dissertation is to develop a methodology that allows the integration of human uncertainty present in reference databases in the accuracy assessment of land cover maps, and analyse the impacts that uncertainty may have in the thematic accuracy measures reported to the end users of land cover maps. The utility of the inclusion of human uncertainty in the accuracy assessment of land cover maps is investigated. Specifically we studied the utility of fuzzy sets theory, more precisely of fuzzy arithmetic, for a better understanding of human uncertainty associated to the elaboration of reference databases, and their impacts in the thematic accuracy measures that are derived from confusion matrixes. For this purpose linguistic values transformed in fuzzy intervals that address the uncertainty in the elaboration of reference databases were used to compute fuzzy confusion matrixes. The proposed methodology is illustrated using a case study in which the accuracy assessment of a land cover map for Continental Portugal derived from Medium Resolution Imaging Spectrometer (MERIS) is made. The obtained results demonstrate that the inclusion of human uncertainty in reference databases provides much more information about the quality of land cover maps, when compared with the traditional approach of accuracy assessment of land cover maps. None

  14. Plant Genome Resources at the National Center for Biotechnology Information

    PubMed Central

    Wheeler, David L.; Smith-White, Brian; Chetvernin, Vyacheslav; Resenchuk, Sergei; Dombrowski, Susan M.; Pechous, Steven W.; Tatusova, Tatiana; Ostell, James

    2005-01-01

    The National Center for Biotechnology Information (NCBI) integrates data from more than 20 biological databases through a flexible search and retrieval system called Entrez. A core Entrez database, Entrez Nucleotide, includes GenBank and is tightly linked to the NCBI Taxonomy database, the Entrez Protein database, and the scientific literature in PubMed. A suite of more specialized databases for genomes, genes, gene families, gene expression, gene variation, and protein domains dovetails with the core databases to make Entrez a powerful system for genomic research. Linked to the full range of Entrez databases is the NCBI Map Viewer, which displays aligned genetic, physical, and sequence maps for eukaryotic genomes including those of many plants. A specialized plant query page allow maps from all plant genomes covered by the Map Viewer to be searched in tandem to produce a display of aligned maps from several species. PlantBLAST searches against the sequences shown in the Map Viewer allow BLAST alignments to be viewed within a genomic context. In addition, precomputed sequence similarities, such as those for proteins offered by BLAST Link, enable fluid navigation from unannotated to annotated sequences, quickening the pace of discovery. NCBI Web pages for plants, such as Plant Genome Central, complete the system by providing centralized access to NCBI's genomic resources as well as links to organism-specific Web pages beyond NCBI. PMID:16010002

  15. A community effort to construct a gravity database for the United States and an associated Web portal

    USGS Publications Warehouse

    Keller, Gordon R.; Hildenbrand, T.G.; Kucks, R.; Webring, M.; Briesacher, A.; Rujawitz, K.; Hittleman, A.M.; Roman, D.R.; Winester, D.; Aldouri, R.; Seeley, J.; Rasillo, J.; Torres, R.; Hinze, W. J.; Gates, A.; Kreinovich, V.; Salayandia, L.

    2006-01-01

    Potential field data (gravity and magnetic measurements) are both useful and costeffective tools for many geologic investigations. Significant amounts of these data are traditionally in the public domain. A new magnetic database for North America was released in 2002, and as a result, a cooperative effort between government agencies, industry, and universities to compile an upgraded digital gravity anomaly database, grid, and map for the conterminous United States was initiated and is the subject of this paper. This database is being crafted into a data system that is accessible through a Web portal. This data system features the database, software tools, and convenient access. The Web portal will enhance the quality and quantity of data contributed to the gravity database that will be a shared community resource. The system's totally digital nature ensures that it will be flexible so that it can grow and evolve as new data, processing procedures, and modeling and visualization tools become available. Another goal of this Web-based data system is facilitation of the efforts of researchers and students who wish to collect data from regions currently not represented adequately in the database. The primary goal of upgrading the United States gravity database and this data system is to provide more reliable data that support societal and scientific investigations of national importance. An additional motivation is the international intent to compile an enhanced North American gravity database, which is critical to understanding regional geologic features, the tectonic evolution of the continent, and other issues that cross national boundaries. ?? 2006 Geological Society of America. All rights reserved.

  16. Local Integration of the National Atmospheric Release Advisory Center with Cities (LINC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ermak, D L; Tull, J E; Mosley-Rovi, R

    The objective of the ''Local Integration of the National Atmospheric Release Advisory Center with Cities'' (LINC) program is to demonstrate the capability for providing local government agencies with an advanced operational atmospheric plume prediction capability, which can be seamlessly integrated with appropriate federal agency support for homeland security applications. LINC is a Domestic Demonstration and Application Program (DDAP) funded by the Chemical and Biological National Security Program (CBNP), which is part of the Department of Energy's (DOE) National Nuclear Security Administration (NNSA). LINC will make use of capabilities that have been developed the CBNP, and integrated into the National Atmosphericmore » Release Advisory Center (NARAC) at Lawrence Livermore National Laboratory (LLNL). NARAC tools services will be provided to pilot study cities and counties to map plumes from terrorism threats. Support to these local agencies will include training and customized support for exercises, special events, and general emergencies. NARAC provides tools and services that map the probable spread of hazardous material which have been accidentally or intentionally released into the atmosphere. Primarily supported by the DOE, NARAC is a national support and resource center for planning, real-time assessment and detailed studies of incidents involving a wide variety of hazards, including radiological, chemical, or biological releases. NARAC is a distributed system, providing modeling and geographical information tools for use on an end user's computer system, as well as real-time access to global meteorological and geographical databases and advanced three-dimensional model predictions.« less

  17. Maps for the nation: The current federal mapping establishment

    USGS Publications Warehouse

    North, G.W.

    1983-01-01

    The U.S. Government annually produces an estimated 53,000 new maps and charts and distributes about 160 million copies. A large number of these maps are produced under the national mapping program, a decentralized Federal/State cooperative approach to mapping the country at standard scales. Circular A-16, issued by the Office of Management and Budget in 1953 and revised in 1967, delegates the mapping responsibilities to various federal agencies. The U.S. Department of the Interior's Geological Survey is the principal federal agency responsible for implementing the national mapping program. Other major federal map producing agencies include the Departments of Agriculture, Commerce, Defense, Housing and Urban Development, and Transportation, and the Tennessee Valley Authority. To make maps and mapping information more readily available, the National Cartographic Information Center was established in 1974 and an expanded National Map Library Depository Program in 1981. The most recent of many technological advances made under the mapping program are in the areas of digital cartography and video disc and optical disc information storage systems. Future trends and changes in the federal mapping program will involve expanded information and customer service operations, further developments in the production and use of digital cartographic data, and consideration of a Federal Mapping Agency. ?? 1983.

  18. Development of a One-Stop Data Search and Discovery Engine using Ontologies for Semantic Mappings (HydroSeek)

    NASA Astrophysics Data System (ADS)

    Piasecki, M.; Beran, B.

    2007-12-01

    Search engines have changed the way we see the Internet. The ability to find the information by just typing in keywords was a big contribution to the overall web experience. While the conventional search engine methodology worked well for textual documents, locating scientific data remains a problem since they are stored in databases not readily accessible by search engine bots. Considering different temporal, spatial and thematic coverage of different databases, especially for interdisciplinary research it is typically necessary to work with multiple data sources. These sources can be federal agencies which generally offer national coverage or regional sources which cover a smaller area with higher detail. However for a given geographic area of interest there often exists more than one database with relevant data. Thus being able to query multiple databases simultaneously is a desirable feature that would be tremendously useful for scientists. Development of such a search engine requires dealing with various heterogeneity issues. In scientific databases, systems often impose controlled vocabularies which ensure that they are generally homogeneous within themselves but are semantically heterogeneous when moving between different databases. This defines the boundaries of possible semantic related problems making it easier to solve than with the conventional search engines that deal with free text. We have developed a search engine that enables querying multiple data sources simultaneously and returns data in a standardized output despite the aforementioned heterogeneity issues between the underlying systems. This application relies mainly on metadata catalogs or indexing databases, ontologies and webservices with virtual globe and AJAX technologies for the graphical user interface. Users can trigger a search of dozens of different parameters over hundreds of thousands of stations from multiple agencies by providing a keyword, a spatial extent, i.e. a bounding box, and a temporal bracket. As part of this development we have also added an environment that allows users to do some of the semantic tagging, i.e. the linkage of a variable name (which can be anything they desire) to defined concepts in the ontology structure which in turn provides the backbone of the search engine.

  19. Surficial geologic map of the Amboy 30' x 60' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.

    2010-01-01

    The surficial geologic map of the Amboy 30' x 60' quadrangle presents characteristics of surficial materials for an area of approximately 5,000 km2 in the eastern Mojave Desert of southern California. This map consists of new surficial mapping conducted between 2000 and 2007, as well as compilations from previous surficial mapping. Surficial geologic units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects following deposition, and, where appropriate, the lithologic nature of the material. Many physical properties were noted and measured during the geologic mapping. This information was used to classify surficial deposits and to understand their ecological importance. We focus on physical properties that drive hydrologic, biologic, and physical processes such as particle-size distribution (PSD) and bulk density. The database contains point data representing locations of samples for both laboratory determined physical properties and semiquantitative field-based information in the database. We include the locations of all field observations and note the type of information collected in the field to help assist in assessing the quality of the mapping. The publication is separated into three parts: documentation, spatial data, and printable map graphics of the database. Documentation includes this pamphlet, which provides a discussion of the surficial geology and units and the map. Spatial data are distributed as ArcGIS Geodatabase in Microsoft Access format and are accompanied by a readme file, which describes the database contents, and FGDC metadata for the spatial map information. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files that provide a view of the spatial database at the mapped scale.

  20. Official crime data versus collaborative crime mapping at a Brazilian city

    NASA Astrophysics Data System (ADS)

    Brito, P. L.; Jesus, E. G. V.; Sant'Ana, R. M. S.; Martins, C.; Delgado, J. P. M.; Fernandes, V. O.

    2014-11-01

    In July of 2013 a group of undergraduate students from the Federal University of Bahia, Brazil, published a collaborative web map called "Where I Was Robbed". Their initial efforts in publicizing their web map were restricted to announce it at a local radio as a tool of social interest. In two months the map had almost 10.000 reports, 155 reports per day and people from more the 350 cities had already reported a crime. The present study consists in an investigation about this collaborative web map spatial correlation to official robbery data registered at the Secretary of Public Safety database, for the city of Salvador, Bahia. Kernel density estimator combined with map algebra was used to the investigation. Spatial correlations with official robbery data for the city of Salvador were not found initially, but after standardizing collaborative data and mining official registers, both data pointed at very similar areas as the main hot spots for pedestrian robbery. Both areas are located at two of the most economical active areas of the city, although web map crimes reports were more concentrated in an area with higher income population. This results and discussions indicates that this collaborative application is been used mainly by mid class and upper class parcel of the city population, but can still provide significant information on public safety priority areas. Therefore, extended divulgation, on local papers, radio and TV, of the collaborative crime map application and partnership with official agencies are strongly recommended.

  1. 78 FR 46338 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-31

    ... Quality's (AHRQ) Hospital Survey on Patient Safety Culture Comparative Database.'' In accordance with the... Safety Culture Comparative Database Request for information collection approval. The Agency for... on Patient Safety Culture (Hospital SOPS) Comparative Database; OMB NO. 0935-0162, last approved on...

  2. 76 FR 67732 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... proposed information collection project: ``Nursing Home Survey on Patient Safety Culture Comparative... Nursing Home Survey on Patient Safety Culture Comparative Database The Agency for Healthcare Research and... Culture (Nursing Home SOPS) Comparative Database. The Nursing Home SOPS Comparative Database consists of...

  3. A GIS-Enabled, Michigan-Specific, Hierarchical Groundwater Modeling and Visualization System

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Li, S.; Mandle, R.; Simard, A.; Fisher, B.; Brown, E.; Ross, S.

    2005-12-01

    Efficient management of groundwater resources relies on a comprehensive database that represents the characteristics of the natural groundwater system as well as analysis and modeling tools to describe the impacts of decision alternatives. Many agencies in Michigan have spent several years compiling expensive and comprehensive surface water and groundwater inventories and other related spatial data that describe their respective areas of responsibility. However, most often this wealth of descriptive data has only been utilized for basic mapping purposes. The benefits from analyzing these data, using GIS analysis functions or externally developed analysis models or programs, has yet to be systematically realized. In this talk, we present a comprehensive software environment that allows Michigan groundwater resources managers and frontline professionals to make more effective use of the available data and improve their ability to manage and protect groundwater resources, address potential conflicts, design cleanup schemes, and prioritize investigation activities. In particular, we take advantage of the Interactive Ground Water (IGW) modeling system and convert it to a customized software environment specifically for analyzing, modeling, and visualizing the Michigan statewide groundwater database. The resulting Michigan IGW modeling system (IGW-M) is completely window-based, fully interactive, and seamlessly integrated with a GIS mapping engine. The system operates in real-time (on the fly) providing dynamic, hierarchical mapping, modeling, spatial analysis, and visualization. Specifically, IGW-M allows water resources and environmental professionals in Michigan to: * Access and utilize the extensive data from the statewide groundwater database, interactively manipulate GIS objects, and display and query the associated data and attributes; * Analyze and model the statewide groundwater database, interactively convert GIS objects into numerical model features, automatically extract data and attributes, and simulate unsteady groundwater flow and contaminant transport in response to water and land management decisions; * Visualize and map model simulations and predictions with data from the statewide groundwater database in a seamless interactive environment. IGW-M has the potential to significantly improve the productivity of Michigan groundwater management investigations. It changes the role of engineers and scientists in modeling and analyzing the statewide groundwater database from heavily physical to cognitive problem-solving and decision-making tasks. The seamless real-time integration, real-time visual interaction, and real-time processing capability allows a user to focus on critical management issues, conflicts, and constraints, to quickly and iteratively examine conceptual approximations, management and planning scenarios, and site characterization assumptions, to identify dominant processes, to evaluate data worth and sensitivity, and to guide further data-collection activities. We illustrate the power and effectiveness of the M-IGW modeling and visualization system with a real case study and a real-time, live demonstration.

  4. Database for the geologic map of the Sauk River 30-minute by 60-minute quadrangle, Washington (I-2592)

    USGS Publications Warehouse

    Tabor, R.W.; Booth, D.B.; Vance, J.A.; Ford, A.B.

    2006-01-01

    This digital map database has been prepared by R.W. Tabor from the published Geologic map of the Sauk River 30- by 60 Minute Quadrangle, Washington. Together with the accompanying text files as PDF, it provides information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The authors mapped most of the bedrock geology at 1:100,000 scale, but compiled most Quaternary units at 1:24,000 scale. The Quaternary contacts and structural data have been much simplified for the 1:100,000-scale map and database. The spatial resolution (scale) of the database is 1:100,000 or smaller. This database depicts the distribution of geologic materials and structures at a regional (1:100,000) scale. The report is intended to provide geologic information for the regional study of materials properties, earthquake shaking, landslide potential, mineral hazards, seismic velocity, and earthquake faults. In addition, the report contains information and interpretations about the regional geologic history and framework. However, the regional scale of this report does not provide sufficient detail for site development purposes.

  5. USGS national surveys and analysis projects: Preliminary compilation of integrated geological datasets for the United States

    USGS Publications Warehouse

    Nicholson, Suzanne W.; Stoeser, Douglas B.; Wilson, Frederic H.; Dicken, Connie L.; Ludington, Steve

    2007-01-01

    The growth in the use of Geographic nformation Systems (GS) has highlighted the need for regional and national digital geologic maps attributed with age and rock type information. Such spatial data can be conveniently used to generate derivative maps for purposes that include mineral-resource assessment, metallogenic studies, tectonic studies, human health and environmental research. n 1997, the United States Geological Survey’s Mineral Resources Program initiated an effort to develop national digital databases for use in mineral resource and environmental assessments. One primary activity of this effort was to compile a national digital geologic map database, utilizing state geologic maps, to support mineral resource studies in the range of 1:250,000- to 1:1,000,000-scale. Over the course of the past decade, state databases were prepared using a common standard for the database structure, fields, attributes, and data dictionaries. As of late 2006, standardized geological map databases for all conterminous (CONUS) states have been available on-line as USGS Open-File Reports. For Alaska and Hawaii, new state maps are being prepared, and the preliminary work for Alaska is being released as a series of 1:500,000-scale regional compilations. See below for a list of all published databases.

  6. 78 FR 79660 - Agency Information Collection Activities: Proposed Collection; Comment Request-Child Nutrition...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-31

    ... DEPARTMENT OF AGRICULTURE Food and Nutrition Service Agency Information Collection Activities: Proposed Collection; Comment Request--Child Nutrition Database AGENCY: Food and Nutrition Service, USDA... Nutrition Database in support of the Healthy Hunger Free Kids Act. DATES: Written comments on this notice...

  7. 75 FR 41140 - Agency Information Collection Activities: Proposed Collection; Comment Request-Child Nutrition...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-15

    ... DEPARTMENT OF AGRICULTURE Food and Nutrition Service Agency Information Collection Activities: Proposed Collection; Comment Request--Child Nutrition Database AGENCY: Food and Nutrition Service, USDA... nutrient data from the food service industry to update and expand the Child Nutrition Database in support...

  8. GIS Methodic and New Database for Magmatic Rocks. Application for Atlantic Oceanic Magmatism.

    NASA Astrophysics Data System (ADS)

    Asavin, A. M.

    2001-12-01

    There are several geochemical Databases in INTERNET available now. There one of the main peculiarities of stored geochemical information is geographical coordinates of each samples in those Databases. As rule the software of this Database use spatial information only for users interface search procedures. In the other side, GIS-software (Geographical Information System software),for example ARC/INFO software which using for creation and analyzing special geological, geochemical and geophysical e-map, have been deeply involved with geographical coordinates for of samples. We join peculiarities GIS systems and relational geochemical Database from special software. Our geochemical information system created in Vernadsky Geological State Museum and institute of Geochemistry and Analytical Chemistry from Moscow. Now we tested system with data of geochemistry oceanic rock from Atlantic and Pacific oceans, about 10000 chemical analysis. GIS information content consist from e-map covers Wold Globes. Parts of these maps are Atlantic ocean covers gravica map (with grid 2''), oceanic bottom hot stream, altimeteric maps, seismic activity, tectonic map and geological map. Combination of this information content makes possible created new geochemical maps and combination of spatial analysis and numerical geochemical modeling of volcanic process in ocean segment. Now we tested information system on thick client technology. Interface between GIS system Arc/View and Database resides in special multiply SQL-queries sequence. The result of the above gueries were simple DBF-file with geographical coordinates. This file act at the instant of creation geochemical and other special e-map from oceanic region. We used more complex method for geophysical data. From ARC\\View we created grid cover for polygon spatial geophysical information.

  9. Geospatial application for the identification and monitoring of rubber smallholders in the Malaysian state of Negeri Sembilan

    NASA Astrophysics Data System (ADS)

    Hafiz Mohd Hazir, Mohd; Muda, Tuan Mohamad Tuan

    2016-06-01

    The Malaysian rubber industry, especially in the upstream sector, is much dependent on smallholders to produce latex or cup lumps. Identification and monitoring of rubber smallholders are essential tasks when it comes to the Malaysian rubber industry's sustainability. The authorised agencies who support the rubber smallholders can do better planning, arranging, and managing. This paper introduces a method of calculating the total number of smallholders as well as identifying the location of their planted rubber area. The scope of this study only focused on land owners as rubber smallholders in the selected study area of Negeri Sembilan. The land use map provided by the Department of Agriculture Malaysia gave information on distribution of rubber area in Malaysia, while the cadastral map from the Department of Survey and Mapping Malaysia was specifically used for identifying land owners of each rubber parcel or rubber lot. Both data were analyzed and processed with ArcGIS software to extract the information, and the results were then compared to the Malaysian Rubber Board smallholders database.

  10. Database for the geologic map of upper Eocene to Holocene volcanic and related rocks in the Cascade Range, Washington

    USGS Publications Warehouse

    Barron, Andrew D.; Ramsey, David W.; Smith, James G.

    2014-01-01

    This digital database contains information used to produce the geologic map published as Sheet 1 in U.S. Geological Survey Miscellaneous Investigations Series Map I-2005. (Sheet 2 of Map I-2005 shows sources of geologic data used in the compilation and is available separately). Sheet 1 of Map I-2005 shows the distribution and relations of volcanic and related rock units in the Cascade Range of Washington at a scale of 1:500,000. This digital release is produced from stable materials originally compiled at 1:250,000 scale that were used to publish Sheet 1. The database therefore contains more detailed geologic information than is portrayed on Sheet 1. This is most noticeable in the database as expanded polygons of surficial units and the presence of additional strands of concealed faults. No stable compilation materials exist for Sheet 1 at 1:500,000 scale. The main component of this digital release is a spatial database prepared using geographic information systems (GIS) applications. This release also contains links to files to view or print the map sheet, main report text, and accompanying mapping reference sheet from Map I-2005. For more information on volcanoes in the Cascade Range in Washington, Oregon, or California, please refer to the U.S. Geological Survey Volcano Hazards Program website.

  11. Visualizing the semantic content of large text databases using text maps

    NASA Technical Reports Server (NTRS)

    Combs, Nathan

    1993-01-01

    A methodology for generating text map representations of the semantic content of text databases is presented. Text maps provide a graphical metaphor for conceptualizing and visualizing the contents and data interrelationships of large text databases. Described are a set of experiments conducted against the TIPSTER corpora of Wall Street Journal articles. These experiments provide an introduction to current work in the representation and visualization of documents by way of their semantic content.

  12. Progressive and self-limiting neurodegenerative disorders in Africa: a new prominent field of research led by South Africa but without strong health policy.

    PubMed

    Poreau, Brice

    2016-01-01

    Neurodegenerative disorders are involved in mortality and morbidity of every country. A high prevalence is estimated in Africa. Neurodegenerative disorders are defined by a progressive or self-limiting alteration of neurons implied in specific functional and anatomical functions. It encompasses a various range of clinical disorders from self-limiting to progressive. Focus on public health policies and scientific research is needed to understand the mechanisms to reduce this high prevalence. We use bibliometrics and mapping tools to explore the area studies and countries involved in scientific research on neurodegenerative disorders in Africa. We used two databases: Web of Science and Pubmed. We analyzed the journals, most cited articles, authors, publication years, organizations, funding agencies, countries and keywords in Web of Science Core collection database and publication years and Medical Subject Headings in Pubmed database. We mapped the data using VOSviewer. We accessed 44 articles published between 1975 and 2014 in Web of Science Core collection Database and 669 from Pubmed database. The majority of which were after 2006. The main countries involved in research on neurodegenerative disorders in Africa the USA, the United Kingdom, France and South Africa representing the main network collaboration. Clinical neurology and Genetics hereditary are the main Web of Science categories whereas Neurosciences and Biochemistry and Molecular Biology are the main Web of Science categories for the general search "neurodegenerative disorders" not restrained to Africa. This is confirmed by Medical Subject Headings analysis from Pubmed with one more area study: Treatment. Neurodegenerative disorders research is leaded by South Africa with a network involving the USA, the UK, as well as African countries such Zambia. The chief field that emerged was on patient and hereditary as well as treatment. Public health policies were lacking fields in research whereas prevalence is estimated to be important in every country. New 17 sustainable development goals of the United Nations could help in this way.

  13. 32 CFR 1900.21 - Processing of requests for records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Information Act Amendments of 1996. (b) Database of “officially released information.” As an alternative to extensive tasking and as an accommodation to many requesters, the Agency maintains a database of “officially released information” which contains copies of documents released by this Agency. Searches of this database...

  14. 3 CFR - Enhancing Payment Accuracy Through a “Do Not Pay List”

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... are not made. Agencies maintain many databases containing information on a recipient's eligibility to... databases before making payments or awards, agencies can identify ineligible recipients and prevent certain... pre-payment and pre-award procedures and ensure that a thorough review of available databases with...

  15. 76 FR 59186 - Agency Information Collection Activities: Requests for Comments; Clearance of Renewed Approval of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ... Region Aviation Expo Database AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice and... information collection. The New England Region Aviation Expo database performs conference registration and... Region Aviation Expo Database. Form Numbers: There are no FAA forms associated with this collection. Type...

  16. 76 FR 74841 - Agency Information Collection Activities: Requests for Comments; Clearance of Renewed Approval of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-01

    ... Region Aviation Expo Database AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice and... New England Region Aviation Expo database performs conference registration and helps plan the... Expo Database. Form Numbers: There are no FAA forms associated with this collection. Type of Review...

  17. Spatial digital database of the geologic map of Catalina Core Complex and San Pedro Trough, Pima, Pinal, Gila, Graham, and Cochise counties, Arizona

    USGS Publications Warehouse

    Dickinson, William R.; digital database by Hirschberg, Douglas M.; Pitts, G. Stephen; Bolm, Karen S.

    2002-01-01

    The geologic map of Catalina Core Complex and San Pedro Trough by Dickinson (1992) was digitized for input into a geographic information system (GIS) by the U.S. Geological Survey staff and contractors in 2000-2001. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. The resulting digital geologic map database data can be queried in many ways to produce a variety of geologic maps and derivative products. Digital base map data (topography, roads, towns, rivers, lakes, and so forth) are not included; they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:125,000 (for example, 1:100,000 or 1:24,000). The digital geologic map plot files that are provided herein are representations of the database. The map area is located in southern Arizona. This report lists the geologic map units, the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. The manuscript and digital data review by Lorre Moyer (USGS) is greatly appreciated.

  18. Terrestrial Sediments of the Earth: Development of a Global Unconsolidated Sediments Map Database (GUM)

    NASA Astrophysics Data System (ADS)

    Börker, J.; Hartmann, J.; Amann, T.; Romero-Mujalli, G.

    2018-04-01

    Mapped unconsolidated sediments cover half of the global land surface. They are of considerable importance for many Earth surface processes like weathering, hydrological fluxes or biogeochemical cycles. Ignoring their characteristics or spatial extent may lead to misinterpretations in Earth System studies. Therefore, a new Global Unconsolidated Sediments Map database (GUM) was compiled, using regional maps specifically representing unconsolidated and quaternary sediments. The new GUM database provides insights into the regional distribution of unconsolidated sediments and their properties. The GUM comprises 911,551 polygons and describes not only sediment types and subtypes, but also parameters like grain size, mineralogy, age and thickness where available. Previous global lithological maps or databases lacked detail for reported unconsolidated sediment areas or missed large areas, and reported a global coverage of 25 to 30%, considering the ice-free land area. Here, alluvial sediments cover about 23% of the mapped total ice-free area, followed by aeolian sediments (˜21%), glacial sediments (˜20%), and colluvial sediments (˜16%). A specific focus during the creation of the database was on the distribution of loess deposits, since loess is highly reactive and relevant to understand geochemical cycles related to dust deposition and weathering processes. An additional layer compiling pyroclastic sediment is added, which merges consolidated and unconsolidated pyroclastic sediments. The compilation shows latitudinal abundances of sediment types related to climate of the past. The GUM database is available at the PANGAEA database (https://doi.org/10.1594/PANGAEA.884822).

  19. Digital geomorphological landslide hazard mapping of the Alpago area, Italy

    NASA Astrophysics Data System (ADS)

    van Westen, Cees J.; Soeters, Rob; Sijmons, Koert

    Large-scale geomorphological maps of mountainous areas are traditionally made using complex symbol-based legends. They can serve as excellent "geomorphological databases", from which an experienced geomorphologist can extract a large amount of information for hazard mapping. However, these maps are not designed to be used in combination with a GIS, due to their complex cartographic structure. In this paper, two methods are presented for digital geomorphological mapping at large scales using GIS and digital cartographic software. The methods are applied to an area with a complex geomorphological setting on the Borsoia catchment, located in the Alpago region, near Belluno in the Italian Alps. The GIS database set-up is presented with an overview of the data layers that have been generated and how they are interrelated. The GIS database was also converted into a paper map, using a digital cartographic package. The resulting largescale geomorphological hazard map is attached. The resulting GIS database and cartographic product can be used to analyse the hazard type and hazard degree for each polygon, and to find the reasons for the hazard classification.

  20. 77 FR 16235 - Agency Information Collection Activities: Proposed Collection; Comment Request; Guidance for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-20

    ... is entered into the appropriate tracking databases. Use of the information in the Agency's tracking databases enables the Agency to monitor progress on the activities attendant to scheduling and holding a... collection of information on respondents, including through the use of automated collection techniques, when...

  1. 78 FR 69093 - Agency Information Collection Activities; Proposed Collection; Comment Request; Guidance for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... request is entered into the appropriate tracking databases. Use of the information in the Agency's tracking databases enables the appropriate Agency official to monitor progress on the evaluation of the... collection of information on respondents, including through the use of automated collection techniques, when...

  2. Landscape features, standards, and semantics in U.S. national topographic mapping databases

    USGS Publications Warehouse

    Varanka, Dalia

    2009-01-01

    The objective of this paper is to examine the contrast between local, field-surveyed topographical representation and feature representation in digital, centralized databases and to clarify their ontological implications. The semantics of these two approaches are contrasted by examining the categorization of features by subject domains inherent to national topographic mapping. When comparing five USGS topographic mapping domain and feature lists, results indicate that multiple semantic meanings and ontology rules were applied to the initial digital database, but were lost as databases became more centralized at national scales, and common semantics were replaced by technological terms.

  3. Advancements in web-database applications for rabies surveillance.

    PubMed

    Rees, Erin E; Gendron, Bruno; Lelièvre, Frédérick; Coté, Nathalie; Bélanger, Denise

    2011-08-02

    Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among participating agencies. In Québec, RageDB was developed to house surveillance data for the raccoon rabies variant, representing the next generation in web-based database applications that provide a key resource for the protection of public health. RageDB incorporates data from, and grants access to, all agencies responsible for the surveillance of raccoon rabies in Québec. Technological advancements of RageDB to rabies surveillance databases include (1) automatic integration of multi-agency data and diagnostic results on a daily basis; (2) a web-based data editing interface that enables authorized users to add, edit and extract data; and (3) an interactive dashboard to help visualize data simply and efficiently, in table, chart, and cartographic formats. Furthermore, RageDB stores data from citizens who voluntarily report sightings of rabies suspect animals. We also discuss how sightings data can indicate public perception to the risk of racoon rabies and thus aid in directing the allocation of disease control resources for protecting public health. RageDB provides an example in the evolution of spatio-temporal database applications for the storage, analysis and communication of disease surveillance data. The database was fast and inexpensive to develop by using open-source technologies, simple and efficient design strategies, and shared web hosting. The database increases communication among agencies collaborating to protect human health from raccoon rabies. Furthermore, health agencies have real-time access to a wide assortment of data documenting new developments in the raccoon rabies epidemic and this enables a more timely and appropriate response.

  4. Advancements in web-database applications for rabies surveillance

    PubMed Central

    2011-01-01

    Background Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among participating agencies. In Québec, RageDB was developed to house surveillance data for the raccoon rabies variant, representing the next generation in web-based database applications that provide a key resource for the protection of public health. Results RageDB incorporates data from, and grants access to, all agencies responsible for the surveillance of raccoon rabies in Québec. Technological advancements of RageDB to rabies surveillance databases include 1) automatic integration of multi-agency data and diagnostic results on a daily basis; 2) a web-based data editing interface that enables authorized users to add, edit and extract data; and 3) an interactive dashboard to help visualize data simply and efficiently, in table, chart, and cartographic formats. Furthermore, RageDB stores data from citizens who voluntarily report sightings of rabies suspect animals. We also discuss how sightings data can indicate public perception to the risk of racoon rabies and thus aid in directing the allocation of disease control resources for protecting public health. Conclusions RageDB provides an example in the evolution of spatio-temporal database applications for the storage, analysis and communication of disease surveillance data. The database was fast and inexpensive to develop by using open-source technologies, simple and efficient design strategies, and shared web hosting. The database increases communication among agencies collaborating to protect human health from raccoon rabies. Furthermore, health agencies have real-time access to a wide assortment of data documenting new developments in the raccoon rabies epidemic and this enables a more timely and appropriate response. PMID:21810215

  5. Preliminary Geologic Map of the Topanga 7.5' Quadrangle, Southern California: A Digital Database

    USGS Publications Warehouse

    Yerkes, R.F.; Campbell, R.H.

    1995-01-01

    INTRODUCTION This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1994). More specific information about the units may be available in the original sources. The content and character of the database and methods of obtaining it are described herein. The geologic map database itself, consisting of three ARC coverages and one base layer, can be obtained over the Internet or by magnetic tape copy as described below. The processes of extracting the geologic map database from the tar file, and importing the ARC export coverages (procedure described herein), will result in the creation of an ARC workspace (directory) called 'topnga.' The database was compiled using ARC/INFO version 7.0.3, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). It is stored in uncompressed ARC export format (ARC/INFO version 7.x) in a compressed UNIX tar (tape archive) file. The tar file was compressed with gzip, and may be uncompressed with gzip, which is available free of charge via the Internet from the gzip Home Page (http://w3.teaser.fr/~jlgailly/gzip). A tar utility is required to extract the database from the tar file. This utility is included in most UNIX systems, and can be obtained free of charge via the Internet from Internet Literacy's Common Internet File Formats Webpage http://www.matisse.net/files/formats.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView (version 1.0 for Windows 3.1 to 3.11 is available for free from ESRI's web site: http://www.esri.com). 1. Different base layer - The original digital database included separates clipped out of the Los Angeles 1:100,000 sheet. This release includes a vectorized scan of a scale-stable negative of the Topanga 7.5 minute quadrangle. 2. Map projection - The files in the original release were in polyconic projection. The projection used in this release is state plane, which allows for the tiling of adjacent quadrangles. 3. File compression - The files in the original release were compressed with UNIX compression. The files in this release are compressed with gzip.

  6. GDR (Genome Database for Rosaceae): integrated web-database for Rosaceae genomics and genetics data

    PubMed Central

    Jung, Sook; Staton, Margaret; Lee, Taein; Blenda, Anna; Svancara, Randall; Abbott, Albert; Main, Dorrie

    2008-01-01

    The Genome Database for Rosaceae (GDR) is a central repository of curated and integrated genetics and genomics data of Rosaceae, an economically important family which includes apple, cherry, peach, pear, raspberry, rose and strawberry. GDR contains annotated databases of all publicly available Rosaceae ESTs, the genetically anchored peach physical map, Rosaceae genetic maps and comprehensively annotated markers and traits. The ESTs are assembled to produce unigene sets of each genus and the entire Rosaceae. Other annotations include putative function, microsatellites, open reading frames, single nucleotide polymorphisms, gene ontology terms and anchored map position where applicable. Most of the published Rosaceae genetic maps can be viewed and compared through CMap, the comparative map viewer. The peach physical map can be viewed using WebFPC/WebChrom, and also through our integrated GDR map viewer, which serves as a portal to the combined genetic, transcriptome and physical mapping information. ESTs, BACs, markers and traits can be queried by various categories and the search result sites are linked to the mapping visualization tools. GDR also provides online analysis tools such as a batch BLAST/FASTA server for the GDR datasets, a sequence assembly server and microsatellite and primer detection tools. GDR is available at http://www.rosaceae.org. PMID:17932055

  7. LMSD: LIPID MAPS structure database

    PubMed Central

    Sud, Manish; Fahy, Eoin; Cotter, Dawn; Brown, Alex; Dennis, Edward A.; Glass, Christopher K.; Merrill, Alfred H.; Murphy, Robert C.; Raetz, Christian R. H.; Russell, David W.; Subramaniam, Shankar

    2007-01-01

    The LIPID MAPS Structure Database (LMSD) is a relational database encompassing structures and annotations of biologically relevant lipids. Structures of lipids in the database come from four sources: (i) LIPID MAPS Consortium's core laboratories and partners; (ii) lipids identified by LIPID MAPS experiments; (iii) computationally generated structures for appropriate lipid classes; (iv) biologically relevant lipids manually curated from LIPID BANK, LIPIDAT and other public sources. All the lipid structures in LMSD are drawn in a consistent fashion. In addition to a classification-based retrieval of lipids, users can search LMSD using either text-based or structure-based search options. The text-based search implementation supports data retrieval by any combination of these data fields: LIPID MAPS ID, systematic or common name, mass, formula, category, main class, and subclass data fields. The structure-based search, in conjunction with optional data fields, provides the capability to perform a substructure search or exact match for the structure drawn by the user. Search results, in addition to structure and annotations, also include relevant links to external databases. The LMSD is publicly available at PMID:17098933

  8. 76 FR 54807 - Notice of Proposed Information Collection: IMLS Museum Web Database: MuseumsCount.gov

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ...: IMLS Museum Web Database: MuseumsCount.gov AGENCY: Institute of Museum and Library Services, National..., and the general public. Information such as name, address, phone, e-mail, Web site, congressional...: IMLS Museum Web Database, MuseumsCount.gov . OMB Number: To be determined. Agency Number: 3137...

  9. 77 FR 37869 - Agency Information Collection Activities: Proposed Collection; Comment Request-National Hunger...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-25

    ...: Proposed Collection; Comment Request--National Hunger Clearinghouse Database Form AGENCY: Food and... Database Form. Form: FNS 543. OMB Number: 0584-0474. Expiration Date: 8/31/2012. Type of Request: Revision... Clearinghouse includes a database (FNS-543) of non- governmental, grassroots programs that work in the areas of...

  10. OriDB, the DNA replication origin database updated and extended.

    PubMed

    Siow, Cheuk C; Nieduszynska, Sian R; Müller, Carolin A; Nieduszynski, Conrad A

    2012-01-01

    OriDB (http://www.oridb.org/) is a database containing collated genome-wide mapping studies of confirmed and predicted replication origin sites. The original database collated and curated Saccharomyces cerevisiae origin mapping studies. Here, we report that the OriDB database and web site have been revamped to improve user accessibility to curated data sets, to greatly increase the number of curated origin mapping studies, and to include the collation of replication origin sites in the fission yeast Schizosaccharomyces pombe. The revised database structure underlies these improvements and will facilitate further expansion in the future. The updated OriDB for S. cerevisiae is available at http://cerevisiae.oridb.org/ and for S. pombe at http://pombe.oridb.org/.

  11. Geologic and topographic maps of the Kabul South 30' x 60' quadrangle, Afghanistan

    USGS Publications Warehouse

    Bohannon, Robert G.

    2010-01-01

    This report consists of two map sheets, this pamphlet, and a collection of database files. Sheet 1 is the geologic map with three highly speculative cross sections, and sheet 2 is a topographic map that comprises all the support data for the geologic map. Both maps (sheets 1 and 2) are produced at 1:100,000-scale and are provided in Geospatial PDF format that preserves the georegistration and original layering. The database files include images of the topographic hillshade (shaded relief) and color-topography files used to create the topographic maps, a copy of the Landsat image, and a gray-scale basemap. Vector data from each of the layers that comprise both maps are provided in the form of Arc/INFO shapefiles. Most of the geologic interpretations and all of the topographic data were derived exclusively from images. A variety of image types were used, and each image type corresponds to a unique view of the geology. The geologic interpretations presented here are the result of comparing and contrasting between the various images and making the best uses of the strengths of each image type. A limited amount of fieldwork, in the spring of 2004 and the fall of 2006, was carried out within the quadrangle, but all the war-related dangers present in Afghanistan restricted its scope, duration, and utility. The maps that are included in this report represent works-in-progress in that they are simply intended to be the best possible product for the time available and conditions that exist during the early phases of reconstruction in Afghanistan. This report has been funded by the United States Agency for International Development (USAID) as a part of several broader programs that USAID designed to stimulate growth in the energy and mineral sectors of the Afghan economy. The main objective is to provide maps that will be used by scientists of the Afghan Ministry of Mines, the Afghanistan Geological Survey, and the Afghan Geodesy and Cartography Head Office in their efforts to rebuild the energy and mineral sectors of their economy. The U.S. Geological Survey has also produced a variety of geological, topographic, Landsat natural-color, and Landsat false-color maps covering Afghanistan at the 1:250,000 scale. These maps may be used to compliment the information presented here. For more information about USGS activities in Afghanistan, visit the USGS Projects in Afghanistan Web site at http://afghanistan.cr.usgs.gov/ For scientific questions or comments, please send inquiries to Robert G. Bohannon.

  12. Geologic and Topographic Maps of the Kabul North 30' x 60' Quadrangle, Afghanistan

    USGS Publications Warehouse

    Bohannon, Robert G.

    2010-01-01

    This report consists of two map sheets, this pamphlet, and a collection of database files. Sheet 1 is the geologic map with two highly speculative cross sections, and sheet 2 is a topographic map that comprises all the support data for the geologic map. Both maps (sheets 1 and 2) are produced at 1:100,000-scale and are provided in GeoPDF format that preserves the georegistration and original layering. The database files include images of the topographic hillshade (shaded relief) and color-topography files used to create the topographic maps, a copy of the Landsat image, and a gray-scale basemap. Vector data from each of the layers that comprise both maps are provided in the form of Arc/INFO shapefiles. Most of the geologic interpretations and all of the topographic data were derived exclusively from images. A variety of image types were used, and each image type corresponds to a unique view of the geology. The geologic interpretations presented here are the result of comparing and contrasting between the various images and making the best uses of the strengths of each image type. A limited amount of fieldwork, in the spring of 2004 and the fall of 2006, was carried out within the quadrangle, but all the war-related dangers present in Afghanistan restricted its scope, duration, and utility. The maps that are included in this report represent works-in-progress in that they are simply intended to be the best possible product for the time available and conditions that exist during the early phases of reconstruction in Afghanistan. This report has been funded by the United States Agency for International Development (USAID) as a part of several broader programs that USAID designed to stimulate growth in the energy and mineral sectors of the Afghan economy. The main objective is to provide maps that will be used by scientists of the Afghan Ministry of Mines, the Afghanistan Geological Survey, and the Afghan Geodesy and Cartography Head Office in their efforts to rebuild the energy and mineral sectors of their economy. The U.S. Geological Survey has also produced a variety of geological, topographic, Landsat natural-color, and Landsat false-color maps covering Afghanistan at the 1:250,000 scale. These maps may be used to compliment the information presented here. For more information about USGS activities in Afghanistan, visit the USGS Projects in Afghanistan Web site at http://gisdata.usgs.net/Website/Afghan/ For scientific questions or comments, please send inquiries to Robert G. Bohannon.

  13. Mapping of Florida's Coastal and Marine Resources: Setting Priorities Workshop

    USGS Publications Warehouse

    Robbins, Lisa; Wolfe, Steven; Raabe, Ellen

    2008-01-01

    The importance of mapping habitats and bioregions as a means to improve resource management has become increasingly clear. Large areas of the waters surrounding Florida are unmapped or incompletely mapped, possibly hindering proper management and good decisionmaking. Mapping of these ecosystems is among the top priorities identified by the Florida Oceans and Coastal Council in their Annual Science Research Plan. However, lack of prioritization among the coastal and marine areas and lack of coordination of agency efforts impede efficient, cost-effective mapping. A workshop on Mapping of Florida's Coastal and Marine Resources was sponsored by the U.S. Geological Survey (USGS), Florida Department of Environmental Protection (FDEP), and Southeastern Regional Partnership for Planning and Sustainability (SERPPAS). The workshop was held at the USGS Florida Integrated Science Center (FISC) in St. Petersburg, FL, on February 7-8, 2007. The workshop was designed to provide State, Federal, university, and non-governmental organizations (NGOs) the opportunity to discuss their existing data coverage and create a prioritization of areas for new mapping data in Florida. Specific goals of the workshop were multifold, including to: * provide information to agencies on state-of-the-art technology for collecting data; * inform participants of the ongoing mapping programs in waters off Florida; * present the mapping needs and priorities of the State and Federal agencies and entities operating in Florida; * work with State of Florida agencies to establish an overall priority for areas needing mapping; * initiate discussion of a unified classification of habitat and bioregions; * discuss and examine the need to standardize terminology and data collection/storage so that data, in particular habitat data, can be shared; 9 identify opportunities for partnering and leveraging mapping efforts among agencies and entities; * identify impediments and organizational gaps that hinder collection of data for mapping; * seek innovative solutions to the primary obstacles identified; * identify the steps needed to move mapping of Florida's oceans and coasts forward, in preparation for a better coordinated, more cost-effective mapping program to allow State and Federal agencies to make better decisions on coastal-resource issues. Over 90 invited participants representing more than 30 State and Federal agencies, universities, NGOs, and private industries played a large role in the success of this two-day workshop. State of Florida agency participants created a ranked priority order for mapping 13 different regions around Florida. The data needed for each of the 13 priority regions were outlined. A matrix considering State and Federal priorities was created, utilizing input from all agencies. The matrix showed overlapping interests of the entities and will allow for partnering and leveraging of resources. The five most basic mapping needs were determined to be bathymetry, high-vertical resolution coastline for sea-level rise scenarios, shoreline change, subsurface geology, and benthic habitats at sufficient scale. There was a clear convergence on the need to coordinate mapping activities around the state. Suggestions for coordination included: * creating a glossary of terms: a standard for specifying agency data-mapping needs; * creating a geographic information officer (GIO) position or permanent organizing group to maintain communications established at this workshop and to maintain progress on the issues identified during the workshop. The person or group could develop a website, maintain a project-status matrix, develop a list of contacts, create links to legislative updates and links to funding sources; * developing a web portal and one-stop/clearinghouse of data. There was general consensus on the need to adopt a single habitat classification system and a strategy to accommodate existing systems smoothly. Unresolve

  14. 77 FR 16237 - Agency Information Collection Activities: Proposed Collection; Comment Request; Guidance for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-20

    ... the appropriate tracking databases. Use of the information in the Agency's tracking databases enables... of information on respondents, including through the use of automated collection techniques, when...

  15. Assessment and application of national environmental databases and mapping tools at the local level to two community case studies.

    PubMed

    Hammond, Davyda; Conlon, Kathryn; Barzyk, Timothy; Chahine, Teresa; Zartarian, Valerie; Schultz, Brad

    2011-03-01

    Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessment of environmental-pollution-related risks. Databases and mapping tools that supply community-level estimates of ambient concentrations of hazardous pollutants, risk, and potential health impacts can provide relevant information for communities to understand, identify, and prioritize potential exposures and risk from multiple sources. An assessment of existing databases and mapping tools was conducted as part of this study to explore the utility of publicly available databases, and three of these databases were selected for use in a community-level GIS mapping application. Queried data from the U.S. EPA's National-Scale Air Toxics Assessment, Air Quality System, and National Emissions Inventory were mapped at the appropriate spatial and temporal resolutions for identifying risks of exposure to air pollutants in two communities. The maps combine monitored and model-simulated pollutant and health risk estimates, along with local survey results, to assist communities with the identification of potential exposure sources and pollution hot spots. Findings from this case study analysis will provide information to advance the development of new tools to assist communities with environmental risk assessments and hazard prioritization. © 2010 Society for Risk Analysis.

  16. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    NASA Astrophysics Data System (ADS)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  17. The GLIMS Glacier Database

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2007-12-01

    The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), MapInfo, GML (Geography Markup Language) and GMT (Generic Mapping Tools). This "clip-and-ship" function allows users to download only the data they are interested in. Our flexible web interfaces to the database, which includes various support layers (e.g. a layer to help collaborators identify satellite imagery over their region of expertise) will facilitate enhanced analysis to be undertaken on glacier systems, their distribution, and their impacts on other Earth systems.

  18. Preliminary surficial geologic map of a Calico Mountains piedmont and part of Coyote Lake, Mojave desert, San Bernardino County, California

    USGS Publications Warehouse

    Dudash, Stephanie L.

    2006-01-01

    This 1:24,000 scale detailed surficial geologic map and digital database of a Calico Mountains piedmont and part of Coyote Lake in south-central California depicts surficial deposits and generalized bedrock units. The mapping is part of a USGS project to investigate the spatial distribution of deposits linked to changes in climate, to provide framework geology for land use management (http://deserts.wr.usgs.gov), to understand the Quaternary tectonic history of the Mojave Desert, and to provide additional information on the history of Lake Manix, of which Coyote Lake is a sub-basin. Mapping is displayed on parts of four USGS 7.5 minute series topographic maps. The map area lies in the central Mojave Desert of California, northeast of Barstow, Calif. and south of Fort Irwin, Calif. and covers 258 sq.km. (99.5 sq.mi.). Geologic deposits in the area consist of Paleozoic metamorphic rocks, Mesozoic plutonic rocks, Miocene volcanic rocks, Pliocene-Pleistocene basin fill, and Quaternary surficial deposits. McCulloh (1960, 1965) conducted bedrock mapping and a generalized version of his maps are compiled into this map. McCulloh's maps contain many bedrock structures within the Calico Mountains that are not shown on the present map. This study resulted in several new findings, including the discovery of previously unrecognized faults, one of which is the Tin Can Alley fault. The north-striking Tin Can Alley fault is part of the Paradise fault zone (Miller and others, 2005), a potentially important feature for studying neo-tectonic strain in the Mojave Desert. Additionally, many Anodonta shells were collected in Coyote Lake lacustrine sediments for radiocarbon dating. Preliminary results support some of Meek's (1999) conclusions on the timing of Mojave River inflow into the Coyote Basin. The database includes information on geologic deposits, samples, and geochronology. The database is distributed in three parts: spatial map-based data, documentation, and printable map graphics of the database. Spatial data are distributed as an ArcInfo personal geodatabase, or as tabular data in the form of Microsoft Access Database (MDB) or dBase Format (DBF) file formats. Documentation includes this file, which provides a discussion of the surficial geology and describes the format and content of the map data, and Federal Geographic Data Committee (FGDC) metadata for the spatial map information. Map graphics files are distributed as Postscript and Adobe Acrobat Portable Document Format (PDF) files, and are appropriate for representing a view of the spatial database at the mapped scale.

  19. Spatio Temporal Detection and Virtual Mapping of Landslide Using High-Resolution Airborne Laser Altimetry (lidar) in Densely Vegetated Areas of Tropics

    NASA Astrophysics Data System (ADS)

    Bibi, T.; Azahari Razak, K.; Rahman, A. Abdul; Latif, A.

    2017-10-01

    Landslides are an inescapable natural disaster, resulting in massive social, environmental and economic impacts all over the world. The tropical, mountainous landscape in generally all over Malaysia especially in eastern peninsula (Borneo) is highly susceptible to landslides because of heavy rainfall and tectonic disturbances. The purpose of the Landslide hazard mapping is to identify the hazardous regions for the execution of mitigation plans which can reduce the loss of life and property from future landslide incidences. Currently, the Malaysian research bodies e.g. academic institutions and government agencies are trying to develop a landslide hazard and risk database for susceptible areas to backing the prevention, mitigation, and evacuation plan. However, there is a lack of devotion towards landslide inventory mapping as an elementary input of landslide susceptibility, hazard and risk mapping. The developing techniques based on remote sensing technologies (satellite, terrestrial and airborne) are promising techniques to accelerate the production of landslide maps, shrinking the time and resources essential for their compilation and orderly updates. The aim of the study is to provide a better perception regarding the use of virtual mapping of landslides with the help of LiDAR technology. The focus of the study is spatio temporal detection and virtual mapping of landslide inventory via visualization and interpretation of very high-resolution data (VHR) in forested terrain of Mesilau river, Kundasang. However, to cope with the challenges of virtual inventory mapping on in forested terrain high resolution LiDAR derivatives are used. This study specifies that the airborne LiDAR technology can be an effective tool for mapping landslide inventories in a complex climatic and geological conditions, and a quick way of mapping regional hazards in the tropics.

  20. The Protein Identifier Cross-Referencing (PICR) service: reconciling protein identifiers across multiple source databases.

    PubMed

    Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning

    2007-10-18

    Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at http://www.ebi.ac.uk/Tools/picr.

  1. The Protein Identifier Cross-Referencing (PICR) service: reconciling protein identifiers across multiple source databases

    PubMed Central

    Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning

    2007-01-01

    Background Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. Results We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. Conclusion We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at . PMID:17945017

  2. 75 FR 39949 - Agency Information Collection Activities; Proposed Collection; Comment Request; Guidance for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-13

    ... entered into the appropriate tracking databases. Use of the information in the agency's tracking databases... respondents, including through the use of automated collection techniques, when appropriate, and other forms...

  3. Digital version of "Open-File Report 92-179: Geologic map of the Cow Cove Quadrangle, San Bernardino County, California"

    USGS Publications Warehouse

    Wilshire, Howard G.; Bedford, David R.; Coleman, Teresa

    2002-01-01

    3. Plottable map representations of the database at 1:24,000 scale in PostScript and Adobe PDF formats. The plottable files consist of a color geologic map derived from the spatial database, composited with a topographic base map in the form of the USGS Digital Raster Graphic for the map area. Color symbology from each of these datasets is maintained, which can cause plot file sizes to be large.

  4. Geologic and aeromagnetic maps of the Fossil Ridge area and vicinity, Gunnison County, Colorado

    USGS Publications Warehouse

    DeWitt, Ed; Zech, R.S.; Chase, C.G.; Zartman, R.E.; Kucks, R.P.; Bartelson, Bruce; Rosenlund, G.C.; Earley, Drummond

    2002-01-01

    This data set includes a GIS geologic map database of an Early Proterozoic metavolcanic and metasedimentary terrane extensively intruded by Early and Middle Proterozoic granitic plutons. Laramide to Tertiary deformation and intrusion of felsic plutons have created numerous small mineral deposits that are described in the tables and are shown on the figures in the accompanying text pamphlet. Also included in the pamphlet are numerous chemical analyses of igneous and meta-igneous bodies of all ages in tables and in summary geochemical diagrams. The text pamphlet also contains a detailed description of map units and discussions of the aeromagnetic survey, igneous and metmorphic rocks, and mineral deposits. The printed map sheet and browse graphic pdf file include the aeromagnetic map of the study area, as well as figures and photographs. Purpose: This GIS geologic map database is provided to facilitate the presentation and analysis of earth-science data for this region of Colorado. This digital map database may be displayed at any scale or projection. However, the geologic data in this coverage are not intended for use at a scale other than 1:30,000. Supplemental useful data accompanying the database are extensive geochemical and mineral deposits data, as well as an aeromagnetic map.

  5. A mapping review of the literature on UK-focused health and social care databases.

    PubMed

    Cooper, Chris; Rogers, Morwenna; Bethel, Alison; Briscoe, Simon; Lowe, Jenny

    2015-03-01

    Bibliographic databases are a day-to-day tool of the researcher: they offer the researcher easy and organised access to knowledge, but how much is actually known about the databases on offer? The focus of this paper is UK health and social care databases. These databases are often small, specialised by topic, and provide a complementary literature to the large, international databases. There is, however, good evidence that these databases are overlooked in systematic reviews, perhaps because little is known about what they can offer. To systematically locate and map, published and unpublished literature on the key UK health and social care bibliographic databases. Systematic searching and mapping. Two hundred and forty-two items were identified which specifically related to the 24 of the 34 databases under review. There is little published or unpublished literature specifically analysing the key UK health and social care databases. Since several UK databases have closed, others are at risk, and some are overlooked in reviews, better information is required to enhance our knowledge. Further research on UK health and social care databases is required. This paper suggests the need to develop the evidence base through a series of case studies on each of the databases. © 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Journal.

  6. Scoping of Flood Hazard Mapping Needs for Penobscot County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program (MFMP), began scoping work in 2006 for Penobscot County. Scoping activities included assembling existing data and map needs information for communities in Penobscot County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database with information gathered during the scoping process. As of 2007, the average age of the FEMA floodplain maps in Penobscot County, Maine, is 22 years, based on the most recent revisions to the maps. Because the revisions did not affect all the map panels in each town, however, the true average date probably is more than 22 years. Many of the studies were published in the mid-1980s. Since the studies were completed, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.

  7. Geologic map database of the El Mirage Lake area, San Bernardino and Los Angeles Counties, California

    USGS Publications Warehouse

    Miller, David M.; Bedford, David R.

    2000-01-01

    This geologic map database for the El Mirage Lake area describes geologic materials for the dry lake, parts of the adjacent Shadow Mountains and Adobe Mountain, and much of the piedmont extending south from the lake upward toward the San Gabriel Mountains. This area lies within the western Mojave Desert of San Bernardino and Los Angeles Counties, southeastern California. The area is traversed by a few paved highways that service the community of El Mirage, and by numerous dirt roads that lead to outlying properties. An off-highway vehicle area established by the Bureau of Land Management encompasses the dry lake and much of the land north and east of the lake. The physiography of the area consists of the dry lake, flanking mud and sand flats and alluvial piedmonts, and a few sharp craggy mountains. This digital geologic map database, intended for use at 1:24,000-scale, describes and portrays the rock units and surficial deposits of the El Mirage Lake area. The map database was prepared to aid in a water-resource assessment of the area by providing surface geologic information with which deepergroundwater-bearing units may be understood. The area mapped covers the Shadow Mountains SE and parts of the Shadow Mountains, Adobe Mountain, and El Mirage 7.5-minute quadrangles. The map includes detailed geology of surface and bedrock deposits, which represent a significant update from previous bedrock geologic maps by Dibblee (1960) and Troxel and Gunderson (1970), and the surficial geologic map of Ponti and Burke (1980); it incorporates a fringe of the detailed bedrock mapping in the Shadow Mountains by Martin (1992). The map data were assembled as a digital database using ARC/INFO to enable wider applications than traditional paper-product geologic maps and to provide for efficient meshing with other digital data bases prepared by the U.S. Geological Survey's Southern California Areal Mapping Project.

  8. Delivering integrated HAZUS-MH flood loss analyses and flood inundation maps over the Web.

    PubMed

    Hearn, Paul P; Longenecker, Herbert E; Aguinaldo, John J; Rahav, Ami N

    2013-01-01

    Catastrophic flooding is responsible for more loss of life and damages to property than any other natural hazard. Recently developed flood inundation mapping technologies make it possible to view the extent and depth of flooding on the land surface over the Internet; however, by themselves these technologies are unable to provide estimates of losses to property and infrastructure. The Federal Emergency Management Agency's (FEMA's) HAZUS-MH software is extensively used to conduct flood loss analyses in the United States, providing a nationwide database of population and infrastructure at risk. Unfortunately, HAZUS-MH requires a dedicated Geographic Information System (GIS) workstation and a trained operator, and analyses are not adapted for convenient delivery over the Web. This article describes a cooperative effort by the US Geological Survey (USGS) and FEMA to make HAZUS-MH output GIS and Web compatible and to integrate these data with digital flood inundation maps in USGS's newly developed Inundation Mapping Web Portal. By running the computationally intensive HAZUS-MH flood analyses offline and converting the output to a Web-GIS compatible format, detailed estimates of flood losses can now be delivered to anyone with Internet access, thus dramatically increasing the availability of these forecasts to local emergency planners and first responders.

  9. JAMSTEC multibeam surveys and submersible dives around the Hawaiian Islands: a collaborative Japan-USA exploration of Hawaii's deep seafloor

    USGS Publications Warehouse

    Robinson, Joel E.; Eakins, Barry W.; Kanamatsu, Toshiya; Naka, Jiro; Takahashi, Eiichi; Satake, Kenji; Smith, John R.; Clague, David A.; Yokose, Hisayoshi

    2006-01-01

    This database release, USGS Data Series 171, contains data collected during four Japan-USA collaborative cruises that characterize the seafloor around the Hawaiian Islands. The Japan Agency for Marine-Earth Science and Technology (JAMSTEC) sponsored cruises in 1998, 1999, 2001, and 2002, to build a greater understanding of the deep marine geology around the Hawaiian Islands. During these cruises, scientists surveyed over 600,000 square kilometers of the seafloor with a hull-mounted multibeam seafloor-mapping sonar system (SEA BEAM® 2112), observed the seafloor and collected samples using robotic and manned submersible dives, collected dredge and piston-core samples, and performed single-channel seismic surveys.

  10. Integrating disparate lidar data at the national scale to assess the relationships between height above ground, land cover and ecoregions

    USGS Publications Warehouse

    Stoker, Jason M.; Cochrane, Mark A.; Roy, David P.

    2013-01-01

    With the acquisition of lidar data for over 30 percent of the US, it is now possible to assess the three-dimensional distribution of features at the national scale. This paper integrates over 350 billion lidar points from 28 disparate datasets into a national-scale database and evaluates if height above ground is an important variable in the context of other nationalscale layers, such as the US Geological Survey National Land Cover Database and the US Environmental Protection Agency ecoregions maps. While the results were not homoscedastic and the available data did not allow for a complete height census in any of the classes, it does appear that where lidar data were used, there were detectable differences in heights among many of these national classification schemes. This study supports the hypothesis that there were real, detectable differences in heights in certain national-scale classification schemes, despite height not being a variable used in any of the classification routines.

  11. Scoping of Flood Hazard Mapping Needs for Hancock County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Hancock County. Scoping activities included assembling existing data and map needs information for communities in Hancock County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) database with information gathered during the scoping process. The average age of the FEMA floodplain maps (all types) in Hancock County, Maine, is at least 19 years. Most of these studies were published in the late 1980s and early 1990s, and no study is more recent than 1992. Some towns have partial maps that are more recent than their study, indicating that the true average age of the data is probably more than 19 years. Since the studies were done, development has occurred in some of the watersheds and the characteristics of the watersheds have changed. Therefore, many of the older studies may not depict current conditions or accurately estimate risk in terms of flood heights or flood mapping.

  12. Exploring consumer exposure pathways and patterns of use for chemicals in the environment.

    PubMed

    Dionisio, Kathie L; Frame, Alicia M; Goldsmith, Michael-Rock; Wambaugh, John F; Liddell, Alan; Cathey, Tommy; Smith, Doris; Vail, James; Ernstoff, Alexi S; Fantke, Peter; Jolliet, Olivier; Judson, Richard S

    2015-01-01

    Humans are exposed to thousands of chemicals in the workplace, home, and via air, water, food, and soil. A major challenge in estimating chemical exposures is to understand which chemicals are present in these media and microenvironments. Here we describe the Chemical/Product Categories Database (CPCat), a new, publically available (http://actor.epa.gov/cpcat) database of information on chemicals mapped to "use categories" describing the usage or function of the chemical. CPCat was created by combining multiple and diverse sources of data on consumer- and industrial-process based chemical uses from regulatory agencies, manufacturers, and retailers in various countries. The database uses a controlled vocabulary of 833 terms and a novel nomenclature to capture and streamline descriptors of chemical use for 43,596 chemicals from the various sources. Examples of potential applications of CPCat are provided, including identifying chemicals to which children may be exposed and to support prioritization of chemicals for toxicity screening. CPCat is expected to be a valuable resource for regulators, risk assessors, and exposure scientists to identify potential sources of human exposures and exposure pathways, particularly for use in high-throughput chemical exposure assessment.

  13. RatMap--rat genome tools and data.

    PubMed

    Petersen, Greta; Johnson, Per; Andersson, Lars; Klinga-Levan, Karin; Gómez-Fabre, Pedro M; Ståhl, Fredrik

    2005-01-01

    The rat genome database RatMap (http://ratmap.org or http://ratmap.gen.gu.se) has been one of the main resources for rat genome information since 1994. The database is maintained by CMB-Genetics at Goteborg University in Sweden and provides information on rat genes, polymorphic rat DNA-markers and rat quantitative trait loci (QTLs), all curated at RatMap. The database is under the supervision of the Rat Gene and Nomenclature Committee (RGNC); thus much attention is paid to rat gene nomenclature. RatMap presents information on rat idiograms, karyotypes and provides a unified presentation of the rat genome sequence and integrated rat linkage maps. A set of tools is also available to facilitate the identification and characterization of rat QTLs, as well as the estimation of exon/intron number and sizes in individual rat genes. Furthermore, comparative gene maps of rat in regard to mouse and human are provided.

  14. MapApp: A Java(TM) Applet for Accessing Geographic Databases

    NASA Astrophysics Data System (ADS)

    Haxby, W.; Carbotte, S.; Ryan, W. B.; OHara, S.

    2001-12-01

    MapApp (http://coast.ldeo.columbia.edu/help/MapApp.html) is a prototype Java(TM) applet that is intended to give easy and versatile access to geographic data sets through a web browser. It was developed initially to interface with the RIDGE Multibeam Synthesis. Subsequently, interfaces with other geophysical databases were added. At present, multibeam bathymetry grids, underway geophysics along ship tracks, and the LDEO Borehole Research Group's ODP well logging database are accessible through MapApp. We plan to add an interface with the Ridge Petrology Database in the near future. The central component of MapApp is a world physiographic map. Users may navigate around the map (zoom/pan) without waiting for HTTP requests to a remote server to be processed. A focus request loads image tiles from the server to compose a new map at the current viewing resolution. Areas in which multibeam grids are available may be focused to a pixel resolution of about 200 m. These areas may be identified by toggling a mask. Databases may be accessed through menus, and selected data objects may be loaded into MapApp by selecting items from tables. Once loaded, a bathymetry grid may be contoured or used to create bathymetric profiles; ship tracks and ODP sites may be overlain on the map and their geophysical data plotted in X-Y graphs. The advantage of applets over traditional web pages is that they permit dynamic interaction with data sets, while limiting time consuming interaction with a remote server. Users may customize the graphics display by modifying the scale, or the symbol or line characteristics of rendered data, contour interval, etc. The ease with which users can select areas, view the physiography of areas, and preview data sets and evaluate them for quality and applicability, makes MapApp a valuable tool for education and research.

  15. Coastal resource and sensitivity mapping of Vietnam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odin, L.M.

    1997-08-01

    This paper describes a project to establish a relationship between environmental sensitivity (primarily to oil pollution) and response planning and prevention priorities for Vietnamese coastal regions. An inventory of coastal environmental sensitivity and the creation of index mapping was performed. Satellite and geographical information system data were integrated and used for database creation. The database was used to create a coastal resource map, coastal sensitivity map, and a field inventory base map. The final coastal environment sensitivity classification showed that almost 40 percent of the 7448 km of mapped shoreline has a high to medium high sensitivity to oil pollution.

  16. Geologic map of the eastern part of the Challis National Forest and vicinity, Idaho

    USGS Publications Warehouse

    Wilson, A.B.; Skipp, B.A.

    1994-01-01

    The paper version of the Geologic Map of the eastern part of the Challis National Forest and vicinity, Idaho was compiled by Anna Wilson and Betty Skipp in 1994. The geology was compiled on a 1:250,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a GIS database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  17. The future of bibliographic standards in a networked information environment

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The main mission of the CENDI Cataloging Working Group is to provide guidelines for cataloging practices that support the sharing of database records among the CENDI agencies, and that incorporate principles based on cost effectiveness and efficiency. Recent efforts include the extension of COSATI Guidelines for the Cataloging of Technical Reports to include non-print materials, and the mapping of each agency's export file structure to USMARC. Of primary importance is the impact of electronic documents and the distributed nature of the networked information environment. Topics discussed during the workshop include the following: Trade-offs in Cataloging and Indexing Internet Information; The Impact on Current and Future Standards; A Look at WWW Metadata Initiatives; Standards for Electronic Journals; The Present and Future Search Engines; The Roles for Text Analysis Software; Advanced Search Engine Meets Metathesaurus; Locator Schemes for Internet Resources; Identifying and Cataloging Web Document Types; In Search of a New Bibliographic Record. The videos in this set include viewgraphs of charts and related materials of the workshop.

  18. TOXMAP: A GIS-Based Gateway to Environmental Health Resources

    PubMed Central

    Hochstein, Colette; Szczur, Marti

    2009-01-01

    The National Library of Medicine (NLM) has an extensive collection of environmental health information, including bibliographic and technical data on hazardous chemical substances, in its TOXNET databases. TOXNET also provides access to the United States Environment Protection Agency (EPA)’s Toxics Release Inventory (TRI) data, which covers release of specific chemicals via air, water, and land, and by underground injection, as reported by industrial facilities around the United States. NLM has developed a Web-based geographic information system (GIS), TOXMAP , which allows users to create dynamic maps that show where TRI chemicals are released and that provides direct links to information about the chemicals in TOXNET. By extracting the associated regional geographic text terms from the displayed map (e.g., rivers, towns, county, state), TOXMAP also provides customized chemical and/or region-specific searches of NLM’s bibliographic biomedical resources. This paper focuses on TOXMAP’s features, data accuracy issues, challenges, user feedback techniques, and future directions. PMID:16893844

  19. History and use of remote sensing for conservation and management of federal lands in Alaska, USA

    USGS Publications Warehouse

    Markon, Carl

    1995-01-01

    Remote sensing has been used to aid land use planning efforts for federal public lands in Alaska since the 1940s. Four federal land management agencies-the U.S. Fish and Wildlife Service, US. Bureau of Land Management, US. National Park Service, and U.S. Forest Service-have used aerial photography and satellite imagery to document the extent, type, and condition of Alaska's natural resources. Aerial photographs have been used to collect detailed information over small to medium-sized areas. This standard management tool is obtainable using equipment ranging from hand-held 35-mm cameras to precision metric mapping cameras. Satellite data, equally important, provide synoptic views of landscapes, are digitally manipulatable, and are easily merged with other digital databases. To date, over 109.2 million ha (72%) of Alaska's land cover have been mapped via remote sensing. This information has provided a base for conservation, management, and planning on federal public lands in Alaska.

  20. Candidate gene database and transcript map for peach, a model species for fruit trees.

    PubMed

    Horn, Renate; Lecouls, Anne-Claire; Callahan, Ann; Dandekar, Abhaya; Garay, Lilibeth; McCord, Per; Howad, Werner; Chan, Helen; Verde, Ignazio; Main, Doreen; Jung, Sook; Georgi, Laura; Forrest, Sam; Mook, Jennifer; Zhebentyayeva, Tatyana; Yu, Yeisoo; Kim, Hye Ran; Jesudurai, Christopher; Sosinski, Bryon; Arús, Pere; Baird, Vance; Parfitt, Dan; Reighard, Gregory; Scorza, Ralph; Tomkins, Jeffrey; Wing, Rod; Abbott, Albert Glenn

    2005-05-01

    Peach (Prunus persica) is a model species for the Rosaceae, which includes a number of economically important fruit tree species. To develop an extensive Prunus expressed sequence tag (EST) database for identifying and cloning the genes important to fruit and tree development, we generated 9,984 high-quality ESTs from a peach cDNA library of developing fruit mesocarp. After assembly and annotation, a putative peach unigene set consisting of 3,842 ESTs was defined. Gene ontology (GO) classification was assigned based on the annotation of the single "best hit" match against the Swiss-Prot database. No significant homology could be found in the GenBank nr databases for 24.3% of the sequences. Using core markers from the general Prunus genetic map, we anchored bacterial artificial chromosome (BAC) clones on the genetic map, thereby providing a framework for the construction of a physical and transcript map. A transcript map was developed by hybridizing 1,236 ESTs from the putative peach unigene set and an additional 68 peach cDNA clones against the peach BAC library. Hybridizing ESTs to genetically anchored BACs immediately localized 11.2% of the ESTs on the genetic map. ESTs showed a clustering of expressed genes in defined regions of the linkage groups. [The data were built into a regularly updated Genome Database for Rosaceae (GDR), available at (http://www.genome.clemson.edu/gdr/).].

  1. Cadastral Positioning Accuracy Improvement: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Hashim, N. M.; Omar, A. H.; Omar, K. M.; Abdullah, N. M.; Yatim, M. H. M.

    2016-09-01

    Cadastral map is a parcel-based information which is specifically designed to define the limitation of boundaries. In Malaysia, the cadastral map is under authority of the Department of Surveying and Mapping Malaysia (DSMM). With the growth of spatial based technology especially Geographical Information System (GIS), DSMM decided to modernize and reform its cadastral legacy datasets by generating an accurate digital based representation of cadastral parcels. These legacy databases usually are derived from paper parcel maps known as certified plan. The cadastral modernization will result in the new cadastral database no longer being based on single and static parcel paper maps, but on a global digital map. Despite the strict process of the cadastral modernization, this reform has raised unexpected queries that remain essential to be addressed. The main focus of this study is to review the issues that have been generated by this transition. The transformed cadastral database should be additionally treated to minimize inherent errors and to fit them to the new satellite based coordinate system with high positional accuracy. This review result will be applied as a foundation for investigation to study the systematic and effectiveness method for Positional Accuracy Improvement (PAI) in cadastral database modernization.

  2. Spatial digital database for the geologic map of the east part of the Pullman 1° x 2° quadrangle, Idaho

    USGS Publications Warehouse

    Rember, William C.; Bennett, Earl H.

    2001-01-01

    he paper geologic map of the east part of the Pullman 1·x 2· degree quadrangle, Idaho (Rember and Bennett, 1979) was scanned and initially attributed by Optronics Specialty Co., Inc. (Northridge, CA) and remitted to the U.S. Geological Survey for further attribution and publication of the geospatial digital files. The resulting digital geologic map GIS can be queried in many ways to produce a variety of geologic maps. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. Digital base map data files (topography, roads, towns, rivers and lakes, and others.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:250,000 (for example, 1:100,000 or 1:24,000). The digital geologic map graphics and plot files (pull250k.gra/.hp /.eps) that are provided in the digital package are representations of the digital database.

  3. The Handling of Hazard Data on a National Scale: A Case Study from the British Geological Survey

    NASA Astrophysics Data System (ADS)

    Royse, Katherine R.

    2011-11-01

    This paper reviews how hazard data and geological map data have been combined by the British Geological Survey (BGS) to produce a set of GIS-based national-scale hazard susceptibility maps for the UK. This work has been carried out over the last 9 years and as such reflects the combined outputs of a large number of researchers at BGS. The paper details the inception of these datasets from the development of the seamless digital geological map in 2001 through to the deterministic 2D hazard models produced today. These datasets currently include landslides, shrink-swell, soluble rocks, compressible and collapsible deposits, groundwater flooding, geological indicators of flooding, radon potential and potentially harmful elements in soil. These models have been created using a combination of expert knowledge (from both within BGS and from outside bodies such as the Health Protection Agency), national databases (which contain data collected over the past 175 years), multi-criteria analysis within geographical information systems and a flexible rule-based approach for each individual geohazard. By using GIS in this way, it has been possible to model the distribution and degree of geohazards across the whole of Britain.

  4. Shared patients: multiple health and social care contact.

    PubMed

    Keene, J; Swift, L; Bailey, S; Janacek, G

    2001-07-01

    The paper describes results from the 'Tracking Project', a new method for examining agency overlap, repeat service use and shared clients/patients amongst social and health care agencies in the community. This is the first project in this country to combine total population databases from a range of social, health care and criminal justice agencies to give a multidisciplinary database for one county (n = 97,162 cases), through standardised anonymisation of agency databases, using SOUNDEX, a software programme. A range of 20 community social and health care agencies were shown to have a large overlap with each other in a two-year period, indicating high proportions of shared patients/clients. Accident and Emergency is used as an example of major overlap: 16.2% (n = 39,992) of persons who attended a community agency had attended Accident and Emergency as compared to 8.2% (n = 775,000) of the total population of the county. Of these, 96% who had attended seven or more different community agencies had also attended Accident and Emergency. Further statistical analysis of Accident and Emergency attendance as a characteristic of community agency populations (n = 39,992) revealed that increasing frequency of attendance at Accident and Emergency was very strongly associated with increasing use of other services. That is, the patients that repeatedly attend Accident and Emergency are much more likely to attend more other agencies, indicating the possibility that they share more problematic or difficult patients. Research questions arising from these data are discussed and future research methods suggested in order to derive predictors from the database and develop screening instruments to identify multiple agency attenders for targeting or multidisciplinary working. It is suggested that Accident and Emergency attendance might serve as an important predictor of multiple agency attendance.

  5. Collaborative and Multilingual Approach to Learn Database Topics Using Concept Maps

    PubMed Central

    Calvo, Iñaki

    2014-01-01

    Authors report on a study using the concept mapping technique in computer engineering education for learning theoretical introductory database topics. In addition, the learning of multilingual technical terminology by means of the collaborative drawing of a concept map is also pursued in this experiment. The main characteristics of a study carried out in the database subject at the University of the Basque Country during the 2011/2012 course are described. This study contributes to the field of concept mapping as these kinds of cognitive tools have proved to be valid to support learning in computer engineering education. It contributes to the field of computer engineering education, providing a technique that can be incorporated with several educational purposes within the discipline. Results reveal the potential that a collaborative concept map editor offers to fulfil the above mentioned objectives. PMID:25538957

  6. Sequencing of cDNA Clones from the Genetic Map of Tomato (Lycopersicon esculentum)

    PubMed Central

    Ganal, Martin W.; Czihal, Rosemarie; Hannappel, Ulrich; Kloos, Dorothee-U.; Polley, Andreas; Ling, Hong-Qing

    1998-01-01

    The dense RFLP linkage map of tomato (Lycopersicon esculentum) contains >300 anonymous cDNA clones. Of those clones, 272 were partially or completely sequenced. The sequences were compared at the DNA and protein level to known genes in databases. For 57% of the clones, a significant match to previously described genes was found. The information will permit the conversion of those markers to STS markers and allow their use in PCR-based mapping experiments. Furthermore, it will facilitate the comparative mapping of genes across distantly related plant species by direct comparison of DNA sequences and map positions. [cDNA sequence data reported in this paper have been submitted to the EMBL database under accession nos. AA824695–AA825005 and the dbEST_Id database under accession nos. 1546519–1546862.] PMID:9724330

  7. Monitoring and analysis of the change process in curriculum mapping compared to the National Competency-based Learning Objective Catalogue for Undergraduate Medical Education (NKLM) at four medical faculties. Part I: Conducive resources and structures

    PubMed Central

    Lammerding-Koeppel, Maria; Giesler, Marianne; Gornostayeva, Maryna; Narciss, Elisabeth; Wosnik, Annette; Zipfel, Stephan; Griewatz, Jan; Fritze, Olaf

    2017-01-01

    Objective: After passing of the National Competency-based Learning Objectives Catalogue in Medicine (Nationaler Kompetenzbasierter Lernzielkatalog Medizin, [NKLM, retrieved on 22.03.2016]), the German medical faculties must take inventory and develop their curricula. NKLM contents are expected to be present, but not linked well or sensibly enough in locally grown curricula. Learning and examination formats must be reviewed for appropriateness and coverage of the competences. The necessary curricular transparency is best achieved by systematic curriculum mapping, combined with effective change management. Mapping a complex existing curriculum and convincing a faculty that this will have benefits is not easy. Headed by Tübingen, the faculties of Freiburg, Heidelberg, Mannheim and Tübingen take inventory by mapping their curricula in comparison to the NKLM, using the dedicated web-based MERLIN-database. This two-part article analyses and summarises how NKLM curriculum mapping could be successful in spite of resistance at the faculties. The target is conveying the widest possible overview of beneficial framework conditions, strategies and results. Part I of the article shows the beneficial resources and structures required for implementation of curriculum mapping at the faculties. Part II describes key factors relevant for motivating faculties and teachers during the mapping process. Method: The network project was systematically planned in advance according to steps of project and change management, regularly reflected on and adjusted together in workshops and semi-annual project meetings. From the beginning of the project, a grounded-theory approach was used to systematically collect detailed information on structures, measures and developments at the faculties using various sources and methods, to continually analyse them and to draw a final conclusion (sources: surveys among the project participants with questionnaires, semi-structured group interviews and discussions, guideline-supported individual interviews, informal surveys, evaluation of target agreements and protocols, openly discernible local, regional or over-regional structure-relevant events). Results: The following resources and structures support implementation of curriculum mapping at a faculty: Setting up a coordination agency (≥50% of a full position; support by student assistants), systematic project management, and development of organisation and communication structures with integration of the dean of study and teaching and pilot departments, as well as development of a user-friendly web-based mapping instrument. Acceptance of the mapping was increased particularly by visualisation of the results and early insight into indicative results relevant for the department. Conclusion: Successful NKLM curriculum mapping requires trained staff for coordination, resilient communication structures and a user-oriented mapping database. In alignment with literature, recommendations can be derived to support other faculties that want to map their curriculum. PMID:28293674

  8. Monitoring and analysis of the change process in curriculum mapping compared to the National Competency-based Learning Objective Catalogue for Undergraduate Medical Education (NKLM) at four medical faculties. Part I: Conducive resources and structures.

    PubMed

    Lammerding-Koeppel, Maria; Giesler, Marianne; Gornostayeva, Maryna; Narciss, Elisabeth; Wosnik, Annette; Zipfel, Stephan; Griewatz, Jan; Fritze, Olaf

    2017-01-01

    Objective: After passing of the National Competency-based Learning Objectives Catalogue in Medicine (Nationaler Kompetenzbasierter Lernzielkatalog Medizin, [NKLM, retrieved on 22.03.2016]), the German medical faculties must take inventory and develop their curricula. NKLM contents are expected to be present, but not linked well or sensibly enough in locally grown curricula. Learning and examination formats must be reviewed for appropriateness and coverage of the competences. The necessary curricular transparency is best achieved by systematic curriculum mapping, combined with effective change management. Mapping a complex existing curriculum and convincing a faculty that this will have benefits is not easy. Headed by Tübingen, the faculties of Freiburg, Heidelberg, Mannheim and Tübingen take inventory by mapping their curricula in comparison to the NKLM, using the dedicated web-based MER LIN -database. This two-part article analyses and summarises how NKLM curriculum mapping could be successful in spite of resistance at the faculties. The target is conveying the widest possible overview of beneficial framework conditions, strategies and results. Part I of the article shows the beneficial resources and structures required for implementation of curriculum mapping at the faculties. Part II describes key factors relevant for motivating faculties and teachers during the mapping process. Method: The network project was systematically planned in advance according to steps of project and change management, regularly reflected on and adjusted together in workshops and semi-annual project meetings. From the beginning of the project, a grounded-theory approach was used to systematically collect detailed information on structures, measures and developments at the faculties using various sources and methods, to continually analyse them and to draw a final conclusion (sources: surveys among the project participants with questionnaires, semi-structured group interviews and discussions, guideline-supported individual interviews, informal surveys, evaluation of target agreements and protocols, openly discernible local, regional or over-regional structure-relevant events). Results: The following resources and structures support implementation of curriculum mapping at a faculty: Setting up a coordination agency (≥50% of a full position; support by student assistants), systematic project management, and development of organisation and communication structures with integration of the dean of study and teaching and pilot departments, as well as development of a user-friendly web-based mapping instrument. Acceptance of the mapping was increased particularly by visualisation of the results and early insight into indicative results relevant for the department. Conclusion: Successful NKLM curriculum mapping requires trained staff for coordination, resilient communication structures and a user-oriented mapping database. In alignment with literature, recommendations can be derived to support other faculties that want to map their curriculum.

  9. USEPA Grants

    EPA Pesticide Factsheets

    This is a provisional dataset that contains point locations for all grants given out by the USEPA going back to the 1960s through today. There are many limitations to the data so it is advised that these metadata be read carefully before use. Although the records for these grant locations are drawn directly from the official EPA grants repository (IGMS ?? Integrated Grants Management System), it is important to know that the IGMS was designed for purposes that did not include accurately portraying the grant??s place of performance on a map. Instead, the IGMS grant recipient??s mailing address is the primary source for grant locations. Particularly for statewide grants that are administered via State and Regional headquarters, the grant location data should not be interpreted as the grant??s place of performance. In 2012, a policy was established to start to collect the place of performance as a pilot for newly awarded grants ?? that were deemed ??community-based?? in nature and for these the grant location depicted in this database will be a more reliable indicator of the actual place of performance. As for the locational accuracy of these points, there is no programmatic certification process, however, they are being entered by the Grant Project Officers who are most familiar with the details of the grants, apart from the grantees themselves. Limitations notwithstanding, this is a first-of-breed attempt to map all of the Agency??s grants, using the best

  10. GIS INTERNET MAP SERVICE FOR DISPLAYING SELENIUM CONTAMINATION DATA IN THE SOUTHEASTERN IDAHO PHOSPHATE MINING RESOURCE AREA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roger Mayes; Sera White; Randy Lee

    2005-04-01

    Selenium is present in waste rock/overburden that is removed during phosphate mining in southeastern Idaho. Waste rock piles or rock used during reclamation can be a source of selenium (and other metals) to streams and vegetation. Some instances (in 1996) of selenium toxicity in grazing sheep and horses caused public health and environmental concerns, leading to Idaho Department of Environmental Quality (DEQ) involvement. The Selenium Information System Project is a collaboration among the DEQ, the United States Forest Service (USFS), the Bureau of Land Management (BLM), the Idaho Mining Association (IMA), Idaho State University (ISU), and the Idaho National Laboratorymore » (INL)2. The Selenium Information System is a centralized data repository for southeastern Idaho selenium data. The data repository combines information that was previously in numerous agency, mining company, and consultants’ databases and web sites. These data include selenium concentrations in soil, water, sediment, vegetation and other environmental media, as well as comprehensive mine information. The Idaho DEQ spearheaded a selenium area-wide investigation through voluntary agreements with the mining companies and interagency participants. The Selenium Information System contains the results of that area-wide investigation, and many other background documents. As studies are conducted and remedial action decisions are made the resulting data and documentation will be stored within the information system. Potential users of the information system are agency officials, students, lawmakers, mining company personnel, teachers, researchers, and the general public. The system, available from a central website, consists of a database that contains the area-wide sampling information and an ESRI ArcIMS map server. The user can easily acquire information pertaining to the area-wide study as well as the final area-wide report. Future work on this project includes creating custom tools to increase the simplicity of the website and increasing the amount of information available from site-specific studies at 15 mines.« less

  11. 76 FR 72929 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-28

    ... proposed information collection project: ``Medical Office Survey on Patient Safety Culture Comparative... Medical Office Survey on Patient Safety Culture Comparative Database. The Agency for Healthcare Research... Patient Safety Culture (Medical Office SOPS) Comparative Database. The Medical Office SOPS Comparative...

  12. 75 FR 24723 - Agency Information Collection Activities: Proposed Collection; Comment Request, 1660-0037...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ... Insurance Program Maps (Spanish). SUMMARY: The Federal Emergency Management Agency, as part of its... Flood Insurance Program Maps (Spanish). Abstract: FEMA Forms 086-0-22 and 086-0-22A are designed to...,389 Single Residential Lot or Structure Amendments to National Flood Insurance Program Maps (Spanish...

  13. Digital geologic map of the Coeur d'Alene 1:100,000 quadrangle, Idaho and Montana

    USGS Publications Warehouse

    digital compilation by Munts, Steven R.

    2000-01-01

    Between 1961 and 1969, Alan Griggs and others conducted fieldwork to prepare a geologic map of the Spokane 1:250,000 map (Griggs, 1973). Their field observations were posted on paper copies of 15-minute quadrangle maps. In 1999, the USGS contracted with the Idaho Geological Survey to prepare a digital version of the Coeur d’Alene 1:100,000 quadrangle. To facilitate this work, the USGS obtained the field maps prepared by Griggs and others from the USGS Field Records Library in Denver, Colorado. The Idaho Geological Survey (IGS) digitized these maps and used them in their mapping program. The mapping focused on field checks to resolve problems in poorly known areas and in areas of disagreement between adjoining maps. The IGS is currently in the process of preparing a final digital spatial database for the Coeur d’Alene 1:100,000 quadrangle. However, there was immediate need for a digital version of the geologic map of the Coeur d’Alene 1:100,000 quadrangle and the data from the field sheets along with several other sources were assembled to produce this interim product. This interim product is the digital geologic map of the Coeur d’Alene 1:100,000 quadrangle, Idaho and Montana. It was compiled from the preliminary digital files prepared by the Idaho Geological, and supplemented by data from Griggs (1973) and from digital databases by Bookstrom and others (1999) and Derkey and others (1996). The resulting digital geologic map (GIS) database can be queried in many ways to produce a variety of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:100,000 (e.g., 1:62,500 or 1:24,000). The digital geologic map graphics (of00-135_map.pdf) that are provided are representations of the digital database. The map area is located in north Idaho. This open-file report describes the geologic map units, the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet.

  14. Researchermap: a tool for visualizing author locations using Google maps.

    PubMed

    Rastegar-Mojarad, Majid; Bales, Michael E; Yu, Hong

    2013-01-01

    We hereby present ResearcherMap, a tool to visualize locations of authors of scholarly papers. In response to a query, the system returns a map of author locations. To develop the system we first populated a database of author locations, geocoding institution locations for all available institutional affiliation data in our database. The database includes all authors of Medline papers from 1990 to 2012. We conducted a formative heuristic usability evaluation of the system and measured the system's accuracy and performance. The accuracy of finding the accurate address is 97.5% in our system.

  15. User Generated Spatial Content Sources for Land Use/Land Cover Validation Purposes: Suitability Analysis and Integration Model

    NASA Astrophysics Data System (ADS)

    Estima, Jacinto Paulo Simoes

    Traditional geographic information has been produced by mapping agencies and corporations, using high skilled people as well as expensive precision equipment and procedures, in a very costly approach. The production of land use and land cover databases are just one example of such traditional approach. On the other side, The amount of Geographic Information created and shared by citizens through the Web has been increasing exponentially during the last decade, resulting from the emergence and popularization of technologies such as the Web 2.0, cloud computing, GPS, smart phones, among others. Such comprehensive amount of free geographic data might have valuable information to extract and thus opening great possibilities to improve significantly the production of land use and land cover databases. In this thesis we explored the feasibility of using geographic data from different user generated spatial content initiatives in the process of land use and land cover database production. Data from Panoramio, Flickr and OpenStreetMap were explored in terms of their spatial and temporal distribution, and their distribution over the different land use and land cover classes. We then proposed a conceptual model to integrate data from suitable user generated spatial content initiatives based on identified dissimilarities among a comprehensive list of initiatives. Finally we developed a prototype implementing the proposed integration model, which was then validated by using the prototype to solve four identified use cases. We concluded that data from user generated spatial content initiatives has great value but should be integrated to increase their potential. The possibility of integrating data from such initiatives in an integration model was proved. Using the developed prototype, the relevance of the integration model was also demonstrated for different use cases. None None None

  16. Bedrock geologic map of the Worcester South quadrangle, Worcester County, Massachusetts

    USGS Publications Warehouse

    Walsh, Gregory J.; Merschat, Arthur J.

    2015-09-29

    The bedrock geology was mapped to study the tectonic history of the area and to provide a framework for ongoing hydrogeologic characterization of the fractured bedrock of Massachusetts. This report presents mapping by Gregory J. Walsh and Arthur J. Merschat from 2008 to 2010. The report consists of a map and GIS database, both of which are available for download at http://dx.doi.org/ 10.3133/sim3345. The database includes contacts of bedrock geologic units, faults, outcrop locations, structural information, and photographs.

  17. 76 FR 72931 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-28

    ... Systems (CAHPS) Clinician and Group Survey Comparative Database.'' In accordance with the Paperwork... Providers and Systems (CAHPS) Clinician and Group Survey Comparative Database The Agency for Healthcare..., and provided critical data illuminating key aspects of survey design and administration. In July 2007...

  18. 77 FR 4038 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-26

    ... proposed information collection project: ``Nursing Home Survey on Patient Safety Culture Comparative...: Proposed Project Nursing Home Survey on Patient Safety Culture Comparative Database The Agency for... Nursing Home Survey on Patient Safety Culture (Nursing Home SOPS) Comparative Database. The Nursing Home...

  19. 76 FR 15953 - Agency Information Collection Activities; Announcement of Office of Management and Budget...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-22

    ... CONSUMER PRODUCT SAFETY COMMISSION Agency Information Collection Activities; Announcement of Office of Management and Budget Approval; Publicly Available Consumer Product Safety Information Database... Product Safety Information Database has been approved by the Office of Management and Budget (OMB) under...

  20. The status of soil mapping for the Idaho National Engineering Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olson, G.L.; Lee, R.D.; Jeppesen, D.J.

    This report discusses the production of a revised version of the general soil map of the 2304-km{sup 2} (890-mi{sup 2}) Idaho National Engineering Laboratory (INEL) site in southeastern Idaho and the production of a geographic information system (GIS) soil map and supporting database. The revised general soil map replaces an INEL soil map produced in 1978 and incorporates the most current information on INEL soils. The general soil map delineates large soil associations based on National Resources Conservation Services [formerly the Soil Conservation Service (SCS)] principles of soil mapping. The GIS map incorporates detailed information that could not be presentedmore » on the general soil map and is linked to a database that contains the soil map unit descriptions, surficial geology codes, and other pertinent information.« less

  1. RatMap—rat genome tools and data

    PubMed Central

    Petersen, Greta; Johnson, Per; Andersson, Lars; Klinga-Levan, Karin; Gómez-Fabre, Pedro M.; Ståhl, Fredrik

    2005-01-01

    The rat genome database RatMap (http://ratmap.org or http://ratmap.gen.gu.se) has been one of the main resources for rat genome information since 1994. The database is maintained by CMB–Genetics at Göteborg University in Sweden and provides information on rat genes, polymorphic rat DNA-markers and rat quantitative trait loci (QTLs), all curated at RatMap. The database is under the supervision of the Rat Gene and Nomenclature Committee (RGNC); thus much attention is paid to rat gene nomenclature. RatMap presents information on rat idiograms, karyotypes and provides a unified presentation of the rat genome sequence and integrated rat linkage maps. A set of tools is also available to facilitate the identification and characterization of rat QTLs, as well as the estimation of exon/intron number and sizes in individual rat genes. Furthermore, comparative gene maps of rat in regard to mouse and human are provided. PMID:15608244

  2. Preliminary Geologic Map of the Buxton 7.5' Quadrangle, Washington County, Oregon

    USGS Publications Warehouse

    Dinterman, Philip A.; Duvall, Alison R.

    2009-01-01

    This map, compiled from previously published and unpublished data, and new mapping by the authors, represents the general distribution of bedrock and surficial deposits of the Buxton 7.5-minute quadrangle. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:24,000 or smaller. This plot file and accompanying database depict the distribution of geologic materials and structures at a regional (1:24,000) scale. The report is intended to provide geologic information for the regional study of materials properties, earthquake shaking, landslide potential, mineral hazards, seismic velocity, and earthquake faults. In addition, the report contains new information and interpretations about the regional geologic history and framework. However, the regional scale of this report does not provide sufficient detail for site development purposes.

  3. Monitoring Colonias Development along the United States-Mexico Border: A Process Application using GIS and Remote Sensing in Douglas, Arizona, and Agua Prieta, Sonora

    USGS Publications Warehouse

    Norman, Laura M.; Donelson, Angela J.; Pfeifer, Edwin L.; Lam, Alven H.; Osborn, Kenneth J.

    2004-01-01

    The U.S. Department of Housing and Urban Development (HUD) and the U.S. Geological Survey (USGS) have developed a joint project to create Internet-enabled geographic information systems (GIS) that will help cities along the United States-Mexico border deal with issues related to colonias. HUD defines colonias as rural neighborhoods in the United States-Mexico border region that lack adequate infrastructure or housing and other basic services. They typically have high poverty rates that make it difficult for residents to pay for roads, sanitary water and sewer systems, decent housing, street lighting, and other services through assessment. Many Federal agencies recognize colonias designations and provide funding assistance. It is the intention of this project to empower Arizona-Sonora borderland neighborhoods and community members by recognizing them as colonias. This recognition will result in eligibility for available economic subsidies and accessibility to geospatial tools and information for urban planning. The steps to achieve this goal include delineation of colonia-like neighborhoods, identification of their urbanization over time, development of geospatial databases describing their infrastructure, and establishment of a framework for distributing Web-based GIS decision support systems. A combination of imagery and infrastructure information was used to help delineate colonia boundaries. A land-use change analysis, focused on urbanization in the cities over a 30-year timeframe, was implemented. The results of this project are being served over the Internet, providing data to the public as well as to participating agencies. One of the initial study areas for this project was the City of Douglas, Ariz., and its Mexican sister-city Agua Prieta, Sonora, which are described herein. Because of its location on the border, this twin-cities area is especially well suited to international manufacturing and commerce, which has, in turn, led to an uncontrolled spread of colonias. The USGS worked with local organizations in developing the Web-based GIS database. Community involvement ensured that the database and map server would meet the current and long-term needs of the communities and end users. Partners include Federal agencies, State agencies, county officials, town representatives, universities, and youth organizations, as well as interested local advocacy groups and individuals. A significant component of this project was development of relationships and partnerships in the border towns for facilitating binational approaches to land management.

  4. Geologic and structure map of the Choteau 1 degree by 2 degrees Quadrangle, western Montana

    USGS Publications Warehouse

    Mudge, Melville R.; Earhart, Robert L.; Whipple, James W.; Harrison, Jack E.

    1982-01-01

    The geologic and structure map of Choteau 1 x 2 degree quadrangle (Mudge and others, 1982) was originally converted to a digital format by Jeff Silkwood (U.S. Forest Service and completed by the U.S. Geological Survey staff and contractor at the Spokane Field Office (WA) in 2000 for input into a geographic information system (GIS). The resulting digital geologic map (GIS) database can be queried in many ways to produce a variey of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:250,000 (e.g. 1:100,000 or 1:24,000. The digital geologic map graphics and plot files (chot250k.gra/.hp/.eps and chot-map.pdf) that are provided in the digital package are representations of the digital database. They are not designed to be cartographic products.

  5. Geothermal NEPA Database on OpenEI (Poster)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, K. R.; Levine, A.

    2014-09-01

    The National Renewable Energy Laboratory (NREL) developed the Geothermal National Environmental Policy Act (NEPA) Database as a platform for government agencies and industry to access and maintain information related to geothermal NEPA documents. The data were collected to inform analyses of NEPA timelines, and the collected data were made publically available via this tool in case others might find the data useful. NREL staff and contractors collected documents from agency websites, during visits to the two busiest Bureau of Land Management (BLM) field offices for geothermal development, and through email and phone call requests from other BLM field offices. Theymore » then entered the information into the database, hosted by Open Energy Information (http://en.openei.org/wiki/RAPID/NEPA). The long-term success of the project will depend on the willingness of federal agencies, industry, and others to populate the database with NEPA and related documents, and to use the data for their own analyses. As the information and capabilities of the database expand, developers and agencies can save time on new NEPA reports by accessing a single location to research related activities, their potential impacts, and previously proposed and imposed mitigation measures. NREL used a wiki platform to allow industry and agencies to maintain the content in the future so that it continues to provide relevant and accurate information to users.« less

  6. A generic method for improving the spatial interoperability of medical and ecological databases.

    PubMed

    Ghenassia, A; Beuscart, J B; Ficheur, G; Occelli, F; Babykina, E; Chazard, E; Genin, M

    2017-10-03

    The availability of big data in healthcare and the intensive development of data reuse and georeferencing have opened up perspectives for health spatial analysis. However, fine-scale spatial studies of ecological and medical databases are limited by the change of support problem and thus a lack of spatial unit interoperability. The use of spatial disaggregation methods to solve this problem introduces errors into the spatial estimations. Here, we present a generic, two-step method for merging medical and ecological databases that avoids the use of spatial disaggregation methods, while maximizing the spatial resolution. Firstly, a mapping table is created after one or more transition matrices have been defined. The latter link the spatial units of the original databases to the spatial units of the final database. Secondly, the mapping table is validated by (1) comparing the covariates contained in the two original databases, and (2) checking the spatial validity with a spatial continuity criterion and a spatial resolution index. We used our novel method to merge a medical database (the French national diagnosis-related group database, containing 5644 spatial units) with an ecological database (produced by the French National Institute of Statistics and Economic Studies, and containing with 36,594 spatial units). The mapping table yielded 5632 final spatial units. The mapping table's validity was evaluated by comparing the number of births in the medical database and the ecological databases in each final spatial unit. The median [interquartile range] relative difference was 2.3% [0; 5.7]. The spatial continuity criterion was low (2.4%), and the spatial resolution index was greater than for most French administrative areas. Our innovative approach improves interoperability between medical and ecological databases and facilitates fine-scale spatial analyses. We have shown that disaggregation models and large aggregation techniques are not necessarily the best ways to tackle the change of support problem.

  7. Database and Map of Quaternary Faults and Folds in Peru and its Offshore Region

    USGS Publications Warehouse

    Machare, Jose; Fenton, Clark H.; Machette, Michael N.; Lavenu, Alain; Costa, Carlos; Dart, Richard L.

    2003-01-01

    This publication consists of a main map of Quaternary faults and fiolds of Peru, a table of Quaternary fault data, a region inset map showing relative plate motion, and a second inset map of an enlarged area of interest in southern Peru. These maps and data compilation show evidence for activity of Quaternary faults and folds in Peru and its offshore regions of the Pacific Ocean. The maps show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds. These data are accompanied by text databases that describe these features and document current information on their activity in the Quaternary.

  8. 77 FR 5021 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-01

    ...) Clinician and Group Survey Comparative Database.'' In accordance with the Paperwork Reduction Act, 44 U.S.C... Providers and Systems (CAHPS) Clinician and Group Survey Comparative Database The Agency for Healthcare..., and provided critical data illuminating key aspects of survey design and administration. In July 2007...

  9. 75 FR 16134 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ... Survey Comparative Database.'' In accordance with the Paperwork Reduction Act, 44 U.S.C. 3501-3520, AHRQ... Comparative Database The Agency for Healthcare Research and Quality (AHRQ) requests that the Office of..., purchasers, and the Centers for Medicare & Medicaid Services (CMS) to provide comparative data to support...

  10. 78 FR 69088 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... Comparative Database.'' In accordance with the Paperwork Reduction Act, 44 U.S.C. 3501-3521, AHRQ invites the... Comparative Database Request for information collection approval. The Agency for Healthcare Research and..., purchasers, and the Centers for Medicare & Medicaid Services (CMS) to provide comparative data to support...

  11. 78 FR 51809 - Seventeenth Meeting: RTCA Special Committee 217-Aeronautical Databases Joint With EUROCAE WG-44...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-21

    ... Committee 217--Aeronautical Databases Joint With EUROCAE WG-44--Aeronautical Databases AGENCY: Federal... Committee 217--Aeronautical Databases Joint with EUROCAE WG-44--Aeronautical Databases. SUMMARY: The FAA is... Databases being held jointly with EUROCAE WG-44--Aeronautical Databases. DATES: The meeting will be held...

  12. 78 FR 8684 - Fifteenth Meeting: RTCA Special Committee 217-Aeronautical Databases Joint with EUROCAE WG-44...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-06

    ... Committee 217--Aeronautical Databases Joint with EUROCAE WG-44--Aeronautical Databases AGENCY: Federal... Committee 217--Aeronautical Databases Joint with EUROCAE WG-44--Aeronautical Databases. SUMMARY: The FAA is... Databases being held jointly with EUROCAE WG-44--Aeronautical Databases. DATES: The meeting will be held...

  13. 78 FR 25134 - Sixteenth Meeting: RTCA Special Committee 217-Aeronautical Databases Joint With EUROCAE WG-44...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-29

    ... Committee 217--Aeronautical Databases Joint With EUROCAE WG-44--Aeronautical Databases AGENCY: Federal... Committee 217--Aeronautical Databases Joint with EUROCAE WG-44--Aeronautical Databases. SUMMARY: The FAA is... Databases being held jointly with EUROCAE WG-44--Aeronautical Databases. DATES: The meeting will be held...

  14. 78 FR 66418 - Eighteenth Meeting: RTCA Special Committee 217-Aeronautical Databases Joint With EUROCAE WG-44...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ... Committee 217--Aeronautical Databases Joint With EUROCAE WG-44--Aeronautical Databases AGENCY: Federal... Committee 217--Aeronautical Databases Joint with EUROCAE WG-44--Aeronautical Databases. SUMMARY: The FAA is... Databases being held jointly with EUROCAE WG-44--Aeronautical Databases. DATES: The meeting will be held...

  15. Updating road databases from shape-files using aerial images

    NASA Astrophysics Data System (ADS)

    Häufel, Gisela; Bulatov, Dimitri; Pohl, Melanie

    2015-10-01

    Road databases are an important part of geo data infrastructure. The knowledge about their characteristics and course is essential for urban planning, navigation or evacuation tasks. Starting from OpenStreetMap (OSM) shape-file data for street networks, we introduce an algorithm to enrich these available road maps by new maps which are based on other airborne sensor technology. In our case, these are results of our context-based urban terrain reconstruction process. We wish to enhance the use of road databases by computing additional junctions, narrow passages and other items which may emerge due to changes in the terrain. This is relevant for various military and civil applications.

  16. Single-edition quadrangle maps

    USGS Publications Warehouse

    ,

    1998-01-01

    In August 1993, the U.S. Geological Survey's (USGS) National Mapping Division and the U.S. Department of Agriculture's Forest Service signed an Interagency Agreement to begin a single-edition joint mapping program. This agreement established the coordination for producing and maintaining single-edition primary series topographic maps for quadrangles containing National Forest System lands. The joint mapping program saves money by eliminating duplication of effort by the agencies and results in a more frequent revision cycle for quadrangles containing national forests. Maps are revised on the basis of jointly developed standards and contain normal features mapped by the USGS, as well as additional features required for efficient management of National Forest System lands. Single-edition maps look slightly different but meet the content, accuracy, and quality criteria of other USGS products. The Forest Service is responsible for the land management of more than 191 million acres of land throughout the continental United States, Alaska, and Puerto Rico, including 155 national forests and 20 national grasslands. These areas make up the National Forest System lands and comprise more than 10,600 of the 56,000 primary series 7.5-minute quadrangle maps (15-minute in Alaska) covering the United States. The Forest Service has assumed responsibility for maintaining these maps, and the USGS remains responsible for printing and distributing them. Before the agreement, both agencies published similar maps of the same areas. The maps were used for different purposes, but had comparable types of features that were revised at different times. Now, the two products have been combined into one so that the revision cycle is stabilized and only one agency revises the maps, thus increasing the number of current maps available for National Forest System lands. This agreement has improved service to the public by requiring that the agencies share the same maps and that the maps meet a common standard, as well as by significantly reducing duplication of effort.

  17. The U.S. Geological Survey mapping and cartographic database activities, 2006-2010

    USGS Publications Warehouse

    Craun, Kari J.; Donnelly, John P.; Allord, Gregory J.

    2011-01-01

    The U.S. Geological Survey (USGS) began systematic topographic mapping of the United States in the 1880s, beginning with scales of 1:250,000 and 1:125,000 in support of geological mapping. Responding to the need for higher resolution and more detail, the 1:62,500-scale, 15-minute, topographic map series was begun in the beginning of the 20th century. Finally, in the 1950s the USGS adopted the 1:24,000-scale, 7.5-minute topographic map series to portray even more detail, completing the coverage of the conterminous 48 states of the United States with this series in 1992. In 2001, the USGS developed the vision and concept of The National Map, a topographic database for the 21st century and the source for a new generation of topographic maps (http://nationalmap.gov/). In 2008, the initial production of those maps began with a 1:24,000-scale digital product. In a separate, but related project, the USGS began scanning the existing inventory of historical topographic maps at all scales to accompany the new topographic maps. The USGS also had developed a digital database of The National Atlas of the United States. The digital version of Atlas is now Web-available and supports a mapping engine for small scale maps of the United States and North America. These three efforts define topographic mapping activities of the USGS during the last few years and are discussed below.

  18. Ridge 2000 Data Management System

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.; Carbotte, S. M.; Arko, R. A.; Haxby, W. F.; Ryan, W. B.; Chayes, D. N.; Lehnert, K. A.; Shank, T. M.

    2005-12-01

    Hosted at Lamont by the marine geoscience Data Management group, mgDMS, the NSF-funded Ridge 2000 electronic database, http://www.marine-geo.org/ridge2000/, is a key component of the Ridge 2000 multi-disciplinary program. The database covers each of the three Ridge 2000 Integrated Study Sites: Endeavour Segment, Lau Basin, and 8-11N Segment. It promotes the sharing of information to the broader community, facilitates integration of the suite of information collected at each study site, and enables comparisons between sites. The Ridge 2000 data system provides easy web access to a relational database that is built around a catalogue of cruise metadata. Any web browser can be used to perform a versatile text-based search which returns basic cruise and submersible dive information, sample and data inventories, navigation, and other relevant metadata such as shipboard personnel and links to NSF program awards. In addition, non-proprietary data files, images, and derived products which are hosted locally or in national repositories, as well as science and technical reports, can be freely downloaded. On the Ridge 2000 database page, our Data Link allows users to search the database using a broad range of parameters including data type, cruise ID, chief scientist, geographical location. The first Ridge 2000 field programs sailed in 2004 and, in addition to numerous data sets collected prior to the Ridge 2000 program, the database currently contains information on fifteen Ridge 2000-funded cruises and almost sixty Alvin dives. Track lines can be viewed using a recently- implemented Web Map Service button labelled Map View. The Ridge 2000 database is fully integrated with databases hosted by the mgDMS group for MARGINS and the Antarctic multibeam and seismic reflection data initiatives. Links are provided to partner databases including PetDB, SIOExplorer, and the ODP Janus system. Improved inter-operability with existing and new partner repositories continues to be strengthened. One major effort involves the gradual unification of the metadata across these partner databases. Standardised electronic metadata forms that can be filled in at sea are available from our web site. Interactive map-based exploration and visualisation of the Ridge 2000 database is provided by GeoMapApp, a freely-available Java(tm) application being developed within the mgDMS group. GeoMapApp includes high-resolution bathymetric grids for the 8-11N EPR segment and allows customised maps and grids for any of the Ridge 2000 ISS to be created. Vent and instrument locations can be plotted and saved as images, and Alvin dive photos are also available.

  19. Database and online map service on unstable rock slopes in Norway - From data perpetuation to public information

    NASA Astrophysics Data System (ADS)

    Oppikofer, Thierry; Nordahl, Bobo; Bunkholt, Halvor; Nicolaisen, Magnus; Jarna, Alexandra; Iversen, Sverre; Hermanns, Reginald L.; Böhme, Martina; Yugsi Molina, Freddy X.

    2015-11-01

    The unstable rock slope database is developed and maintained by the Geological Survey of Norway as part of the systematic mapping of unstable rock slopes in Norway. This mapping aims to detect catastrophic rock slope failures before they occur. More than 250 unstable slopes with post-glacial deformation are detected up to now. The main aims of the unstable rock slope database are (1) to serve as a national archive for unstable rock slopes in Norway; (2) to serve for data collection and storage during field mapping; (3) to provide decision-makers with hazard zones and other necessary information on unstable rock slopes for land-use planning and mitigation; and (4) to inform the public through an online map service. The database is organized hierarchically with a main point for each unstable rock slope to which several feature classes and tables are linked. This main point feature class includes several general attributes of the unstable rock slopes, such as site name, general and geological descriptions, executed works, recommendations, technical parameters (volume, lithology, mechanism and others), displacement rates, possible consequences, as well as hazard and risk classification. Feature classes and tables linked to the main feature class include different scenarios of an unstable rock slope, field observation points, sampling points for dating, displacement measurement stations, lineaments, unstable areas, run-out areas, areas affected by secondary effects, along with tables for hazard and risk classification and URL links to further documentation and references. The database on unstable rock slopes in Norway will be publicly consultable through an online map service. Factsheets with key information on unstable rock slopes can be automatically generated and downloaded for each site. Areas of possible rock avalanche run-out and their secondary effects displayed in the online map service, along with hazard and risk assessments, will become important tools for land-use planning. The present database will further evolve in the coming years as the systematic mapping progresses and as available techniques and tools evolve.

  20. The Iranian National Geodata Revision Strategy and Realization Based on Geodatabase

    NASA Astrophysics Data System (ADS)

    Haeri, M.; Fasihi, A.; Ayazi, S. M.

    2012-07-01

    In recent years, using of spatial database for storing and managing spatial data has become a hot topic in the field of GIS. Accordingly National Cartographic Center of Iran (NCC) produces - from time to time - some spatial data which is usually included in some databases. One of the NCC major projects was designing National Topographic Database (NTDB). NCC decided to create National Topographic Database of the entire country-based on 1:25000 coverage maps. The standard of NTDB was published in 1994 and its database was created at the same time. In NTDB geometric data was stored in MicroStation design format (DGN) which each feature has a link to its attribute data (stored in Microsoft Access file). Also NTDB file was produced in a sheet-wise mode and then stored in a file-based style. Besides map compilation, revision of existing maps has already been started. Key problems of NCC are revision strategy, NTDB file-based style storage and operator challenges (NCC operators are almost preferred to edit and revise geometry data in CAD environments). A GeoDatabase solution for national Geodata, based on NTDB map files and operators' revision preferences, is introduced and released herein. The proposed solution extends the traditional methods to have a seamless spatial database which it can be revised in CAD and GIS environment, simultaneously. The proposed system is the common data framework to create a central data repository for spatial data storage and management.

  1. Thematic Accuracy Assessment of the 2011 National Land Cover Database (NLCD)

    EPA Science Inventory

    Accuracy assessment is a standard protocol of National Land Cover Database (NLCD) mapping. Here we report agreement statistics between map and reference labels for NLCD 2011, which includes land cover for ca. 2001, ca. 2006, and ca. 2011. The two main objectives were assessment o...

  2. APPLICATION OF A "VITURAL FIELD REFERENCE DATABASE" TO ASSESS LAND-COVER MAP ACCURACIES

    EPA Science Inventory

    An accuracy assessment was performed for the Neuse River Basin, NC land-cover/use
    (LCLU) mapping results using a "Virtual Field Reference Database (VFRDB)". The VFRDB was developed using field measurement and digital imagery (camera) data collected at 1,409 sites over a perio...

  3. Automated Database Mediation Using Ontological Metadata Mappings

    PubMed Central

    Marenco, Luis; Wang, Rixin; Nadkarni, Prakash

    2009-01-01

    Objective To devise an automated approach for integrating federated database information using database ontologies constructed from their extended metadata. Background One challenge of database federation is that the granularity of representation of equivalent data varies across systems. Dealing effectively with this problem is analogous to dealing with precoordinated vs. postcoordinated concepts in biomedical ontologies. Model Description The authors describe an approach based on ontological metadata mapping rules defined with elements of a global vocabulary, which allows a query specified at one granularity level to fetch data, where possible, from databases within the federation that use different granularities. This is implemented in OntoMediator, a newly developed production component of our previously described Query Integrator System. OntoMediator's operation is illustrated with a query that accesses three geographically separate, interoperating databases. An example based on SNOMED also illustrates the applicability of high-level rules to support the enforcement of constraints that can prevent inappropriate curator or power-user actions. Summary A rule-based framework simplifies the design and maintenance of systems where categories of data must be mapped to each other, for the purpose of either cross-database query or for curation of the contents of compositional controlled vocabularies. PMID:19567801

  4. The Arctic Observing Viewer: A Web-mapping Application for U.S. Arctic Observing Activities

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Manley, W. F.; Gaylord, A. G.; Kassin, A.; Villarreal, S.; Barba, M.; Dover, M.; Escarzaga, S. M.; Habermann, T.; Kozimor, J.; Score, R.; Tweedie, C. E.

    2015-12-01

    Although a great deal of progress has been made with various arctic observing efforts, it can be difficult to assess such progress when so many agencies, organizations, research groups and others are making such rapid progress over such a large expanse of the Arctic. To help meet the strategic needs of the U.S. SEARCH-AON program and facilitate the development of SAON and other related initiatives, the Arctic Observing Viewer (AOV; http://ArcticObservingViewer.org) has been developed. This web mapping application compiles detailed information pertaining to U.S. Arctic Observing efforts. Contributing partners include the U.S. NSF, USGS, ACADIS, ADIwg, AOOS, a2dc, AON, ARMAP, BAID, IASOA, INTERACT, and others. Over 7700 observation sites are currently in the AOV database and the application allows users to visualize, navigate, select, advance search, draw, print, and more. During 2015, the web mapping application has been enhanced by the addition of a query builder that allows users to create rich and complex queries. AOV is founded on principles of software and data interoperability and includes an emerging "Project" metadata standard, which uses ISO 19115-1 and compatible web services. Substantial efforts have focused on maintaining and centralizing all database information. In order to keep up with emerging technologies, the AOV data set has been structured and centralized within a relational database and the application front-end has been ported to HTML5 to enable mobile access. Other application enhancements include an embedded Apache Solr search platform which provides users with the capability to perform advance searches and an administration web based data management system that allows administrators to add, update, and delete information in real time. We encourage all collaborators to use AOV tools and services for their own purposes and to help us extend the impact of our efforts and ensure AOV complements other cyber-resources. Reinforcing dispersed but interoperable resources in this way will help to ensure improved capacities for conducting activities such as assessing the status of arctic observing efforts, optimizing logistic operations, and for quickly accessing external and project-focused web resources for more detailed information and access to scientific data and derived products.

  5. The National Map: from geography to mapping and back again

    USGS Publications Warehouse

    Kelmelis, John A.; DeMulder, Mark L.; Ogrosky, Charles E.; Van Driel, J. Nicholas; Ryan, Barbara J.

    2003-01-01

    When the means of production for national base mapping were capital intensive, required large production facilities, and had ill-defined markets, Federal Government mapping agencies were the primary providers of the spatial data needed for economic development, environmental management, and national defense. With desktop geographic information systems now ubiquitous, source data available as a commodity from private industry, and the realization that many complex problems faced by society need far more and different kinds of spatial data for their solutions, national mapping organizations must realign their business strategies to meet growing demand and anticipate the needs of a rapidly changing geographic information environment. The National Map of the United States builds on a sound historic foundation of describing and monitoring the land surface and adds a focused effort to produce improved understanding, modeling, and prediction of land-surface change. These added dimensions bring to bear a broader spectrum of geographic science to address extant and emerging issues. Within the overarching construct of The National Map, the U.S. Geological Survey (USGS) is making a transition from data collector to guarantor of national data completeness; from producing paper maps to supporting an online, seamless, integrated database; and from simply describing the Nation’s landscape to linking these descriptions with increased scientific understanding. Implementing the full spectrum of geographic science addresses a myriad of public policy issues, including land and natural resource management, recreation, urban growth, human health, and emergency planning, response, and recovery. Neither these issues nor the science and technologies needed to deal with them are static. A robust research agenda is needed to understand these changes and realize The National Map vision. Initial successes have been achieved. These accomplishments demonstrate the utility of The National Map to the Nation and give confidence in evolving its future applications.

  6. Digital Geologic Map of the Rosalia 1:100,000 Quadrangle, Washington and Idaho: A Digital Database for the 1990 S.Z. Waggoner Map

    USGS Publications Warehouse

    Derkey, Pamela D.; Johnson, Bruce R.; Lackaff, Beatrice B.; Derkey, Robert E.

    1998-01-01

    The geologic map of the Rosalia 1:100,000-scale quadrangle was compiled in 1990 by S.Z. Waggoner of the Washington state Division of Geology and Earth Resources. This data was entered into a geographic information system (GIS) as part of a larger effort to create regional digital geology for the Pacific Northwest. The intent was to provide a digital geospatial database for a previously published black and white paper geologic map. This database can be queried in many ways to produce a variety of geologic maps. Digital base map data files are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:100,000 (e.g., 1:62,500 or 1:24,000) as it has been somewhat generalized to fit the 1:100,000 scale map. The map area is located in eastern Washington and extends across the state border into western Idaho. This open-file report describes the methods used to convert the geologic map data into a digital format, documents the file structures, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. We wish to thank J. Eric Schuster of the Washington Division of Geology and Earth Resources for providing the original stable-base mylar and the funding for it to be scanned. We also thank Dick Blank and Barry Moring of the U.S. Geological Survey for reviewing the manuscript and digital files, respectively.

  7. Database for volcanic processes and geology of Augustine Volcano, Alaska

    USGS Publications Warehouse

    McIntire, Jacqueline; Ramsey, David W.; Thoms, Evan; Waitt, Richard B.; Beget, James E.

    2012-01-01

    This digital release contains information used to produce the geologic map published as Plate 1 in U.S. Geological Survey Professional Paper 1762 (Waitt and Begét, 2009). The main component of this digital release is a geologic map database prepared using geographic information systems (GIS) applications. This release also contains links to files to view or print the map plate, accompanying measured sections, and main report text from Professional Paper 1762. It should be noted that Augustine Volcano erupted in 2006, after the completion of the geologic mapping shown in Professional Paper 1762 and presented in this database. Information on the 2006 eruption can be found in U.S. Geological Survey Professional Paper 1769. For the most up to date information on the status of Alaska volcanoes, please refer to the U.S. Geological Survey Volcano Hazards Program website.

  8. 78 FR 43890 - Privacy Act of 1974; Department of Homeland Security, Federal Emergency Management Agency-006...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    ... titled, ``Department of Homeland Security/Federal Emergency Management Agency--006 Citizen Corps Database...) authorities; (5) purpose; (6) routine uses of information; (7) system manager and address; (8) notification... Database'' and retitle it ``DHS/FEMA--006 Citizen Corps Program System of Records.'' FEMA administers the...

  9. 45 CFR 30.13 - Debt reporting and use of credit reporting agencies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... agencies. 30.13 Section 30.13 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION... over $100 to credit bureaus or other automated databases. Debts arising under the Social Security Act..., any subsequent reporting to or updating of a credit bureau or other automated database may be handled...

  10. 45 CFR 30.13 - Debt reporting and use of credit reporting agencies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... agencies. 30.13 Section 30.13 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION... over $100 to credit bureaus or other automated databases. Debts arising under the Social Security Act..., any subsequent reporting to or updating of a credit bureau or other automated database may be handled...

  11. 45 CFR 30.13 - Debt reporting and use of credit reporting agencies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... agencies. 30.13 Section 30.13 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION... over $100 to credit bureaus or other automated databases. Debts arising under the Social Security Act..., any subsequent reporting to or updating of a credit bureau or other automated database may be handled...

  12. 45 CFR 30.13 - Debt reporting and use of credit reporting agencies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... agencies. 30.13 Section 30.13 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION... over $100 to credit bureaus or other automated databases. Debts arising under the Social Security Act..., any subsequent reporting to or updating of a credit bureau or other automated database may be handled...

  13. Geologic Surface Effects of Underground Nuclear Testing, Buckboard Mesa, Climax Stock, Dome Mountain, Frenchman Flat, Rainier/Aqueduct Mesa, and Shoshone Mountain, Nevada Test Site, Nevada

    USGS Publications Warehouse

    Grasso, Dennis N.

    2003-01-01

    Surface effects maps were produced for 72 of 89 underground detonations conducted at the Frenchman Flat, Rainier Mesa and Aqueduct Mesa, Climax Stock, Shoshone Mountain, Buckboard Mesa, and Dome Mountain testing areas of the Nevada Test Site between August 10, 1957 (Saturn detonation, Area 12) and September 18, 1992 (Hunters Trophy detonation, Area 12). The ?Other Areas? Surface Effects Map Database, which was used to construct the maps shown in this report, contains digital reproductions of these original maps. The database is provided in both ArcGIS (v. 8.2) geodatabase format and ArcView (v. 3.2) shapefile format. This database contains sinks, cracks, faults, and other surface effects having a combined (cumulative) length of 136.38 km (84.74 mi). In GIS digital format, the user can view all surface effects maps simultaneously, select and view the surface effects of one or more sites of interest, or view specific surface effects by area or site. Three map layers comprise the database. They are: (1) the surface effects maps layer (oase_n27f), (2) the bar symbols layer (oase_bar_n27f), and (3) the ball symbols layer (oase_ball_n27f). Additionally, an annotation layer, named 'Ball_and_Bar_Labels,' and a polygon features layer, named 'Area12_features_poly_n27f,' are contained in the geodatabase version of the database. The annotation layer automatically labels all 295 ball-and-bar symbols shown on these maps. The polygon features layer displays areas of ground disturbances, such as rock spall and disturbed ground caused by the detonations. Shapefile versions of the polygon features layer in Nevada State Plane and Universal Transverse Mercator projections, named 'area12_features_poly_n27f.shp' and 'area12_features_poly_u83m.shp,' are also provided in the archive.

  14. TabSQL: a MySQL tool to facilitate mapping user data to public databases.

    PubMed

    Xia, Xiao-Qin; McClelland, Michael; Wang, Yipeng

    2010-06-23

    With advances in high-throughput genomics and proteomics, it is challenging for biologists to deal with large data files and to map their data to annotations in public databases. We developed TabSQL, a MySQL-based application tool, for viewing, filtering and querying data files with large numbers of rows. TabSQL provides functions for downloading and installing table files from public databases including the Gene Ontology database (GO), the Ensembl databases, and genome databases from the UCSC genome bioinformatics site. Any other database that provides tab-delimited flat files can also be imported. The downloaded gene annotation tables can be queried together with users' data in TabSQL using either a graphic interface or command line. TabSQL allows queries across the user's data and public databases without programming. It is a convenient tool for biologists to annotate and enrich their data.

  15. TabSQL: a MySQL tool to facilitate mapping user data to public databases

    PubMed Central

    2010-01-01

    Background With advances in high-throughput genomics and proteomics, it is challenging for biologists to deal with large data files and to map their data to annotations in public databases. Results We developed TabSQL, a MySQL-based application tool, for viewing, filtering and querying data files with large numbers of rows. TabSQL provides functions for downloading and installing table files from public databases including the Gene Ontology database (GO), the Ensembl databases, and genome databases from the UCSC genome bioinformatics site. Any other database that provides tab-delimited flat files can also be imported. The downloaded gene annotation tables can be queried together with users' data in TabSQL using either a graphic interface or command line. Conclusions TabSQL allows queries across the user's data and public databases without programming. It is a convenient tool for biologists to annotate and enrich their data. PMID:20573251

  16. Map showing geologic terranes of the Hailey 1 degree x 2 degrees quadrangle and the western part of the Idaho Falls 1 degree x 2 degrees quadrangle, south-central Idaho

    USGS Publications Warehouse

    Worl, R.G.; Johnson, K.M.

    1995-01-01

    The paper version of Map Showing Geologic Terranes of the Hailey 1x2 Quadrangle and the western part of the Idaho Falls 1x2 Quadrangle, south-central Idaho was compiled by Ron Worl and Kate Johnson in 1995. The plate was compiled on a 1:250,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a geographic information system database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  17. Flood inundation map library, Fort Kent, Maine

    USGS Publications Warehouse

    Lombard, Pamela J.

    2012-01-01

    Severe flooding occurred in northern Maine from April 28 to May 1, 2008, and damage was extensive in the town of Fort Kent (Lombard, 2010). Aroostook County was declared a Federal disaster area on May 9, 2008. The extent of flooding on both the Fish and St. John Rivers during this event showed that the current Federal Emergency Management Agency (FEMA) Flood Insurance Study (FIS) and Flood Insurance Rate Map (FIRM) (Federal Emergency Management Agency, 1979) were out of date. The U.S. Geological Survey (USGS) conducted a study to develop a flood inundation map library showing the areas and depths for a range of flood stages from bankfull to the flood of record for Fort Kent to complement an updated FIS (Federal Emergency Management Agency, in press). Hydrologic analyses that support the maps include computer models with and without the levee and with various depths of backwater on the Fish River. This fact sheet describes the methods used to develop the maps and describes how the maps can be accessed.

  18. Geologic Map Database of Texas

    USGS Publications Warehouse

    Stoeser, Douglas B.; Shock, Nancy; Green, Gregory N.; Dumonceaux, Gayle M.; Heran, William D.

    2005-01-01

    The purpose of this report is to release a digital geologic map database for the State of Texas. This database was compiled for the U.S. Geological Survey (USGS) Minerals Program, National Surveys and Analysis Project, whose goal is a nationwide assemblage of geologic, geochemical, geophysical, and other data. This release makes the geologic data from the Geologic Map of Texas available in digital format. Original clear film positives provided by the Texas Bureau of Economic Geology were photographically enlarged onto Mylar film. These films were scanned, georeferenced, digitized, and attributed by Geologic Data Systems (GDS), Inc., Denver, Colorado. Project oversight and quality control was the responsibility of the U.S. Geological Survey. ESRI ArcInfo coverages, AMLs, and shapefiles are provided.

  19. Preliminary investigation of submerged aquatic vegetation mapping using hyperspectral remote sensing.

    PubMed

    William, David J; Rybicki, Nancy B; Lombana, Alfonso V; O'Brien, Tim M; Gomez, Richard B

    2003-01-01

    The use of airborne hyperspectral remote sensing imagery for automated mapping of submerged aquatic vegetation (SAV) in the tidal Potomac River was investigated for near to real-time resource assessment and monitoring. Airborne hyperspectral imagery and field spectrometer measurements were obtained in October of 2000. A spectral library database containing selected ground-based and airborne sensor spectra was developed for use in image processing. The spectral library is used to automate the processing of hyperspectral imagery for potential real-time material identification and mapping. Field based spectra were compared to the airborne imagery using the database to identify and map two species of SAV (Myriophyllum spicatum and Vallisneria americana). Overall accuracy of the vegetation maps derived from hyperspectral imagery was determined by comparison to a product that combined aerial photography and field based sampling at the end of the SAV growing season. The algorithms and databases developed in this study will be useful with the current and forthcoming space-based hyperspectral remote sensing systems.

  20. Bedrock geologic map of the Grafton quadrangle, Worcester County, Massachusetts

    USGS Publications Warehouse

    Walsh, Gregory J.; Aleinikoff, John N.; Dorais, Michael J.

    2011-01-01

    The bedrock geology of the 7.5-minute Grafton, Massachusetts, quadrangle consists of deformed Neoproterozoic to early Paleozoic crystalline metamorphic and intrusive igneous rocks. Neoproterozoic intrusive, metasedimentary, and metavolcanic rocks crop out in the Avalon zone, and Cambrian to Silurian intrusive, metasedimentary, and metavolcanic rocks crop out in the Nashoba zone. Rocks of the Avalon and Nashoba zones, or terranes, are separated by the Bloody Bluff fault. The bedrock geology was mapped to study the tectonic history of the area and to provide a framework for ongoing hydrogeologic characterization of the fractured bedrock of Massachusetts. This report presents mapping by G.J. Walsh, geochronology by J.N. Aleinikoff, geochemistry by M.J. Dorais, and consists of a map, text pamphlet, and GIS database. The map and text pamphlet are available in paper format or as downloadable files (see frame at right). The GIS database is available for download. The database includes contacts of bedrock geologic units, faults, outcrops, structural geologic information, and photographs.

  1. Urban Groundwater Mapping - Bucharest City Area Case Study

    NASA Astrophysics Data System (ADS)

    Gaitanaru, Dragos; Radu Gogu, Constantin; Bica, Ioan; Anghel, Leonard; Amine Boukhemacha, Mohamed; Ionita, Angela

    2013-04-01

    Urban Groundwater Mapping (UGM) is a generic term for a collection of procedures and techniques used to create targeted cartographic representation of the groundwater related aspects in urban areas. The urban environment alters the physical and chemical characteristics of the underneath aquifers. The scale of the pressure is controlled by the urban development in time and space. To have a clear image on the spatial and temporal distribution of different groundwater- urban structures interaction we need a set of thematic maps is needed. In the present study it is described the methodological approach used to obtain a reliable cartographic product for Bucharest City area. The first step in the current study was to identify the groundwater related problems and aspects (changes in the groundwater table, infiltration and seepage from and to the city sewer network, contamination spread to all three aquifers systems located in quaternary sedimentary formations, dewatering impact for large underground structures, management and political drawbacks). The second step was data collection and validation. In urban areas there is a big spectrum of data providers related to groundwater. Due to the fact that data is produced and distributed by different types of organizations (national agencies, private companies, municipal water regulator, etc) the validation and cross check process is mandatory. The data is stored and managed by a geospatial database. The design of the database follows an object-orientated paradigm and is easily extensible. The third step consists of a set of procedures based on a multi criteria assessment that creates the specific setup for the thematic maps. The assessment is based on the following criteria: (1) scale effect , (2) time , (3) vertical distribution and (4) type of the groundwater related problem. The final step is the cartographic representation. In this final step the urban groundwater maps are created. All the methodological steps are doubled by programmed procedures developed in a groundwater management platform for urban areas. The core of the procedures is represented by a set of well defined hydrogeological set of geospatial queries. The cartographic products (urban groundwater maps) can be used by different types of users: civil engineers, urban planners, scientist as well as decision and policies makers.

  2. International Energy: Subject Thesaurus. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The International Energy Agency: Subject Thesaurus contains the standard vocabulary of indexing terms (descriptors) developed and structured to build and maintain energy information databases. Involved in this cooperative task are (1) the technical staff of the USDOE Office of Scientific and Technical Information (OSTI) in cooperation with the member countries of the International Energy Agency`s Energy Technology Data Exchange (ETDE) and (2) the International Atomic Energy Agency`s International Nuclear Information System (INIS) staff representing the more than 100 countries and organizations that record and index information for the international nuclear information community. ETDE member countries are also members of INIS.more » Nuclear information prepared for INIS by ETDE member countries is included in the ETDE Energy Database, which contains the online equivalent of the printed INIS Atomindex. Indexing terminology is therefore cooperatively standardized for use in both information systems. This structured vocabulary reflects thscope of international energy research, development, and technological programs. The terminology of this thesaurus aids in subject searching on commercial systems, such as ``Energy Science & Technology`` by DIALOG Information Services, ``Energy`` by STN International and the ``ETDE Energy Database`` by SilverPlatter. It is also the thesaurus for the Integrated Technical Information System (ITIS) online databases of the US Department of Energy.« less

  3. MareyMap Online: A User-Friendly Web Application and Database Service for Estimating Recombination Rates Using Physical and Genetic Maps.

    PubMed

    Siberchicot, Aurélie; Bessy, Adrien; Guéguen, Laurent; Marais, Gabriel A B

    2017-10-01

    Given the importance of meiotic recombination in biology, there is a need to develop robust methods to estimate meiotic recombination rates. A popular approach, called the Marey map approach, relies on comparing genetic and physical maps of a chromosome to estimate local recombination rates. In the past, we have implemented this approach in an R package called MareyMap, which includes many functionalities useful to get reliable recombination rate estimates in a semi-automated way. MareyMap has been used repeatedly in studies looking at the effect of recombination on genome evolution. Here, we propose a simpler user-friendly web service version of MareyMap, called MareyMap Online, which allows a user to get recombination rates from her/his own data or from a publicly available database that we offer in a few clicks. When the analysis is done, the user is asked whether her/his curated data can be placed in the database and shared with other users, which we hope will make meta-analysis on recombination rates including many species easy in the future. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  4. The Global Evidence Mapping Initiative: scoping research in broad topic areas.

    PubMed

    Bragge, Peter; Clavisi, Ornella; Turner, Tari; Tavender, Emma; Collie, Alex; Gruen, Russell L

    2011-06-17

    Evidence mapping describes the quantity, design and characteristics of research in broad topic areas, in contrast to systematic reviews, which usually address narrowly-focused research questions. The breadth of evidence mapping helps to identify evidence gaps, and may guide future research efforts. The Global Evidence Mapping (GEM) Initiative was established in 2007 to create evidence maps providing an overview of existing research in Traumatic Brain Injury (TBI) and Spinal Cord Injury (SCI). The GEM evidence mapping method involved three core tasks:1. Setting the boundaries and context of the map: Definitions for the fields of TBI and SCI were clarified, the prehospital, acute inhospital and rehabilitation phases of care were delineated and relevant stakeholders (patients, carers, clinicians, researchers and policymakers) who could contribute to the mapping were identified. Researchable clinical questions were developed through consultation with key stakeholders and a broad literature search. 2. Searching for and selection of relevant studies: Evidence search and selection involved development of specific search strategies, development of inclusion and exclusion criteria, searching of relevant databases and independent screening and selection by two researchers. 3. Reporting on yield and study characteristics: Data extraction was performed at two levels - 'interventions and study design' and 'detailed study characteristics'. The evidence map and commentary reflected the depth of data extraction. One hundred and twenty-nine researchable clinical questions in TBI and SCI were identified. These questions were then prioritised into high (n = 60) and low (n = 69) importance by the stakeholders involved in question development. Since 2007, 58 263 abstracts have been screened, 3 731 full text articles have been reviewed and 1 644 relevant neurotrauma publications have been mapped, covering fifty-three high priority questions. GEM Initiative evidence maps have a broad range of potential end-users including funding agencies, researchers and clinicians. Evidence mapping is at least as resource-intensive as systematic reviewing. The GEM Initiative has made advancements in evidence mapping, most notably in the area of question development and prioritisation. Evidence mapping complements other review methods for describing existing research, informing future research efforts, and addressing evidence gaps.

  5. The Global Evidence Mapping Initiative: Scoping research in broad topic areas

    PubMed Central

    2011-01-01

    Background Evidence mapping describes the quantity, design and characteristics of research in broad topic areas, in contrast to systematic reviews, which usually address narrowly-focused research questions. The breadth of evidence mapping helps to identify evidence gaps, and may guide future research efforts. The Global Evidence Mapping (GEM) Initiative was established in 2007 to create evidence maps providing an overview of existing research in Traumatic Brain Injury (TBI) and Spinal Cord Injury (SCI). Methods The GEM evidence mapping method involved three core tasks: 1. Setting the boundaries and context of the map: Definitions for the fields of TBI and SCI were clarified, the prehospital, acute inhospital and rehabilitation phases of care were delineated and relevant stakeholders (patients, carers, clinicians, researchers and policymakers) who could contribute to the mapping were identified. Researchable clinical questions were developed through consultation with key stakeholders and a broad literature search. 2. Searching for and selection of relevant studies: Evidence search and selection involved development of specific search strategies, development of inclusion and exclusion criteria, searching of relevant databases and independent screening and selection by two researchers. 3. Reporting on yield and study characteristics: Data extraction was performed at two levels - 'interventions and study design' and 'detailed study characteristics'. The evidence map and commentary reflected the depth of data extraction. Results One hundred and twenty-nine researchable clinical questions in TBI and SCI were identified. These questions were then prioritised into high (n = 60) and low (n = 69) importance by the stakeholders involved in question development. Since 2007, 58 263 abstracts have been screened, 3 731 full text articles have been reviewed and 1 644 relevant neurotrauma publications have been mapped, covering fifty-three high priority questions. Conclusions GEM Initiative evidence maps have a broad range of potential end-users including funding agencies, researchers and clinicians. Evidence mapping is at least as resource-intensive as systematic reviewing. The GEM Initiative has made advancements in evidence mapping, most notably in the area of question development and prioritisation. Evidence mapping complements other review methods for describing existing research, informing future research efforts, and addressing evidence gaps. PMID:21682870

  6. Preliminary geologic map of the eastern Willapa Hills, Cowlitz, Lewis, and Wahkiakum Counties, Washington

    USGS Publications Warehouse

    Wells, Ray E.; Sawlan, Michael G.

    2014-01-01

    This digital map database and the PDF derived from the database were created from the analog geologic map: Wells, R.E. (1981), “Geologic map of the eastern Willapa Hills, Cowlitz, Lewis, and Wahkiakum Counties, Washington.” The geodatabase replicates the geologic mapping of the 1981 report with minor exceptions along water boundaries and also along the north and south map boundaries. Slight adjustments to contacts along water boundaries were made to correct differences between the topographic base map used in the 1981 compilation (analog USGS 15-minute series quadrangle maps at 1:62,500 scale) and the base map used for this digital compilation (scanned USGS 7.5-minute series quadrangle maps at 1:24,000 scale). These minor adjustments, however, did not materially alter the geologic map. No new field mapping was performed to create this digital map database, and no attempt was made to fit geologic contacts to the new 1:24,000 topographic base, except as noted above. We corrected typographical errors, formatting errors, and attribution errors (for example, the name change of Goble Volcanics to Grays River Volcanics following current State of Washington usage; Walsh and others, 1987). We also updated selected references, substituted published papers for abstracts, and cited published radiometric ages for the volcanic and plutonic rocks. The reader is referred to Magill and others (1982), Wells and Coe (1985), Walsh and others (1987), Moothart (1993), Payne (1998), Kleibacker (2001), McCutcheon (2003), Wells and others (2009), Chan and others (2012), and Wells and others (in press) for subsequent interpretations of the Willapa Hills geology.

  7. Assessing Hydrologic Impacts of Future Land Cover Change ...

    EPA Pesticide Factsheets

    Long‐term land‐use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed on the San Pedro River Basin to characterize hydrologic impacts from future urban growth through time. This methodology was then expanded and utilized to characterize the changing hydrology on the South Platte River Basin. Future urban growth is represented by housingdensity maps generated in decadal intervals from 2010 to 2100, produced by the U.S. Environmental Protection Agency (EPA) Integrated Climate and Land‐Use Scenarios (ICLUS) project. ICLUS developed future housing density maps by adapting the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) social, economic, and demographic storylines to the conterminous United States. To characterize hydrologic impacts from future growth, the housing density maps were reclassified to National Land Cover Database (NLCD) 2006 land cover classes and used to parameterize the Soil and Water Assessment Tool (SWAT) using the Automated Geospatial Watershed Assessment (AGWA) tool. The objectives of this project were to 1) develop and describe a methodology for adapting the ICLUS data for use in AGWA as anapproach to evaluate basin‐wide impacts of development on water‐quantity and ‐quality, 2) present initial results from the application of the methodology to

  8. Neighborhood scale quantification of ecosystem goods and ...

    EPA Pesticide Factsheets

    Ecosystem goods and services are those ecological structures and functions that humans can directly relate to their state of well-being. Ecosystem goods and services include, but are not limited to, a sufficient fresh water supply, fertile lands to produce agricultural products, shading, air and water of sufficient quality for designated uses, flood water retention, and places to recreate. The US Environmental Protection Agency (USEPA) Office of Research and Development’s Tampa Bay Ecosystem Services Demonstration Project (TBESDP) modeling efforts organized existing literature values for biophysical attributes and processes related to EGS. The goal was to develop a database for informing mapped-based EGS assessments for current and future land cover/use scenarios at multiple scales. This report serves as a demonstration of applying an EGS assessment approach at the large neighborhood scale (~1,000 acres of residential parcels plus common areas). Here, we present mapped inventories of ecosystem goods and services production at a neighborhood scale within the Tampa Bay, FL region. Comparisons of the inventory between two alternative neighborhood designs are presented as an example of how one might apply EGS concepts at this scale.

  9. Research on computer virus database management system

    NASA Astrophysics Data System (ADS)

    Qi, Guoquan

    2011-12-01

    The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex. Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus database, the communication between each other lacks, or virus information is incomplete, or a small number of sample information. This paper introduces the current construction status of the virus database at home and abroad, analyzes how to standardize and complete description of virus characteristics, and then gives the information integrity, storage security and manageable computer virus database design scheme.

  10. 77 FR 46104 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ... DEPARTMENT OF HOMELAND SECURITY Federal Emergency Management Agency [Internal Agency Docket No... inspection at both the online location and the respective Community Map Repository address listed in the... online through the FEMA Map Service Center at www.msc.fema.gov for comparison. You may submit comments...

  11. 77 FR 18766 - Proposed Flood Elevation Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... DEPARTMENT OF HOMELAND SECURITY Federal Emergency Management Agency 44 CFR Part 67 [Docket ID FEMA-2010-0003; Internal Agency Docket No. FEMA-B-1114] Proposed Flood Elevation Determinations Correction... locations above. Please refer to the revised Flood Insurance Rate Map located at the community map...

  12. 42 CFR 488.68 - State Agency responsibilities for OASIS collection and data base requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... operating the OASIS system: (a) Establish and maintain an OASIS database. The State agency or other entity designated by CMS must— (1) Use a standard system developed or approved by CMS to collect, store, and analyze..., system back-up, and monitoring the status of the database; and (3) Obtain CMS approval before modifying...

  13. Semi Automated Land Cover Layer Updating Process Utilizing Spectral Analysis and GIS Data Fusion

    NASA Astrophysics Data System (ADS)

    Cohen, L.; Keinan, E.; Yaniv, M.; Tal, Y.; Felus, A.; Regev, R.

    2018-04-01

    Technological improvements made in recent years of mass data gathering and analyzing, influenced the traditional methods of updating and forming of the national topographic database. It has brought a significant increase in the number of use cases and detailed geo information demands. Processes which its purpose is to alternate traditional data collection methods developed in many National Mapping and Cadaster Agencies. There has been significant progress in semi-automated methodologies aiming to facilitate updating of a topographic national geodatabase. Implementation of those is expected to allow a considerable reduction of updating costs and operation times. Our previous activity has focused on building automatic extraction (Keinan, Zilberstein et al, 2015). Before semiautomatic updating method, it was common that interpreter identification has to be as detailed as possible to hold most reliable database eventually. When using semi-automatic updating methodologies, the ability to insert human insights based knowledge is limited. Therefore, our motivations were to reduce the created gap by allowing end-users to add their data inputs to the basic geometric database. In this article, we will present a simple Land cover database updating method which combines insights extracted from the analyzed image, and a given spatial data of vector layers. The main stages of the advanced practice are multispectral image segmentation and supervised classification together with given vector data geometric fusion while maintaining the principle of low shape editorial work to be done. All coding was done utilizing open source software components.

  14. Rapid Damage Mapping for the 2015 M7.8 Gorkha Earthquake using Synthetic Aperture Radar Data from COSMO-SkyMed and ALOS-2 Satellites

    NASA Astrophysics Data System (ADS)

    Yun, S. H.; Hudnut, K. W.; Owen, S. E.; Webb, F.; Simons, M.; Macdonald, A.; Sacco, P.; Gurrola, E. M.; Manipon, G.; Liang, C.; Fielding, E. J.; Milillo, P.; Hua, H.; Coletta, A.

    2015-12-01

    The April 25, 2015 M7.8 Gorkha earthquake caused more than 8,000 fatalities and widespread building damage in central Nepal. Four days after the earthquake, the Italian Space Agency's (ASI's) COSMO-SkyMed Synthetic Aperture Radar (SAR) satellite acquired data over Kathmandu area. Nine days after the earthquake, the Japan Aerospace Exploration Agency's (JAXA's) ALOS-2 SAR satellite covered larger area. Using these radar observations, we rapidly produced damage proxy maps derived from temporal changes in Interferometric SAR (InSAR) coherence. These maps were qualitatively validated through comparison with independent damage analyses by National Geospatial-Intelligence Agency (NGA) and the UNITAR's (United Nations Institute for Training and Research's) Operational Satellite Applications Programme (UNOSAT), and based on our own visual inspection of DigitalGlobe's WorldView optical pre- vs. post-event imagery. Our maps were quickly released to responding agencies and the public, and used for damage assessment, determining inspection/imaging priorities, and reconnaissance fieldwork.

  15. 32 CFR Appendix F to Part 286 - DoD Freedom of Information Act Program Components

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Defense Information Systems Agency Defense Contract Audit Agency Defense Intelligence Agency Defense Security Service Defense Logistics Agency National Imagery and Mapping Agency Defense Special Weapons Agency National Security Agency Office of the Inspector General, Department of Defense Defense Finance...

  16. Integrating Radar Image Data with Google Maps

    NASA Technical Reports Server (NTRS)

    Chapman, Bruce D.; Gibas, Sarah

    2010-01-01

    A public Web site has been developed as a method for displaying the multitude of radar imagery collected by NASA s Airborne Synthetic Aperture Radar (AIRSAR) instrument during its 16-year mission. Utilizing NASA s internal AIRSAR site, the new Web site features more sophisticated visualization tools that enable the general public to have access to these images. The site was originally maintained at NASA on six computers: one that held the Oracle database, two that took care of the software for the interactive map, and three that were for the Web site itself. Several tasks were involved in moving this complicated setup to just one computer. First, the AIRSAR database was migrated from Oracle to MySQL. Then the back-end of the AIRSAR Web site was updated in order to access the MySQL database. To do this, a few of the scripts needed to be modified; specifically three Perl scripts that query that database. The database connections were then updated from Oracle to MySQL, numerous syntax errors were corrected, and a query was implemented that replaced one of the stored Oracle procedures. Lastly, the interactive map was designed, implemented, and tested so that users could easily browse and access the radar imagery through the Google Maps interface.

  17. An integrated photogrammetric and spatial database management system for producing fully structured data using aerial and remote sensing images.

    PubMed

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.

  18. Relational Database for the Geology of the Northern Rocky Mountains - Idaho, Montana, and Washington

    USGS Publications Warehouse

    Causey, J. Douglas; Zientek, Michael L.; Bookstrom, Arthur A.; Frost, Thomas P.; Evans, Karl V.; Wilson, Anna B.; Van Gosen, Bradley S.; Boleneus, David E.; Pitts, Rebecca A.

    2008-01-01

    A relational database was created to prepare and organize geologic map-unit and lithologic descriptions for input into a spatial database for the geology of the northern Rocky Mountains, a compilation of forty-three geologic maps for parts of Idaho, Montana, and Washington in U.S. Geological Survey Open File Report 2005-1235. Not all of the information was transferred to and incorporated in the spatial database due to physical file limitations. This report releases that part of the relational database that was completed for that earlier product. In addition to descriptive geologic information for the northern Rocky Mountains region, the relational database contains a substantial bibliography of geologic literature for the area. The relational database nrgeo.mdb (linked below) is available in Microsoft Access version 2000, a proprietary database program. The relational database contains data tables and other tables used to define terms, relationships between the data tables, and hierarchical relationships in the data; forms used to enter data; and queries used to extract data.

  19. BiKEGG: a COBRA toolbox extension for bridging the BiGG and KEGG databases.

    PubMed

    Jamialahmadi, Oveis; Motamedian, Ehsan; Hashemi-Najafabadi, Sameereh

    2016-10-18

    Development of an interface tool between the Biochemical, Genetic and Genomic (BiGG) and KEGG databases is necessary for simultaneous access to the features of both databases. For this purpose, we present the BiKEGG toolbox, an open source COBRA toolbox extension providing a set of functions to infer the reaction correspondences between the KEGG reaction identifiers and those in the BiGG knowledgebase using a combination of manual verification and computational methods. Inferred reaction correspondences using this approach are supported by evidence from the literature, which provides a higher number of reconciled reactions between these two databases compared to the MetaNetX and MetRxn databases. This set of equivalent reactions is then used to automatically superimpose the predicted fluxes using COBRA methods on classical KEGG pathway maps or to create a customized metabolic map based on the KEGG global metabolic pathway, and to find the corresponding reactions in BiGG based on the genome annotation of an organism in the KEGG database. Customized metabolic maps can be created for a set of pathways of interest, for the whole KEGG global map or exclusively for all pathways for which there exists at least one flux carrying reaction. This flexibility in visualization enables BiKEGG to indicate reaction directionality as well as to visualize the reaction fluxes for different static or dynamic conditions in an animated manner. BiKEGG allows the user to export (1) the output visualized metabolic maps to various standard image formats or save them as a video or animated GIF file, and (2) the equivalent reactions for an organism as an Excel spreadsheet.

  20. A search map for organic additives and solvents applicable in high-voltage rechargeable batteries.

    PubMed

    Park, Min Sik; Park, Insun; Kang, Yoon-Sok; Im, Dongmin; Doo, Seok-Gwang

    2016-09-29

    Chemical databases store information such as molecular formulas, chemical structures, and the physical and chemical properties of compounds. Although the massive databases of organic compounds exist, the search of target materials is constrained by a lack of physical and chemical properties necessary for specific applications. With increasing interest in the development of energy storage systems such as high-voltage rechargeable batteries, it is critical to find new electrolytes efficiently. Here we build a search map to screen organic additives and solvents with novel core and functional groups, and thus establish a database of electrolytes to identify the most promising electrolyte for high-voltage rechargeable batteries. This search map is generated from MAssive Molecular Map BUilder (MAMMBU) by combining a high-throughput quantum chemical simulation with an artificial neural network algorithm. MAMMBU is designed for predicting the oxidation and reduction potentials of organic compounds existing in the massive organic compound database, PubChem. We develop a search map composed of ∼1 000 000 redox potentials and elucidate the quantitative relationship between the redox potentials and functional groups. Finally, we screen a quinoxaline compound for an anode additive and apply it to electrolytes and improve the capacity retention from 64.3% to 80.8% near 200 cycles for a lithium ion battery in experiments.

  1. Vegetation database for land-cover mapping, Clark and Lincoln Counties, Nevada

    USGS Publications Warehouse

    Charlet, David A.; Damar, Nancy A.; Leary, Patrick J.

    2014-01-01

    Floristic and other vegetation data were collected at 3,175 sample sites to support land-cover mapping projects in Clark and Lincoln Counties, Nevada, from 2007 to 2013. Data were collected at sample sites that were selected to fulfill mapping priorities by one of two different plot sampling approaches. Samples were described at the stand level and classified into the National Vegetation Classification hierarchy at the alliance level and above. The vegetation database is presented in geospatial and tabular formats.

  2. Geologic map and digital database of the Porcupine Wash 7.5 minute Quadrangle, Riverside County, southern California

    USGS Publications Warehouse

    Powell, Robert E.

    2001-01-01

    This data set maps and describes the geology of the Porcupine Wash 7.5 minute quadrangle, Riverside County, southern California. The quadrangle, situated in Joshua Tree National Park in the eastern Transverse Ranges physiographic and structural province, encompasses parts of the Hexie Mountains, Cottonwood Mountains, northern Eagle Mountains, and south flank of Pinto Basin. It is underlain by a basement terrane comprising Proterozoic metamorphic rocks, Mesozoic plutonic rocks, and Mesozoic and Mesozoic or Cenozoic hypabyssal dikes. The basement terrane is capped by a widespread Tertiary erosion surface preserved in remnants in the Eagle and Cottonwood Mountains and buried beneath Cenozoic deposits in Pinto Basin. Locally, Miocene basalt overlies the erosion surface. A sequence of at least three Quaternary pediments is planed into the north piedmont of the Eagle and Hexie Mountains, each in turn overlain by successively younger residual and alluvial deposits. The Tertiary erosion surface is deformed and broken by north-northwest-trending, high-angle, dip-slip faults and an east-west trending system of high-angle dip- and left-slip faults. East-west trending faults are younger than and perhaps in part coeval with faults of the northwest-trending set. The Porcupine Wash database was created using ARCVIEW and ARC/INFO, which are geographical information system (GIS) software products of Envronmental Systems Research Institute (ESRI). The database consists of the following items: (1) a map coverage showing faults and geologic contacts and units, (2) a separate coverage showing dikes, (3) a coverage showing structural data, (4) a scanned topographic base at a scale of 1:24,000, and (5) attribute tables for geologic units (polygons and regions), contacts (arcs), and site-specific data (points). The database, accompanied by a pamphlet file and this metadata file, also includes the following graphic and text products: (1) A portable document file (.pdf) containing a navigable graphic of the geologic map on a 1:24,000 topographic base. The map is accompanied by a marginal explanation consisting of a Description of Map and Database Units (DMU), a Correlation of Map and Database Units (CMU), and a key to point-and line-symbols. (2) Separate .pdf files of the DMU and CMU, individually. (3) A PostScript graphic-file containing the geologic map on a 1:24,000 topographic base accompanied by the marginal explanation. (4) A pamphlet that describes the database and how to access it. Within the database, geologic contacts , faults, and dikes are represented as lines (arcs), geologic units as polygons and regions, and site-specific data as points. Polygon, arc, and point attribute tables (.pat, .aat, and .pat, respectively) uniquely identify each geologic datum and link it to other tables (.rel) that provide more detailed geologic information.

  3. National Rehabilitation Information Center

    MedlinePlus

    ... search the NARIC website or one of our databases Select a database or search for a webpage A NARIC webpage ... Projects conducting research and/or development (NIDILRR Program Database). Organizations, agencies, and online resources that support people ...

  4. Possible costs associated with investigating and mitigating geologic hazards in rural areas of western San Mateo County, California with a section on using the USGS website to determine the cost of developing property for residences in rural parts of San Mateo County, California

    USGS Publications Warehouse

    Brabb, Earl E.; Roberts, Sebastian; Cotton, William R.; Kropp, Alan L.; Wright, Robert H.; Zinn, Erik N.; Digital database by Roberts, Sebastian; Mills, Suzanne K.; Barnes, Jason B.; Marsolek, Joanna E.

    2000-01-01

    This publication consists of a digital map database on a geohazards web site, http://kaibab.wr.usgs.gov/geohazweb/intro.htm, this text, and 43 digital map images available for downloading at this site. The report is stored as several digital files, in ARC export (uncompressed) format for the database, and Postscript and PDF formats for the map images. Several of the source data layers for the images have already been released in other publications by the USGS and are available for downloading on the Internet. These source layers are not included in this digital database, but rather a reference is given for the web site where the data can be found in digital format. The exported ARC coverages and grids lie in UTM zone 10 projection. The pamphlet, which only describes the content and character of the digital map database, is included as Postscript, PDF, and ASCII text files and is also available on paper as USGS Open-File Report 00-127. The full versatility of the spatial database is realized by importing the ARC export files into ARC/INFO or an equivalent GIS. Other GIS packages, including MapInfo and ARCVIEW, can also use the ARC export files. The Postscript map image can be used for viewing or plotting in computer systems with sufficient capacity, and the considerably smaller PDF image files can be viewed or plotted in full or in part from Adobe ACROBAT software running on Macintosh, PC, or UNIX platforms.

  5. Review and critical appraisal of studies mapping from quality of life or clinical measures to EQ-5D: an online database and application of the MAPS statement.

    PubMed

    Dakin, Helen; Abel, Lucy; Burns, Richéal; Yang, Yaling

    2018-02-12

    The Health Economics Research Centre (HERC) Database of Mapping Studies was established in 2013, based on a systematic review of studies developing mapping algorithms predicting EQ-5D. The Mapping onto Preference-based measures reporting Standards (MAPS) statement was published in 2015 to improve reporting of mapping studies. We aimed to update the systematic review and assess the extent to which recently-published studies mapping condition-specific quality of life or clinical measures to the EQ-5D follow the guidelines published in the MAPS Reporting Statement. A published systematic review was updated using the original inclusion criteria to include studies published by December 2016. We included studies reporting novel algorithms mapping from any clinical measure or patient-reported quality of life measure to either the EQ-5D-3L or EQ-5D-5L. Titles and abstracts of all identified studies and the full text of papers published in 2016 were assessed against the MAPS checklist. The systematic review identified 144 mapping studies reporting 190 algorithms mapping from 110 different source instruments to EQ-5D. Of the 17 studies published in 2016, nine (53%) had titles that followed the MAPS statement guidance, although only two (12%) had abstracts that fully addressed all MAPS items. When the full text of these papers was assessed against the complete MAPS checklist, only two studies (12%) were found to fulfil or partly fulfil all criteria. Of the 141 papers (across all years) that included abstracts, the items on the MAPS statement checklist that were fulfilled by the largest number of studies comprised having a structured abstract (95%) and describing target instruments (91%) and source instruments (88%). The number of published mapping studies continues to increase. Our updated database provides a convenient way to identify mapping studies for use in cost-utility analysis. Most recent studies do not fully address all items on the MAPS checklist.

  6. Engineering geological mapping in Wallonia (Belgium) : present state and recent computerized approach

    NASA Astrophysics Data System (ADS)

    Delvoie, S.; Radu, J.-P.; Ruthy, I.; Charlier, R.

    2012-04-01

    An engineering geological map can be defined as a geological map with a generalized representation of all the components of a geological environment which are strongly required for spatial planning, design, construction and maintenance of civil engineering buildings. In Wallonia (Belgium) 24 engineering geological maps have been developed between the 70s and the 90s at 1/5,000 or 1/10,000 scale covering some areas of the most industrialized and urbanized cities (Liège, Charleroi and Mons). They were based on soil and subsoil data point (boring, drilling, penetration test, geophysical test, outcrop…). Some displayed data present the depth (with isoheights) or the thickness (with isopachs) of the different subsoil layers up to about 50 m depth. Information about geomechanical properties of each subsoil layer, useful for engineers and urban planners, is also synthesized. However, these maps were built up only on paper and progressively needed to be updated with new soil and subsoil data. The Public Service of Wallonia and the University of Liège have recently initiated a study to evaluate the feasibility to develop engineering geological mapping with a computerized approach. Numerous and various data (about soil and subsoil) are stored into a georelational database (the geotechnical database - using Access, Microsoft®). All the data are geographically referenced. The database is linked to a GIS project (using ArcGIS, ESRI®). Both the database and GIS project consist of a powerful tool for spatial data management and analysis. This approach involves a methodology using interpolation methods to update the previous maps and to extent the coverage to new areas. The location (x, y, z) of each subsoil layer is then computed from data point. The geomechanical data of these layers are synthesized in an explanatory booklet joined to maps.

  7. Ten years of change: National Library of Medicine TOXMAP gets a new look.

    PubMed

    Hochstein, Colette; Gemoets, Darren; Goshorn, Jeanne

    2014-01-01

    The United States National Library of Medicine (NLM) TOXNET® databases < http://toxnet.nlm.nih.gov > provide broad coverage of environmental health information covering a wide variety of topics, including access to the U.S. Environment Protection Agency (EPA)'s Toxics Release Inventory (TRI) data. The NLM web-based geographic information system (GIS), TOXMAP® < http://toxmap.nlm.nih.gov/ >, provides interactive maps which show where TRI chemicals are released into the environment and links to TOXNET for information about these chemicals. TOXMAP also displays locations of Superfund sites on the EPA National Priority List, as well as information about the chemical contaminants at these sites. This column focuses on a new version of TOXMAP which brings it up to date with current web GIS technologies and user expectations.

  8. Accurate atom-mapping computation for biochemical reactions.

    PubMed

    Latendresse, Mario; Malerich, Jeremiah P; Travers, Mike; Karp, Peter D

    2012-11-26

    The complete atom mapping of a chemical reaction is a bijection of the reactant atoms to the product atoms that specifies the terminus of each reactant atom. Atom mapping of biochemical reactions is useful for many applications of systems biology, in particular for metabolic engineering where synthesizing new biochemical pathways has to take into account for the number of carbon atoms from a source compound that are conserved in the synthesis of a target compound. Rapid, accurate computation of the atom mapping(s) of a biochemical reaction remains elusive despite significant work on this topic. In particular, past researchers did not validate the accuracy of mapping algorithms. We introduce a new method for computing atom mappings called the minimum weighted edit-distance (MWED) metric. The metric is based on bond propensity to react and computes biochemically valid atom mappings for a large percentage of biochemical reactions. MWED models can be formulated efficiently as Mixed-Integer Linear Programs (MILPs). We have demonstrated this approach on 7501 reactions of the MetaCyc database for which 87% of the models could be solved in less than 10 s. For 2.1% of the reactions, we found multiple optimal atom mappings. We show that the error rate is 0.9% (22 reactions) by comparing these atom mappings to 2446 atom mappings of the manually curated Kyoto Encyclopedia of Genes and Genomes (KEGG) RPAIR database. To our knowledge, our computational atom-mapping approach is the most accurate and among the fastest published to date. The atom-mapping data will be available in the MetaCyc database later in 2012; the atom-mapping software will be available within the Pathway Tools software later in 2012.

  9. Brassica ASTRA: an integrated database for Brassica genomic research.

    PubMed

    Love, Christopher G; Robinson, Andrew J; Lim, Geraldine A C; Hopkins, Clare J; Batley, Jacqueline; Barker, Gary; Spangenberg, German C; Edwards, David

    2005-01-01

    Brassica ASTRA is a public database for genomic information on Brassica species. The database incorporates expressed sequences with Swiss-Prot and GenBank comparative sequence annotation as well as secondary Gene Ontology (GO) annotation derived from the comparison with Arabidopsis TAIR GO annotations. Simple sequence repeat molecular markers are identified within resident sequences and mapped onto the closely related Arabidopsis genome sequence. Bacterial artificial chromosome (BAC) end sequences derived from the Multinational Brassica Genome Project are also mapped onto the Arabidopsis genome sequence enabling users to identify candidate Brassica BACs corresponding to syntenic regions of Arabidopsis. This information is maintained in a MySQL database with a web interface providing the primary means of interrogation. The database is accessible at http://hornbill.cspp.latrobe.edu.au.

  10. 76 FR 5106 - Deposit Requirements for Registration of Automated Databases That Predominantly Consist of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-28

    ... Registration of Automated Databases That Predominantly Consist of Photographs AGENCY: Copyright Office, Library... regarding electronic registration of automated databases that consist predominantly of photographs and group... applications for automated databases that consist predominantly of photographs. The proposed amendments would...

  11. 77 FR 66617 - HIT Policy and Standards Committees; Workgroup Application Database

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... Database AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of New ONC HIT FACA Workgroup Application Database. The Office of the National Coordinator (ONC) has launched a new Health Information Technology Federal Advisory Committee Workgroup Application Database...

  12. Genome databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Courteau, J.

    1991-10-11

    Since the Genome Project began several years ago, a plethora of databases have been developed or are in the works. They range from the massive Genome Data Base at Johns Hopkins University, the central repository of all gene mapping information, to small databases focusing on single chromosomes or organisms. Some are publicly available, others are essentially private electronic lab notebooks. Still others limit access to a consortium of researchers working on, say, a single human chromosome. An increasing number incorporate sophisticated search and analytical software, while others operate as little more than data lists. In consultation with numerous experts inmore » the field, a list has been compiled of some key genome-related databases. The list was not limited to map and sequence databases but also included the tools investigators use to interpret and elucidate genetic data, such as protein sequence and protein structure databases. Because a major goal of the Genome Project is to map and sequence the genomes of several experimental animals, including E. coli, yeast, fruit fly, nematode, and mouse, the available databases for those organisms are listed as well. The author also includes several databases that are still under development - including some ambitious efforts that go beyond data compilation to create what are being called electronic research communities, enabling many users, rather than just one or a few curators, to add or edit the data and tag it as raw or confirmed.« less

  13. National Cartographic Information Center

    USGS Publications Warehouse

    ,

    1984-01-01

    The National Cartographic Information Center (NCIC) exists to help you find maps of all kinds and much of the data and materials used to compile and to print them. NCIC collects, sorts and describes all types of cartographic information from Federal, State and local government agencies and, where possible, from private companies in the mapping business. It is the public's primary source for cartographic information. (See partial list of Federal agencies and their map and other cartographic products.)

  14. 77 FR 3455 - Privacy Act of 1974; System of Records-Migrant Education Bypass Program Student Database

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-24

    ... Student Database AGENCY: Office of Elementary and Secondary Education, Department of Education. ACTION... entitled ``Migrant Education Bypass Program Student Database (MEBPSD)'' (18-14-06). The Secretary has...

  15. Characterization of Sedimentary Deposits Using usSEABED for Large-scale Mapping, Modeling and Research of U.S.Continental Margins

    NASA Astrophysics Data System (ADS)

    Williams, S. J.; Reid, J. A.; Arsenault, M. A.; Jenkins, C.

    2006-12-01

    Geologic maps of offshore areas containing detailed morphologic features and sediment character can serve many scientific and operational purposes. Such maps have been lacking, but recent computer technology and software to capture diverse marine data are offering promise. Continental margins, products of complex geologic history and dynamic oceanographic processes, dominated by the Holocene marine transgression, contain landforms which provide a variety of important functions: critical habitats for fish, ship navigation, national defense, and engineering activities (i.e., oil and gas platforms, pipeline and cable routes, wind-energy sites) and contain important sedimentary records. Some shelf areas also contain sedimentary deposits such as sand and gravel, regarded as potential aggregate resources for mitigating coastal erosion, reducing vulnerability to hazards, and restoring ecosystems. Because coastal and offshore areas are increasingly important, knowledge of the framework geology and marine processes is useful to many. Especially valuable are comprehensive and integrated digital databases based on data from original sources in the marine community. Products of interest are GIS maps containing thematic information such as seafloor physiography, geology, sediment character and texture, seafloor roughness, and geotechnical engineering properties. These map products are useful to scientists modeling nearshore and shelf processes as well as planners and managers. The USGS with partners is leading a Nation-wide program to gather a wide variety of extant marine geologic data into the usSEABED system (http://walrus.wr.usgs/usseabed). This provides a centralized, fully integrated digital database of marine geologic data collected over the past 50 years by USGS, other federal and state agencies, universities and private companies. To date, approximately 325,000 data points from the U.S. EEZ reside in usSEABED. The usSEABED, which combines a broad array of physical data and information (both analytical and descriptive) about the sea floor, including sediment textural, statistical, geochemical, geophysical, and compositional information, is available to the marine community through USGS Data Series publications. Three DS reports for the Atlantic (DS-118), Gulf of Mexico (DS-146) and Pacific(DS-182) were published in 2006 and reports for HI and AK are forthcoming. The use of usSEABED and derivative map products are part of ongoing USGS efforts to conduct regional assessments of potential marine sand and gravel resources, map benthic habitats, and support research in understanding seafloor character and mobility, transport processes and natural resources.

  16. KIDFamMap: a database of kinase-inhibitor-disease family maps for kinase inhibitor selectivity and binding mechanisms

    PubMed Central

    Chiu, Yi-Yuan; Lin, Chih-Ta; Huang, Jhang-Wei; Hsu, Kai-Cheng; Tseng, Jen-Hu; You, Syuan-Ren; Yang, Jinn-Moon

    2013-01-01

    Kinases play central roles in signaling pathways and are promising therapeutic targets for many diseases. Designing selective kinase inhibitors is an emergent and challenging task, because kinases share an evolutionary conserved ATP-binding site. KIDFamMap (http://gemdock.life.nctu.edu.tw/KIDFamMap/) is the first database to explore kinase-inhibitor families (KIFs) and kinase-inhibitor-disease (KID) relationships for kinase inhibitor selectivity and mechanisms. This database includes 1208 KIFs, 962 KIDs, 55 603 kinase-inhibitor interactions (KIIs), 35 788 kinase inhibitors, 399 human protein kinases, 339 diseases and 638 disease allelic variants. Here, a KIF can be defined as follows: (i) the kinases in the KIF with significant sequence similarity, (ii) the inhibitors in the KIF with significant topology similarity and (iii) the KIIs in the KIF with significant interaction similarity. The KIIs within a KIF are often conserved on some consensus KIDFamMap anchors, which represent conserved interactions between the kinase subsites and consensus moieties of their inhibitors. Our experimental results reveal that the members of a KIF often possess similar inhibition profiles. The KIDFamMap anchors can reflect kinase conformations types, kinase functions and kinase inhibitor selectivity. We believe that KIDFamMap provides biological insights into kinase inhibitor selectivity and binding mechanisms. PMID:23193279

  17. A new edition of the Mars 1:5,000,000 map series

    NASA Technical Reports Server (NTRS)

    Batson, R. M.; Mcewen, Alfred S.; Wu, Sherman S. C.

    1991-01-01

    A new edition of the Mars 1:5,000,000 scale map series is in preparation. Two sheets will be made for each quadrangle. Sheet one will show shaded relief, contours, and nomenclature. Sheet 2 will be a full-color photomosaic prepared on the Mars digital image model (MDIM) base co-registered with the Mars low-resolution color database. The latter will have an abbreviated graticule (latitude/longitude ticks only) and no other line overprint. The four major databases used to assemble this series are now virtually complete. These are: (1) Viking-revised shaded relief maps at 1:5,000,000 scale; (2) contour maps at 1:2,000,000 scale; (3) the Mars digital image model; and (4) a color image mosaic of Mars. Together, these databases form the most complete planetwide cartographic definition of Mars that can be compiled with existing data. The new edition will supersede the published Mars 1:5,000,000 scale maps, including the original shaded relief and topographic maps made primarily with Mariner 9 data and the Viking-revised shaded relief and controlled photomosaic series. Publication of the new series will begin in late 1991 or early 1992, and it should be completed in two years.

  18. 76 FR 30997 - National Transit Database: Amendments to Urbanized Area Annual Reporting Manual

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-27

    ... Transit Database: Amendments to Urbanized Area Annual Reporting Manual AGENCY: Federal Transit Administration (FTA), DOT. ACTION: Notice of Amendments to 2011 National Transit Database Urbanized Area Annual... Administration's (FTA) 2011 National Transit Database (NTD) Urbanized Area Annual Reporting Manual (Annual Manual...

  19. 77 FR 40268 - Deposit Requirements for Registration of Automated Databases That Predominantly Consist of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-09

    ... Registration of Automated Databases That Predominantly Consist of Photographs AGENCY: Copyright Office, Library... the deposit requirements for applications for automated databases that consist predominantly of... authorship, the deposits for such databases include the image of each photograph in which copyright is...

  20. Database for the geologic map of Upper Geyser Basin, Yellowstone National Park, Wyoming

    USGS Publications Warehouse

    Abendini, Atosa A.; Robinson, Joel E.; Muffler, L. J. Patrick; White, D. E.; Beeson, Melvin H.; Truesdell, A. H.

    2015-01-01

    This dataset contains contacts, geologic units, and map boundaries from Miscellaneous Investigations Series Map I-1371, "The Geologic map of upper Geyser Basin, Yellowstone, National Park, Wyoming". This dataset was constructed to produce a digital geologic map as a basis for ongoing studies of hydrothermal processes.

  1. The National Deep-Sea Coral and Sponge Database: A Comprehensive Resource for United States Deep-Sea Coral and Sponge Records

    NASA Astrophysics Data System (ADS)

    Dornback, M.; Hourigan, T.; Etnoyer, P.; McGuinn, R.; Cross, S. L.

    2014-12-01

    Research on deep-sea corals has expanded rapidly over the last two decades, as scientists began to realize their value as long-lived structural components of high biodiversity habitats and archives of environmental information. The NOAA Deep Sea Coral Research and Technology Program's National Database for Deep-Sea Corals and Sponges is a comprehensive resource for georeferenced data on these organisms in U.S. waters. The National Database currently includes more than 220,000 deep-sea coral records representing approximately 880 unique species. Database records from museum archives, commercial and scientific bycatch, and from journal publications provide baseline information with relatively coarse spatial resolution dating back as far as 1842. These data are complemented by modern, in-situ submersible observations with high spatial resolution, from surveys conducted by NOAA and NOAA partners. Management of high volumes of modern high-resolution observational data can be challenging. NOAA is working with our data partners to incorporate this occurrence data into the National Database, along with images and associated information related to geoposition, time, biology, taxonomy, environment, provenance, and accuracy. NOAA is also working to link associated datasets collected by our program's research, to properly archive them to the NOAA National Data Centers, to build a robust metadata record, and to establish a standard protocol to simplify the process. Access to the National Database is provided through an online mapping portal. The map displays point based records from the database. Records can be refined by taxon, region, time, and depth. The queries and extent used to view the map can also be used to download subsets of the database. The database, map, and website is already in use by NOAA, regional fishery management councils, and regional ocean planning bodies, but we envision it as a model that can expand to accommodate data on a global scale.

  2. Kazusa Marker DataBase: a database for genomics, genetics, and molecular breeding in plants.

    PubMed

    Shirasawa, Kenta; Isobe, Sachiko; Tabata, Satoshi; Hirakawa, Hideki

    2014-09-01

    In order to provide useful genomic information for agronomical plants, we have established a database, the Kazusa Marker DataBase (http://marker.kazusa.or.jp). This database includes information on DNA markers, e.g., SSR and SNP markers, genetic linkage maps, and physical maps, that were developed at the Kazusa DNA Research Institute. Keyword searches for the markers, sequence data used for marker development, and experimental conditions are also available through this database. Currently, 10 plant species have been targeted: tomato (Solanum lycopersicum), pepper (Capsicum annuum), strawberry (Fragaria × ananassa), radish (Raphanus sativus), Lotus japonicus, soybean (Glycine max), peanut (Arachis hypogaea), red clover (Trifolium pratense), white clover (Trifolium repens), and eucalyptus (Eucalyptus camaldulensis). In addition, the number of plant species registered in this database will be increased as our research progresses. The Kazusa Marker DataBase will be a useful tool for both basic and applied sciences, such as genomics, genetics, and molecular breeding in crops.

  3. Arctic Research Mapping Application (ARMAP): visualize project-level information for U.S. funded research in the Arctic

    NASA Astrophysics Data System (ADS)

    Kassin, A.; Cody, R. P.; Barba, M.; Escarzaga, S. M.; Score, R.; Dover, M.; Gaylord, A. G.; Manley, W. F.; Habermann, T.; Tweedie, C. E.

    2015-12-01

    The Arctic Research Mapping Application (ARMAP; http://armap.org/) is a suite of online applications and data services that support Arctic science by providing project tracking information (who's doing what, when and where in the region) for United States Government funded projects. In collaboration with 17 research agencies, project locations are displayed in a visually enhanced web mapping application. Key information about each project is presented along with links to web pages that provide additional information. The mapping application includes new reference data layers and an updated ship tracks layer. Visual enhancements are achieved by redeveloping the front-end from FLEX to HTML5 and JavaScript, which now provide access to mobile users utilizing tablets and cell phone devices. New tools have been added that allow users to navigate, select, draw, measure, print, use a time slider, and more. Other module additions include a back-end Apache SOLR search platform that provides users with the capability to perform advance searches throughout the ARMAP database. Furthermore, a new query builder interface has been developed in order to provide more intuitive controls to generate complex queries. These improvements have been made to increase awareness of projects funded by numerous entities in the Arctic, enhance coordination for logistics support, help identify geographic gaps in research efforts and potentially foster more collaboration amongst researchers working in the region. Additionally, ARMAP can be used to demonstrate past, present, and future research efforts supported by the U.S. Government.

  4. Maps of Quaternary Deposits and Liquefaction Susceptibility in the Central San Francisco Bay Region, California

    USGS Publications Warehouse

    Witter, Robert C.; Knudsen, Keith L.; Sowers, Janet M.; Wentworth, Carl M.; Koehler, Richard D.; Randolph, Carolyn E.; Brooks, Suzanna K.; Gans, Kathleen D.

    2006-01-01

    This report presents a map and database of Quaternary deposits and liquefaction susceptibility for the urban core of the San Francisco Bay region. It supercedes the equivalent area of U.S. Geological Survey Open-File Report 00-444 (Knudsen and others, 2000), which covers the larger 9-county San Francisco Bay region. The report consists of (1) a spatial database, (2) two small-scale colored maps (Quaternary deposits and liquefaction susceptibility), (3) a text describing the Quaternary map and liquefaction interpretation (part 3), and (4) a text introducing the report and describing the database (part 1). All parts of the report are digital; part 1 describes the database and digital files and how to obtain them by downloading across the internet. The nine counties surrounding San Francisco Bay straddle the San Andreas fault system, which exposes the region to serious earthquake hazard (Working Group on California Earthquake Probabilities, 1999). Much of the land adjacent to the Bay and the major rivers and streams is underlain by unconsolidated deposits that are particularly vulnerable to earthquake shaking and liquefaction of water-saturated granular sediment. This new map provides a consistent detailed treatment of the central part of the 9-county region in which much of the mapping of Open-File Report 00-444 was either at smaller (less detailed) scale or represented only preliminary revision of earlier work. Like Open-File Report 00-444, the current mapping uses geomorphic expression, pedogenic soils, inferred depositional environments, and geologic age to define and distinguish the map units. Further scrutiny of the factors controlling liquefaction susceptibility has led to some changes relative to Open-File Report 00-444: particularly the reclassification of San Francisco Bay mud (Qhbm) to have only MODERATE susceptibility and the rating of artificial fills according to the Quaternary map units inferred to underlie them (other than dams - adf). The two colored maps provide a regional summary of the new mapping at a scale of 1:200,000, a scale that is sufficient to show the general distribution and relationships of the map units but not to distinguish the more detailed elements that are present in the database. The report is the product of cooperative work by the National Earthquake Hazards Reduction Program (NEHRP) and National Cooperative Geologic Mapping Program of the U.S. Geological Survey, William Lettis and & Associates, Inc. (WLA), and the California Geological Survey. An earlier version was submitted to the U.S. Geological Survey by WLA as a final report for a NEHRP grant (Witter and others, 2005). The mapping has been carried out by WLA geologists under contract to the NEHRP Earthquake Program (Grant 99-HQ-GR-0095) and by the California Geological Survey.

  5. Guidelines for establishing and maintaining construction quality databases : tech brief.

    DOT National Transportation Integrated Search

    2006-12-01

    Construction quality databases contain a variety of construction-related data that characterize the quality of materials and workmanship. The primary purpose of construction quality databases is to help State highway agencies (SHAs) assess the qualit...

  6. Map-Based Querying for Multimedia Database

    DTIC Science & Technology

    2014-09-01

    existing assets in a custom multimedia database based on an area of interest. It also describes the augmentation of an Android Tactical Assault Kit (ATAK......for Multimedia Database Somiya Metu Computational and Information Sciences Directorate, ARL

  7. A Free Database of Auto-detected Full-sun Coronal Hole Maps

    NASA Astrophysics Data System (ADS)

    Caplan, R. M.; Downs, C.; Linker, J.

    2016-12-01

    We present a 4-yr (06/10/2010 to 08/18/14 at 6-hr cadence) database of full-sun synchronic EUV and coronal hole (CH) maps made available on a dedicated web site (http://www.predsci.com/chd). The maps are generated using STEREO/EUVI A&B 195Å and SDO/AIA 193Å images through an automated pipeline (Caplan et al, (2016) Ap.J. 823, 53).Specifically, the original data is preprocessed with PSF-deconvolution, a nonlinear limb-brightening correction, and a nonlinear inter-instrument intensity normalization. Coronal holes are then detected in the preprocessed images using a GPU-accelerated region growing segmentation algorithm. The final results from all three instruments are then merged and projected to form full-sun sine-latitude maps. All the software used in processing the maps is provided, which can easily be adapted for use with other instruments and channels. We describe the data pipeline and show examples from the database. We also detail recent CH-detection validation experiments using synthetic EUV emission images produced from global thermodynamic MHD simulations.

  8. SModelS v1.1 user manual: Improving simplified model constraints with efficiency maps

    NASA Astrophysics Data System (ADS)

    Ambrogi, Federico; Kraml, Sabine; Kulkarni, Suchita; Laa, Ursula; Lessa, Andre; Magerl, Veronika; Sonneveld, Jory; Traub, Michael; Waltenberger, Wolfgang

    2018-06-01

    SModelS is an automatized tool for the interpretation of simplified model results from the LHC. It allows to decompose models of new physics obeying a Z2 symmetry into simplified model components, and to compare these against a large database of experimental results. The first release of SModelS, v1.0, used only cross section upper limit maps provided by the experimental collaborations. In this new release, v1.1, we extend the functionality of SModelS to efficiency maps. This increases the constraining power of the software, as efficiency maps allow to combine contributions to the same signal region from different simplified models. Other new features of version 1.1 include likelihood and χ2 calculations, extended information on the topology coverage, an extended database of experimental results as well as major speed upgrades for both the code and the database. We describe in detail the concepts and procedures used in SModelS v1.1, explaining in particular how upper limits and efficiency map results are dealt with in parallel. Detailed instructions for code usage are also provided.

  9. Predictive landslide susceptibility mapping using spatial information in the Pechabun area of Thailand

    NASA Astrophysics Data System (ADS)

    Oh, Hyun-Joo; Lee, Saro; Chotikasathien, Wisut; Kim, Chang Hwan; Kwon, Ju Hyoung

    2009-04-01

    For predictive landslide susceptibility mapping, this study applied and verified probability model, the frequency ratio and statistical model, logistic regression at Pechabun, Thailand, using a geographic information system (GIS) and remote sensing. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys, and maps of the topography, geology and land cover were constructed to spatial database. The factors that influence landslide occurrence, such as slope gradient, slope aspect and curvature of topography and distance from drainage were calculated from the topographic database. Lithology and distance from fault were extracted and calculated from the geology database. Land cover was classified from Landsat TM satellite image. The frequency ratio and logistic regression coefficient were overlaid for landslide susceptibility mapping as each factor’s ratings. Then the landslide susceptibility map was verified and compared using the existing landslide location. As the verification results, the frequency ratio model showed 76.39% and logistic regression model showed 70.42% in prediction accuracy. The method can be used to reduce hazards associated with landslides and to plan land cover.

  10. EMAP and EMAGE: a framework for understanding spatially organized data.

    PubMed

    Baldock, Richard A; Bard, Jonathan B L; Burger, Albert; Burton, Nicolas; Christiansen, Jeff; Feng, Guanjie; Hill, Bill; Houghton, Derek; Kaufman, Matthew; Rao, Jianguo; Sharpe, James; Ross, Allyson; Stevenson, Peter; Venkataraman, Shanmugasundaram; Waterhouse, Andrew; Yang, Yiya; Davidson, Duncan R

    2003-01-01

    The Edinburgh MouseAtlas Project (EMAP) is a time-series of mouse-embryo volumetric models. The models provide a context-free spatial framework onto which structural interpretations and experimental data can be mapped. This enables collation, comparison, and query of complex spatial patterns with respect to each other and with respect to known or hypothesized structure. The atlas also includes a time-dependent anatomical ontology and mapping between the ontology and the spatial models in the form of delineated anatomical regions or tissues. The models provide a natural, graphical context for browsing and visualizing complex data. The Edinburgh Mouse Atlas Gene-Expression Database (EMAGE) is one of the first applications of the EMAP framework and provides a spatially mapped gene-expression database with associated tools for data mapping, submission, and query. In this article, we describe the underlying principles of the Atlas and the gene-expression database, and provide a practical introduction to the use of the EMAP and EMAGE tools, including use of new techniques for whole body gene-expression data capture and mapping.

  11. Digital images in the map revision process

    NASA Astrophysics Data System (ADS)

    Newby, P. R. T.

    Progress towards the adoption of digital (or softcopy) photogrammetric techniques for database and map revision is reviewed. Particular attention is given to the Ordnance Survey of Great Britain, the author's former employer, where digital processes are under investigation but have not yet been introduced for routine production. Developments which may lead to increasing automation of database update processes appear promising, but because of the cost and practical problems associated with managing as well as updating large digital databases, caution is advised when considering the transition to softcopy photogrammetry for revision tasks.

  12. 42 CFR 455.436 - Federal database checks.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Federal database checks. 455.436 Section 455.436....436 Federal database checks. The State Medicaid agency must do all of the following: (a) Confirm the... databases. (b) Check the Social Security Administration's Death Master File, the National Plan and Provider...

  13. 78 FR 58545 - Global Unique Device Identification Database; Draft Guidance for Industry; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ...] Global Unique Device Identification Database; Draft Guidance for Industry; Availability AGENCY: Food and... the availability of the draft guidance entitled ``Global Unique Device Identification Database (GUDID... manufacturer) will interface with the GUDID, as well as information on the database elements that must be...

  14. 75 FR 18255 - Passenger Facility Charge Database System for Air Carrier Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... Facility Charge Database System for Air Carrier Reporting AGENCY: Federal Aviation Administration (FAA... the Passenger Facility Charge (PFC) database system to report PFC quarterly report information. In... developed a national PFC database system in order to more easily track the PFC program on a nationwide basis...

  15. 42 CFR 455.436 - Federal database checks.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Federal database checks. 455.436 Section 455.436....436 Federal database checks. The State Medicaid agency must do all of the following: (a) Confirm the... databases. (b) Check the Social Security Administration's Death Master File, the National Plan and Provider...

  16. 42 CFR 455.436 - Federal database checks.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Federal database checks. 455.436 Section 455.436....436 Federal database checks. The State Medicaid agency must do all of the following: (a) Confirm the... databases. (b) Check the Social Security Administration's Death Master File, the National Plan and Provider...

  17. 42 CFR 455.436 - Federal database checks.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Federal database checks. 455.436 Section 455.436....436 Federal database checks. The State Medicaid agency must do all of the following: (a) Confirm the... databases. (b) Check the Social Security Administration's Death Master File, the National Plan and Provider...

  18. Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1 degree x 2 degrees quadrangle and part of the southern part of the Challis 1 degree x 2 degrees quadrangle, south-central Idaho

    USGS Publications Warehouse

    Link, P.K.; Mahoney, J.B.; Bruner, D.J.; Batatian, L.D.; Wilson, Eric; Williams, F.J.C.

    1995-01-01

    The paper version of the Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1x2 Quadrangle and part of the southern part of the Challis 1x2 Quadrangle, south-central Idaho was compiled by Paul Link and others in 1995. The plate was compiled on a 1:100,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a GIS database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  19. Preliminary northeast Asia geodynamics map

    USGS Publications Warehouse

    Parfenov, Leonid M.; Khanchuk, Alexander I.; Badarch, Gombosuren; Miller, Robert J.; Naumova, Vera V.; Nokleberg, Warren J.; Ogasawara, Masatsugu; Prokopiev, Andrei V.; Yan, Hongquan

    2003-01-01

    This map portrays the geodynamics of Northeast Asia at a scale of 1:5,000,000 using the concepts of plate tectonics and analysis of terranes and overlap assemblages. The map is the result of a detailed compilation and synthesis at 5 million scale and is part of a major international collaborative study of the Mineral Resources, Metallogenesis, and Tectonics of Northeast Asia conducted from 1997 through 2002 by geologists from earth science agencies and universities in Russia, Mongolia, Northeastern China, South Korea, Japan, and the USA. This map is the result of extensive geologic mapping and associated tectonic studies in Northeast Asia in the last few decades and is the first collaborative compilation of the geology of the region at a scale of 1:5,000,000 by geologists from Russia, Mongolia, Northeastern China, South Korea, Japan, and the USA. The map was compiled by a large group of international geologists using the below concepts and definitions during collaborative workshops over a six-year period. The map is a major new compilation and re-interpretation of pre-existing geologic maps of the region. The map is designed to be used for several purposes, including regional tectonic analyses, mineral resource and metallogenic analysis, petroleum resource analysis, neotectonic analysis, and analysis of seismic hazards and volcanic hazards. The map consists of two sheets. Sheet 1 displays the map at a scale of 1:5,000,000, explanation. Sheet 2 displays the introduction, list of map units, and source references. Detailed descriptions of map units and stratigraphic columns are being published separately. This map is one of a series of publications on the mineral resources, metallogenesis, and geodynamics,of Northeast Asia. Companion studies and other articles and maps , and various detailed reports are: (1) a compilation of major mineral deposit models (Rodionov and Nokleberg, 2000; Rodionov and others, 2000; Obolenskiy and others, in press a); (2) a series of metallogenic belt maps (Obolenskiy and others, 2001; in press b); (3) a lode mineral deposits and placer districts location map for Northeast Asia (Ariunbileg and others, in press b); (4) descriptions of metallogenic belts (Rodionov and others, in press); and (5) a database on significant metalliferous and selected nonmetalliferous lode deposits, and selected placer districts (Ariunbileg and others, in press a).

  20. Holocene = Anthropocene? The HYDE database for integrated global change research over the past 12,000 years

    NASA Astrophysics Data System (ADS)

    Klein Goldewijk, K.

    2008-12-01

    More and more studies of global (climate) change are focusing on the past. Hundreds and thousands of years of land use, driven by population growth have left their trace/mark on the Earth's surface. We are only at the beginning to understand the complex relationship of human induced disturbances of the global environment, and the consequences for future climate. It is therefore essential that we get a clear picture/understanding of past relationships between population growth, land use and climate. In order to facilitate climate modelers to examine these relationships, the HYDE database has been updated and extended. The update of HYDE described here (Klein Goldewijk et al. 2006; Klein Goldewijk et al. 2007) includes several improvements compared to its predecessor: (i) the HYDE 2 version used a Boolean approach with a 30 minute degree resolution, while HYDE 3 uses fractional land use on a 5 minute resolution; (ii) more and better sub-national (population) data (Klein Goldewijk, 2005) to improve the historical (urban and rural) population maps as a basis for allocation of land cover; (iii) implementation of different allocation algorithms with time-dependent weighting maps for cropland and grassland; (iv) the period covered has now been extended from the emergence of agriculture (10,000 B.C) to present time (2,000 A.D.), with different time intervals. Examples of (future) use of the database is to help test the 'Ruddiman hypothesis', who proposed a theory that mankind already altered the global atmosphere much earlier than the start of the Industrial Revolution in the early 18th century (Ruddiman, 2003), which put forward the research question whether we detect a pre- Industrial Revolution anthropogenic signal, and how strong is that signal? References Klein Goldewijk, K. A.F. Bouwman and G. van Drecht, 2007. Mapping current global cropland and grassland distributions on a 5 by 5 minute resolution, Journal of Land Use Science, Vol 2(3): 167-190. Klein Goldewijk, K. and G. van Drecht, 2006. HYDE 3: Current and historical population and land cover. MNP (2006) (Edited by A.F. Bouwman, T. Kram and K. Klein Goldewijk), Integrated modelling of global environmental change. An overview of IMAGE 2.4. Netherlands Environmental Assessment Agency (MNP), Bilthoven, The Netherlands Klein Goldewijk, K. 2005. Three centuries of global population growth: A spatial referenced population density database for 1700 - 2000, Population and Environment, 26 (5): 343-367. Ruddiman, WF, 2003. The anthropogenic greenhouse era bagan thousands of years ago, Climatic Change, 61(3), 261-293.

  1. Mapping Research in the Field of Special Education on the Island of Ireland since 2000

    ERIC Educational Resources Information Center

    Travers, Joseph; Savage, Rosie; Butler, Cathal; O'Donnell, Margaret

    2018-01-01

    This paper describes the process of building a database mapping research and policy in the field of special education on the island of Ireland from 2000 to 2013. The field of study includes special educational needs, disability and inclusion. The database contains 3188 references organised thematically and forms a source for researchers to access…

  2. MAPS: The Organization of a Spatial Database System Using Imagery, Terrain, and Map Data

    DTIC Science & Technology

    1983-06-01

    segments which share the same pixel position. Finally, in any largo system, a logical partitioning of the database must be performed in order to avoid...34theodore roosevelt memoria entry 0; entry 1: Virginia ’northwest Washington* 2 en 11" ies for "crossover" for ’theodore roosevelt memor i entry 0

  3. Mapping PDB chains to UniProtKB entries.

    PubMed

    Martin, Andrew C R

    2005-12-01

    UniProtKB/SwissProt is the main resource for detailed annotations of protein sequences. This database provides a jumping-off point to many other resources through the links it provides. Among others, these include other primary databases, secondary databases, the Gene Ontology and OMIM. While a large number of links are provided to Protein Data Bank (PDB) files, obtaining a regularly updated mapping between UniProtKB entries and PDB entries at the chain or residue level is not straightforward. In particular, there is no regularly updated resource which allows a UniProtKB/SwissProt entry to be identified for a given residue of a PDB file. We have created a completely automatically maintained database which maps PDB residues to residues in UniProtKB/SwissProt and UniProtKB/trEMBL entries. The protocol uses links from PDB to UniProtKB, from UniProtKB to PDB and a brute-force sequence scan to resolve PDB chains for which no annotated link is available. Finally the sequences from PDB and UniProtKB are aligned to obtain a residue-level mapping. The resource may be queried interactively or downloaded from http://www.bioinf.org.uk/pdbsws/.

  4. Producing remote sensing-based emission estimates of prescribed burning in the contiguous United States for the U.S. Environmental Protection Agency 2011 National Emissions Inventory

    NASA Astrophysics Data System (ADS)

    McCarty, J. L.; Pouliot, G. A.; Soja, A. J.; Miller, M. E.; Rao, T.

    2013-12-01

    Prescribed fires in agricultural landscapes generally produce smaller burned areas than wildland fires but are important contributors to emissions impacting air quality and human health. Currently, there are a variety of available satellite-based estimates of crop residue burning, including the NOAA/NESDIS Hazard Mapping System (HMS) the Satellite Mapping Automated Reanalysis Tool for Fire Incident Reconciliation (SMARTFIRE 2), the Moderate Resolution Imaging Spectroradiometer (MODIS) Official Burned Area Product (MCD45A1)), the MODIS Direct Broadcast Burned Area Product (MCD64A1) the MODIS Active Fire Product (MCD14ML), and a regionally-tuned 8-day cropland differenced Normalized Burn Ratio product for the contiguous U.S. The purpose of this NASA-funded research was to refine the regionally-tuned product utilizing higher spatial resolution crop type data from the USDA NASS Cropland Data Layer and burned area training data from field work and high resolution commercial satellite data to improve the U.S. Environmental Protection Agency's (EPA) National Emissions Inventory (NEI). The final product delivered to the EPA included a detailed database of 25 different atmospheric emissions at the county level, emission distributions by crop type and seasonality, and GIS data. The resulting emission databases were shared with the U.S. EPA and regional offices, the National Wildfire Coordinating Group (NWGC) Smoke Committee, and all 48 states in the contiguous U.S., with detailed error estimations for Wyoming and Indiana and detailed analyses of results for Florida, Minnesota, North Dakota, Oklahoma, and Oregon. This work also provided opportunities in discovering the different needs of federal and state partners, including the various geospatial abilities and platforms across the many users and how to incorporate expert air quality, policy, and land management knowledge into quantitative earth observation-based estimations of prescribed fire emissions. Finally, this work created direct communication paths between federal and state partners to the scientists creating the remote sensing-based products, further improving the geospatial products and understanding of air quality impacts of prescribed burning at the state, regional, and national scales.

  5. Digital geologic map of the Thirsty Canyon NW quadrangle, Nye County, Nevada

    USGS Publications Warehouse

    Minor, S.A.; Orkild, P.P.; Sargent, K.A.; Warren, R.G.; Sawyer, D.A.; Workman, J.B.

    1998-01-01

    This digital geologic map compilation presents new polygon (i.e., geologic map unit contacts), line (i.e., fault, fold axis, dike, and caldera wall), and point (i.e., structural attitude) vector data for the Thirsty Canyon NW 7 1/2' quadrangle in southern Nevada. The map database, which is at 1:24,000-scale resolution, provides geologic coverage of an area of current hydrogeologic and tectonic interest. The Thirsty Canyon NW quadrangle is located in southern Nye County about 20 km west of the Nevada Test Site (NTS) and 30 km north of the town of Beatty. The map area is underlain by extensive layers of Neogene (about 14 to 4.5 million years old [Ma]) mafic and silicic volcanic rocks that are temporally and spatially associated with transtensional tectonic deformation. Mapped volcanic features include part of a late Miocene (about 9.2 Ma) collapse caldera, a Pliocene (about 4.5 Ma) shield volcano, and two Pleistocene (about 0.3 Ma) cinder cones. Also documented are numerous normal, oblique-slip, and strike-slip faults that reflect regional transtensional deformation along the southern part of the Walker Lane belt. The Thirsty Canyon NW map provides new geologic information for modeling groundwater flow paths that may enter the map area from underground nuclear testing areas located in the NTS about 25 km to the east. The geologic map database comprises six component ArcINFO map coverages that can be accessed after decompressing and unbundling the data archive file (tcnw.tar.gz). These six coverages (tcnwpoly, tcnwflt, tcnwfold, tcnwdike, tcnwcald, and tcnwatt) are formatted here in ArcINFO EXPORT format. Bundled with this database are two PDF files for readily viewing and printing the map, accessory graphics, and a description of map units and compilation methods.

  6. Landslide databases for applied landslide impact research: the example of the landslide database for the Federal Republic of Germany

    NASA Astrophysics Data System (ADS)

    Damm, Bodo; Klose, Martin

    2014-05-01

    This contribution presents an initiative to develop a national landslide database for the Federal Republic of Germany. It highlights structure and contents of the landslide database and outlines its major data sources and the strategy of information retrieval. Furthermore, the contribution exemplifies the database potentials in applied landslide impact research, including statistics of landslide damage, repair, and mitigation. The landslide database offers due to systematic regional data compilation a differentiated data pool of more than 5,000 data sets and over 13,000 single data files. It dates back to 1137 AD and covers landslide sites throughout Germany. In seven main data blocks, the landslide database stores besides information on landslide types, dimensions, and processes, additional data on soil and bedrock properties, geomorphometry, and climatic or other major triggering events. A peculiarity of this landslide database is its storage of data sets on land use effects, damage impacts, hazard mitigation, and landslide costs. Compilation of landslide data is based on a two-tier strategy of data collection. The first step of information retrieval includes systematic web content mining and exploration of online archives of emergency agencies, fire and police departments, and news organizations. Using web and RSS feeds and soon also a focused web crawler, this enables effective nationwide data collection for recent landslides. On the basis of this information, in-depth data mining is performed to deepen and diversify the data pool in key landslide areas. This enables to gather detailed landslide information from, amongst others, agency records, geotechnical reports, climate statistics, maps, and satellite imagery. Landslide data is extracted from these information sources using a mix of methods, including statistical techniques, imagery analysis, and qualitative text interpretation. The landslide database is currently migrated to a spatial database system running on PostgreSQL/PostGIS. This provides advanced functionality for spatial data analysis and forms the basis for future data provision and visualization using a WebGIS application. Analysis of landslide database contents shows that in most parts of Germany landslides primarily affect transportation infrastructures. Although with distinct lower frequency, recent landslides are also recorded to cause serious damage to hydraulic facilities and waterways, supply and disposal infrastructures, sites of cultural heritage, as well as forest, agricultural, and mining areas. The main types of landslide damage are failure of cut and fill slopes, destruction of retaining walls, street lights, and forest stocks, burial of roads, backyards, and garden areas, as well as crack formation in foundations, sewer lines, and building walls. Landslide repair and mitigation at transportation infrastructures is dominated by simple solutions such as catch barriers or rock fall drapery. These solutions are often undersized and fail under stress. The use of costly slope stabilization or protection systems is proven to reduce these risks effectively over longer maintenance cycles. The right balancing of landslide mitigation is thus a crucial problem in managing landslide risks. Development and analysis of such landslide databases helps to support decision-makers in finding efficient solutions to minimize landslide risks for human beings, infrastructures, and financial assets.

  7. Geologic map of the Republic of Armenia

    USGS Publications Warehouse

    Maldonado, Florian; Castellanos, Esther S.

    2000-01-01

    This map is a product that resulted from a project by the U.S. Agency for International Development (Participating Agency Service Agreement No. CCN-0002-P-ID-3097-00) to conduct an evaluation of coal and other fossil fuels in the Republic of Armenia. The original map has been translated to English from Russian (Marlen Satian, Academy of Sciences, Armenian Institute of Geological Sciences, written commun., 1994), digitized, and slightly modified in some areas. The original format has been modified to follow the U.S. Geological Survey's format. The map projection is not known. Latitude and longitude tics are approximately located.

  8. Technology Used for Realization of the Reform in Informal Areas.

    NASA Astrophysics Data System (ADS)

    Qirko, K.

    2008-12-01

    ORGANIZATION OF STRUCTURE AND ADMINISTRATION OF ALUIZNI Law no. 9482, date 03.03.2006 " On legalization, urban planning and integration of unauthorized buildings", entered into force on May 15, 2006. The Council of Ministers, with its decision no.289, date 17.05.2006, established the Agency for the Legalization, Urbanization, and Integration of the Informal Zones/Buildings (ALUIZNI), with its twelve local bodies. ALUIZNI began its activity in reliance to Law no. 9482, date 03.03.2006 " On legalization, urban planning and integration of unauthorized buildings", in July 2006. The administration of this agency was completed during this period and it is composed of; General Directory and twelve regional directories. As of today, this institution has 300 employees. The administrative structure of ALUIZNI is organized to achieve the objectives of the reform and to solve the problems arising during its completion. The following sectors have been established to achieve the objectives: Sector of compensation of owners; sector of cartography, sector of geographic system data elaboration (GIS) and Information Technology; sector of urban planning; sector of registration of legalized properties and Human resource sector. Following this vision, digital air photography of the Republic of Albania is in process of realization, from which we will receive, for the first time, orthophoto and digital map, unique for the entire territory of our country. This cartographic product, will serve to all government institutions and private ones. All other systems, such as; system of territory management; system of property registration ; system of population registration; system of addresses; urban planning studies and systems; definition of boundaries of administrative and touristic zones will be established based on this cartographic system. The cartographic product will be of parameters mentioned below, divided in lots:(2.3 MEuro) 1.Lot I: It includes the urban zone, 1200 km2. It will have a resolution of 8cm pixel and it will be produced as a orthophoto and digital vectorized map. 2. Lot II: It includes the rural zone, 12000km2. Orthophoto, with resolution 8cm pixel, will be produced. 3.Lot III: It includes mountainous zone, 15000km2. We will receive orthophoto, with resolution 30cm pixel. All the technical documentation of the process will be produced in a digital manner, based on the digital map and it will be the main databases. We have established the sector of geographic system data elaboration (GIS) and Information Technology, with the purpose to assure transparency, and correctness to the process, and to assure a permanent useful information for various reasons. (1.1MEuro) GIS is a modern technology, which elaborates and makes connections among different information. The main objective of this sector is the establishment of self declaration databases, with 30 characteristics for each of them and a databases for the process, with 40 characteristics for each property, which includes cartographic, geographic and construction data.

  9. Text Mining the Biomedical Literature

    DTIC Science & Technology

    2007-11-05

    activities, and repeating past mistakes, or 3) agencies not participating in joint efforts that would fully exploit each agency’s strengths...research and joint projects (multi- department, multi-agency, multi-national, and government-industry) appropriate? • Is the balance among single...overall database taxonomy, i.e., are there any concepts missing from any of the databases, and even if not, do all the concepts bear the same

  10. pseudoMap: an innovative and comprehensive resource for identification of siRNA-mediated mechanisms in human transcribed pseudogenes.

    PubMed

    Chan, Wen-Ling; Yang, Wen-Kuang; Huang, Hsien-Da; Chang, Jan-Gowth

    2013-01-01

    RNA interference (RNAi) is a gene silencing process within living cells, which is controlled by the RNA-induced silencing complex with a sequence-specific manner. In flies and mice, the pseudogene transcripts can be processed into short interfering RNAs (siRNAs) that regulate protein-coding genes through the RNAi pathway. Following these findings, we construct an innovative and comprehensive database to elucidate siRNA-mediated mechanism in human transcribed pseudogenes (TPGs). To investigate TPG producing siRNAs that regulate protein-coding genes, we mapped the TPGs to small RNAs (sRNAs) that were supported by publicly deep sequencing data from various sRNA libraries and constructed the TPG-derived siRNA-target interactions. In addition, we also presented that TPGs can act as a target for miRNAs that actually regulate the parental gene. To enable the systematic compilation and updating of these results and additional information, we have developed a database, pseudoMap, capturing various types of information, including sequence data, TPG and cognate annotation, deep sequencing data, RNA-folding structure, gene expression profiles, miRNA annotation and target prediction. As our knowledge, pseudoMap is the first database to demonstrate two mechanisms of human TPGs: encoding siRNAs and decoying miRNAs that target the parental gene. pseudoMap is freely accessible at http://pseudomap.mbc.nctu.edu.tw/. Database URL: http://pseudomap.mbc.nctu.edu.tw/

  11. Improvements in Dynamic GPS Positions Using Track Averaging

    DTIC Science & Technology

    1999-08-01

    Imagery and Mapping Agency 2 NIMA/SOEMD, Mail Stop D-85 Attn: Larry Kunz Kenneth Croisant 4600 Sangamore Road Bethesda, MD 20816 -5003 6. National...Imagery and Mapping Agency NIMA/ARR, Mail Stop D-82 Attn: William Wooden 4600 Sangamore Road Bethesda, MD 20816 -5003 7. Michael J. Full USAF SMC/CZD 2435

  12. 75 FR 10552 - Noise Exposure Map Notice for Chandler Municipal Airport, Chandler, AZ

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-08

    ... shown in Table 6.6. Flight tracks for the existing and the five-year Noise Exposure Maps are found in... from the ultimate land use control and planning responsibilities of local government. These local... those public agencies and planning agencies with which consultation is required under section 47503 of...

  13. Vegetation and terrain mapping in Alaska using Landsat MSS and digital terrain data

    USGS Publications Warehouse

    Shasby, Mark; Carneggie, David M.

    1986-01-01

    During the past 5 years, the U.S. Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center Field Office in Anchorage, Alaska has worked cooperatively with Federal and State resource management agencies to produce land-cover and terrain maps for 245 million acres of Alaska. The need for current land-cover information in Alaska comes principally from the mandates of the Alaska National Interest Lands Conservation Act (ANILCA), December 1980, which requires major land management agencies to prepare comprehensive management plans. The land-cover mapping projects integrate digital Landsat data, terrain data, aerial photographs, and field data. The resultant land-cover and terrain maps and associated data bases are used for resource assessment, management, and planning by many Alaskan agencies including the U.S. Fish and Wildlife Service, U.S. Forest Service, Bureau of Land Management, and Alaska Department of Natural Resources. Applications addressed through use of the digital land-cover and terrain data bases range from comprehensive refuge planning to multiphased sampling procedures designed to inventory vegetation statewide. The land-cover mapping programs in Alaska demonstrate the operational utility of digital Landsat data and have resulted in a new land-cover mapping program by the USGS National Mapping Division to compile 1:250,000-scale land-cover maps in Alaska using a common statewide land-cover map legend.

  14. Publications of the Western Geologic Mapping Team 1997-1998

    USGS Publications Warehouse

    Stone, Paul; Powell, C.L.

    1999-01-01

    The Western Geologic Mapping Team (WGMT) of the U.S. Geological Survey, Geologic Division (USGS, GD), conducts geologic mapping and related topical earth-science studies in the western United States. This work is focused on areas where modern geologic maps and associated earth-science data are needed to address key societal and environmental issues such as ground-water quality, potential geologic hazards, and land-use decisions. Areas of primary emphasis currently include southern California, the San Francisco Bay region, the Pacific Northwest, the Las Vegas urban corridor, and selected National Park lands. The team has its headquarters in Menlo Park, California, and maintains smaller field offices at several other locations in the western United States. The results of research conducted by the WGMT are released to the public as a variety of databases, maps, text reports, and abstracts, both through the internal publication system of the USGS and in diverse external publications such as scientific journals and books. This report lists publications of the WGMT released in calendar years 1997 and 1998. Most of the publications listed were authored or coauthored by WGMT staff. However, the list also includes some publications authored by formal non-USGS cooperators with the WGMT, as well as some authored by USGS staff outside the WGMT in cooperation with WGMT projects. Several of the publications listed are available on the World Wide Web; for these, URL addresses are provided. Most of these Web publications are USGS open-file reports that contain large digital databases of geologic map and related information. For these, the bibliographic citation refers specifically to an explanatory pamphlet containing information about the content and accessibility of the database, not to the actual map or related information comprising the database itself.

  15. Geologic map of the Reyes Peak quadrangle, Ventura County, California

    USGS Publications Warehouse

    Minor, Scott A.

    2004-01-01

    New 1:24,000-scale geologic mapping in the Cuyama 30' x 60' quadrangle, in support of the USGS Southern California Areal Mapping Project (SCAMP), is contributing to a more complete understanding of the stratigraphy, structure, and tectonic evolution of the complex junction area between the NW-trending Coast Ranges and EW-trending western Transverse Ranges. The 1:24,000-scale geologic map of the Reyes Peak quadrangle, located in the eastern part of the Cuyama map area, is the final of six contiguous 7 ?' quadrangle geologic maps compiled for a more detailed portrayal and reevaluation of geologic structures and rock units shown on previous maps of the region (Carman, 1964; Dibblee, 1972; Vedder and others, 1973). SCAMP digital geologic maps of the five other contiguous quadrangles have recently been published (Minor, 1999; Kellogg, 1999, 2003; Stone and Cossette, 2000; Kellogg and Miggins, 2002). This digital compilation presents a new geologic map database for the Reyes Peak 7?' quadrangle, which is located in southern California about 75 km northwest of Los Angeles. The map database is at 1:24,000-scale resolution.

  16. Preliminary geologic map of the Fontana 7.5' quadrangle, Riverside and San Bernardino Counties, California

    USGS Publications Warehouse

    Morton, Douglas M.; Digital preparation by Bovard, Kelly R.

    2003-01-01

    Open-File Report 03-418 is a digital geologic data set that maps and describes the geology of the Fontana 7.5’ quadrangle, Riverside and San Bernardino Counties, California. The Fontana quadrangle database is one of several 7.5’ quadrangle databases that are being produced by the Southern California Areal Mapping Project (SCAMP). These maps and databases are, in turn, part of the nation-wide digital geologic map coverage being developed by the National Cooperative Geologic Map Program of the U.S. Geological Survey (USGS). General Open-File Report 03-418 contains a digital geologic map database of the Fontana 7.5’ quadrangle, Riverside and San Bernardino Counties, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A Postscript file (fon_map.ps) to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram (CMU), a Description of Map Units (DMU), and an index map. 3. An Encapsulated PostScript (EPS) file (fon_grey.eps) created in Adobe Illustrator 10.0 to plot the geologic map on a grey topographic base, and containing a Correlation of Map Units (CMU), a Description of Map Units (DMU), and an index map. 4. Portable Document Format (.pdf) files of: a. the Readme file; includes in Appendix I, data contained in fon_met.txt b. The same graphics as plotted in 2 and 3 above.Test plots have not produced precise 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (4b above) or plotting the postscript files (2 or 3 above).

  17. NASA Enterprise Architecture and Its Use in Transition of Research Results to Operations

    NASA Astrophysics Data System (ADS)

    Frisbie, T. E.; Hall, C. M.

    2006-12-01

    Enterprise architecture describes the design of the components of an enterprise, their relationships and how they support the objectives of that enterprise. NASA Stennis Space Center leads several projects involving enterprise architecture tools used to gather information on research assets within NASA's Earth Science Division. In the near future, enterprise architecture tools will link and display the relevant requirements, parameters, observatories, models, decision systems, and benefit/impact information relationships and map to the Federal Enterprise Architecture Reference Models. Components configured within the enterprise architecture serving the NASA Applied Sciences Program include the Earth Science Components Knowledge Base, the Systems Components database, and the Earth Science Architecture Tool. The Earth Science Components Knowledge Base systematically catalogues NASA missions, sensors, models, data products, model products, and network partners appropriate for consideration in NASA Earth Science applications projects. The Systems Components database is a centralized information warehouse of NASA's Earth Science research assets and a critical first link in the implementation of enterprise architecture. The Earth Science Architecture Tool is used to analyze potential NASA candidate systems that may be beneficial to decision-making capabilities of other Federal agencies. Use of the current configuration of NASA enterprise architecture (the Earth Science Components Knowledge Base, the Systems Components database, and the Earth Science Architecture Tool) has far exceeded its original intent and has tremendous potential for the transition of research results to operational entities.

  18. Database on unstable rock slopes in Norway

    NASA Astrophysics Data System (ADS)

    Oppikofer, Thierry; Nordahl, Bo; Bunkholt, Halvor; Nicolaisen, Magnus; Hermanns, Reginald L.; Böhme, Martina; Yugsi Molina, Freddy X.

    2014-05-01

    Several large rockslides have occurred in historic times in Norway causing many casualties. Most of these casualties are due to displacement waves triggered by a rock avalanche and affecting coast lines of entire lakes and fjords. The Geological Survey of Norway performs systematic mapping of unstable rock slopes in Norway and has detected up to now more than 230 unstable slopes with significant postglacial deformation. This systematic mapping aims to detect future rock avalanches before they occur. The registered unstable rock slopes are stored in a database on unstable rock slopes developed and maintained by the Geological Survey of Norway. The main aims of this database are (1) to serve as a national archive for unstable rock slopes in Norway; (2) to serve for data collection and storage during field mapping; (3) to provide decision-makers with hazard zones and other necessary information on unstable rock slopes for land-use planning and mitigation; and (4) to inform the public through an online map service. The database is organized hierarchically with a main point for each unstable rock slope to which several feature classes and tables are linked. This main point feature class includes several general attributes of the unstable rock slopes, such as site name, general and geological descriptions, executed works, recommendations, technical parameters (volume, lithology, mechanism and others), displacement rates, possible consequences, hazard and risk classification and so on. Feature classes and tables linked to the main feature class include the run-out area, the area effected by secondary effects, the hazard and risk classification, subareas and scenarios of an unstable rock slope, field observation points, displacement measurement stations, URL links for further documentation and references. The database on unstable rock slopes in Norway will be publicly consultable through the online map service on www.skrednett.no in 2014. Only publicly relevant parts of the database will be shown in the online map service (e.g. processed results of displacement measurements), while more detailed data will not (e.g. raw data of displacement measurements). Factsheets with key information on unstable rock slopes can be automatically generated and downloaded for each site, a municipality, a county or the entire country. Selected data will also be downloadable free of charge. The present database on unstable rock slopes in Norway will further evolve in the coming years as the systematic mapping conducted by the Geological Survey of Norway progresses and as available techniques and tools evolve.

  19. Online bibliographic sources in hydrology

    USGS Publications Warehouse

    Wild, Emily C.; Havener, W. Michael

    2001-01-01

    Traditional commercial bibliographic databases and indexes provide some access to hydrology materials produced by the government; however, these sources do not provide comprehensive coverage of relevant hydrologic publications. This paper discusses bibliographic information available from the federal government and state geological surveys, water resources agencies, and depositories. In addition to information in these databases, the paper describes the scope, styles of citing, subject terminology, and the ways these information sources are currently being searched, formally and informally, by hydrologists. Information available from the federal and state agencies and from the state depositories might be missed by limiting searches to commercially distributed databases.

  20. An intermediary's perspective of online databases for local governments

    NASA Technical Reports Server (NTRS)

    Jack, R. F.

    1984-01-01

    Numerous public administration studies have indicated that local government agencies for a variety of reasons lack access to comprehensive information resources; furthermore, such entities are often unwilling or unable to share information regarding their own problem-solving innovations. The NASA/University of Kentucky Technology Applications Program devotes a considerable effort to providing scientific and technical information and assistance to local agencies, relying on its access to over 500 distinct online databases offered by 20 hosts. The author presents a subjective assessment, based on his own experiences, of several databases which may prove useful in obtaining information for this particular end-user community.

  1. 77 FR 39269 - Submission for OMB Review, Comment Request, Proposed Collection: IMLS Museum Web Database...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-02

    ..., Proposed Collection: IMLS Museum Web Database: MuseumsCount.gov AGENCY: Institute of Museum and Library... general public. Information such as name, address, phone, email, Web site, staff size, program details... Museum Web Database: MuseumsCount.gov collection. The 60-day notice for the IMLS Museum Web Database...

  2. 78 FR 2363 - Notification of Deletion of a System of Records; Automated Trust Funds Database

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-11

    ... Database AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice of deletion of a system... establishing the Automated Trust Funds (ATF) database system of records. The Federal Information Security... Integrity Act of 1982, Public Law 97-255, provided authority for the system. The ATF database has been...

  3. Monitoring the Mesoamerican Biological Corridor: A NASA/CCAD Cooperative Research Project

    NASA Technical Reports Server (NTRS)

    Sever, Thomas; Irwin, Daniel; Sader, Steven A.; Saatchi, Sassan

    2004-01-01

    To foster scientific cooperation under a Memorandum of Understanding between NASA and the Central American countries, the research project developed regional databases to monitor forest condition and environmental change throughout the region. Of particular interest is the Mesoamerican Biological Corridor (MBC), a chain of protected areas and proposed conservation areas that will link segments of natural habitats in Central America from the borders of northern Columbia to southern Mexico. The first and second year of the project focused on the development of regional satellite databases (JERS-IC, MODIS, and Landsat-TM), training of Central American cooperators and forest cover and change analysis. The three regional satellite mosaics were developed and distributed on CD-ROM to cooperators and regional outlets. Four regional remote sensing training courses were conducted in 3 countries including participants from all 7 Central American countries and Mexico. In year 3, regional forest change assessment in reference to Mesoamerican Biological Corridor was completed and land cover maps (from Landsat TM) were developed for 7 Landsat scenes and accuracy assessed. These maps are being used to support validation of MODIS forest/non forest maps and to examine forest fragmentation and forest cover change in selected study sites. A no-cost time extension (2003-2004) allowed the completion of an M.S. thesis by a Costa Rican student and preparation of manuscripts for future submission to peer-reviewed outlets. Proposals initiated at the end of the project have generated external funding from the U.S. Forest Service (to U. Maine), NASA-ESSF (Oregon State U.) and from USAID and EPA (to NASA-MSFC-GHCC) to test MODIS capabilities to detect forest change; conduct literature review on biomass estimation and carbon stocks and develop a regional remote sensing monitoring center in Central America. The success of the project has led to continued cooperation between NASA, other federal agencies, and scientists from all seven Central American Countries (see SERVIR web site for this ongoing work - servir.nsstc.nasa.gov).

  4. Enhanced digital mapping project : final report

    DOT National Transportation Integrated Search

    2004-11-19

    The Enhanced Digital Map Project (EDMap) was a three-year effort launched in April 2001 to develop a range of digital map database enhancements that enable or improve the performance of driver assistance systems currently under development or conside...

  5. Recently active traces of the Bartlett Springs Fault, California: a digital database

    USGS Publications Warehouse

    Lienkaemper, James J.

    2010-01-01

    The purpose of this map is to show the location of and evidence for recent movement on active fault traces within the Bartlett Springs Fault Zone, California. The location and recency of the mapped traces is primarily based on geomorphic expression of the fault as interpreted from large-scale aerial photography. In a few places, evidence of fault creep and offset Holocene strata in trenches and natural exposures have confirmed the activity of some of these traces. This publication is formatted both as a digital database for use within a geographic information system (GIS) and for broader public access as map images that may be browsed on-line or download a summary map. The report text describes the types of scientific observations used to make the map, gives references pertaining to the fault and the evidence of faulting, and provides guidance for use of and limitations of the map.

  6. Mars Global Digital Dune Database: MC2-MC29

    USGS Publications Warehouse

    Hayward, Rosalyn K.; Mullins, Kevin F.; Fenton, L.K.; Hare, T.M.; Titus, T.N.; Bourke, M.C.; Colaprete, Anthony; Christensen, P.R.

    2007-01-01

    Introduction The Mars Global Digital Dune Database presents data and describes the methodology used in creating the database. The database provides a comprehensive and quantitative view of the geographic distribution of moderate- to large-size dune fields from 65? N to 65? S latitude and encompasses ~ 550 dune fields. The database will be expanded to cover the entire planet in later versions. Although we have attempted to include all dune fields between 65? N and 65? S, some have likely been excluded for two reasons: 1) incomplete THEMIS IR (daytime) coverage may have caused us to exclude some moderate- to large-size dune fields or 2) resolution of THEMIS IR coverage (100m/pixel) certainly caused us to exclude smaller dune fields. The smallest dune fields in the database are ~ 1 km2 in area. While the moderate to large dune fields are likely to constitute the largest compilation of sediment on the planet, smaller stores of sediment of dunes are likely to be found elsewhere via higher resolution data. Thus, it should be noted that our database excludes all small dune fields and some moderate to large dune fields as well. Therefore the absence of mapped dune fields does not mean that such dune fields do not exist and is not intended to imply a lack of saltating sand in other areas. Where availability and quality of THEMIS visible (VIS) or Mars Orbiter Camera narrow angle (MOC NA) images allowed, we classifed dunes and included dune slipface measurements, which were derived from gross dune morphology and represent the prevailing wind direction at the last time of significant dune modification. For dunes located within craters, the azimuth from crater centroid to dune field centroid was calculated. Output from a general circulation model (GCM) is also included. In addition to polygons locating dune fields, the database includes over 1800 selected Thermal Emission Imaging System (THEMIS) infrared (IR), THEMIS visible (VIS) and Mars Orbiter Camera Narrow Angle (MOC NA) images that were used to build the database. The database is presented in a variety of formats. It is presented as a series of ArcReader projects which can be opened using the free ArcReader software. The latest version of ArcReader can be downloaded at http://www.esri.com/software/arcgis/arcreader/download.html. The database is also presented in ArcMap projects. The ArcMap projects allow fuller use of the data, but require ESRI ArcMap? software. Multiple projects were required to accommodate the large number of images needed. A fuller description of the projects can be found in the Dunes_ReadMe file and the ReadMe_GIS file in the Documentation folder. For users who prefer to create their own projects, the data is available in ESRI shapefile and geodatabase formats, as well as the open Geographic Markup Language (GML) format. A printable map of the dunes and craters in the database is available as a Portable Document Format (PDF) document. The map is also included as a JPEG file. ReadMe files are available in PDF and ASCII (.txt) files. Tables are available in both Excel (.xls) and ASCII formats.

  7. What is an evidence map? A systematic review of published evidence maps and their definitions, methods, and products.

    PubMed

    Miake-Lye, Isomi M; Hempel, Susanne; Shanman, Roberta; Shekelle, Paul G

    2016-02-10

    The need for systematic methods for reviewing evidence is continuously increasing. Evidence mapping is one emerging method. There are no authoritative recommendations for what constitutes an evidence map or what methods should be used, and anecdotal evidence suggests heterogeneity in both. Our objectives are to identify published evidence maps and to compare and contrast the presented definitions of evidence mapping, the domains used to classify data in evidence maps, and the form the evidence map takes. We conducted a systematic review of publications that presented results with a process termed "evidence mapping" or included a figure called an "evidence map." We identified publications from searches of ten databases through 8/21/2015, reference mining, and consulting topic experts. We abstracted the research question, the unit of analysis, the search methods and search period covered, and the country of origin. Data were narratively synthesized. Thirty-nine publications met inclusion criteria. Published evidence maps varied in their definition and the form of the evidence map. Of the 31 definitions provided, 67 % described the purpose as identification of gaps and 58 % referenced a stakeholder engagement process or user-friendly product. All evidence maps explicitly used a systematic approach to evidence synthesis. Twenty-six publications referred to a figure or table explicitly called an "evidence map," eight referred to an online database as the evidence map, and five stated they used a mapping methodology but did not present a visual depiction of the evidence. The principal conclusion of our evaluation of studies that call themselves "evidence maps" is that the implied definition of what constitutes an evidence map is a systematic search of a broad field to identify gaps in knowledge and/or future research needs that presents results in a user-friendly format, often a visual figure or graph, or a searchable database. Foundational work is needed to better standardize the methods and products of an evidence map so that researchers and policymakers will know what to expect of this new type of evidence review. Although an a priori protocol was developed, no registration was completed; this review did not fit the PROSPERO format.

  8. Interagency collaboration models for people with mental ill health in contact with the police: a systematic scoping review

    PubMed Central

    Scantlebury, Arabella; Booth, Alison; MacBryde, Jillian Catherine; Scott, William J; Wright, Kath

    2018-01-01

    Objective To identify existing evidence on interagency collaboration between law enforcement, emergency services, statutory services and third sector agencies regarding people with mental ill health. Design Systematic scoping review. Scoping reviews map particular research areas to identify research gaps. Data sources and eligibility ASSIA, CENTRAL, the Cochrane Library databases, Criminal Justice Abstracts, ERIC, Embase, MEDLINE, PsycINFO, PROSPERO and Social Care Online and Social Sciences Citation Index were searched up to 2017, as were grey literature and hand searches. Eligible articles were empirical evaluations or descriptions of models of interagency collaboration between the police and other agencies. Study appraisal and synthesis Screening and data extraction were undertaken independently by two researchers. Arksey’s framework was used to collate and map included studies. Results One hundred and twenty-five studies were included. The majority of articles were of descriptions of models (28%), mixed methods evaluations of models (18%) and single service evaluations (14%). The most frequently reported outcomes (52%) were ‘organisational or service level outcomes’ (eg, arrest rates). Most articles (53%) focused on adults with mental ill health, whereas others focused on adult offenders with mental ill health (17.4%). Thirteen models of interagency collaboration were described, each involving between 2 and 13 agencies. Frequently reported models were ‘prearrest diversion’ of people with mental ill health (34%), ‘coresponse’ involving joint response by police officers paired with mental health professionals (28.6%) and ‘jail diversion’ following arrest (23.8%). Conclusions We identified 13 different interagency collaboration models catering for a range of mental health-related interactions. All but one of these models involved the police and mental health services or professionals. Several models have sufficient literature to warrant full systematic reviews of their effectiveness, whereas others need robust evaluation, by randomised controlled trial where appropriate. Future evaluations should focus on health-related outcomes and the impact on key stakeholders. PMID:29588323

  9. Interagency collaboration models for people with mental ill health in contact with the police: a systematic scoping review.

    PubMed

    Parker, Adwoa; Scantlebury, Arabella; Booth, Alison; MacBryde, Jillian Catherine; Scott, William J; Wright, Kath; McDaid, Catriona

    2018-03-27

    To identify existing evidence on interagency collaboration between law enforcement, emergency services, statutory services and third sector agencies regarding people with mental ill health. Systematic scoping review. Scoping reviews map particular research areas to identify research gaps. ASSIA, CENTRAL, the Cochrane Library databases, Criminal Justice Abstracts, ERIC, Embase, MEDLINE, PsycINFO, PROSPERO and Social Care Online and Social Sciences Citation Index were searched up to 2017, as were grey literature and hand searches. Eligible articles were empirical evaluations or descriptions of models of interagency collaboration between the police and other agencies. Screening and data extraction were undertaken independently by two researchers. Arksey's framework was used to collate and map included studies. One hundred and twenty-five studies were included. The majority of articles were of descriptions of models (28%), mixed methods evaluations of models (18%) and single service evaluations (14%). The most frequently reported outcomes (52%) were 'organisational or service level outcomes' (eg, arrest rates). Most articles (53%) focused on adults with mental ill health, whereas others focused on adult offenders with mental ill health (17.4%). Thirteen models of interagency collaboration were described, each involving between 2 and 13 agencies. Frequently reported models were 'prearrest diversion' of people with mental ill health (34%), 'coresponse' involving joint response by police officers paired with mental health professionals (28.6%) and 'jail diversion' following arrest (23.8%). We identified 13 different interagency collaboration models catering for a range of mental health-related interactions. All but one of these models involved the police and mental health services or professionals. Several models have sufficient literature to warrant full systematic reviews of their effectiveness, whereas others need robust evaluation, by randomised controlled trial where appropriate. Future evaluations should focus on health-related outcomes and the impact on key stakeholders. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. 49 CFR 535.8 - Reporting requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... must submit information electronically through the EPA database system as the single point of entry for... agencies are not prepared to receive information through the EPA database system, manufacturers are... applications for certificates of conformity in accordance through the EPA database including both GHG emissions...

  11. 49 CFR 535.8 - Reporting requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... information. (2) Manufacturers must submit information electronically through the EPA database system as the... year 2012 the agencies are not prepared to receive information through the EPA database system... applications for certificates of conformity in accordance through the EPA database including both GHG emissions...

  12. 49 CFR 535.8 - Reporting requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... must submit information electronically through the EPA database system as the single point of entry for... agencies are not prepared to receive information through the EPA database system, manufacturers are... applications for certificates of conformity in accordance through the EPA database including both GHG emissions...

  13. 49 CFR 535.8 - Reporting requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... must submit information electronically through the EPA database system as the single point of entry for... agencies are not prepared to receive information through the EPA database system, manufacturers are... applications for certificates of conformity in accordance through the EPA database including both GHG emissions...

  14. A web-based system architecture for ontology-based data integration in the domain of IT benchmarking

    NASA Astrophysics Data System (ADS)

    Pfaff, Matthias; Krcmar, Helmut

    2018-03-01

    In the domain of IT benchmarking (ITBM), a variety of data and information are collected. Although these data serve as the basis for business analyses, no unified semantic representation of such data yet exists. Consequently, data analysis across different distributed data sets and different benchmarks is almost impossible. This paper presents a system architecture and prototypical implementation for an integrated data management of distributed databases based on a domain-specific ontology. To preserve the semantic meaning of the data, the ITBM ontology is linked to data sources and functions as the central concept for database access. Thus, additional databases can be integrated by linking them to this domain-specific ontology and are directly available for further business analyses. Moreover, the web-based system supports the process of mapping ontology concepts to external databases by introducing a semi-automatic mapping recommender and by visualizing possible mapping candidates. The system also provides a natural language interface to easily query linked databases. The expected result of this ontology-based approach of knowledge representation and data access is an increase in knowledge and data sharing in this domain, which will enhance existing business analysis methods.

  15. Geologic and geophysical maps of the eastern three-fourths of the Cambria 30' x 60' quadrangle, central California Coast Ranges

    USGS Publications Warehouse

    Graymer, R.W.; Langenheim, V.E.; Roberts, M.A.; McDougall, Kristin

    2014-01-01

    The Cambria 30´ x 60´ quadrangle comprises southwestern Monterey County and northwestern San Luis Obispo County. The land area includes rugged mountains of the Santa Lucia Range extending from the northwest to the southeast part of the map; the southern part of the Big Sur coast in the northwest; broad marine terraces along the southwest coast; and broadvalleys, rolling hills, and modest mountains in the northeast. This report contains geologic, gravity anomaly, and aeromagnetic anomaly maps of the eastern three-fourths of the 1:100,000-scale Cambria quadrangle and the associated geologic and geophysical databases (ArcMap databases), as well as complete descriptions of the geologic map units and the structural relations in the mapped area. A cross section is based on both the geologic map and potential-field geophysical data. The maps are presented as an interactive, multilayer PDF, rather than more traditional pre-formatted map-sheet PDFs. Various geologic, geophysical, paleontological, and base map elements are placed on separate layers, which allows the user to combine elements interactively to create map views beyond the traditional map sheets. Four traditional map sheets (geologic map, gravity map, aeromagnetic map, paleontological locality map) are easily compiled by choosing the associated data layers or by choosing the desired map under Bookmarks.

  16. 7 CFR 1.7 - Agency response to requests for records.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Archives and Records Administration (“NARA”), the agency shall inform the requester of this fact and shall...) Database at http://www/nara.gov/nara.nail.html, or by calling NARA at (301) 713-6800. If the agency has no...

  17. ReactionMap: an efficient atom-mapping algorithm for chemical reactions.

    PubMed

    Fooshee, David; Andronico, Alessio; Baldi, Pierre

    2013-11-25

    Large databases of chemical reactions provide new data-mining opportunities and challenges. Key challenges result from the imperfect quality of the data and the fact that many of these reactions are not properly balanced or atom-mapped. Here, we describe ReactionMap, an efficient atom-mapping algorithm. Our approach uses a combination of maximum common chemical subgraph search and minimization of an assignment cost function derived empirically from training data. We use a set of over 259,000 balanced atom-mapped reactions from the SPRESI commercial database to train the system, and we validate it on random sets of 1000 and 17,996 reactions sampled from this pool. These large test sets represent a broad range of chemical reaction types, and ReactionMap correctly maps about 99% of the atoms and about 96% of the reactions, with a mean time per mapping of 2 s. Most correctly mapped reactions are mapped with high confidence. Mapping accuracy compares favorably with ChemAxon's AutoMapper, versions 5 and 6.1, and the DREAM Web tool. These approaches correctly map 60.7%, 86.5%, and 90.3% of the reactions, respectively, on the same data set. A ReactionMap server is available on the ChemDB Web portal at http://cdb.ics.uci.edu .

  18. A Framework for Mapping User-Designed Forms to Relational Databases

    ERIC Educational Resources Information Center

    Khare, Ritu

    2011-01-01

    In the quest for database usability, several applications enable users to design custom forms using a graphical interface, and forward engineer the forms into new databases. The path-breaking aspect of such applications is that users are completely shielded from the technicalities of database creation. Despite this innovation, the process of…

  19. Alaska Interim Land Cover Mapping Program; final report

    USGS Publications Warehouse

    Fitzpatrick-Lins, Katherine; Doughty, E.F.; Shasby, Mark; Benjamin, Susan

    1989-01-01

    In 1985, the U.S. Geological Survey initiated a research project to develop an interim land cover data base for Alaska as an alternative to the nationwide Land Use and Land Cover Mapping Program. The Alaska Interim Land Cover Mapping Program was subsequently created to develop methods for producing a series of land cover maps that utilized the existing Landsat digital land cover classifications produced by and for the major land management agencies for mapping the vegetation of Alaska. The program was successful in producing digital land cover classifications and statistical summaries using a common statewide classification and in reformatting these data to produce l:250,000-scale quadrangle-based maps directly from the Scitex laser plotter. A Federal and State agency review of these products found considerable user support for the maps. Presently the Geological Survey is committed to digital processing of six to eight quadrangles each year.

  20. Exploring the potential offered by legacy soil databases for ecosystem services mapping of Central African soils

    NASA Astrophysics Data System (ADS)

    Verdoodt, Ann; Baert, Geert; Van Ranst, Eric

    2014-05-01

    Central African soil resources are characterised by a large variability, ranging from stony, shallow or sandy soils with poor life-sustaining capabilities to highly weathered soils that recycle and support large amounts of biomass. Socio-economic drivers within this largely rural region foster inappropriate land use and management, threaten soil quality and finally culminate into a declining soil productivity and increasing food insecurity. For the development of sustainable land use strategies targeting development planning and natural hazard mitigation, decision makers often rely on legacy soil maps and soil profile databases. Recent development cooperation financed projects led to the design of soil information systems for Rwanda, D.R. Congo, and (ongoing) Burundi. A major challenge is to exploit these existing soil databases and convert them into soil inference systems through an optimal combination of digital soil mapping techniques, land evaluation tools, and biogeochemical models. This presentation aims at (1) highlighting some key characteristics of typical Central African soils, (2) assessing the positional, geographic and semantic quality of the soil information systems, and (3) revealing its potential impacts on the use of these datasets for thematic mapping of soil ecosystem services (e.g. organic carbon storage, pH buffering capacity). Soil map quality is assessed considering positional and semantic quality, as well as geographic completeness. Descriptive statistics, decision tree classification and linear regression techniques are used to mine the soil profile databases. Geo-matching as well as class-matching approaches are considered when developing thematic maps. Variability in inherent as well as dynamic soil properties within the soil taxonomic units is highlighted. It is hypothesized that within-unit variation in soil properties highly affects the use and interpretation of thematic maps for ecosystem services mapping. Results will mainly be based on analyses done in Rwanda, but can be complemented with ongoing research results or prospects for Burundi.

  1. From 20th century metabolic wall charts to 21st century systems biology: database of mammalian metabolic enzymes

    PubMed Central

    Corcoran, Callan C.; Grady, Cameron R.; Pisitkun, Trairak; Parulekar, Jaya

    2017-01-01

    The organization of the mammalian genome into gene subsets corresponding to specific functional classes has provided key tools for systems biology research. Here, we have created a web-accessible resource called the Mammalian Metabolic Enzyme Database (https://hpcwebapps.cit.nih.gov/ESBL/Database/MetabolicEnzymes/MetabolicEnzymeDatabase.html) keyed to the biochemical reactions represented on iconic metabolic pathway wall charts created in the previous century. Overall, we have mapped 1,647 genes to these pathways, representing ~7 percent of the protein-coding genome. To illustrate the use of the database, we apply it to the area of kidney physiology. In so doing, we have created an additional database (Database of Metabolic Enzymes in Kidney Tubule Segments: https://hpcwebapps.cit.nih.gov/ESBL/Database/MetabolicEnzymes/), mapping mRNA abundance measurements (mined from RNA-Seq studies) for all metabolic enzymes to each of 14 renal tubule segments. We carry out bioinformatics analysis of the enzyme expression pattern among renal tubule segments and mine various data sources to identify vasopressin-regulated metabolic enzymes in the renal collecting duct. PMID:27974320

  2. 5 CFR 9.1 - Definition.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Code, but does not include the Federal Bureau of Investigation, the Central Intelligence Agency, the Defense Intelligence Agency, the National Imagery and Mapping Agency, the National Security Agency, and... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE RULES WORKFORCE INFORMATION (RULE IX) § 9.1...

  3. Querying XML Data with SPARQL

    NASA Astrophysics Data System (ADS)

    Bikakis, Nikos; Gioldasis, Nektarios; Tsinaraki, Chrisa; Christodoulakis, Stavros

    SPARQL is today the standard access language for Semantic Web data. In the recent years XML databases have also acquired industrial importance due to the widespread applicability of XML in the Web. In this paper we present a framework that bridges the heterogeneity gap and creates an interoperable environment where SPARQL queries are used to access XML databases. Our approach assumes that fairly generic mappings between ontology constructs and XML Schema constructs have been automatically derived or manually specified. The mappings are used to automatically translate SPARQL queries to semantically equivalent XQuery queries which are used to access the XML databases. We present the algorithms and the implementation of SPARQL2XQuery framework, which is used for answering SPARQL queries over XML databases.

  4. New Tsunami Inundation Maps for California

    NASA Astrophysics Data System (ADS)

    Barberopoulou, Aggeliki; Borrero, Jose; Uslu, Burak; Kanoglu, Utku; Synolakis, Costas

    2010-05-01

    California is the first US State to complete its tsunami inundation mapping. A new generation of tsunami inundation maps is now available for 17 coastal counties.. The new maps offer improved coverage for many areas, they are based on the most recent descriptions of potential tsunami farfield and nearfield sources and use the best available bathymetric and topographic data for modelling. The need for new tsunami maps for California became clear since Synolakis et al (1998) described how inundation projections derived with inundation models that fully calculate the wave evolution over dry land can be as high as twice the values predicted with earlier threshold models, for tsunamis originating from tectonic source. Since the 1998 Papua New Guinea tsunami when the hazard from offshore submarine landslides was better understood (Bardet et al, 2003), the State of California funded the development of the first generation of maps, based on local tectonic and landslide sources. Most of the hazard was dominated by offshore landslides, whose return period remains unknown but is believed to be higher than 1000 years for any given locale, at least in Southern California. The new generation of maps incorporates local and distant scenarios. The partnership between the Tsunami Research Center at USC, the California Emergency Management Agency and the California Seismic Safety Commission let the State to be the first among all US States to complete the maps. (Exceptions include the offshore islands and Newport Beach, where higher resolution maps are under way). The maps were produced with the lowest cost per mile of coastline, per resident or per map than all other States, because of the seamless integration of the USC and NOAA databases and the use of the MOST model. They are a significant improvement over earlier map generations. As part of a continuous improvement in response, mitigation and planning and community education, the California inundation maps can contribute in reducing tsunami risk. References -Bardet, JP et al (2003), Landslide tsunamis: Recent findings and research directions, Pure and Applied Geophysics, 160, (10-11), 1793-1809. -Eisner, R., Borrero, C., Synolakis, C.E. (2001) Inundation Maps for the State of California, International Tsunami Symposium, ITS 2001 Proceedings, NHTMP Review Paper #4, 67-81. -Synolakis, C.E., D. McCarthy, V.V. Titov, J.C. Borrero, (1998) Evaluating the Tsunami Risk in California, CALIFORNIA AND THE WORLD OCEAN '97, 1225-1236, Proceedings ASCE, ISBN: 0-7844-0297-3.

  5. Assessment and mapping of water pollution indices in zone-III of municipal corporation of hyderabad using remote sensing and geographic information system.

    PubMed

    Asadi, S S; Vuppala, Padmaja; Reddy, M Anji

    2005-01-01

    A preliminary survey of area under Zone-III of MCH was undertaken to assess the ground water quality, demonstrate its spatial distribution and correlate with the land use patterns using advance techniques of remote sensing and geographical information system (GIS). Twenty-seven ground water samples were collected and their chemical analysis was done to form the attribute database. Water quality index was calculated from the measured parameters, based on which the study area was classified into five groups with respect to suitability of water for drinking purpose. Thematic maps viz., base map, road network, drainage and land use/land cover were prepared from IRS ID PAN + LISS III merged satellite imagery forming the spatial database. Attribute database was integrated with spatial sampling locations map in Arc/Info and maps showing spatial distribution of water quality parameters were prepared in Arc View. Results indicated that high concentrations of total dissolved solids (TDS), nitrates, fluorides and total hardness were observed in few industrial and densely populated areas indicating deteriorated water quality while the other areas exhibited moderate to good water quality.

  6. Map and data for Quaternary faults and folds in New Mexico

    USGS Publications Warehouse

    Machette, M.N.; Personius, S.F.; Kelson, K.I.; Haller, K.M.; Dart, R.L.

    1998-01-01

    The "World Map of Major Active Faults" Task Group is compiling a series of digital maps for the United States and other countries in the Western Hemisphere that show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds; the companion database includes published information on these seismogenic features. The Western Hemisphere effort is sponsored by International Lithosphere Program (ILP) Task Group H-2, whereas the effort to compile a new map and database for the United States is funded by the Earthquake Reduction Program (ERP) through the U.S. Geological Survey. The maps and accompanying databases represent a key contribution to the new Global Seismic Hazards Assessment Program (ILP Task Group II-O) for the International Decade for Natural Disaster Reduction. This compilation, which describes evidence for surface faulting and folding in New Mexico, is the third of many similar State and regional compilations that are planned for the U.S. The compilation for West Texas is available as U.S. Geological Survey Open-File Report 96-002 (Collins and others, 1996 #993) and the compilation for Montana will be released as a Montana Bureau of Mines product (Haller and others, in press #1750).

  7. BrainMap VBM: An environment for structural meta-analysis.

    PubMed

    Vanasse, Thomas J; Fox, P Mickle; Barron, Daniel S; Robertson, Michaela; Eickhoff, Simon B; Lancaster, Jack L; Fox, Peter T

    2018-05-02

    The BrainMap database is a community resource that curates peer-reviewed, coordinate-based human neuroimaging literature. By pairing the results of neuroimaging studies with their relevant meta-data, BrainMap facilitates coordinate-based meta-analysis (CBMA) of the neuroimaging literature en masse or at the level of experimental paradigm, clinical disease, or anatomic location. Initially dedicated to the functional, task-activation literature, BrainMap is now expanding to include voxel-based morphometry (VBM) studies in a separate sector, titled: BrainMap VBM. VBM is a whole-brain, voxel-wise method that measures significant structural differences between or within groups which are reported as standardized, peak x-y-z coordinates. Here we describe BrainMap VBM, including the meta-data structure, current data volume, and automated reverse inference functions (region-to-disease profile) of this new community resource. CBMA offers a robust methodology for retaining true-positive and excluding false-positive findings across studies in the VBM literature. As with BrainMap's functional database, BrainMap VBM may be synthesized en masse or at the level of clinical disease or anatomic location. As a use-case scenario for BrainMap VBM, we illustrate a trans-diagnostic data-mining procedure wherein we explore the underlying network structure of 2,002 experiments representing over 53,000 subjects through independent components analysis (ICA). To reduce data-redundancy effects inherent to any database, we demonstrate two data-filtering approaches that proved helpful to ICA. Finally, we apply hierarchical clustering analysis (HCA) to measure network- and disease-specificity. This procedure distinguished psychiatric from neurological diseases. We invite the neuroscientific community to further exploit BrainMap VBM with other modeling approaches. © 2018 Wiley Periodicals, Inc.

  8. Improvements in the Protein Identifier Cross-Reference service.

    PubMed

    Wein, Samuel P; Côté, Richard G; Dumousseau, Marine; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan A

    2012-07-01

    The Protein Identifier Cross-Reference (PICR) service is a tool that allows users to map protein identifiers, protein sequences and gene identifiers across over 100 different source databases. PICR takes input through an interactive website as well as Representational State Transfer (REST) and Simple Object Access Protocol (SOAP) services. It returns the results as HTML pages, XLS and CSV files. It has been in production since 2007 and has been recently enhanced to add new functionality and increase the number of databases it covers. Protein subsequences can be Basic Local Alignment Search Tool (BLAST) against the UniProt Knowledgebase (UniProtKB) to provide an entry point to the standard PICR mapping algorithm. In addition, gene identifiers from UniProtKB and Ensembl can now be submitted as input or mapped to as output from PICR. We have also implemented a 'best-guess' mapping algorithm for UniProt. In this article, we describe the usefulness of PICR, how these changes have been implemented, and the corresponding additions to the web services. Finally, we explain that the number of source databases covered by PICR has increased from the initial 73 to the current 102. New resources include several new species-specific Ensembl databases as well as the Ensembl Genome ones. PICR can be accessed at http://www.ebi.ac.uk/Tools/picr/.

  9. The iMars WebGIS - Spatio-Temporal Data Queries and Single Image Map Web Services

    NASA Astrophysics Data System (ADS)

    Walter, Sebastian; Steikert, Ralf; Schreiner, Bjoern; Muller, Jan-Peter; van Gasselt, Stephan; Sidiropoulos, Panagiotis; Lanz-Kroechert, Julia

    2017-04-01

    Introduction: Web-based planetary image dissemination platforms usually show outline coverages of the data and offer querying for metadata as well as preview and download, e.g. the HRSC Mapserver (Walter & van Gasselt, 2014). Here we introduce a new approach for a system dedicated to change detection by simultanous visualisation of single-image time series in a multi-temporal context. While the usual form of presenting multi-orbit datasets is the merge of the data into a larger mosaic, we want to stay with the single image as an important snapshot of the planetary surface at a specific time. In the context of the EU FP-7 iMars project we process and ingest vast amounts of automatically co-registered (ACRO) images. The base of the co-registration are the high precision HRSC multi-orbit quadrangle image mosaics, which are based on bundle-block-adjusted multi-orbit HRSC DTMs. Additionally we make use of the existing bundle-adjusted HRSC single images available at the PDS archives. A prototype demonstrating the presented features is available at http://imars.planet.fu-berlin.de. Multi-temporal database: In order to locate multiple coverage of images and select images based on spatio-temporal queries, we converge available coverage catalogs for various NASA imaging missions into a relational database management system with geometry support. We harvest available metadata entries during our processing pipeline using the Integrated Software for Imagers and Spectrometers (ISIS) software. Currently, this database contains image outlines from the MGS/MOC, MRO/CTX and the MO/THEMIS instruments with imaging dates ranging from 1996 to the present. For the MEx/HRSC data, we already maintain a database which we automatically update with custom software based on the VICAR environment. Web Map Service with time support: The MapServer software is connected to the database and provides Web Map Services (WMS) with time support based on the START_TIME image attribute. It allows temporal WMS GetMap requests by setting additional TIME parameter values in the request. The values for the parameter represent an interval defined by its lower and upper bounds. As the WMS time standard only supports one time variable, only the start times of the images are considered. If no time values are submitted with the request, the full time range of all images is assumed as the default. Dynamic single image WMS: To compare images from different acquisition times at sites of multiple coverage, we have to load every image as a single WMS layer. Due to the vast amount of single images we need a way to set up the layers in a dynamic way - the map server does not know the images to be served beforehand. We use the MapScript interface to dynamically access MapServer's objects and configure the file name and path of the requested image in the map configuration. The layers are created on-the-fly each representing only one single image. On the frontend side, the vendor-specific WMS request parameter (PRODUCTID) has to be appended to the regular set of WMS parameters. The request is then passed on to the MapScript instance. Web Map Tile Cache: In order to speed up access of the WMS requests, a MapCache instance has been integrated in the pipeline. As it is not aware of the available PDS product IDs which will be queried, the PRODUCTID parameter is configured as an additional dimension of the cache. The WMS request is received by the Apache webserver configured with the MapCache module. If the tile is available in the tile cache, it is immediately commited to the client. If not available, the tile request is forwarded to Apache and the MapScript module. The Python script intercepts the WMS request and extracts the product ID from the parameter chain. It loads the layer object from the map file and appends the file name and path of the inquired image. After some possible further image processing inside the script (stretching, color matching), the request is submitted to the MapServer backend which in turn delivers the response back to the MapCache instance. Web frontend: We have implemented a web-GIS frontend based on various OpenLayers components. The basemap is a global color-hillshaded HRSC bundle-adjusted DTM mosaic with a resolution of 50 m per pixel. The new bundle-block-adjusted qudrangle mosaics of the MC-11 quadrangle, both image and DTM, are included with opacity slider options. The layer user interface has been adapted on the base of the ol3-layerswitcher and extended by foldable and switchable groups, layer sorting (by resolution, by time and alphabeticallly) and reordering (drag-and-drop). A collapsible time panel accomodates a time slider interface where the user can filter the visible data by a range of Mars or Earth dates and/or by solar longitudes. The visualisation of time-series of single images is controlled by a specific toolbar enabling the workflow of image selection (by point or bounding box), dynamic image loading and playback of single images in a video player-like environment. During a stress-test campaign we could demonstrate that the system is capable of serving up to 10 simultaneous users on its current lightweight development hardware. It is planned to relocate the software to more powerful hardware by the time of this conference. Conclusions/Outlook: The iMars webGIS is an expert tool for the detection and visualization of surface changes. We demonstrate a technique to dynamically retrieve and display single images based on the time-series structure of the data. Together with the multi-temporal database and its MapServer/MapCache backend it provides a stable and high performance environment for the dissemination of the various iMars products. Acknowledgements: This research has received funding from the EU's FP7 Programme under iMars 607379 and by the German Space Agency (DLR Bonn), grant 50 QM 1301 (HRSC on Mars Express).

  10. DSSTOX WEBSITE LAUNCH: IMPROVING PUBLIC ACCESS TO DATABASES FOR BUILDING STRUCTURE-TOXICITY PREDICTION MODELS

    EPA Science Inventory

    DSSTox Website Launch: Improving Public Access to Databases for Building Structure-Toxicity Prediction Models
    Ann M. Richard
    US Environmental Protection Agency, Research Triangle Park, NC, USA

    Distributed: Decentralized set of standardized, field-delimited databases,...

  11. Landsat ETM+ False-Color Image Mosaics of Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.

    2007-01-01

    In 2005, the U.S. Agency for International Development and the U.S. Trade and Development Agency contracted with the U.S. Geological Survey to perform assessments of the natural resources within Afghanistan. The assessments concentrate on the resources that are related to the economic development of that country. Therefore, assessments were initiated in oil and gas, coal, mineral resources, water resources, and earthquake hazards. All of these assessments require geologic, structural, and topographic information throughout the country at a finer scale and better accuracy than that provided by the existing maps, which were published in the 1970's by the Russians and Germans. The very rugged terrain in Afghanistan, the large scale of these assessments, and the terrorist threat in Afghanistan indicated that the best approach to provide the preliminary assessments was to use remotely sensed, satellite image data, although this may also apply to subsequent phases of the assessments. Therefore, the first step in the assessment process was to produce satellite image mosaics of Afghanistan that would be useful for these assessments. This report discusses the production of the Landsat false-color image database produced for these assessments, which was produced from the calibrated Landsat ETM+ image mosaics described by Davis (2006).

  12. Calibrated Landsat ETM+ nonthermal-band image mosaics of Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.

    2006-01-01

    In 2005, the U.S. Agency for International Development and the U.S. Trade and Development Agency contracted with the U.S. Geological Survey to perform assessments of the natural resources within Afghanistan. The assessments concentrate on the resources that are related to the economic development of that country. Therefore, assessments were initiated in oil and gas, coal, mineral resources, water resources, and earthquake hazards. All of these assessments require geologic, structural, and topographic information throughout the country at a finer scale and better accuracy than that provided by the existing maps, which were published in the 1970s by the Russians and Germans. The very rugged terrain in Afghanistan, the large scale of these assessments, and the terrorist threat in Afghanistan indicated that the best approach to provide the preliminary assessments was to use remotely sensed, satellite image data, although this may also apply to subsequent phases of the assessments. Therefore, the first step in the assessment process was to produce satellite image mosaics of Afghanistan that would be useful for these assessments. This report discusses the production and characteristics of the fundamental satellite image databases produced for these assessments, which are calibrated image mosaics of all six Landsat nonthermal (reflected) bands.

  13. A digital version of the 1970 U.S. Geological Survey topographic map of the San Francisco Bay region, three sheets, 1:125,000

    USGS Publications Warehouse

    Aitken, Douglas S.

    1997-01-01

    This Open-File report is a digital topographic map database. It contains a digital version of the 1970 U.S. Geological Survey topographic map of the San Francisco Bay Region (3 sheets), at a scale of 1:125,000. These ARC/INFO coverages are in vector format. The vectorization process has distorted characters representing letters and numbers, as well as some road and other symbols, making them difficult to read in some instances. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The content and character of the database and methods of obtaining it are described herein.

  14. Understanding the productive author who published papers in medicine using National Health Insurance Database: A systematic review and meta-analysis.

    PubMed

    Chien, Tsair-Wei; Chang, Yu; Wang, Hsien-Yi

    2018-02-01

    Many researchers used National Health Insurance database to publish medical papers which are often retrospective, population-based, and cohort studies. However, the author's research domain and academic characteristics are still unclear.By searching the PubMed database (Pubmed.com), we used the keyword of [Taiwan] and [National Health Insurance Research Database], then downloaded 2913 articles published from 1995 to 2017. Social network analysis (SNA), Gini coefficient, and Google Maps were applied to gather these data for visualizing: the most productive author; the pattern of coauthor collaboration teams; and the author's research domain denoted by abstract keywords and Pubmed MESH (medical subject heading) terms.Utilizing the 2913 papers from Taiwan's National Health Insurance database, we chose the top 10 research teams shown on Google Maps and analyzed one author (Dr. Kao) who published 149 papers in the database in 2015. In the past 15 years, we found Dr. Kao had 2987 connections with other coauthors from 13 research teams. The cooccurrence abstract keywords with the highest frequency are cohort study and National Health Insurance Research Database. The most coexistent MESH terms are tomography, X-ray computed, and positron-emission tomography. The strength of the author research distinct domain is very low (Gini < 0.40).SNA incorporated with Google Maps and Gini coefficient provides insight into the relationships between entities. The results obtained in this study can be applied for a comprehensive understanding of other productive authors in the field of academics.

  15. The National Landslide Database and GIS for Great Britain: construction, development, data acquisition, application and communication

    NASA Astrophysics Data System (ADS)

    Pennington, Catherine; Dashwood, Claire; Freeborough, Katy

    2014-05-01

    The National Landslide Database has been developed by the British Geological Survey (BGS) and is the focus for national geohazard research for landslides in Great Britain. The history and structure of the geospatial database and associated Geographical Information System (GIS) are explained, along with the future developments of the database and its applications. The database is the most extensive source of information on landslides in Great Britain with over 16,500 records of landslide events, each documented as fully as possible. Data are gathered through a range of procedures, including: incorporation of other databases; automated trawling of current and historical scientific literature and media reports; new field- and desk-based mapping technologies with digital data capture, and crowd-sourcing information through social media and other online resources. This information is invaluable for the investigation, prevention and mitigation of areas of unstable ground in accordance with Government planning policy guidelines. The national landslide susceptibility map (GeoSure) and a national landslide domain map currently under development rely heavily on the information contained within the landslide database. Assessing susceptibility to landsliding requires knowledge of the distribution of failures and an understanding of causative factors and their spatial distribution, whilst understanding the frequency and types of landsliding present is integral to modelling how rainfall will influence the stability of a region. Communication of landslide data through the Natural Hazard Partnership (NHP) contributes to national hazard mitigation and disaster risk reduction with respect to weather and climate. Daily reports of landslide potential are published by BGS through the NHP and data collected for the National Landslide Database is used widely for the creation of these assessments. The National Landslide Database is freely available via an online GIS and is used by a variety of stakeholders for research purposes.

  16. Mapping habitat for multiple species in the Desert Southwest

    USGS Publications Warehouse

    Inman, Richard D.; Nussear, Kenneth E.; Esque, Todd C.; Vandergast, Amy G.; Hathaway, Stacie A.; Wood, Dustin A.; Barr, Kelly R.; Fisher, Robert N.

    2014-01-01

    Many utility scale renewable energy projects are currently proposed across the Mojave Ecoregion. Agencies that manage biological resources throughout this region need to understand the potential impacts of these renewable energy projects and their associated infrastructure (for example, transmission corridors, substations, access roads, etc.) on species movement, genetic exchange among populations, and species’ abilities to adapt to changing environmental conditions. Understanding these factors will help managers’ select appropriate project sites and possibly mitigate for anticipated effects of management activities. We used species distribution models to map habitat for 15 species across the Mojave Ecoregion to aid regional land-use management planning. Models were developed using a common 1 × 1 kilometer resolution with maximum entropy and generalized additive models. Occurrence data were compiled from multiple sources, including VertNet (http://vertnet.org/), HerpNET (http://www.herpnet.org), and MaNIS (http://manisnet.org), as well as from internal U.S. Geological Survey databases and other biologists. Background data included 20 environmental covariates representing terrain, vegetation, and climate covariates. This report summarizes these environmental covariates and species distribution models used to predict habitat for the 15 species across the Mojave Ecoregion.

  17. KSC-99pp0522

    NASA Image and Video Library

    1999-05-13

    Inside the Space Station Processing Facility, the Shuttle Radar Topography Mission (SRTM) is maneuvered by an overhead crane toward a workstand below. The SRTM, which is the primary payload on mission STS-99, consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for launch in September 1999. The objective of this radar system is to obtain the most complete high-resolution digital topographic database of the Earth. It will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. SRTM will be making use of a technique called radar interferometry, wherein two radar images are taken from slightly different locations. Differences between these images allow for the calculation of surface elevation, or change. To get two radar images taken from different locations, the SRTM hardware will consist of one radar antenna in the shuttle payload bay and a second radar antenna attached to the end of a mast extended 60 meters (195 feet) out from the shuttle

  18. KSC-99pp0524

    NASA Image and Video Library

    1999-05-13

    The move of the Shuttle Radar Topography Mission (SRTM) is nearly complete as it is lowered onto the workstand in the Space Station Processing Facility. The SRTM, which is the primary payload on mission STS-99, consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for launch in September 1999. The objective of this radar system is to obtain the most complete high-resolution digital topographic database of the Earth. It will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. SRTM will be making use of a technique called radar interferometry, wherein two radar images are taken from slightly different locations. Differences between these images allow for the calculation of surface elevation, or change. To get two radar images taken from different locations, the SRTM hardware will consist of one radar antenna in the shuttle payload bay and a second radar antenna attached to the end of a mast extended 60 meters (195 feet) out from the shuttle

  19. KSC-99pp0521

    NASA Image and Video Library

    1999-05-13

    After being lifted off the transporter (lower right) in the Space Station Processing Facility, the Shuttle Radar Topography Mission (SRTM) moves across the floor toward a workstand. The SRTM, which is the primary payload on mission STS-99, consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for launch in September 1999. The objective of this radar system is to obtain the most complete high-resolution digital topographic database of the Earth. It will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. SRTM will be making use of a technique called radar interferometry, wherein two radar images are taken from slightly different locations. Differences between these images allow for the calculation of surface elevation, or change. To get two radar images taken from different locations, the SRTM hardware will consist of one radar antenna in the shuttle payload bay and a second radar antenna attached to the end of a mast extended 60 meters (195 feet) out from the shuttle

  20. KSC-99pp0523

    NASA Image and Video Library

    1999-05-13

    Inside the Space Station Processing Facility, workers at each end of a workstand watch as the Shuttle Radar Topography Mission (SRTM) begins its descent onto it. The SRTM, which is the primary payload on mission STS-99, consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for launch in September 1999. The objective of this radar system is to obtain the most complete high-resolution digital topographic database of the Earth. It will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. SRTM will be making use of a technique called radar interferometry, wherein two radar images are taken from slightly different locations. Differences between these images allow for the calculation of surface elevation, or change. To get two radar images taken from different locations, the SRTM hardware will consist of one radar antenna in the shuttle payload bay and a second radar antenna attached to the end of a mast extended 60 meters (195 feet) out from the shuttle

  1. A new automatic SAR-based flood mapping application hosted on the European Space Agency's grid processing on demand fast access to imagery environment

    NASA Astrophysics Data System (ADS)

    Hostache, Renaud; Chini, Marco; Matgen, Patrick; Giustarini, Laura

    2013-04-01

    There is a clear need for developing innovative processing chains based on earth observation (EO) data to generate products supporting emergency response and flood management at a global scale. Here an automatic flood mapping application is introduced. The latter is currently hosted on the Grid Processing on Demand (G-POD) Fast Access to Imagery (Faire) environment of the European Space Agency. The main objective of the online application is to deliver flooded areas using both recent and historical acquisitions of SAR data in an operational framework. It is worth mentioning that the method can be applied to both medium and high resolution SAR images. The flood mapping application consists of two main blocks: 1) A set of query tools for selecting the "crisis image" and the optimal corresponding pre-flood "reference image" from the G-POD archive. 2) An algorithm for extracting flooded areas using the previously selected "crisis image" and "reference image". The proposed method is a hybrid methodology, which combines histogram thresholding, region growing and change detection as an approach enabling the automatic, objective and reliable flood extent extraction from SAR images. The method is based on the calibration of a statistical distribution of "open water" backscatter values inferred from SAR images of floods. Change detection with respect to a pre-flood reference image helps reducing over-detection of inundated areas. The algorithms are computationally efficient and operate with minimum data requirements, considering as input data a flood image and a reference image. Stakeholders in flood management and service providers are able to log onto the flood mapping application to get support for the retrieval, from the rolling archive, of the most appropriate pre-flood reference image. Potential users will also be able to apply the implemented flood delineation algorithm. Case studies of several recent high magnitude flooding events (e.g. July 2007 Severn River flood, UK and March 2010 Red River flood, US) observed by high-resolution SAR sensors as well as airborne photography highlight advantages and limitations of the online application. A mid-term target is the exploitation of ESA SENTINEL 1 SAR data streams. In the long term it is foreseen to develop a potential extension of the application for systematically extracting flooded areas from all SAR images acquired on a daily, weekly or monthly basis. On-going research activities investigate the usefulness of the method for mapping flood hazard at global scale using databases of historic SAR remote sensing-derived flood inundation maps.

  2. Let the IRIS Bloom:Regrowing the integrated risk information system (IRIS) of the U.S. Environmental Protection Agency.

    PubMed

    Dourson, Michael L

    2018-05-03

    The Integrated Risk Information System (IRIS) of the U.S. Environmental Protection Agency (EPA) has an important role in protecting public health. Originally it provided a single database listing official risk values equally valid for all Agency offices, and was an important tool for risk assessment communication across EPA. Started in 1986, IRIS achieved full standing in 1990 when it listed 500 risk values, the effort of two senior EPA groups over 5 years of monthly face-to-face meetings, to assess combined risk data from multiple Agency offices. Those groups were disbanded in 1995, and the lack of continuing face-to-face meetings meant that IRIS became no longer EPA's comprehensive database of risk values or their latest evaluations. As a remedy, a work group of the Agency's senior scientists should be re-established to evaluate new risks and to update older ones. Risk values to be reviewed would come from the same EPA offices now developing such information on their own. Still, this senior group would have the final authority on posting a risk value in IRIS, independently of individual EPA offices. This approach could also lay the groundwork for an all-government IRIS database, especially needed as more government Agencies, industries and non-governmental organizations are addressing evolving risk characterizations. Copyright © 2018. Published by Elsevier Inc.

  3. Make Your Own Mashup Maps

    ERIC Educational Resources Information Center

    Lucking, Robert A.; Christmann, Edwin P.; Whiting, Mervyn J.

    2008-01-01

    "Mashup" is a new technology term used to describe a web application that combines data or technology from several different sources. You can apply this concept in your classroom by having students create their own mashup maps. Google Maps provides you with the simple tools, map databases, and online help you'll need to quickly master this…

  4. Geologic map and digital database of the Conejo Well 7.5 minute quadrangle, Riverside County, Southern California

    USGS Publications Warehouse

    Powell, Robert E.

    2001-01-01

    This data set maps and describes the geology of the Conejo Well 7.5 minute quadrangle, Riverside County, southern California. The quadrangle, situated in Joshua Tree National Park in the eastern Transverse Ranges physiographic and structural province, encompasses part of the northern Eagle Mountains and part of the south flank of Pinto Basin. It is underlain by a basement terrane comprising Proterozoic metamorphic rocks, Mesozoic plutonic rocks, and Mesozoic and Mesozoic or Cenozoic hypabyssal dikes. The basement terrane is capped by a widespread Tertiary erosion surface preserved in remnants in the Eagle Mountains and buried beneath Cenozoic deposits in Pinto Basin. Locally, Miocene basalt overlies the erosion surface. A sequence of at least three Quaternary pediments is planed into the north piedmont of the Eagle Mountains, each in turn overlain by successively younger residual and alluvial deposits. The Tertiary erosion surface is deformed and broken by north-northwest-trending, high-angle, dip-slip faults in the Eagle Mountains and an east-west trending system of high-angle dip- and left-slip faults. In and adjacent to the Conejo Well quadrangle, faults of the northwest-trending set displace Miocene sedimentary rocks and basalt deposited on the Tertiary erosion surface and Pliocene and (or) Pleistocene deposits that accumulated on the oldest pediment. Faults of this system appear to be overlain by Pleistocene deposits that accumulated on younger pediments. East-west trending faults are younger than and perhaps in part coeval with faults of the northwest-trending set. The Conejo Well database was created using ARCVIEW and ARC/INFO, which are geographical information system (GIS) software products of Envronmental Systems Research Institute (ESRI). The database consists of the following items: (1) a map coverage showing faults and geologic contacts and units, (2) a separate coverage showing dikes, (3) a coverage showing structural data, (4) a point coverage containing line ornamentation, and (5) a scanned topographic base at a scale of 1:24,000. The coverages include attribute tables for geologic units (polygons and regions), contacts (arcs), and site-specific data (points). The database, accompanied by a pamphlet file and this metadata file, also includes the following graphic and text products: (1) A portable document file (.pdf) containing a navigable graphic of the geologic map on a 1:24,000 topographic base. The map is accompanied by a marginal explanation consisting of a Description of Map and Database Units (DMU), a Correlation of Map and Database Units (CMU), and a key to point-and line-symbols. (2) Separate .pdf files of the DMU and CMU, individually. (3) A PostScript graphic-file containing the geologic map on a 1:24,000 topographic base accompanied by the marginal explanation. (4) A pamphlet that describes the database and how to access it. Within the database, geologic contacts , faults, and dikes are represented as lines (arcs), geologic units as polygons and regions, and site-specific data as points. Polygon, arc, and point attribute tables (.pat, .aat, and .pat, respectively) uniquely identify each geologic datum and link it to other tables (.rel) that provide more detailed geologic information.

  5. 77 FR 12234 - Changes in Hydric Soils Database Selection Criteria

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-29

    ... Conservation Service [Docket No. NRCS-2011-0026] Changes in Hydric Soils Database Selection Criteria AGENCY... Changes to the National Soil Information System (NASIS) Database Selection Criteria for Hydric Soils of the United States. SUMMARY: The National Technical Committee for Hydric Soils (NTCHS) has updated the...

  6. 76 FR 77504 - Notice of Submission for OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ... of Review: Extension. Title of Collection: Charter Schools Program Grand Award Database. OMB Control... collect data necessary for the Charter Schools Program (CSP) Grant Award Database. The CSP is authorized... award information from grantees (State agencies and some schools) for a database of current CSP-funded...

  7. TRlCARE Controls Over Claims Prepared by Third-Party Billing Agencies

    DTIC Science & Technology

    2008-12-31

    of the HHS-excluded billing agencies to the TRICARE claims database and saw that payments were sent to the addresses of three billing agencies...contractors and subcontractors responsible for claims processing, including TriWest, Wisconsin Physicians Services, HealthNet, Palmetto Government

  8. 76 FR 59185 - Agency Information Collection Activities: Requests for Comments; Clearance of Renewed Approval of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration Agency Information Collection... Registration Renewal AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice and request for... 8050-1 (approved under OMB control number 2120- 0042). The updated registration database will then be...

  9. Database resources of the National Center for Biotechnology Information: 2002 update

    PubMed Central

    Wheeler, David L.; Church, Deanna M.; Lash, Alex E.; Leipe, Detlef D.; Madden, Thomas L.; Pontius, Joan U.; Schuler, Gregory D.; Schriml, Lynn M.; Tatusova, Tatiana A.; Wagner, Lukas; Rapp, Barbara A.

    2002-01-01

    In addition to maintaining the GenBank nucleic acid sequence database, the National Center for Biotechnology Information (NCBI) provides data analysis and retrieval resources that operate on the data in GenBank and a variety of other biological data made available through NCBI’s web site. NCBI data retrieval resources include Entrez, PubMed, LocusLink and the Taxonomy Browser. Data analysis resources include BLAST, Electronic PCR, OrfFinder, RefSeq, UniGene, HomoloGene, Database of Single Nucleotide Polymorphisms (dbSNP), Human Genome Sequencing, Human MapViewer, Human¡VMouse Homology Map, Cancer Chromosome Aberration Project (CCAP), Entrez Genomes, Clusters of Orthologous Groups (COGs) database, Retroviral Genotyping Tools, SAGEmap, Gene Expression Omnibus (GEO), Online Mendelian Inheritance in Man (OMIM), the Molecular Modeling Database (MMDB) and the Conserved Domain Database (CDD). Augmenting many of the web applications are custom implementations of the BLAST program optimized to search specialized data sets. All of the resources can be accessed through the NCBI home page at http://www.ncbi.nlm.nih.gov. PMID:11752242

  10. Quaternary Geology and Liquefaction Susceptibility, Napa, California 1:100,000 Quadrangle: A Digital Database

    USGS Publications Warehouse

    Sowers, Janet M.; Noller, Jay S.; Lettis, William R.

    1998-01-01

    Earthquake-induced ground failures such as liquefaction have historically brought loss of life and damage to property and infrastructure. Observations of the effects of historical large-magnitude earthquakes show that the distribution of liquefaction phenomena is not random. Liquefaction is restricted to areas underlain by loose, cohesionless sands and silts that are saturated with water. These areas can be delineated on the basis of thorough geologic, geomorphic, and hydrologic mapping and map analysis (Tinsley and Holzer, 1990; Youd and Perkins, 1987). Once potential liquefaction zones are delineated, appropriate public and private agencies can prepare for and mitigate seismic hazard in these zones. In this study, we create a liquefaction susceptibility map of the Napa 1:100,000 quadrangle using Quaternary geologic mapping, analysis of historical liquefaction information, groundwater data, and data from other studies. The study is atterned after state-of-the-art studies by Youd (1973) Dupre and Tinsley (1980) and Dupre (1990) in the Monterey-Santa Cruz area, Tinsley and others (1985) in the Los Angeles area, and Youd and Perkins (1987) in San Mateo County, California. The study area comprises the northern San Francisco Metropolitan Area, including the cities of Santa Rosa, Vallejo, Napa, Novato, Martinez, and Fairfield (Figure 1). Holocene estuarine deposits, Holocene stream deposits, eolian sands, and artificial fill are widely present in the region (Helley and Lajoie, 1979) and are the geologic materials of greatest concern. Six major faults capable of producing large earthquakes cross the study area, including the San Andreas, Rodgers Creek, Hayward, West Napa, Concord, and Green Valley faults (Figure 1).

  11. Regulations in the field of Geo-Information

    NASA Astrophysics Data System (ADS)

    Felus, Y.; Keinan, E.; Regev, R.

    2013-10-01

    The geomatics profession has gone through a major revolution during the last two decades with the emergence of advanced GNSS, GIS and Remote Sensing technologies. These technologies have changed the core principles and working procedures of geomatics professionals. For this reason, surveying and mapping regulations, standards and specifications should be updated to reflect these changes. In Israel, the "Survey Regulations" is the principal document that regulates the professional activities in four key areas geodetic control, mapping, cadastre and Georaphic information systems. Licensed Surveyors and mapping professionals in Israel are required to work according to those regulations. This year a new set of regulations have been published and include a few major amendments as follows: In the Geodesy chapter, horizontal control is officially based on the Israeli network of Continuously Operating GNSS Reference Stations (CORS). The regulations were phrased in a manner that will allow minor datum changes to the CORS stations due to Earth Crustal Movements. Moreover, the regulations permit the use of GNSS for low accuracy height measurements. In the Cadastre chapter, the most critical change is the move to Coordinate Based Cadastre (CBC). Each parcel corner point is ranked according to its quality (accuracy and clarity of definition). The highest ranking for a parcel corner is 1. A point with a rank of 1 is defined by its coordinates alone. Any other contradicting evidence is inferior to the coordinates values. Cadastral Information is stored and managed via the National Cadastral Databases. In the Mapping and GIS chapter; the traditional paper maps (ranked by scale) are replaced by digital maps or spatial databases. These spatial databases are ranked by their quality level. Quality level is determined (similar to the ISO19157 Standard) by logical consistency, completeness, positional accuracy, attribute accuracy, temporal accuracy and usability. Metadata is another critical component of any spatial database. Every component in a map should have a metadata identification, even if the map was compiled from multiple resources. The regulations permit the use of advanced sensors and mapping techniques including LIDAR and digita l cameras that have been certified and meet the defined criteria. The article reviews these new regulations and the decision that led to them.

  12. Geologic Map of the State of Hawai`i

    USGS Publications Warehouse

    Sherrod, David R.; Sinton, John M.; Watkins, Sarah E.; Brunt, Kelly M.

    2007-01-01

    About This Map The State's geology is presented on eight full-color map sheets, one for each of the major islands. These map sheets, the illustrative meat of the publication, can be downloaded in pdf format, ready to print. Map scale is 1:100,000 for most of the islands, so that each map is about 27 inches by 36 inches. The Island of Hawai`i, largest of the islands, is depicted at a smaller scale, 1:250,000, so that it, too, can be shown on 36-inch-wide paper. The new publication isn't limited strictly to its map depictions. Twenty years have passed since David Clague and Brent Dalrymple published a comprehensive report that summarized the geology of all the islands, and it has been even longer since the last edition of Gordon Macdonald's book, Islands in the Sea, was revised. Therefore the new statewide geologic map includes an 83-page explanatory pamphlet that revisits many of the concepts that have evolved in our geologic understanding of the eight main islands. The pamphlet includes simplified page-size geologic maps for each island, summaries of all the radiometric ages that have been gathered since about 1960, generalized depictions of geochemical analyses for each volcano's eruptive stages, and discussion of some outstanding topics that remain controversial or deserving of additional research. The pamphlet also contains a complete description of map units, which enumerates the characteristics for each of the state's many stratigraphic formations shown on the map sheets. Since the late 1980s, the audience for geologic maps has grown as desktop computers and map-based software have become increasingly powerful. Those who prefer the convenience and access offered by Geographic Information Systems (GIS) can also feast on this publication. An electronic database, suitable for most GIS software applications, is available for downloading. The GIS database is in an Earth projection widely employed throughout the State of Hawai`i, using the North American datum of 1983 and the Universal Transverse Mercator system projection to zone 4. 'This digital statewide map allows engineers, consultants, and scientists from many different fields to take advantage of the geologic database,' said John Sinton, a geology professor at the University of Hawai`i, whose new mapping of the Wai`anae Range (West O`ahu) appears on the map. Indeed, when a testing version was first made available, most requests came from biologists, archaeologists, and soil scientists interested in applying the map's GIS database to their ongoing investigations. Another area newly depicted on the map, in addition to the Wai`anae Range, is Haleakala volcano, East Maui. So too for the active lava flows of Kilauea volcano, Island of Hawai`i, where the landscape has continued to evolve in the ten years since publication of the Big Island's revised geologic map. For the other islands, much of the map is compiled from mapping published in the 1930-1960s. This reliance stems partly from shortage of funding to undertake entirely new mapping but is warranted by the exemplary mapping of those early experts. The boundaries of all map units are digitized to show correctly on modern topographic maps.

  13. PineElm_SSRdb: a microsatellite marker database identified from genomic, chloroplast, mitochondrial and EST sequences of pineapple (Ananas comosus (L.) Merrill).

    PubMed

    Chaudhary, Sakshi; Mishra, Bharat Kumar; Vivek, Thiruvettai; Magadum, Santoshkumar; Yasin, Jeshima Khan

    2016-01-01

    Simple Sequence Repeats or microsatellites are resourceful molecular genetic markers. There are only few reports of SSR identification and development in pineapple. Complete genome sequence of pineapple available in the public domain can be used to develop numerous novel SSRs. Therefore, an attempt was made to identify SSRs from genomic, chloroplast, mitochondrial and EST sequences of pineapple which will help in deciphering genetic makeup of its germplasm resources. A total of 359511 SSRs were identified in pineapple (356385 from genome sequence, 45 from chloroplast sequence, 249 in mitochondrial sequence and 2832 from EST sequences). The list of EST-SSR markers and their details are available in the database. PineElm_SSRdb is an open source database available for non-commercial academic purpose at http://app.bioelm.com/ with a mapping tool which can develop circular maps of selected marker set. This database will be of immense use to breeders, researchers and graduates working on Ananas spp. and to others working on cross-species transferability of markers, investigating diversity, mapping and DNA fingerprinting.

  14. Geology of the Cape Mendocino, Eureka, Garberville, and Southwestern Part of the Hayfork 30 x 60 Minute Quadrangles and Adjacent Offshore Area, Northern California

    USGS Publications Warehouse

    McLaughlin, Robert J.; Ellen, S.D.; Blake, M.C.; Jayko, Angela S.; Irwin, W.P.; Aalto, K.R.; Carver, G.A.; Clarke, S.H.; Barnes, J.B.; Cecil, J.D.; Cyr, K.A.

    2000-01-01

    Introduction These geologic maps and accompanying structure sections depict the geology and structure of much of northwestern California and the adjacent continental margin. The map area includes the Mendocino triple junction, which is the juncture of the North American continental plate with two plates of the Pacific ocean basin. The map area also encompasses major geographic and geologic provinces of northwestern California. The maps incorporate much previously unpublished geologic mapping done between 1980 and 1995, as well as published mapping done between about 1950 and 1978. To construct structure sections to mid-crustal depths, we integrate the surface geology with interpretations of crustal structure based on seismicity, gravity and aeromagnetic data, offshore structure, and seismic reflection and refraction data. In addition to describing major geologic and structural features of northwestern California, the geologic maps have the potential to address a number of societally relevant issues, including hazards from earthquakes, landslides, and floods and problems related to timber harvest, wildlife habitat, and changing land use. All of these topics will continue to be of interest in the region, as changing land uses and population density interact with natural conditions. In these interactions, it is critical that the policies and practices affecting man and the environment integrate an adequate understanding of the geology. This digital map database, compiled from previously published and unpublished data, and new mapping by the authors, represents the general distribution of bedrock and surficial deposits in the mapped area. Together with the accompanying text file (ceghmf.ps, ceghmf.pdf, ceghmf.txt), it provides current information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:100,000 or smaller.

  15. Preliminary metallogenic belt and mineral deposit maps for northeast Asia

    USGS Publications Warehouse

    Obolenskiy, Alexander A.; Rodionov, Sergey M.; Dejidmaa, Gunchin; Gerel, Ochir; Hwang, Duk-Hwan; Distanov, Elimir G.; Badarch, Gombosuren; Khanchuk, Alexander I.; Ogasawara, Masatsugu; Nokleberg, Warren J.; Parfenov, Leonid M.; Prokopiev, Andrei V.; Seminskiy, Zhan V.; Smelov, Alexander P.; Yan, Hongquan; Birul'kin, Gennandiy V.; Davydov, Yuriy V.V.; Fridovskiy, Valeriy Yu.; Gamyanin, Gennandiy N.; Kostin, Alexei V.; Letunov, Sergey A.; Li, Xujun; Nikitin, Valeriy M.; Sotnikov, Sadahisa; Sudo, Vitaly I.; Spiridonov, Alexander V.; Stepanov, Vitaly A.; Sun, Fengyue; Sun, Jiapeng; Sun, Weizhi; Supletsov, Valeriy M.; Timofeev, Vladimir F.; Tyan, Oleg A.; Vetluzhskikh, Valeriy G.; Wakita, Koji; Yakovlev, Yakov V.; Zorina, Lydia M.

    2003-01-01

    The metallogenic belts and locations of major mineral deposits of Northeast Asia are portrayed on Sheets 1-4. Sheet 1 portrays the location of significant lode deposits and placer districts at a scale of 1:7,500,000. Sheets 2-4 portray the metallogenic belts of the region in a series of 12 time-slices from the Archean through the Quaternary at a scale of 1:15,000,000. For all four map sheets, a generalized geodynamics base map, derived from a more detailed map by Parfenov and others (2003), is used as an underlay for the metallogenic belt maps. This geodynamics map underlay permits depicts the major host geologic units and structures that host metallogenic belts. Four tables are included in this report. A hierarchial ranking of mineral deposit models is listed in Table 1. And summary features of lode deposits, placer districts, and metallogenic belts are described in Tables 2, 3, and 4, respectively. The metallogenic belts for Northeast Asia are synthesized, compiled, described, and interpreted with the use of modern concepts of plate tectonics, analysis of terranes and overlap assemblages, and synthesis of mineral deposit models. The data supporting the compilation are: (1) comprehensive descriptions of mineral deposits; (2) compilation and synthesis of a regional geodynamics map the region at 5 million scale with detailed explanations and cited references; and (3) compilation and synthesis of metallogenic belt maps at 15 million scale with detailed explanations and cited references. These studies are part of a major international collaborative study of the Mineral Resources, Metallogenesis, and Tectonics of Northeast Asia that is being conducted from 1997 through 2002 by geologists from earth science agencies and universities in Russia, Mongolia, Northeastern China, South Korea, Japan, and the USA. Companion studies and previous publications are: (1) a detailed geodynamics map of Northeast Asia (Parfenov and 2003); (2) a compilation of major mineral deposit models (Rodionov and Nokleberg, 2000; Rodionov and others, 2000; Obolenskiy and others, 2003); and (3) a database on significant metalliferous and selected nonmetalliferous lode deposits, and selected placer districts (Ariunbileg and others, 2003).

  16. Active, capable, and potentially active faults - a paleoseismic perspective

    USGS Publications Warehouse

    Machette, M.N.

    2000-01-01

    Maps of faults (geologically defined source zones) may portray seismic hazards in a wide range of completeness depending on which types of faults are shown. Three fault terms - active, capable, and potential - are used in a variety of ways for different reasons or applications. Nevertheless, to be useful for seismic-hazards analysis, fault maps should encompass a time interval that includes several earthquake cycles. For example, if the common recurrence in an area is 20,000-50,000 years, then maps should include faults that are 50,000-100,000 years old (two to five typical earthquake cycles), thus allowing for temporal variability in slip rate and recurrence intervals. Conversely, in more active areas such as plate boundaries, maps showing faults that are <10,000 years old should include those with at least 2 to as many as 20 paleoearthquakes. For the International Lithosphere Programs' Task Group II-2 Project on Major Active Faults of the World our maps and database will show five age categories and four slip rate categories that allow one to select differing time spans and activity rates for seismic-hazard analysis depending on tectonic regime. The maps are accompanied by a database that describes evidence for Quaternary faulting, geomorphic expression, and paleoseismic parameters (slip rate, recurrence interval and time of most recent surface faulting). These maps and databases provide an inventory of faults that would be defined as active, capable, and potentially active for seismic-hazard assessments.

  17. Implementation of Policies to Bridge the Gap Between Police Officer Line of Duty Deaths and Agency Resiliency

    DTIC Science & Technology

    2015-12-01

    by Year and Category..................................... 3 Figure 2. Map of Florida...16 Figure 3. Map of St. Petersburg................................................................... 17 Figure 4. Method of Line of... Map of Eastern United States ....................................................... 32 Figure 8. Virginia State Police Division Map

  18. 1986 Year End Report for Road Following at Carnegie-Mellon

    DTIC Science & Technology

    1987-05-01

    how to make them work efficiently. We designed a hierarchical structure and a monitor module which manages all parts of the hierarchy (see figure 1...database, called the Local Map, is managed by a program known as the Local Map Builder (LMB). Each module stores and retrieves information in the...knowledge-intensive modules, and a database manager that synchronizes the modules-is characteristic of a traditional blackboard system. Such a system is

  19. Geologic map of Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Hults, Chad P.; Mull, Charles G.; Karl, Susan M.

    2015-12-31

    This Alaska compilation is unique in that it is integrated with a rich database of information provided in the spatial datasets and standalone attribute databases. Within the spatial files every line and polygon is attributed to its original source; the references to these sources are contained in related tables, as well as in stand-alone tables. Additional attributes include typical lithology, geologic setting, and age range for the map units. Also included are tables of radiometric ages.

  20. Computer-Aided Clinical Trial Recruitment Based on Domain-Specific Language Translation: A Case Study of Retinopathy of Prematurity

    PubMed Central

    2017-01-01

    Reusing the data from healthcare information systems can effectively facilitate clinical trials (CTs). How to select candidate patients eligible for CT recruitment criteria is a central task. Related work either depends on DBA (database administrator) to convert the recruitment criteria to native SQL queries or involves the data mapping between a standard ontology/information model and individual data source schema. This paper proposes an alternative computer-aided CT recruitment paradigm, based on syntax translation between different DSLs (domain-specific languages). In this paradigm, the CT recruitment criteria are first formally represented as production rules. The referenced rule variables are all from the underlying database schema. Then the production rule is translated to an intermediate query-oriented DSL (e.g., LINQ). Finally, the intermediate DSL is directly mapped to native database queries (e.g., SQL) automated by ORM (object-relational mapping). PMID:29065644

  1. e23D: database and visualization of A-to-I RNA editing sites mapped to 3D protein structures.

    PubMed

    Solomon, Oz; Eyal, Eran; Amariglio, Ninette; Unger, Ron; Rechavi, Gidi

    2016-07-15

    e23D, a database of A-to-I RNA editing sites from human, mouse and fly mapped to evolutionary related protein 3D structures, is presented. Genomic coordinates of A-to-I RNA editing sites are converted to protein coordinates and mapped onto 3D structures from PDB or theoretical models from ModBase. e23D allows visualization of the protein structure, modeling of recoding events and orientation of the editing with respect to nearby genomic functional sites from databases of disease causing mutations and genomic polymorphism. http://www.sheba-cancer.org.il/e23D CONTACT: oz.solomon@live.biu.ac.il or Eran.Eyal@sheba.health.gov.il. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Computer-Aided Clinical Trial Recruitment Based on Domain-Specific Language Translation: A Case Study of Retinopathy of Prematurity.

    PubMed

    Zhang, Yinsheng; Zhang, Guoming; Shang, Qian

    2017-01-01

    Reusing the data from healthcare information systems can effectively facilitate clinical trials (CTs). How to select candidate patients eligible for CT recruitment criteria is a central task. Related work either depends on DBA (database administrator) to convert the recruitment criteria to native SQL queries or involves the data mapping between a standard ontology/information model and individual data source schema. This paper proposes an alternative computer-aided CT recruitment paradigm, based on syntax translation between different DSLs (domain-specific languages). In this paradigm, the CT recruitment criteria are first formally represented as production rules. The referenced rule variables are all from the underlying database schema. Then the production rule is translated to an intermediate query-oriented DSL (e.g., LINQ). Finally, the intermediate DSL is directly mapped to native database queries (e.g., SQL) automated by ORM (object-relational mapping).

  3. GEOGRAPHIC INFORMATION SYSTEM APPROACH FOR PLAY PORTFOLIOS TO IMPROVE OIL PRODUCTION IN THE ILLINOIS BASIN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beverly Seyler; John Grube

    2004-12-10

    Oil and gas have been commercially produced in Illinois for over 100 years. Existing commercial production is from more than fifty-two named pay horizons in Paleozoic rocks ranging in age from Middle Ordovician to Pennsylvanian. Over 3.2 billion barrels of oil have been produced. Recent calculations indicate that remaining mobile resources in the Illinois Basin may be on the order of several billion barrels. Thus, large quantities of oil, potentially recoverable using current technology, remain in Illinois oil fields despite a century of development. Many opportunities for increased production may have been missed due to complex development histories, multiple stackedmore » pays, and commingled production which makes thorough exploitation of pays and the application of secondary or improved/enhanced recovery strategies difficult. Access to data, and the techniques required to evaluate and manage large amounts of diverse data are major barriers to increased production of critical reserves in the Illinois Basin. These constraints are being alleviated by the development of a database access system using a Geographic Information System (GIS) approach for evaluation and identification of underdeveloped pays. The Illinois State Geological Survey has developed a methodology that is being used by industry to identify underdeveloped areas (UDAs) in and around petroleum reservoirs in Illinois using a GIS approach. This project utilizes a statewide oil and gas Oracle{reg_sign} database to develop a series of Oil and Gas Base Maps with well location symbols that are color-coded by producing horizon. Producing horizons are displayed as layers and can be selected as separate or combined layers that can be turned on and off. Map views can be customized to serve individual needs and page size maps can be printed. A core analysis database with over 168,000 entries has been compiled and assimilated into the ISGS Enterprise Oracle database. Maps of wells with core data have been generated. Data from over 1,700 Illinois waterflood units and waterflood areas have been entered into an Access{reg_sign} database. The waterflood area data has also been assimilated into the ISGS Oracle database for mapping and dissemination on the ArcIMS website. Formation depths for the Beech Creek Limestone, Ste. Genevieve Limestone and New Albany Shale in all of the oil producing region of Illinois have been calculated and entered into a digital database. Digital contoured structure maps have been constructed, edited and added to the ILoil website as map layers. This technology/methodology addresses the long-standing constraints related to information access and data management in Illinois by significantly simplifying the laborious process that industry presently must use to identify underdeveloped pay zones in Illinois.« less

  4. De-MA: a web Database for electron Microprobe Analyses to assist EMP lab manager and users

    NASA Astrophysics Data System (ADS)

    Allaz, J. M.

    2012-12-01

    Lab managers and users of electron microprobe (EMP) facilities require comprehensive, yet flexible documentation structures, as well as an efficient scheduling mechanism. A single on-line database system for managing reservations, and providing information on standards, quantitative and qualitative setups (element mapping, etc.), and X-ray data has been developed for this purpose. This system is particularly useful in multi-user facilities where experience ranges from beginners to the highly experienced. New users and occasional facility users will find these tools extremely useful in developing and maintaining high quality, reproducible, and efficient analyses. This user-friendly database is available through the web, and uses MySQL as a database and PHP/HTML as script language (dynamic website). The database includes several tables for standards information, X-ray lines, X-ray element mapping, PHA, element setups, and agenda. It is configurable for up to five different EMPs in a single lab, each of them having up to five spectrometers and as many diffraction crystals as required. The installation should be done on a web server supporting PHP/MySQL, although installation on a personal computer is possible using third-party freeware to create a local Apache server, and to enable PHP/MySQL. Since it is web-based, any user outside the EMP lab can access this database anytime through any web browser and on any operating system. The access can be secured using a general password protection (e.g. htaccess). The web interface consists of 6 main menus. (1) "Standards" lists standards defined in the database, and displays detailed information on each (e.g. material type, name, reference, comments, and analyses). Images such as EDS spectra or BSE can be associated with a standard. (2) "Analyses" lists typical setups to use for quantitative analyses, allows calculation of mineral composition based on a mineral formula, or calculation of mineral formula based on a fixed amount of oxygen, or of cation (using an analysis in element or oxide weight-%); this latter includes re-calculation of H2O/CO2 based on stoichiometry, and oxygen correction for F and Cl. Another option offers a list of any available standards and possible peak or background interferences for a series of elements. (3) "X-ray maps" lists the different setups recommended for element mapping using WDS, and a map calculator to facilitate maps setups and to estimate the total mapping time. (4) "X-ray data" lists all x-ray lines for a specific element (K, L, M, absorption edges, and satellite peaks) in term of energy, wavelength and peak position. A check for possible interferences on peak or background is also possible. Theoretical x-ray peak positions for each crystal are calculated based on the 2d spacing of each crystal and the wavelength of each line. (5) "Agenda" menu displays the reservation dates for each month and for each EMP lab defined. It also offers a reservation request option, this request being sent by email to the EMP manager for approval. (6) Finally, "Admin" is password restricted, and contains all necessary options to manage the database through user-friendly forms. The installation of this database is made easy and knowledge of HTML, PHP, or MySQL is unnecessary to install, configure, manage, or use it. A working database is accessible at http://cub.geoloweb.ch.

  5. Map and database of Quaternary faults in Venezuela and its offshore regions

    USGS Publications Warehouse

    Audemard, F.A.; Machette, M.N.; Cox, J.W.; Dart, R.L.; Haller, K.M.

    2000-01-01

    As part of the International Lithosphere Program’s “World Map of Major Active Faults,” the U.S. Geological Survey is assisting in the compilation of a series of digital maps of Quaternary faults and folds in Western Hemisphere countries. The maps show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds. They are accompanied by databases that describe these features and document current information on their activity in the Quaternary. The project is a key part of the Global Seismic Hazards Assessment Program (ILP Project II-0) for the International Decade for Natural Hazard Disaster Reduction.The project is sponsored by the International Lithosphere Program and funded by the USGS’s National Earthquake Hazards Reduction Program. The primary elements of the project are general supervision and interpretation of geologic/tectonic information, data compilation and entry for fault catalog, database design and management, and digitization and manipulation of data in †ARCINFO. For the compilation of data, we engaged experts in Quaternary faulting, neotectonics, paleoseismology, and seismology.

  6. NADM Conceptual Model 1.0 -- A Conceptual Model for Geologic Map Information

    USGS Publications Warehouse

    ,

    2004-01-01

    Executive Summary -- The NADM Data Model Design Team was established in 1999 by the North American Geologic Map Data Model Steering Committee (NADMSC) with the purpose of drafting a geologic map data model for consideration as a standard for developing interoperable geologic map-centered databases by state, provincial, and federal geological surveys. The model is designed to be a technology-neutral conceptual model that can form the basis for a web-based interchange format using evolving information technology (e.g., XML, RDF, OWL), and guide implementation of geoscience databases in a common conceptual framework. The intended purpose is to allow geologic information sharing between geologic map data providers and users, independent of local information system implementation. The model emphasizes geoscience concepts and relationships related to information presented on geologic maps. Design has been guided by an informal requirements analysis, documentation of existing databases, technology developments, and other standardization efforts in the geoscience and computer-science communities. A key aspect of the model is the notion that representation of the conceptual framework (ontology) that underlies geologic map data must be part of the model, because this framework changes with time and understanding, and varies between information providers. The top level of the model distinguishes geologic concepts, geologic representation concepts, and metadata. The geologic representation part of the model provides a framework for representing the ontology that underlies geologic map data through a controlled vocabulary, and for establishing the relationships between this vocabulary and a geologic map visualization or portrayal. Top-level geologic classes in the model are Earth material (substance), geologic unit (parts of the Earth), geologic age, geologic structure, fossil, geologic process, geologic relation, and geologic event.

  7. CAP: Mobile App

    Science.gov Websites

    Interpreting Services Training Non-DoD Employees Partner Agencies A - L Partner Agencies M - Z Training Service Service Members Site Map + CAP Customers DoD Employees DoD Agencies Support Services Training Non-DoD Employees Partner Agencies A - L Partner Agencies M - Z Training Service Members Military Treatment

  8. Geologic and geophysical maps of the El Casco 7.5′ quadrangle, Riverside County, southern California, with accompanying geologic-map database

    USGS Publications Warehouse

    Matti, J.C.; Morton, D.M.; Langenheim, V.E.

    2015-01-01

    Geologic information contained in the El Casco database is general-purpose data applicable to land-related investigations in the earth and biological sciences. The term “general-purpose” means that all geologic-feature classes have minimal information content adequate to characterize their general geologic characteristics and to interpret their general geologic history. However, no single feature class has enough information to definitively characterize its properties and origin. For this reason the database cannot be used for site-specific geologic evaluations, although it can be used to plan and guide investigations at the site-specific level.

  9. 76 FR 1137 - Publicly Available Consumer Product Safety Information Database: Notice of Public Web Conferences

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-07

    ...: Notice of Public Web Conferences AGENCY: Consumer Product Safety Commission. ACTION: Notice. SUMMARY: The Consumer Product Safety Commission (``Commission,'' ``CPSC,'' or ``we'') is announcing two Web conferences... database (``Database''). The Web conferences will be webcast live from the Commission's headquarters in...

  10. 75 FR 49869 - Changes to Standard Numbering System, Vessel Identification System, and Boating Accident Report...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-16

    ... Boating Accident Report Database AGENCY: Coast Guard, DHS. ACTION: Reopening of public comment period... Boating Accident Report Database. DATES: Comments and related material must either be submitted to our... Database that, collectively, are intended to improve recreational boating safety efforts, enhance law...

  11. 48 CFR 504.602-71 - Federal Procurement Data System-Public access to data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Procurement Data System—Public access to data. (a) The FPDS database. The General Services Administration awarded a contract for creation and operation of the Federal Procurement Data System (FPDS) database. That database includes information reported by departments and agencies as required by Federal Acquisition...

  12. 48 CFR 504.602-71 - Federal Procurement Data System-Public access to data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Procurement Data System—Public access to data. (a) The FPDS database. The General Services Administration awarded a contract for creation and operation of the Federal Procurement Data System (FPDS) database. That database includes information reported by departments and agencies as required by Federal Acquisition...

  13. 75 FR 8392 - Low Income Housing Tax Credit Tenant Database

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5376-N-11] Low Income Housing Tax Credit Tenant Database AGENCY: Office of the Chief Information Officer, HUD. ACTION: Notice. SUMMARY: The... Lists the Following Information Title Of Proposal: Low Income Housing Tax Credit Tenant Database. Omb...

  14. Map and database of Quaternary faults and folds in Colombia and its offshore regions

    USGS Publications Warehouse

    Paris, Gabriel; Machette, Michael N.; Dart, Richard L.; Haller, Kathleen M.

    2000-01-01

    As part of the International Lithosphere Program’s “World Map of Major Active Faults,” the U.S. Geological Survey (USGS) is assisting in the compilation of a series of digital maps of Quaternary faults and folds in Western Hemisphere countries. The maps show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds. They are accompanied by databases that describe these features and document current information on their activity in the Quaternary. Top date, the project has published fault and fold maps for Costa Rica (Montero and others, 1998), Panama (Cowan and others, 1998), Venezuela (Audemard and others, 2000), Bolovia/Chile (Lavenu, and others, 2000), and Argentina (Costa and others, 2000). The project is a key part of the Global Seismic Hazards Assessment Program (ILP Project II-0) for the International Decade for Natural Hazard Disaster Reduction.

  15. Geologic map of Chickasaw National Recreation Area, Murray County, Oklahoma

    USGS Publications Warehouse

    Blome, Charles D.; Lidke, David J.; Wahl, Ronald R.; Golab, James A.

    2013-01-01

    This 1:24,000-scale geologic map is a compilation of previous geologic maps and new geologic mapping of areas in and around Chickasaw National Recreation Area. The geologic map includes revisions of numerous unit contacts and faults and a number of previously “undifferentiated” rock units were subdivided in some areas. Numerous circular-shaped hills in and around Chickasaw National Recreation Area are probably the result of karst-related collapse and may represent the erosional remnants of large, exhumed sinkholes. Geospatial registration of existing, smaller scale (1:72,000- and 1:100,000-scale) geologic maps of the area and construction of an accurate Geographic Information System (GIS) database preceded 2 years of fieldwork wherein previously mapped geology (unit contacts and faults) was verified and new geologic mapping was carried out. The geologic map of Chickasaw National Recreation Area and this pamphlet include information pertaining to how the geologic units and structural features in the map area relate to the formation of the northern Arbuckle Mountains and its Arbuckle-Simpson aquifer. The development of an accurate geospatial GIS database and the use of a handheld computer in the field greatly increased both the accuracy and efficiency in producing the 1:24,000-scale geologic map.

  16. Map and map database of susceptibility to slope failure by sliding and earthflow in the Oakland area, California

    USGS Publications Warehouse

    Pike, R.J.; Graymer, R.W.; Roberts, Sebastian; Kalman, N.B.; Sobieszczyk, Steven

    2001-01-01

    Map data that predict the varying likelihood of landsliding can help public agencies make informed decisions on land use and zoning. This map, prepared in a geographic information system from a statistical model, estimates the relative likelihood of local slopes to fail by two processes common to an area of diverse geology, terrain, and land use centered on metropolitan Oakland. The model combines the following spatial data: (1) 120 bedrock and surficial geologic-map units, (2) ground slope calculated from a 30-m digital elevation model, (3) an inventory of 6,714 old landslide deposits (not distinguished by age or type of movement and excluding debris flows), and (4) the locations of 1,192 post-1970 landslides that damaged the built environment. The resulting index of likelihood, or susceptibility, plotted as a 1:50,000-scale map, is computed as a continuous variable over a large area (872 km2) at a comparatively fine (30 m) resolution. This new model complements landslide inventories by estimating susceptibility between existing landslide deposits, and improves upon prior susceptibility maps by quantifying the degree of susceptibility within those deposits. Susceptibility is defined for each geologic-map unit as the spatial frequency (areal percentage) of terrain occupied by old landslide deposits, adjusted locally by steepness of the topography. Susceptibility of terrain between the old landslide deposits is read directly from a slope histogram for each geologic-map unit, as the percentage (0.00 to 0.90) of 30-m cells in each one-degree slope interval that coincides with the deposits. Susceptibility within landslide deposits (0.00 to 1.33) is this same percentage raised by a multiplier (1.33) derived from the comparative frequency of recent failures within and outside the old deposits. Positive results from two evaluations of the model encourage its extension to the 10-county San Francisco Bay region and elsewhere. A similar map could be prepared for any area where the three basic constituents, a geologic map, a landslide inventory, and a slope map, are available in digital form. Added predictive power of the new susceptibility model may reside in attributes that remain to be explored?among them seismic shaking, distance to nearest road, and terrain elevation, aspect, relief, and curvature.

  17. Linking NCBI to Wikipedia: a wiki-based approach.

    PubMed

    Page, Roderic D M

    2011-03-31

    The NCBI Taxonomy underpins many bioinformatics and phyloinformatics databases, but by itself provides limited information on the taxa it contains. One readily available source of information on many taxa is Wikipedia. This paper describes iPhylo Linkout, a Semantic wiki that maps taxa in NCBI's taxonomy database onto corresponding pages in Wikipedia. Storing the mapping in a wiki makes it easy to edit, correct, or otherwise annotate the links between NCBI and Wikipedia. The mapping currently comprises some 53,000 taxa, and is available at http://iphylo.org/linkout. The links between NCBI and Wikipedia are also made available to NCBI users through the NCBI LinkOut service.

  18. Design and implementation of a CORBA-based genome mapping system prototype.

    PubMed

    Hu, J; Mungall, C; Nicholson, D; Archibald, A L

    1998-01-01

    CORBA (Common Object Request Broker Architecture), as an open standard, is considered to be a good solution for the development and deployment of applications in distributed heterogeneous environments. This technology can be applied in the bioinformatics area to enhance utilization, management and interoperation between biological resources. This paper investigates issues in developing CORBA applications for genome mapping information systems in the Internet environment with emphasis on database connectivity and graphical user interfaces. The design and implementation of a CORBA prototype for an animal genome mapping database are described. The prototype demonstration is available via: http://www.ri.bbsrc.ac.uk/ark_corba/. jian.hu@bbsrc.ac.uk

  19. 44 CFR 70.5 - Letter of Map Amendment.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Letter of Map Amendment. 70.5 Section 70.5 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program PROCEDURE FOR MAP CORRECTION Mapping Deficiencies Unrelated to...

  20. 44 CFR 70.5 - Letter of Map Amendment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Letter of Map Amendment. 70.5 Section 70.5 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program PROCEDURE FOR MAP CORRECTION Mapping Deficiencies Unrelated to...

  1. 44 CFR 70.5 - Letter of Map Amendment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Letter of Map Amendment. 70.5 Section 70.5 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program PROCEDURE FOR MAP CORRECTION Mapping Deficiencies Unrelated to...

  2. 44 CFR 70.5 - Letter of Map Amendment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Letter of Map Amendment. 70.5 Section 70.5 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program PROCEDURE FOR MAP CORRECTION Mapping Deficiencies Unrelated to...

  3. 44 CFR 70.5 - Letter of Map Amendment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Letter of Map Amendment. 70.5 Section 70.5 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program PROCEDURE FOR MAP CORRECTION Mapping Deficiencies Unrelated to...

  4. 75 FR 16719 - Information Collection; Forest Landscape Value and Special Place Mapping for National Forest...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-02

    ... Collection; Forest Landscape Value and Special Place Mapping for National Forest Planning AGENCY: Forest... on the new information collection, Forest Landscape Value and Special Place Mapping for National... holidays. SUPPLEMENTARY INFORMATION: Title: Forest Landscape Value and Special Place Mapping for National...

  5. 76 FR 78015 - Revised Analysis and Mapping Procedures for Non-Accredited Levees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-15

    ...] Revised Analysis and Mapping Procedures for Non-Accredited Levees AGENCY: Federal Emergency Management... comments on the proposed solution for Revised Analysis and Mapping Procedures for Non-Accredited Levees. This document proposes a revised procedure for the analysis and mapping of non-accredited levees on...

  6. Maps of the United States

    USGS Publications Warehouse

    ,

    1998-01-01

    The U.S. Geological Survey (USGS) sells a variety of maps of the United States.  Who needs these maps?  Students, land planners, politicians, teachers, marketing specialists, delivery companies, authors and illustrators, attorneys, railroad enthusiasts, travelers, Government agencies, military recruiters, newspapers, map collectors, truckers, boaters, hikers, sales representatives, communication specialists.  Everybody.

  7. 44 CFR 72.5 - Exemptions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Requests for map changes based on mapping or study analysis errors; (b) Requests for map changes based on... and hydraulic studies conducted by Federal, State, or local agencies to replace approximate studies... information meant to improve upon that shown on the flood map or within the flood study will be exempt from...

  8. 44 CFR 72.5 - Exemptions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) Requests for map changes based on mapping or study analysis errors; (b) Requests for map changes based on... and hydraulic studies conducted by Federal, State, or local agencies to replace approximate studies... information meant to improve upon that shown on the flood map or within the flood study will be exempt from...

  9. A new catalog of planetary maps

    NASA Technical Reports Server (NTRS)

    Batson, R. M.; Inge, J. L.

    1991-01-01

    A single, concise reference to all existing planetary maps, including lunar ones, is being prepared that will allow map users to identify and locate maps of their areas of interest. This will be the first such comprehensive listing of planetary maps. Although the USGS shows index maps on the collar of each map sheet, periodically publishes index maps of Mars, and provides informal listings of the USGS map database, no tabulation exists that identifies all planetary maps, including those published by DMA and other organizations. The catalog will consist of a booklet containing small-scale image maps with superimposed quadrangle boundaries and map data tabulations.

  10. A digital geologic map database for the state of Oklahoma

    USGS Publications Warehouse

    Heran, William D.; Green, Gregory N.; Stoeser, Douglas B.

    2003-01-01

    This dataset is a composite of part or all of the 12 1:250,000 scale quadrangles that make up Oklahoma. The result looks like a geologic map of the State of Oklahoma. But it is only an Oklahoma shaped map clipped from the 1:250,000 geologic maps. This is not a new geologic map. No new mapping took place. The geologic information from each quadrangle is available within the composite dataset.

  11. From 20th century metabolic wall charts to 21st century systems biology: database of mammalian metabolic enzymes.

    PubMed

    Corcoran, Callan C; Grady, Cameron R; Pisitkun, Trairak; Parulekar, Jaya; Knepper, Mark A

    2017-03-01

    The organization of the mammalian genome into gene subsets corresponding to specific functional classes has provided key tools for systems biology research. Here, we have created a web-accessible resource called the Mammalian Metabolic Enzyme Database ( https://hpcwebapps.cit.nih.gov/ESBL/Database/MetabolicEnzymes/MetabolicEnzymeDatabase.html) keyed to the biochemical reactions represented on iconic metabolic pathway wall charts created in the previous century. Overall, we have mapped 1,647 genes to these pathways, representing ~7 percent of the protein-coding genome. To illustrate the use of the database, we apply it to the area of kidney physiology. In so doing, we have created an additional database ( Database of Metabolic Enzymes in Kidney Tubule Segments: https://hpcwebapps.cit.nih.gov/ESBL/Database/MetabolicEnzymes/), mapping mRNA abundance measurements (mined from RNA-Seq studies) for all metabolic enzymes to each of 14 renal tubule segments. We carry out bioinformatics analysis of the enzyme expression pattern among renal tubule segments and mine various data sources to identify vasopressin-regulated metabolic enzymes in the renal collecting duct. Copyright © 2017 the American Physiological Society.

  12. Archetype relational mapping - a practical openEHR persistence solution.

    PubMed

    Wang, Li; Min, Lingtong; Wang, Rui; Lu, Xudong; Duan, Huilong

    2015-11-05

    One of the primary obstacles to the widespread adoption of openEHR methodology is the lack of practical persistence solutions for future-proof electronic health record (EHR) systems as described by the openEHR specifications. This paper presents an archetype relational mapping (ARM) persistence solution for the archetype-based EHR systems to support healthcare delivery in the clinical environment. First, the data requirements of the EHR systems are analysed and organized into archetype-friendly concepts. The Clinical Knowledge Manager (CKM) is queried for matching archetypes; when necessary, new archetypes are developed to reflect concepts that are not encompassed by existing archetypes. Next, a template is designed for each archetype to apply constraints related to the local EHR context. Finally, a set of rules is designed to map the archetypes to data tables and provide data persistence based on the relational database. A comparison study was conducted to investigate the differences among the conventional database of an EHR system from a tertiary Class A hospital in China, the generated ARM database, and the Node + Path database. Five data-retrieving tests were designed based on clinical workflow to retrieve exams and laboratory tests. Additionally, two patient-searching tests were designed to identify patients who satisfy certain criteria. The ARM database achieved better performance than the conventional database in three of the five data-retrieving tests, but was less efficient in the remaining two tests. The time difference of query executions conducted by the ARM database and the conventional database is less than 130 %. The ARM database was approximately 6-50 times more efficient than the conventional database in the patient-searching tests, while the Node + Path database requires far more time than the other two databases to execute both the data-retrieving and the patient-searching tests. The ARM approach is capable of generating relational databases using archetypes and templates for archetype-based EHR systems, thus successfully adapting to changes in data requirements. ARM performance is similar to that of conventionally-designed EHR systems, and can be applied in a practical clinical environment. System components such as ARM can greatly facilitate the adoption of openEHR architecture within EHR systems.

  13. DBMap: a TreeMap-based framework for data navigation and visualization of brain research registry

    NASA Astrophysics Data System (ADS)

    Zhang, Ming; Zhang, Hong; Tjandra, Donny; Wong, Stephen T. C.

    2003-05-01

    The purpose of this study is to investigate and apply a new, intuitive and space-conscious visualization framework to facilitate efficient data presentation and exploration of large-scale data warehouses. We have implemented the DBMap framework for the UCSF Brain Research Registry. Such a novel utility would facilitate medical specialists and clinical researchers in better exploring and evaluating a number of attributes organized in the brain research registry. The current UCSF Brain Research Registry consists of a federation of disease-oriented database modules, including Epilepsy, Brain Tumor, Intracerebral Hemorrphage, and CJD (Creuzfeld-Jacob disease). These database modules organize large volumes of imaging and non-imaging data to support Web-based clinical research. While the data warehouse supports general information retrieval and analysis, there lacks an effective way to visualize and present the voluminous and complex data stored. This study investigates whether the TreeMap algorithm can be adapted to display and navigate categorical biomedical data warehouse or registry. TreeMap is a space constrained graphical representation of large hierarchical data sets, mapped to a matrix of rectangles, whose size and color represent interested database fields. It allows the display of a large amount of numerical and categorical information in limited real estate of computer screen with an intuitive user interface. The paper will describe, DBMap, the proposed new data visualization framework for large biomedical databases. Built upon XML, Java and JDBC technologies, the prototype system includes a set of software modules that reside in the application server tier and provide interface to backend database tier and front-end Web tier of the brain registry.

  14. Chapter 4 - The LANDFIRE Prototype Project reference database

    Treesearch

    John F. Caratti

    2006-01-01

    This chapter describes the data compilation process for the Landscape Fire and Resource Management Planning Tools Prototype Project (LANDFIRE Prototype Project) reference database (LFRDB) and explains the reference data applications for LANDFIRE Prototype maps and models. The reference database formed the foundation for all LANDFIRE tasks. All products generated by the...

  15. Transport Statistics - Transport - UNECE

    Science.gov Websites

    Statistics and Data Online Infocards Database SDG Papers E-Road Census Traffic Census Map Traffic Census 2015 available. Two new datasets have been added to the transport statistics database: bus and coach statistics Database Evaluations Follow UNECE Facebook Rss Twitter You tube Contact us Instagram Flickr Google+ Â

  16. [Effects of soil data and map scale on assessment of total phosphorus storage in upland soils.

    PubMed

    Li, Heng Rong; Zhang, Li Ming; Li, Xiao di; Yu, Dong Sheng; Shi, Xue Zheng; Xing, Shi He; Chen, Han Yue

    2016-06-01

    Accurate assessment of total phosphorus storage in farmland soils is of great significance to sustainable agricultural and non-point source pollution control. However, previous studies haven't considered the estimation errors from mapping scales and various databases with different sources of soil profile data. In this study, a total of 393×10 4 hm 2 of upland in the 29 counties (or cities) of North Jiangsu was cited as a case for study. Analysis was performed of how the four sources of soil profile data, namely, "Soils of County", "Soils of Prefecture", "Soils of Province" and "Soils of China", and the six scales, i.e. 1:50000, 1:250000, 1:500000, 1:1000000, 1:4000000 and1:10000000, used in the 24 soil databases established for the four soil journals, affected assessment of soil total phosphorus. Compared with the most detailed 1:50000 soil database established with 983 upland soil profiles, relative deviation of the estimates of soil total phosphorus density (STPD) and soil total phosphorus storage (STPS) from the other soil databases varied from 4.8% to 48.9% and from 1.6% to 48.4%, respectively. The estimated STPD and STPS based on the 1:50000 database of "Soils of County" and most of the estimates based on the databases of each scale in "Soils of County" and "Soils of Prefecture" were different, with the significance levels of P<0.001 or P<0.05. Extremely significant differences (P<0.001) existed between the estimates based on the 1:50000 database of "Soils of County" and the estimates based on the databases of each scale in "Soils of Province" and "Soils of China". This study demonstrated the significance of appropriate soil data sources and appropriate mapping scales in estimating STPS.

  17. 28 CFR 802.29 - Exemption of the Pretrial Services Agency System.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS Exemption of Records Systems Under the Privacy Act § 802.29 Exemption of the Pretrial Services Agency System. The Privacy Act permits specific systems of... Bail Agency Database (ABADABA) (CSOSA/PSA-1). (ii) Drug Test Management System (DTMS) (CSOSA/PSA-2...

  18. 28 CFR 802.29 - Exemption of the Pretrial Services Agency System.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS Exemption of Records Systems Under the Privacy Act § 802.29 Exemption of the Pretrial Services Agency System. The Privacy Act permits specific systems of... Bail Agency Database (ABADABA) (CSOSA/PSA-1). (ii) Drug Test Management System (DTMS) (CSOSA/PSA-2...

  19. 28 CFR 802.29 - Exemption of the Pretrial Services Agency System.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS Exemption of Records Systems Under the Privacy Act § 802.29 Exemption of the Pretrial Services Agency System. The Privacy Act permits specific systems of... Bail Agency Database (ABADABA) (CSOSA/PSA-1). (ii) Drug Test Management System (DTMS) (CSOSA/PSA-2...

  20. Advertising Agency Libraries: 30 Years of Change.

    ERIC Educational Resources Information Center

    Christianson, Elin B.; Waldron, Anne M.

    1988-01-01

    Reports on a survey of advertising agency libraries and compares results of the current study with similar surveys from 1954 and 1969. Characteristics of the parent agency, organizational status and location, budgets, users, staff, collections, indexes and databases, reference books, and library services are the areas addressed. Data are presented…

  1. 78 FR 50435 - Agency Information Collection Activities; Proposed Collection; Comment Request: FEMA Mitigation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... Mitigation Success Story Database AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY... (GPRA) (Pub. L. 103.62 Section 2) FEMA has established the FEMA Mitigation Best Practices success story... stories incorporate mitigation strategies that have been successfully implemented and provide real- world...

  2. 48 CFR 32.1110 - Solicitation provision and contract clauses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... System for Award Management (SAM) database and maintain registration until final payment, unless— (i..., or a similar agency clause that requires the contractor to be registered in the SAM database. (ii)(A...

  3. 48 CFR 32.1110 - Solicitation provision and contract clauses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... System for Award Management (SAM) database and maintain registration until final payment, unless— (i..., or a similar agency clause that requires the contractor to be registered in the SAM database. (ii)(A...

  4. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  5. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  6. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  7. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  8. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  9. 75 FR 28024 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-19

    ... the data-capturing process. SAMHSA will place Web site registration information into a Knowledge Management database and will place email subscription information into a database maintained by a third-party...

  10. Pattern-based, multi-scale segmentation and regionalization of EOSD land cover

    NASA Astrophysics Data System (ADS)

    Niesterowicz, Jacek; Stepinski, Tomasz F.

    2017-10-01

    The Earth Observation for Sustainable Development of Forests (EOSD) map is a 25 m resolution thematic map of Canadian forests. Because of its large spatial extent and relatively high resolution the EOSD is difficult to analyze using standard GIS methods. In this paper we propose multi-scale segmentation and regionalization of EOSD as new methods for analyzing EOSD on large spatial scales. Segments, which we refer to as forest land units (FLUs), are delineated as tracts of forest characterized by cohesive patterns of EOSD categories; we delineated from 727 to 91,885 FLUs within the spatial extent of EOSD depending on the selected scale of a pattern. Pattern of EOSD's categories within each FLU is described by 1037 landscape metrics. A shapefile containing boundaries of all FLUs together with an attribute table listing landscape metrics make up an SQL-searchable spatial database providing detailed information on composition and pattern of land cover types in Canadian forest. Shapefile format and extensive attribute table pertaining to the entire legend of EOSD are designed to facilitate broad range of investigations in which assessment of composition and pattern of forest over large areas is needed. We calculated four such databases using different spatial scales of pattern. We illustrate the use of FLU database for producing forest regionalization maps of two Canadian provinces, Quebec and Ontario. Such maps capture the broad scale variability of forest at the spatial scale of the entire province. We also demonstrate how FLU database can be used to map variability of landscape metrics, and thus the character of landscape, over the entire Canada.

  11. Preliminary maps of Quaternary deposits and liquefaction susceptibility, nine-county San Francisco Bay region, California: a digital database

    USGS Publications Warehouse

    Knudsen, Keith L.; Sowers, Janet M.; Witter, Robert C.; Wentworth, Carl M.; Helley, Edward J.; Nicholson, Robert S.; Wright, Heather M.; Brown, Katherine H.

    2000-01-01

    This report presents a preliminary map and database of Quaternary deposits and liquefaction susceptibility for the nine-county San Francisco Bay region, together with a digital compendium of ground effects associated with past earthquakes in the region. The report consists of (1) a spatial database of fivedata layers (Quaternary deposits, quadrangle index, and three ground effects layers) and two text layers (a labels and leaders layer for Quaternary deposits and for ground effects), (2) two small-scale colored maps (Quaternary deposits and liquefaction susceptibility), (3) a text describing the Quaternary map, liquefaction interpretation, and the ground effects compendium, and (4) the databse description pamphlet. The nine counties surrounding San Francisco Bay straddle the San Andreas fault system, which exposes the region to serious earthquake hazard (Working Group on California Earthquake Probabilities, 1999). Much of the land adjacent to the Bay and the major rivers and streams is underlain by unconsolidated deposits that are particularly vulnerable to earthquake shaking and liquefaction of water-saturated granular sediment. This new map provides a modern and regionally consistent treatment of Quaternary surficial deposits that builds on the pioneering mapping of Helley and Lajoie (Helley and others, 1979) and such intervening work as Atwater (1982), Helley and others (1994), and Helley and Graymer (1997a and b). Like these earlier studies, the current mapping uses geomorphic expression, pedogenic soils, and inferred depositional environments to define and distinguish the map units. In contrast to the twelve map units of Helley and Lajoie, however, this new map uses a complex stratigraphy of some forty units, which permits a more realistic portrayal of the Quaternary depositional system. The two colored maps provide a regional summary of the new mapping at a scale of 1:275,000, a scale that is sufficient to show the general distribution and relationships of the map units but cannot distinguish the more detailed elements that are present in the database. The report is the product of years of cooperative work by the USGS National Earthquake Hazards Reduction Program (NEHRP) and National Cooperative Geologic Mapping Program, William Lettis and & Associates, Inc. (WLA) and, more recently, by the California Division of Mines and Geology as well. An earlier version was submitted to the Geological Survey by WLA as a final report for a NEHRP grant (Knudsen and others, 2000). The mapping has been carried out by WLA geologists under contract to the NEHRP Earthquake Program (Grants #14-08-0001-G2129, 1434-94-G-2499, 1434-HQ-97-GR-03121, and 99-HQ-GR-0095) and with other limited support from the County of Napa, and recently also by the California Division of Mines and Geology. The current map consists of this new mapping and revisions of previous USGS mapping.

  12. Real-time terrain storage generation from multiple sensors towards mobile robot operation interface.

    PubMed

    Song, Wei; Cho, Seoungjae; Xi, Yulong; Cho, Kyungeun; Um, Kyhyun

    2014-01-01

    A mobile robot mounted with multiple sensors is used to rapidly collect 3D point clouds and video images so as to allow accurate terrain modeling. In this study, we develop a real-time terrain storage generation and representation system including a nonground point database (PDB), ground mesh database (MDB), and texture database (TDB). A voxel-based flag map is proposed for incrementally registering large-scale point clouds in a terrain model in real time. We quantize the 3D point clouds into 3D grids of the flag map as a comparative table in order to remove the redundant points. We integrate the large-scale 3D point clouds into a nonground PDB and a node-based terrain mesh using the CPU. Subsequently, we program a graphics processing unit (GPU) to generate the TDB by mapping the triangles in the terrain mesh onto the captured video images. Finally, we produce a nonground voxel map and a ground textured mesh as a terrain reconstruction result. Our proposed methods were tested in an outdoor environment. Our results show that the proposed system was able to rapidly generate terrain storage and provide high resolution terrain representation for mobile mapping services and a graphical user interface between remote operators and mobile robots.

  13. ReMap 2018: an updated atlas of regulatory regions from an integrative analysis of DNA-binding ChIP-seq experiments

    PubMed Central

    Chèneby, Jeanne; Gheorghe, Marius; Artufel, Marie

    2018-01-01

    Abstract With this latest release of ReMap (http://remap.cisreg.eu), we present a unique collection of regulatory regions in human, as a result of a large-scale integrative analysis of ChIP-seq experiments for hundreds of transcriptional regulators (TRs) such as transcription factors, transcriptional co-activators and chromatin regulators. In 2015, we introduced the ReMap database to capture the genome regulatory space by integrating public ChIP-seq datasets, covering 237 TRs across 13 million (M) peaks. In this release, we have extended this catalog to constitute a unique collection of regulatory regions. Specifically, we have collected, analyzed and retained after quality control a total of 2829 ChIP-seq datasets available from public sources, covering a total of 485 TRs with a catalog of 80M peaks. Additionally, the updated database includes new search features for TR names as well as aliases, including cell line names and the ability to navigate the data directly within genome browsers via public track hubs. Finally, full access to this catalog is available online together with a TR binding enrichment analysis tool. ReMap 2018 provides a significant update of the ReMap database, providing an in depth view of the complexity of the regulatory landscape in human. PMID:29126285

  14. Real-Time Terrain Storage Generation from Multiple Sensors towards Mobile Robot Operation Interface

    PubMed Central

    Cho, Seoungjae; Xi, Yulong; Cho, Kyungeun

    2014-01-01

    A mobile robot mounted with multiple sensors is used to rapidly collect 3D point clouds and video images so as to allow accurate terrain modeling. In this study, we develop a real-time terrain storage generation and representation system including a nonground point database (PDB), ground mesh database (MDB), and texture database (TDB). A voxel-based flag map is proposed for incrementally registering large-scale point clouds in a terrain model in real time. We quantize the 3D point clouds into 3D grids of the flag map as a comparative table in order to remove the redundant points. We integrate the large-scale 3D point clouds into a nonground PDB and a node-based terrain mesh using the CPU. Subsequently, we program a graphics processing unit (GPU) to generate the TDB by mapping the triangles in the terrain mesh onto the captured video images. Finally, we produce a nonground voxel map and a ground textured mesh as a terrain reconstruction result. Our proposed methods were tested in an outdoor environment. Our results show that the proposed system was able to rapidly generate terrain storage and provide high resolution terrain representation for mobile mapping services and a graphical user interface between remote operators and mobile robots. PMID:25101321

  15. Geographical Distribution of Woody Biomass Carbon in Tropical Africa: An Updated Database for 2000 (NDP-055.2007, NDP-055b))

    DOE Data Explorer

    Gibbs, Holly K. [Center for Sustainability and the Global Environment (SAGE), University of Wisconsin, Madison, WI (USA); Brown, Sandra [Winrock International, Arlington, VA (USA); Olsen, L. M. [Carbon Dioxide Information Analysis Center (CDIAC), Oak Ridge National Laboratory, Oak Ridge, TN (USA); Boden, Thomas A. [Carbon Dioxide Information Analysis Center (CDIAC), Oak Ridge National Laboratory, Oak Ridge, TN (USA)

    2007-09-01

    Maps of biomass density are critical inputs for estimating carbon emissions from deforestation and degradation of tropical forests. Brown and Gatson (1996) pioneered methods to use GIS analysis to map forest biomass based on forest inventory data (ndp055). This database is an update of ndp055 (which represent conditions in circa 1980) and accounts for land cover changes occurring up to the year 2000.

  16. CADASTER QSPR Models for Predictions of Melting and Boiling Points of Perfluorinated Chemicals.

    PubMed

    Bhhatarai, Barun; Teetz, Wolfram; Liu, Tao; Öberg, Tomas; Jeliazkova, Nina; Kochev, Nikolay; Pukalov, Ognyan; Tetko, Igor V; Kovarich, Simona; Papa, Ester; Gramatica, Paola

    2011-03-14

    Quantitative structure property relationship (QSPR) studies on per- and polyfluorinated chemicals (PFCs) on melting point (MP) and boiling point (BP) are presented. The training and prediction chemicals used for developing and validating the models were selected from Syracuse PhysProp database and literatures. The available experimental data sets were split in two different ways: a) random selection on response value, and b) structural similarity verified by self-organizing-map (SOM), in order to propose reliable predictive models, developed only on the training sets and externally verified on the prediction sets. Individual linear and non-linear approaches based models developed by different CADASTER partners on 0D-2D Dragon descriptors, E-state descriptors and fragment based descriptors as well as consensus model and their predictions are presented. In addition, the predictive performance of the developed models was verified on a blind external validation set (EV-set) prepared using PERFORCE database on 15 MP and 25 BP data respectively. This database contains only long chain perfluoro-alkylated chemicals, particularly monitored by regulatory agencies like US-EPA and EU-REACH. QSPR models with internal and external validation on two different external prediction/validation sets and study of applicability-domain highlighting the robustness and high accuracy of the models are discussed. Finally, MPs for additional 303 PFCs and BPs for 271 PFCs were predicted for which experimental measurements are unknown. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. CardioScape mapping the cardiovascular funding landscape in Europe.

    PubMed

    Pries, Axel Radlach; Naoum, Anastasia; Habazettl, Helmut; Dunkel, Mathias; Preissner, Robert; Coats, Caroline J; Tornada, Ana; Orso, Francesco; Van de Werf, Frans; Wood, David A

    2017-04-25

    The burden of cardiovascular disease is increasing worldwide, which has to be reflected by cardiovascular (CV) research in Europe. CardioScape, a FP7 funded project initiated by the European Society of Cardiology (ESC), identified where CV research is performed, how it is funded and by whom. It could be transformed into an on-line and up-to-date resource of great relevance for researchers, funding bodies and policymakers and could be a role model for mapping CV research funding in Europe and beyond. Relevant funding bodies in 28 European Union (EU) countries were identified by a multistep process involving experts in each country. Projects above a funding threshold of 100 k€ during the period 2010-2012 were included using a standard questionnaire. Results were classified by experts and an adaptive text analysis software to a CV-research taxonomy, integrating existing schemes from ESC journals and congresses. An on-line query portal was set up to allow different users to interrogate the database according to their specific viewpoints. CV-research funding varies strongly between different nations with the EU providing 37% of total available project funding and clear geographical gradients exist. Data allow in depth comparison of funding for different research areas and led to a number of recommendations by the consortium. CardioScape can support CV research by aiding researchers, funding agencies and policy makers in their strategic decisions thus improving research quality if CardioScape strategy and technology becomes the basis of a continuously updated and expanded European wide publicly accessible database. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.

  18. CMD: a Cotton Microsatellite Database resource for Gossypium genomics

    PubMed Central

    Blenda, Anna; Scheffler, Jodi; Scheffler, Brian; Palmer, Michael; Lacape, Jean-Marc; Yu, John Z; Jesudurai, Christopher; Jung, Sook; Muthukumar, Sriram; Yellambalase, Preetham; Ficklin, Stephen; Staton, Margaret; Eshelman, Robert; Ulloa, Mauricio; Saha, Sukumar; Burr, Ben; Liu, Shaolin; Zhang, Tianzhen; Fang, Deqiu; Pepper, Alan; Kumpatla, Siva; Jacobs, John; Tomkins, Jeff; Cantrell, Roy; Main, Dorrie

    2006-01-01

    Background The Cotton Microsatellite Database (CMD) is a curated and integrated web-based relational database providing centralized access to publicly available cotton microsatellites, an invaluable resource for basic and applied research in cotton breeding. Description At present CMD contains publication, sequence, primer, mapping and homology data for nine major cotton microsatellite projects, collectively representing 5,484 microsatellites. In addition, CMD displays data for three of the microsatellite projects that have been screened against a panel of core germplasm. The standardized panel consists of 12 diverse genotypes including genetic standards, mapping parents, BAC donors, subgenome representatives, unique breeding lines, exotic introgression sources, and contemporary Upland cottons with significant acreage. A suite of online microsatellite data mining tools are accessible at CMD. These include an SSR server which identifies microsatellites, primers, open reading frames, and GC-content of uploaded sequences; BLAST and FASTA servers providing sequence similarity searches against the existing cotton SSR sequences and primers, a CAP3 server to assemble EST sequences into longer transcripts prior to mining for SSRs, and CMap, a viewer for comparing cotton SSR maps. Conclusion The collection of publicly available cotton SSR markers in a centralized, readily accessible and curated web-enabled database provides a more efficient utilization of microsatellite resources and will help accelerate basic and applied research in molecular breeding and genetic mapping in Gossypium spp. PMID:16737546

  19. Spatial Databases for CalVO Volcanoes: Current Status and Future Directions

    NASA Astrophysics Data System (ADS)

    Ramsey, D. W.

    2013-12-01

    The U.S. Geological Survey (USGS) California Volcano Observatory (CalVO) aims to advance scientific understanding of volcanic processes and to lessen harmful impacts of volcanic activity in California and Nevada. Within CalVO's area of responsibility, ten volcanoes or volcanic centers have been identified by a national volcanic threat assessment in support of developing the U.S. National Volcano Early Warning System (NVEWS) as posing moderate, high, or very high threats to surrounding communities based on their recent eruptive histories and their proximity to vulnerable people, property, and infrastructure. To better understand the extent of potential hazards at these and other volcanoes and volcanic centers, the USGS Volcano Science Center (VSC) is continually compiling spatial databases of volcano information, including: geologic mapping, hazards assessment maps, locations of geochemical and geochronological samples, and the distribution of volcanic vents. This digital mapping effort has been ongoing for over 15 years and early databases are being converted to match recent datasets compiled with new data models designed for use in: 1) generating hazard zones, 2) evaluating risk to population and infrastructure, 3) numerical hazard modeling, and 4) display and query on the CalVO as well as other VSC and USGS websites. In these capacities, spatial databases of CalVO volcanoes and their derivative map products provide an integrated and readily accessible framework of VSC hazards science to colleagues, emergency managers, and the general public.

  20. Rice Annotation Project Database (RAP-DB): an integrative and interactive database for rice genomics.

    PubMed

    Sakai, Hiroaki; Lee, Sung Shin; Tanaka, Tsuyoshi; Numa, Hisataka; Kim, Jungsok; Kawahara, Yoshihiro; Wakimoto, Hironobu; Yang, Ching-chia; Iwamoto, Masao; Abe, Takashi; Yamada, Yuko; Muto, Akira; Inokuchi, Hachiro; Ikemura, Toshimichi; Matsumoto, Takashi; Sasaki, Takuji; Itoh, Takeshi

    2013-02-01

    The Rice Annotation Project Database (RAP-DB, http://rapdb.dna.affrc.go.jp/) has been providing a comprehensive set of gene annotations for the genome sequence of rice, Oryza sativa (japonica group) cv. Nipponbare. Since the first release in 2005, RAP-DB has been updated several times along with the genome assembly updates. Here, we present our newest RAP-DB based on the latest genome assembly, Os-Nipponbare-Reference-IRGSP-1.0 (IRGSP-1.0), which was released in 2011. We detected 37,869 loci by mapping transcript and protein sequences of 150 monocot species. To provide plant researchers with highly reliable and up to date rice gene annotations, we have been incorporating literature-based manually curated data, and 1,626 loci currently incorporate literature-based annotation data, including commonly used gene names or gene symbols. Transcriptional activities are shown at the nucleotide level by mapping RNA-Seq reads derived from 27 samples. We also mapped the Illumina reads of a Japanese leading japonica cultivar, Koshihikari, and a Chinese indica cultivar, Guangluai-4, to the genome and show alignments together with the single nucleotide polymorphisms (SNPs) and gene functional annotations through a newly developed browser, Short-Read Assembly Browser (S-RAB). We have developed two satellite databases, Plant Gene Family Database (PGFD) and Integrative Database of Cereal Gene Phylogeny (IDCGP), which display gene family and homologous gene relationships among diverse plant species. RAP-DB and the satellite databases offer simple and user-friendly web interfaces, enabling plant and genome researchers to access the data easily and facilitating a broad range of plant research topics.

  1. Regional water table (2016) in the Mojave River and Morongo groundwater basins, southwestern Mojave Desert, California

    USGS Publications Warehouse

    Dick, Meghan; Kjos, Adam

    2017-12-07

    From January to April 2016, the U.S. Geological Survey (USGS), the Mojave Water Agency, and other local water districts made approximately 1,200 water-level measurements in about 645 wells located within 15 separate groundwater basins, collectively referred to as the Mojave River and Morongo groundwater basins. These data document recent conditions and, when compared with older data, changes in groundwater levels. A water-level contour map was drawn using data measured in 2016 that shows the elevation of the water table and general direction of groundwater movement for most of the groundwater basins. Historical water-level data stored in the USGS National Water Information System (https://waterdata.usgs.gov/nwis/) database were used in conjunction with data collected for this study to construct 37 hydrographs to show long-term (1930–2016) and short-term (1990–2016) water-level changes in the study area.

  2. Georgia's Surface-Water Resources and Streamflow Monitoring Network, 2006

    USGS Publications Warehouse

    Nobles, Patricia L.; ,

    2006-01-01

    The U.S. Geological Survey (USGS) network of 223 real-time monitoring stations, the 'Georgia HydroWatch,' provides real-time water-stage data, with streamflow computed at 198 locations, and rainfall recorded at 187 stations. These sites continuously record data on 15-minute intervals and transmit the data via satellite to be incorporated into the USGS National Water Information System database. These data are automatically posted to the USGS Web site for public dissemination (http://waterdata.usgs.gov/ga/nwis/nwis). The real-time capability of this network provides information to help emergency-management officials protect human life and property during floods, and mitigate the effects of prolonged drought. The map at right shows the USGS streamflow monitoring network for Georgia and major watersheds. Streamflow is monitored at 198 sites statewide, more than 80 percent of which include precipitation gages. Various Federal, State, and local agencies fund these streamflow monitoring stations.

  3. Supporting the Establishment of Climate-Resilient Rural Livelihoods in Mongolia with EO Services

    NASA Astrophysics Data System (ADS)

    Grosso, Nuno; Patinha, Carla; Sainkhuu, Tserendash; Bataa, Mendbayar; Doljinsuren, Nyamdorj

    2016-08-01

    The work presented here shows the results from the project "Climate-Resilient Rural Livelihoods in Mongolia", included in the EOTAP (Earth Observation for a Transforming Asia Pacific) initiative, a collaboration between the European Space Agency (ESA) and the Asian Development Bank (ADB), developed in cooperation with the Ministry of Food and Agriculture of Mongolia.The EO services developed within this EOTAP project primarily aimed at enriching the existing environmental database maintained by the National Remote Sensing Center (NRSC) in Mongolia and sustaining the collaborative pasture management practices introduced by the teams within the Ministry of Food and Agriculture of Mongolia. The geographic area covered by the EOTAP services is Bayankhongor province, in western Mongolia region, with two main services: drought monitoring at the provincial level for the year 2014 and Land Use/Land Cover (LULC) and changes mapping for three districts of this province (Buutsagaan, Dzag and Khureemaral) for the years 2013, 2014.

  4. Software Engineering Laboratory (SEL) database organization and user's guide, revision 2

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Bristow, John

    1992-01-01

    The organization of the Software Engineering Laboratory (SEL) database is presented. Included are definitions and detailed descriptions of the database tables and views, the SEL data, and system support data. The mapping from the SEL and system support data to the base table is described. In addition, techniques for accessing the database through the Database Access Manager for the SEL (DAMSEL) system and via the ORACLE structured query language (SQL) are discussed.

  5. Software Engineering Laboratory (SEL) database organization and user's guide

    NASA Technical Reports Server (NTRS)

    So, Maria; Heller, Gerard; Steinberg, Sandra; Spiegel, Douglas

    1989-01-01

    The organization of the Software Engineering Laboratory (SEL) database is presented. Included are definitions and detailed descriptions of the database tables and views, the SEL data, and system support data. The mapping from the SEL and system support data to the base tables is described. In addition, techniques for accessing the database, through the Database Access Manager for the SEL (DAMSEL) system and via the ORACLE structured query language (SQL), are discussed.

  6. Geologic map of the Patagonia Mountains, Santa Cruz County, Arizona

    USGS Publications Warehouse

    Graybeal, Frederick T.; Moyer, Lorre A.; Vikre, Peter; Dunlap, Pamela; Wallis, John C.

    2015-01-01

    Several spatial databases provide data for the geologic map of the Patagonia Mountains in Arizona. The data can be viewed and queried in ArcGIS 10, a geographic information system; a geologic map is also available in PDF format. All products are available online only.

  7. Staff - April M. Woolery | Alaska Division of Geological & Geophysical

    Science.gov Websites

    SurveysA> Skip to content State of Alaska myAlaska My Government Resident Business in Alaska Publications Search Statewide Maps New Releases Sales Interactive Maps Databases Sections Geologic Geologic Mapping Advisory Board STATEMAP Publications Geophysics Program Information Geophysical Survey

  8. GlobCorine- A Joint EEA-ESA Project for Operational Land Cover and Land Use Mapping at Pan-European Scale

    NASA Astrophysics Data System (ADS)

    Bontemps, S.; Defourny, P.; Van Bogaert, E.; Weber, J. L.; Arino, O.

    2010-12-01

    Regular and global land cover mapping contributes to evaluating the impact of human activities on the environment. Jointly supported by the European Space Agency and the European Environmental Agency, the GlobCorine project builds on the GlobCover findings and aims at making the full use of the MERIS time series for frequent land cover monitoring. The GlobCover automated classification approach has been tuned to the pan-European continent and adjusted towards a classification compatible with the Corine typology. The GlobCorine 2005 land cover map has been achieved, validated and made available to a broad- level stakeholder community from the ESA website. A first version of the GlobCorine 2009 map has also been produced, demonstrating the possibility for an operational production of frequent and updated global land cover maps.

  9. The Protein Disease Database of human body fluids: II. Computer methods and data issues.

    PubMed

    Lemkin, P F; Orr, G A; Goldstein, M P; Creed, G J; Myrick, J E; Merril, C R

    1995-01-01

    The Protein Disease Database (PDD) is a relational database of proteins and diseases. With this database it is possible to screen for quantitative protein abnormalities associated with disease states. These quantitative relationships use data drawn from the peer-reviewed biomedical literature. Assays may also include those observed in high-resolution electrophoretic gels that offer the potential to quantitate many proteins in a single test as well as data gathered by enzymatic or immunologic assays. We are using the Internet World Wide Web (WWW) and the Web browser paradigm as an access method for wide distribution and querying of the Protein Disease Database. The WWW hypertext transfer protocol and its Common Gateway Interface make it possible to build powerful graphical user interfaces that can support easy-to-use data retrieval using query specification forms or images. The details of these interactions are totally transparent to the users of these forms. Using a client-server SQL relational database, user query access, initial data entry and database maintenance are all performed over the Internet with a Web browser. We discuss the underlying design issues, mapping mechanisms and assumptions that we used in constructing the system, data entry, access to the database server, security, and synthesis of derived two-dimensional gel image maps and hypertext documents resulting from SQL database searches.

  10. Levelling and merging of two discrete national-scale geochemical databases: A case study showing the surficial expression of metalliferous black shales

    USGS Publications Warehouse

    Smith, Steven M.; Neilson, Ryan T.; Giles, Stuart A.

    2015-01-01

    Government-sponsored, national-scale, soil and sediment geochemical databases are used to estimate regional and local background concentrations for environmental issues, identify possible anthropogenic contamination, estimate mineral endowment, explore for new mineral deposits, evaluate nutrient levels for agriculture, and establish concentration relationships with human or animal health. Because of these different uses, it is difficult for any single database to accommodate all the needs of each client. Smith et al. (2013, p. 168) reviewed six national-scale soil and sediment geochemical databases for the United States (U.S.) and, for each, evaluated “its appropriateness as a national-scale geochemical database and its usefulness for national-scale geochemical mapping.” Each of the evaluated databases has strengths and weaknesses that were listed in that review.Two of these U.S. national-scale geochemical databases are similar in their sample media and collection protocols but have different strengths—primarily sampling density and analytical consistency. This project was implemented to determine whether those databases could be merged to produce a combined dataset that could be used for mineral resource assessments. The utility of the merged database was tested to see whether mapped distributions could identify metalliferous black shales at a national scale.

  11. SPECIATE Version 4.5 Database Development Documentation

    EPA Science Inventory

    This product updated SPECIATE 4.4 with new emission profiles to address high priority Agency data gaps and to included new, more accurate emission profiles generated by research underway within and outside the Agency.

  12. Engineering With Nature Geographic Project Mapping Tool (EWN ProMap)

    DTIC Science & Technology

    2015-07-01

    EWN ProMap database provides numerous case studies for infrastructure projects such as breakwaters, river engineering dikes, and seawalls that have...the EWN Project Mapping Tool (EWN ProMap) is to assist users in their search for case study information that can be valuable for developing EWN ideas...Essential elements of EWN include: (1) using science and engineering to produce operational efficiencies supporting sustainable delivery of

  13. Using a spatial and tabular database to generate statistics from terrain and spectral data for soil surveys

    USGS Publications Warehouse

    Horvath , E.A.; Fosnight, E.A.; Klingebiel, A.A.; Moore, D.G.; Stone, J.E.; Reybold, W.U.; Petersen, G.W.

    1987-01-01

    A methodology has been developed to create a spatial database by referencing digital elevation, Landsat multispectral scanner data, and digitized soil premap delineations of a number of adjacent 7.5-min quadrangle areas to a 30-m Universal Transverse Mercator projection. Slope and aspect transformations are calculated from elevation data and grouped according to field office specifications. An unsupervised classification is performed on a brightness and greenness transformation of the spectral data. The resulting spectral, slope, and aspect maps of each of the 7.5-min quadrangle areas are then plotted and submitted to the field office to be incorporated into the soil premapping stages of a soil survey. A tabular database is created from spatial data by generating descriptive statistics for each data layer within each soil premap delineation. The tabular data base is then entered into a data base management system to be accessed by the field office personnel during the soil survey and to be used for subsequent resource management decisions.Large amounts of data are collected and archived during resource inventories for public land management. Often these data are stored as stacks of maps or folders in a file system in someone's office, with the maps in a variety of formats, scales, and with various standards of accuracy depending on their purpose. This system of information storage and retrieval is cumbersome at best when several categories of information are needed simultaneously for analysis or as input to resource management models. Computers now provide the resource scientist with the opportunity to design increasingly complex models that require even more categories of resource-related information, thus compounding the problem.Recently there has been much emphasis on the use of geographic information systems (GIS) as an alternative method for map data archives and as a resource management tool. Considerable effort has been devoted to the generation of tabular databases, such as the U.S. Department of Agriculture's SCS/S015 (Soil Survey Staff, 1983), to archive the large amounts of information that are collected in conjunction with mapping of natural resources in an easily retrievable manner.During the past 4 years the U.S. Geological Survey's EROS Data Center, in a cooperative effort with the Bureau of Land Management (BLM) and the Soil Conservation Service (SCS), developed a procedure that uses spatial and tabular databases to generate elevation, slope, aspect, and spectral map products that can be used during soil premapping. The procedure results in tabular data, residing in a database management system, that are indexed to the final soil delineations and help quantify soil map unit composition.The procedure was developed and tested on soil surveys on over 600 000 ha in Wyoming, Nevada, and Idaho. A transfer of technology from the EROS Data Center to the BLM will enable the Denver BLM Service Center to use this procedure in soil survey operations on BLM lands. Also underway is a cooperative effort between the EROS Data Center and SCS to define and evaluate maps that can be produced as derivatives of digital elevation data for 7.5-min quadrangle areas, such as those used during the premapping stage of the soil surveys mentioned above, the idea being to make such products routinely available.The procedure emphasizes the applications of digital elevation and spectral data to order-three soil surveys on rangelands, and will:Incorporate digital terrain and spectral data into a spatial database for soil surveys.Provide hardcopy products (that can be generated from digital elevation model and spectral data) that are useful during the soil pre-mapping process.Incorporate soil premaps into a spatial database that can be accessed during the soil survey process along with terrain and spectral data.Summarize useful quantitative information for soil mapping and for making interpretations for resource management.

  14. 48 CFR 504.605-70 - Federal Procurement Data System-Public access to data.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Procurement Data System—Public access to data. (a) The FPDS database. The General Services Administration awarded a contract for creation and operation of the Federal Procurement Data System (FPDS) database. That database includes information reported by departments and agencies as required by FAR subpart 4.6. One of...

  15. 76 FR 77533 - Notice of Order: Revisions to Enterprise Public Use Database Incorporating High-Cost Single...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ... FEDERAL HOUSING FINANCE AGENCY [No. 2011-N-13] Notice of Order: Revisions to Enterprise Public Use Database Incorporating High-Cost Single-Family Securitized Loan Data Fields and Technical Data Field..., regarding FHFA's adoption of an Order revising FHFA's Public Use Database matrices to include certain data...

  16. Database Security: What Students Need to Know

    ERIC Educational Resources Information Center

    Murray, Meg Coffin

    2010-01-01

    Database security is a growing concern evidenced by an increase in the number of reported incidents of loss of or unauthorized exposure to sensitive data. As the amount of data collected, retained and shared electronically expands, so does the need to understand database security. The Defense Information Systems Agency of the US Department of…

  17. 76 FR 10044 - Notice of Proposed Information Collection for Public Comment Public Housing Assessment System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-23

    ... Database Adjustments AGENCY: Office of the Assistant Secretary for Public and Indian Housing, HUD. ACTION...: Public Housing Assessment System Appeals, Technical Reviews and Database Adjustments. OMB Control Number..., at Sec. 902.24, a database adjustment if certain conditions are present. A technical review of the...

  18. 48 CFR 504.605-70 - Federal Procurement Data System-Public access to data.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Procurement Data System—Public access to data. (a) The FPDS database. The General Services Administration awarded a contract for creation and operation of the Federal Procurement Data System (FPDS) database. That database includes information reported by departments and agencies as required by FAR subpart 4.6. One of...

  19. 48 CFR 504.605-70 - Federal Procurement Data System-Public access to data.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Procurement Data System—Public access to data. (a) The FPDS database. The General Services Administration awarded a contract for creation and operation of the Federal Procurement Data System (FPDS) database. That database includes information reported by departments and agencies as required by FAR subpart 4.6. One of...

  20. 76 FR 39315 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security/ALL-030 Use of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-06

    ... Terrorist Screening Database System of Records AGENCY: Privacy Office, DHS. ACTION: Notice of proposed... Use of the Terrorist Screening Database System of Records'' and this proposed rulemaking. In this... Use of the Terrorist Screening Database (TSDB) System of Records.'' DHS is maintaining a mirror copy...

Top