Sample records for geologic computer process-based

  1. A Geospatial Information Grid Framework for Geological Survey.

    PubMed

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.

  2. A Geospatial Information Grid Framework for Geological Survey

    PubMed Central

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255

  3. Constructing a Geology Ontology Using a Relational Database

    NASA Astrophysics Data System (ADS)

    Hou, W.; Yang, L.; Yin, S.; Ye, J.; Clarke, K.

    2013-12-01

    In geology community, the creation of a common geology ontology has become a useful means to solve problems of data integration, knowledge transformation and the interoperation of multi-source, heterogeneous and multiple scale geological data. Currently, human-computer interaction methods and relational database-based methods are the primary ontology construction methods. Some human-computer interaction methods such as the Geo-rule based method, the ontology life cycle method and the module design method have been proposed for applied geological ontologies. Essentially, the relational database-based method is a reverse engineering of abstracted semantic information from an existing database. The key is to construct rules for the transformation of database entities into the ontology. Relative to the human-computer interaction method, relational database-based methods can use existing resources and the stated semantic relationships among geological entities. However, two problems challenge the development and application. One is the transformation of multiple inheritances and nested relationships and their representation in an ontology. The other is that most of these methods do not measure the semantic retention of the transformation process. In this study, we focused on constructing a rule set to convert the semantics in a geological database into a geological ontology. According to the relational schema of a geological database, a conversion approach is presented to convert a geological spatial database to an OWL-based geological ontology, which is based on identifying semantics such as entities, relationships, inheritance relationships, nested relationships and cluster relationships. The semantic integrity of the transformation was verified using an inverse mapping process. In a geological ontology, an inheritance and union operations between superclass and subclass were used to present the nested relationship in a geochronology and the multiple inheritances relationship. Based on a Quaternary database of downtown of Foshan city, Guangdong Province, in Southern China, a geological ontology was constructed using the proposed method. To measure the maintenance of semantics in the conversation process and the results, an inverse mapping from the ontology to a relational database was tested based on a proposed conversation rule. The comparison of schema and entities and the reduction of tables between the inverse database and the original database illustrated that the proposed method retains the semantic information well during the conversation process. An application for abstracting sandstone information showed that semantic relationships among concepts in the geological database were successfully reorganized in the constructed ontology. Key words: geological ontology; geological spatial database; multiple inheritance; OWL Acknowledgement: This research is jointly funded by the Specialized Research Fund for the Doctoral Program of Higher Education of China (RFDP) (20100171120001), NSFC (41102207) and the Fundamental Research Funds for the Central Universities (12lgpy19).

  4. The Cyborg Astrobiologist: testing a novelty detection algorithm on two mobile exploration systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah

    NASA Astrophysics Data System (ADS)

    McGuire, P. C.; Gross, C.; Wendt, L.; Bonnici, A.; Souza-Egipsy, V.; Ormö, J.; Díaz-Martínez, E.; Foing, B. H.; Bose, R.; Walter, S.; Oesker, M.; Ontrup, J.; Haschke, R.; Ritter, H.

    2010-01-01

    In previous work, a platform was developed for testing computer-vision algorithms for robotic planetary exploration. This platform consisted of a digital video camera connected to a wearable computer for real-time processing of images at geological and astrobiological field sites. The real-time processing included image segmentation and the generation of interest points based upon uncommonness in the segmentation maps. Also in previous work, this platform for testing computer-vision algorithms has been ported to a more ergonomic alternative platform, consisting of a phone camera connected via the Global System for Mobile Communications (GSM) network to a remote-server computer. The wearable-computer platform has been tested at geological and astrobiological field sites in Spain (Rivas Vaciamadrid and Riba de Santiuste), and the phone camera has been tested at a geological field site in Malta. In this work, we (i) apply a Hopfield neural-network algorithm for novelty detection based upon colour, (ii) integrate a field-capable digital microscope on the wearable computer platform, (iii) test this novelty detection with the digital microscope at Rivas Vaciamadrid, (iv) develop a Bluetooth communication mode for the phone-camera platform, in order to allow access to a mobile processing computer at the field sites, and (v) test the novelty detection on the Bluetooth-enabled phone camera connected to a netbook computer at the Mars Desert Research Station in Utah. This systems engineering and field testing have together allowed us to develop a real-time computer-vision system that is capable, for example, of identifying lichens as novel within a series of images acquired in semi-arid desert environments. We acquired sequences of images of geologic outcrops in Utah and Spain consisting of various rock types and colours to test this algorithm. The algorithm robustly recognized previously observed units by their colour, while requiring only a single image or a few images to learn colours as familiar, demonstrating its fast learning capability.

  5. On the upscaling of process-based models in deltaic applications

    NASA Astrophysics Data System (ADS)

    Li, L.; Storms, J. E. A.; Walstra, D. J. R.

    2018-03-01

    Process-based numerical models are increasingly used to study the evolution of marine and terrestrial depositional environments. Whilst a detailed description of small-scale processes provides an accurate representation of reality, application on geological timescales is restrained by the associated increase in computational time. In order to reduce the computational time, a number of acceleration methods are combined and evaluated for a schematic supply-driven delta (static base level) and an accommodation-driven delta (variable base level). The performance of the combined acceleration methods is evaluated by comparing the morphological indicators such as distributary channel networking and delta volumes derived from the model predictions for various levels of acceleration. The results of the accelerated models are compared to the outcomes from a series of simulations to capture autogenic variability. Autogenic variability is quantified by re-running identical models on an initial bathymetry with 1 cm added noise. The overall results show that the variability of the accelerated models fall within the autogenic variability range, suggesting that the application of acceleration methods does not significantly affect the simulated delta evolution. The Time-scale compression method (the acceleration method introduced in this paper) results in an increased computational efficiency of 75% without adversely affecting the simulated delta evolution compared to a base case. The combination of the Time-scale compression method with the existing acceleration methods has the potential to extend the application range of process-based models towards geologic timescales.

  6. Woods Hole Image Processing System Software implementation; using NetCDF as a software interface for image processing

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    The Branch of Atlantic Marine Geology has been involved in the collection, processing and digital mosaicking of high, medium and low-resolution side-scan sonar data during the past 6 years. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. With the need to process sidescan data in the field with increased power and reduced cost of major workstations, a need to have an image processing package on a UNIX based computer system which could be utilized in the field as well as be more generally available to Branch personnel was identified. This report describes the initial development of that package referred to as the Woods Hole Image Processing System (WHIPS). The software was developed using the Unidata NetCDF software interface to allow data to be more readily portable between different computer operating systems.

  7. Advantages of Computer Simulation in Enhancing Students' Learning about Landform Evolution: A Case Study Using the Grand Canyon

    ERIC Educational Resources Information Center

    Luo, Wei; Pelletier, Jon; Duffin, Kirk; Ormand, Carol; Hung, Wei-chen; Shernoff, David J.; Zhai, Xiaoming; Iverson, Ellen; Whalley, Kyle; Gallaher, Courtney; Furness, Walter

    2016-01-01

    The long geological time needed for landform development and evolution poses a challenge for understanding and appreciating the processes involved. The Web-based Interactive Landform Simulation Model--Grand Canyon (WILSIM-GC, http://serc.carleton.edu/landform/) is an educational tool designed to help students better understand such processes,…

  8. Computer-assisted photogrammetric mapping systems for geologic studies-A progress report

    USGS Publications Warehouse

    Pillmore, C.L.; Dueholm, K.S.; Jepsen, H.S.; Schuch, C.H.

    1981-01-01

    Photogrammetry has played an important role in geologic mapping for many years; however, only recently have attempts been made to automate mapping functions for geology. Computer-assisted photogrammetric mapping systems for geologic studies have been developed and are currently in use in offices of the Geological Survey of Greenland at Copenhagen, Denmark, and the U.S. Geological Survey at Denver, Colorado. Though differing somewhat, the systems are similar in that they integrate Kern PG-2 photogrammetric plotting instruments and small desk-top computers that are programmed to perform special geologic functions and operate flat-bed plotters by means of specially designed hardware and software. A z-drive capability, in which stepping motors control the z-motions of the PG-2 plotters, is an integral part of both systems. This feature enables the computer to automatically position the floating mark on computer-calculated, previously defined geologic planes, such as contacts or the base of coal beds, throughout the stereoscopic model in order to improve the mapping capabilities of the instrument and to aid in correlation and tracing of geologic units. The common goal is to enhance the capabilities of the PG-2 plotter and provide a means by which geologists can make conventional geologic maps more efficiently and explore ways to apply computer technology to geologic studies. ?? 1981.

  9. Use of electronic microprocessor-based instrumentation by the U.S. geological survey for hydrologic data collection

    USGS Publications Warehouse

    Shope, William G.; ,

    1991-01-01

    The U.S. Geological Survey is acquiring a new generation of field computers and communications software to support hydrologic data-collection at field locations. The new computer hardware and software mark the beginning of the Survey's transition from the use of electromechanical devices and paper tapes to electronic microprocessor-based instrumentation. Software is being developed for these microprocessors to facilitate the collection, conversion, and entry of data into the Survey's National Water Information System. The new automated data-collection process features several microprocessor-controlled sensors connected to a serial digital multidrop line operated by an electronic data recorder. Data are acquired from the sensors in response to instructions programmed into the data recorder by the user through small portable lap-top or hand-held computers. The portable computers, called personal field computers, also are used to extract data from the electronic recorders for transport by courier to the office computers. The Survey's alternative to manual or courier retrieval is the use of microprocessor-based remote telemetry stations. Plans have been developed to enhance the Survey's use of the Geostationary Operational Environmental Satellite telemetry by replacing the present network of direct-readout ground stations with less expensive units. Plans also provide for computer software that will support other forms of telemetry such as telephone or land-based radio.

  10. Application of Remote Sensing in Geological Mapping, Case Study al Maghrabah Area - Hajjah Region, Yemen

    NASA Astrophysics Data System (ADS)

    Al-Nahmi, F.; Saddiqi, O.; Hilali, A.; Rhinane, H.; Baidder, L.; El arabi, H.; Khanbari, K.

    2017-11-01

    Remote sensing technology plays an important role today in the geological survey, mapping, analysis and interpretation, which provides a unique opportunity to investigate the geological characteristics of the remote areas of the earth's surface without the need to gain access to an area on the ground. The aim of this study is achievement a geological map of the study area. The data utilizes is Sentinel-2 imagery, the processes used in this study, the OIF Optimum Index Factor is a statistic value that can be used to select the optimum combination of three bands in a satellite image. It's based on the total variance within bands and correlation coefficient between bands, ICA Independent component analysis (3, 4, 6) is a statistical and computational technique for revealing hidden factors that underlie sets of random variables, measurements, or signals, MNF Minimum Noise Fraction (1, 2, 3) is used to determine the inherent dimensionality of image data to segregate noise in the data and to reduce the computational requirements for subsequent processing, Optimum Index Factor is a good method for choosing the best band for lithological mapping. ICA, MNF, also a practical way to extract the structural geology maps. The results in this paper indicate that, the studied area can be divided into four main geological units: Basement rocks (Meta volcanic, Meta sediments), Sedimentary rocks, Intrusive rocks, volcanic rocks. The method used in this study offers great potential for lithological mapping, by using Sentinel-2 imagery, the results were compared with existing geologic maps and were superior and could be used to update the existing maps.

  11. Images of Kilauea East Rift Zone eruption, 1983-1993

    USGS Publications Warehouse

    Takahashi, Taeko Jane; Abston, C.C.; Heliker, C.C.

    1995-01-01

    This CD-ROM disc contains 475 scanned photographs from the U.S. Geological Survey Hawaii Observatory Library. The collection represents a comprehensive range of the best photographic images of volcanic phenomena for Kilauea's East Rift eruption, which continues as of September 1995. Captions of the images present information on location, geologic feature or process, and date. Short documentations of work by the USGS Hawaiian Volcano Observatory in geology, seismology, ground deformation, geophysics, and geochemistry are also included, along with selected references. The CD-ROM was produced in accordance with the ISO 9660 standard; however, it is intended for use only on DOS-based computer systems.

  12. Geologic Measurements using Rover Images: Lessons from Pathfinder with Application to Mars 2001

    NASA Technical Reports Server (NTRS)

    Bridges, N. T.; Haldemann, A. F. C.; Herkenhoff, K. E.

    1999-01-01

    The Pathfinder Sojourner rover successfully acquired images that provided important and exciting information on the geology of Mars. This included the documentation of rock textures, barchan dunes, soil crusts, wind tails, and ventifacts. It is expected that the Marie Curie rover cameras will also successfully return important information on landing site geology. Critical to a proper analysis of these images will be a rigorous determination of rover location and orientation. Here, the methods that were used to compute rover position for Sojourner image analysis are reviewed. Based on this experience, specific recommendations are made that should improve this process on the '01 mission.

  13. Real-Time Mapping alert system; user's manual

    USGS Publications Warehouse

    Torres, L.A.

    1996-01-01

    The U.S. Geological Survey has an extensive hydrologic network that records and transmits precipitation, stage, discharge, and other water- related data on a real-time basis to an automated data processing system. Data values are recorded on electronic data collection platforms at field monitoring sites. These values are transmitted by means of orbiting satellites to receiving ground stations, and by way of telecommunication lines to a U.S. Geological Survey office where they are processed on a computer system. Data that exceed predefined thresholds are identified as alert values. These alert values can help keep water- resource specialists informed of current hydrologic conditions. The current alert status at monitoring sites is of critical importance during floods, hurricanes, and other extreme hydrologic events where quick analysis of the situation is needed. This manual provides instructions for using the Real-Time Mapping software, a series of computer programs developed by the U.S. Geological Survey for quick analysis of hydrologic conditions, and guides users through a basic interactive session. The software provides interactive graphics display and query of real-time information in a map-based, menu-driven environment.

  14. 3D Geological Model for "LUSI" - a Deep Geothermal System

    NASA Astrophysics Data System (ADS)

    Sohrabi, Reza; Jansen, Gunnar; Mazzini, Adriano; Galvan, Boris; Miller, Stephen A.

    2016-04-01

    Geothermal applications require the correct simulation of flow and heat transport processes in porous media, and many of these media, like deep volcanic hydrothermal systems, host a certain degree of fracturing. This work aims to understand the heat and fluid transport within a new-born sedimentary hosted geothermal system, termed Lusi, that began erupting in 2006 in East Java, Indonesia. Our goal is to develop conceptual and numerical models capable of simulating multiphase flow within large-scale fractured reservoirs such as the Lusi region, with fractures of arbitrary size, orientation and shape. Additionally, these models can also address a number of other applications, including Enhanced Geothermal Systems (EGS), CO2 sequestration (Carbon Capture and Storage CCS), and nuclear waste isolation. Fractured systems are ubiquitous, with a wide-range of lengths and scales, making difficult the development of a general model that can easily handle this complexity. We are developing a flexible continuum approach with an efficient, accurate numerical simulator based on an appropriate 3D geological model representing the structure of the deep geothermal reservoir. Using previous studies, borehole information and seismic data obtained in the framework of the Lusi Lab project (ERC grant n°308126), we present here the first 3D geological model of Lusi. This model is calculated using implicit 3D potential field or multi-potential fields, depending on the geological context and complexity. This method is based on geological pile containing the geological history of the area and relationship between geological bodies allowing automatic computation of intersections and volume reconstruction. Based on the 3D geological model, we developed a new mesh algorithm to create hexahedral octree meshes to transfer the structural geological information for 3D numerical simulations to quantify Thermal-Hydraulic-Mechanical-Chemical (THMC) physical processes.

  15. Real-Time Mapping alert system; characteristics and capabilities

    USGS Publications Warehouse

    Torres, L.A.; Lambert, S.C.; Liebermann, T.D.

    1995-01-01

    The U.S. Geological Survey has an extensive hydrologic network that records and transmits precipitation, stage, discharge, and other water-related data on a real-time basis to an automated data processing system. Data values are recorded on electronic data collection platforms at field sampling sites. These values are transmitted by means of orbiting satellites to receiving ground stations, and by way of telecommunication lines to a U.S. Geological Survey office where they are processed on a computer system. Data that exceed predefined thresholds are identified as alert values. The current alert status at monitoring sites within a state or region is of critical importance during floods, hurricanes, and other extreme hydrologic events. This report describes the characteristics and capabilities of a series of computer programs for real-time mapping of hydrologic data. The software provides interactive graphics display and query of hydrologic information from the network in a real-time, map-based, menu-driven environment.

  16. Methods of training the graduate level and professional geologist in remote sensing technology

    NASA Technical Reports Server (NTRS)

    Kolm, K. E.

    1981-01-01

    Requirements for a basic course in remote sensing to accommodate the needs of the graduate level and professional geologist are described. The course should stress the general topics of basic remote sensing theory, the theory and data types relating to different remote sensing systems, an introduction to the basic concepts of computer image processing and analysis, the characteristics of different data types, the development of methods for geological interpretations, the integration of all scales and data types of remote sensing in a given study, the integration of other data bases (geophysical and geochemical) into a remote sensing study, and geological remote sensing applications. The laboratories should stress hands on experience to reinforce the concepts and procedures presented in the lecture. The geologist should then be encouraged to pursue a second course in computer image processing and analysis of remotely sensed data.

  17. Geological research for public outreach and education in Lithuania

    NASA Astrophysics Data System (ADS)

    Skridlaite, Grazina; Guobyte, Rimante

    2013-04-01

    Successful IYPE activities and implementation of Geoheritage day in Lithuania increased public awareness in geology. A series of projects introducing geology to the general public and youth, supported by EU funds and local communities, were initiated. Researchers from the scientific and applied geology institutions of Lithuania participated in these projects and provided with the geological data. In one case, the Lithuanian Survey of Protected Areas supported the installation of a series of geological exhibitions in several regional and national parks. An animation demonstrating glacial processes was chosen for most of these because the Lithuanian surface is largely covered with sedimentary deposits of the Nemunas (Weichselian) glaciation. Researchers from the Lithuanian Geological Survey used the mapping results to demonstrate real glacial processes for every chosen area. In another case, 3D models showing underground structures of different localities were based on detailed geological maps and profiles obtained for that area. In case of the Sartai regional park, the results of previous geological research projects provided the possibility to create a movie depicting the ca. 2 Ga geological evolution of the region. The movie starts with the accretion of volcanic island arcs on the earlier continental margin at ca. 2 Ga and deciphers later Precambrian tectonic and magmatic events. The reconstruction is based on numerous scientific articles and interpretation of geophysical data. Later Paleozoic activities and following erosion sculptured the surface which was covered with several ice sheets in Quaternary. For educational purpose, a collection of minerals and rocks at the Forestry Institute was used to create an exhibition called "Cycle of geological processes". Forestry scientists and their students are able to study the interactions of geodiversity and biodiversity and to understand ancient and modern geological processes leading to a soil formation. An aging exposition at the Museum of Erratic Boulders in NW Lithuania is being rearranged for educational purposes, to show the major rock types and their origins more clearly. A new exhibition is supplemented with computer portals presenting geological processes, geological quizzes, animations etc. Magmatism, metamorphism, sedimentation and other geological processes are demonstrated using erratic boulders brought by glaciers from Scandinavia and northern Russia. A part of the exhibition is devoted to glaciation processes and arrival of ice sheets to Lithuania. Visitors are able to examine large erratic boulder groups in a surrounding park and to enjoy beautiful environment. The exhibition also demonstrates mineral resources of Lithuania, different fossils and stones from a human body. In all cases it was recognised that a lack of geological information limits the use of geology for public outreach. Ongoing scientific research is essential in many places as well as a mediator's job for interpreting the results of highly specialised research results and to adapt them for public consumption.

  18. User's manual for computer program BASEPLOT

    USGS Publications Warehouse

    Sanders, Curtis L.

    2002-01-01

    The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.

  19. Predictive uncertainty analysis of plume distribution for geological carbon sequestration using sparse-grid Bayesian method

    NASA Astrophysics Data System (ADS)

    Shi, X.; Zhang, G.

    2013-12-01

    Because of the extensive computational burden, parametric uncertainty analyses are rarely conducted for geological carbon sequestration (GCS) process based multi-phase models. The difficulty of predictive uncertainty analysis for the CO2 plume migration in realistic GCS models is not only due to the spatial distribution of the caprock and reservoir (i.e. heterogeneous model parameters), but also because the GCS optimization estimation problem has multiple local minima due to the complex nonlinear multi-phase (gas and aqueous), and multi-component (water, CO2, salt) transport equations. The geological model built by Doughty and Pruess (2004) for the Frio pilot site (Texas) was selected and assumed to represent the 'true' system, which was composed of seven different facies (geological units) distributed among 10 layers. We chose to calibrate the permeabilities of these facies. Pressure and gas saturation values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. Each simulation of the model lasts about 2 hours. In this study, we develop a new approach that improves computational efficiency of Bayesian inference by constructing a surrogate system based on an adaptive sparse-grid stochastic collocation method. This surrogate response surface global optimization algorithm is firstly used to calibrate the model parameters, then prediction uncertainty of the CO2 plume position is quantified due to the propagation from parametric uncertainty in the numerical experiments, which is also compared to the actual plume from the 'true' model. Results prove that the approach is computationally efficient for multi-modal optimization and prediction uncertainty quantification for computationally expensive simulation models. Both our inverse methodology and findings can be broadly applicable to GCS in heterogeneous storage formations.

  20. The application of structure from motion (SfM) to identify the geological structure and outcrop studies

    NASA Astrophysics Data System (ADS)

    Saputra, Aditya; Rahardianto, Trias; Gomez, Christopher

    2017-07-01

    Adequate knowledge of geological structure is an essential for most studies in geoscience, mineral exploration, geo-hazard and disaster management. The geological map is still one the datasets the most commonly used to obtain information about the geological structure such as fault, joint, fold, and unconformities, however in rural areas such as Central Java data is still sparse. Recent progress in data acquisition technologies and computing have increased the interest in how to capture the high-resolution geological data effectively and for a relatively low cost. Some methods such as Airborne Laser Scanning (ALS), Terrestrial Laser Scanning (TLS), and Unmanned Aerial Vehicles (UAVs) have been widely used to obtain this information, however, these methods need a significant investment in hardware, software, and time. Resolving some of those issues, the photogrammetric method structure from motion (SfM) is an image-based method, which can provide solutions equivalent to laser technologies for a relatively low-cost with minimal time, specialization and financial investment. Using SfM photogrammetry, it is possible to generate high resolution 3D images rock surfaces and outcrops, in order to improve the geological understanding of Indonesia. In the present contribution, it is shown that the information about fault and joint can be obtained at high-resolution and in a shorter time than with the conventional grid mapping and remotely sensed topographic surveying. The SfM method produces a point-cloud through image matching and computing. This task can be run with open- source or commercial image processing and 3D reconstruction software. As the point cloud has 3D information as well as RGB values, it allows for further analysis such as DEM extraction and image orthorectification processes. The present paper describes some examples of SfM to identify the fault in the outcrops and also highlight the future possibilities in terms of earthquake hazard assessment, based on fieldwork in the South of Yogyakarta City.

  1. Creation of a full color geologic map by computer: A case history from the Port Moller project resource assessment, Alaska Peninsula: A section in Geologic studies in Alaska by the U.S. Geological Survey, 1988

    USGS Publications Warehouse

    Wilson, Frederic H.

    1989-01-01

    Graphics programs on computers can facilitate the compilation and production of geologic maps, including full color maps of publication quality. This paper describes the application of two different programs, GSMAP and ARC/INFO, to the production of a geologic map of the Port Meller and adjacent 1:250,000-scale quadrangles on the Alaska Peninsula. GSMAP was used at first because of easy digitizing on inexpensive computer hardware. Limitations in its editing capability led to transfer of the digital data to ARC/INFO, a Geographic Information System, which has better editing and also added data analysis capability. Although these improved capabilities are accompanied by increased complexity, the availability of ARC/INFO's data analysis capability provides unanticipated advantages. It allows digital map data to be processed as one of multiple data layers for mineral resource assessment. As a result of development of both software packages, it is now easier to apply both software packages to geologic map production. Both systems accelerate the drafting and revision of maps and enhance the compilation process. Additionally, ARC/ INFO's analysis capability enhances the geologist's ability to develop answers to questions of interest that were previously difficult or impossible to obtain.

  2. The computer treatment of remotely sensed data: An introduction to techniques which have geologic applications. [image enhancement and thematic classification in Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Paradella, W. R.; Vitorello, I.

    1982-01-01

    Several aspects of computer-assisted analysis techniques for image enhancement and thematic classification by which LANDSAT MSS imagery may be treated quantitatively are explained. On geological applications, computer processing of digital data allows, possibly, the fullest use of LANDSAT data, by displaying enhanced and corrected data for visual analysis and by evaluating and assigning each spectral pixel information to a given class.

  3. The periodic structure of the natural record, and nonlinear dynamics.

    USGS Publications Warehouse

    Shaw, H.R.

    1987-01-01

    This paper addresses how nonlinear dynamics can contribute to interpretations of the geologic record and evolutionary processes. Background is given to explain why nonlinear concepts are important. A resume of personal research is offered to illustrate why I think nonlinear processes fit with observations on geological and cosmological time series data. The fabric of universal periodicity arrays generated by nonlinear processes is illustrated by means of a simple computer mode. I conclude with implications concerning patterns of evolution, stratigraphic boundary events, and close correlations of major geologically instantaneous events (such as impacts or massive volcanic episodes) with any sharply defined boundary in the geologic column. - from Author

  4. Exploration for fossil and nuclear fuels from orbital altitudes

    NASA Technical Reports Server (NTRS)

    Short, N. M.

    1975-01-01

    A review of satellite-based photographic (optical and infrared) and microwave exploration and large-area mapping of the earth's surface in the ERTS program. Synoptic cloud-free coverage of large areas has been achieved with planimetric vertical views of the earth's surface useful in compiling close-to-orthographic mosaics. Radar penetration of cloud cover and infrared penetration of forest cover have been successful to some extent. Geological applications include map editing (with corrections in scale and computer processing of images), landforms analysis, structural geology studies, lithological identification, and exploration for minerals and fuels. Limitations of the method are noted.

  5. Assessment of effectiveness of geologic isolation systems. Geologic-simulation model for a hypothetical site in the Columbia Plateau. Volume 2: results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, M.G.; Petrie, G.M.; Baldwin, A.J.

    1982-06-01

    This report contains the input data and computer results for the Geologic Simulation Model. This model is described in detail in the following report: Petrie, G.M., et. al. 1981. Geologic Simulation Model for a Hypothetical Site in the Columbia Plateau, Pacific Northwest Laboratory, Richland, Washington. The Geologic Simulation Model is a quasi-deterministic process-response model which simulates, for a million years into the future, the development of the geologic and hydrologic systems of the ground-water basin containing the Pasco Basin. Effects of natural processes on the ground-water hydrologic system are modeled principally by rate equations. The combined effects and synergistic interactionsmore » of different processes are approximated by linear superposition of their effects during discrete time intervals in a stepwise-integration approach.« less

  6. Planetary Geologic Mapping Handbook - 2010. Appendix

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Skinner, J. A., Jr.; Hare, T. M.

    2010-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces. Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962. Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of projectspecific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well. Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically. As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program's Planetary Cartography and Geologic Mapping Working Group's (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.

  7. U.S. Geological Survey national computer technology meeting; program and abstracts, New Orleans, Louisiana, April 10-15, 1994

    USGS Publications Warehouse

    Balthrop, B. H.; Baker, E.G.

    1994-01-01

    This report contains some of the abstracts of papers that were presented at the National Computer Technology Meeting that was held in April 1994. This meeting was sponsored by the Water Resources Division of the U.S. Geological Survey, and was attended by more than 200 technical and managerial personnel representing all the Divisions of the U.S. Geological Survey. Computer-related information from all Divisions of the U.S. Geological Survey are discussed in this compilation of abstracts. Some of the topics addressed are data transfer, data-base management, hydrologic applications, national water information systems, and geographic information systems applications and techniques.

  8. Interagency Report: Astrogeology 58, television cartography

    USGS Publications Warehouse

    Batson, Raymond M.

    1973-01-01

    The purpose of this paper is to illustrate the processing of digital television pictures into base maps. In this context, a base map is defined as a pictorial representation of planetary surface morphology accurately reproduced on standard map projections. Topographic contour lines, albedo or geologic overprints may be super imposed on these base maps. The compilation of geodetic map controls, the techniques of mosaic compilation, computer processing and airbrush enhancement, and the compilation of con tour lines are discussed elsewhere by the originators of these techniques. A bibliography of applicable literature is included for readers interested in more detailed discussions.

  9. An innovative computationally efficient hydromechanical coupling approach for fault reactivation in geological subsurface utilization

    NASA Astrophysics Data System (ADS)

    Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.

    2018-02-01

    Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.

  10. Opportunities and Needs for Mobile-Computing Technology to Support U.S. Geological Survey Fieldwork

    USGS Publications Warehouse

    Wood, Nathan J.; Halsing, David L.

    2006-01-01

    To assess the opportunities and needs for mobile-computing technology at the U.S. Geological Survey (USGS), we conducted an internal, Internet-based survey of bureau scientists whose research includes fieldwork. In summer 2005, 144 survey participants answered 65 questions about fieldwork activities and conditions, technology to support field research, and postfieldwork data processing and analysis. Results suggest that some types of mobile-computing technology are already commonplace, such as digital cameras and Global Positioning System (GPS) receivers, whereas others are not, such as personal digital assistants (PDAs) and tablet-based personal computers (tablet PCs). The potential for PDA use in the USGS is high: 97 percent of respondents record field observations (primarily environmental conditions and water-quality data), and 87 percent take field samples (primarily water-quality data, water samples, and sediment/soil samples). The potential for tablet PC use in the USGS is also high: 59 percent of respondents map environmental features in the field, primarily by sketching in field notebooks, on aerial photographs, or on topographic-map sheets. Results also suggest that efficient mobile-computing-technology solutions could benefit many USGS scientists because most respondents spend at least 1 week per year in the field, conduct field sessions that are least 1 week in duration, have field crews of one to three people, and typically travel on foot about 1 mi from their field vehicles. By allowing researchers to enter data directly into digital databases while in the field, mobile-computing technology could also minimize postfieldwork data processing: 93 percent of respondents enter collected field data into their office computers, and more than 50 percent spend at least 1 week per year on postfieldwork data processing. Reducing postfieldwork data processing could free up additional time for researchers and result in cost savings for the bureau. Generally, respondents support greater use of mobile-computing technology at the USGS and are interested in training opportunities and further discussions related to data archiving, access to additional digital data types, and technology development.

  11. How semantics can inform the geological mapping process and support intelligent queries

    NASA Astrophysics Data System (ADS)

    Lombardo, Vincenzo; Piana, Fabrizio; Mimmo, Dario

    2017-04-01

    The geologic mapping process requires the organization of data according to the general knowledge about the objects, namely the geologic units, and to the objectives of a graphic representation of such objects in a map, following an established model of geotectonic evolution. Semantics can greatly help such a process in two concerns: the provision of a terminological base to name and classify the objects of the map; on the other, the implementation of a machine-readable encoding of the geologic knowledge base supports the application of reasoning mechanisms and the derivation of novel properties and relations about the objects of the map. The OntoGeonous initiative has built a terminological base of geological knowledge in a machine-readable format, following the Semantic Web tenets and the Linked Data paradigm. The major knowledge sources of the OntoGeonous initiative are GeoScience Markup Language schemata and vocabularies (through its last version, GeoSciML 4, 2015, published by the IUGS CGI Commission) and the INSPIRE "Data Specification on Geology" directives (an operative simplification of GeoSciML, published by INSPIRE Thematic Working Group Geology of the European Commission). The Linked Data paradigm has been exploited by linking (without replicating, to avoid inconsistencies) the already existing machine-readable encoding for some specific domains, such as the lithology domain (vocabulary Simple Lithology) and the geochronologic time scale (ontology "gts"). Finally, for the upper level knowledge, shared across several geologic domains, we have resorted to NASA SWEET ontology. The OntoGeonous initiative has also produced a wiki that explains how the geologic knowledge has been encoded from shared geoscience vocabularies (https://www.di.unito.it/wikigeo/). In particular, the sections dedicated to axiomatization will support the construction of an appropriate data base schema that can be then filled with the objects of the map. This contribution will discuss how the formal encoding of the geological knowledge opens new perspectives for the analysis and representation of the geological systems. In fact, once that the major concepts are defined, the resulting formal conceptual model of the geologic system can hold across different technical and scientific communities. Furthermore, this would allow for a semi-automatic or automatic classification of the cartographic database, where a significant number of properties (attributes) of the recorded instances could be inferred through computational reasoning. So, for example, the system can be queried for showing the instances that satisfy some property (e.g., "Retrieve all the lithostratigraphic units composed of clastic sedimentary rock") or for classifying some unit according to the properties holding for that unit (e.g., "What is the class of the geologic unit composed of siltstone material?").

  12. Computer image processing: Geologic applications

    NASA Technical Reports Server (NTRS)

    Abrams, M. J.

    1978-01-01

    Computer image processing of digital data was performed to support several geological studies. The specific goals were to: (1) relate the mineral content to the spectral reflectance of certain geologic materials, (2) determine the influence of environmental factors, such as atmosphere and vegetation, and (3) improve image processing techniques. For detection of spectral differences related to mineralogy, the technique of band ratioing was found to be the most useful. The influence of atmospheric scattering and methods to correct for the scattering were also studied. Two techniques were used to correct for atmospheric effects: (1) dark object subtraction, (2) normalization of use of ground spectral measurements. Of the two, the first technique proved to be the most successful for removing the effects of atmospheric scattering. A digital mosaic was produced from two side-lapping LANDSAT frames. The advantages were that the same enhancement algorithm can be applied to both frames, and there is no seam where the two images are joined.

  13. Goal-seismic computer programs in BASIC: Part I; Store, plot, and edit array data

    USGS Publications Warehouse

    Hasbrouck, Wilfred P.

    1979-01-01

    Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in an extended BASIC language specially augmented for acceptance by the Tektronix 4051 Graphic System. This report presents five computer programs used to store, plot, and edit array data for the line, cross, and triangle arrays commonly employed in our coal-seismic investigations. * Use of brand names in this report is for descriptive purposes only and does not constitute endorsement by the U.S. Geological Survey.

  14. Planetary Geologic Mapping Handbook - 2009

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Skinner, J. A.; Hare, T. M.

    2009-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of project-specific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well (e.g., Hare and others, 2009). Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically (e.g., Wilhelms, 1972, 1990; Tanaka and others, 1994). As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program s Planetary Cartography and Geologic Mapping Working Group s (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.

  15. Computation of Flow Through Water-Control Structures Using Program DAMFLO.2

    USGS Publications Warehouse

    Sanders, Curtis L.; Feaster, Toby D.

    2004-01-01

    As part of its mission to collect, analyze, and store streamflow data, the U.S. Geological Survey computes flow through several dam structures throughout the country. Flows are computed using hydraulic equations that describe flow through sluice and Tainter gates, crest gates, lock gates, spillways, locks, pumps, and siphons, which are calibrated using flow measurements. The program DAMFLO.2 was written to compute, tabulate, and plot flow through dam structures using data that describe the physical properties of dams and various hydraulic parameters and ratings that use time-varying data, such as lake elevations or gate openings. The program uses electronic computer files of time-varying data, such as lake elevation or gate openings, retrieved from the U.S. Geological Survey Automated Data Processing System. Computed time-varying flow data from DAMFLO.2 are output in flat files, which can be entered into the Automated Data Processing System database. All computations are made in units of feet and seconds. DAMFLO.2 uses the procedures and language developed by the SAS Institute Inc.

  16. Sensitivity analysis of 1-D dynamical model for basin analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, S.

    1987-01-01

    Geological processes related to petroleum generation, migration and accumulation are very complicated in terms of time and variables involved, and it is very difficult to simulate these processes by laboratory experiments. For this reasons, many mathematic/computer models have been developed to simulate these geological processes based on geological, geophysical and geochemical principles. The sensitivity analysis in this study is a comprehensive examination on how geological, geophysical and geochemical parameters influence the reconstructions of geohistory, thermal history and hydrocarbon generation history using the 1-D fluid flow/compaction model developed in the Basin Modeling Group at the University of South Carolina. This studymore » shows the effects of some commonly used parameter such as depth, age, lithology, porosity, permeability, unconformity (eroded thickness and erosion time), temperature at sediment surface, bottom hole temperature, present day heat flow, thermal gradient, thermal conductivity and kerogen type and content on the evolutions of formation thickness, porosity, permeability, pressure with time and depth, heat flow with time, temperature with time and depth, vitrinite reflectance (Ro) and TTI with time and depth, and oil window in terms of time and depth, amount of hydrocarbons generated with time and depth. Lithology, present day heat flow and thermal conductivity are the most sensitive parameters in the reconstruction of temperature history.« less

  17. Understanding volcanic geomorphology from derivatives and wavelet analysis: A case study at Miyakejima Volcano, Izu Islands, Japan

    NASA Astrophysics Data System (ADS)

    Gomez, C.

    2018-04-01

    From feature recognition to multiscale analysis, the human brain does this computation almost instantaneously, but reproducing this process for effective computation is still a challenge. Although it is a growing field in computational geomorphology, there has been only limited investigation of those issues on volcanoes. For the present study, we investigated Miyakejima, a volcanic island in the Izu archipelago, located 200 km south of Tokyo City (Japan). The island has experienced numerous Quaternary and historical eruptions, which have been recorded in details and therefore provide a solid foundation to experiment remote-sensing methods and compare the results to existing data. In the present study, the author examines the use of DEM derivatives and wavelet decomposition 5 m DEM available from the Geographic Authority of Japan was used. It was pre-processed to generate grid data with QGIS. The data was then analyzed with remote sensing techniques and wavelet analysis in ENVI and Matlab. Results have shown that the combination of 'Elevation' with 'Local Data Range Variation' and 'Relief Mapping' as a RGB image composite provides a powerful visual interpretation tool, but the feature separation remains a subjective analysis provided a more appropriate dataset for computer-based analysis and information extraction and understanding of topographic features at different scales. In order to confirm the usefulness of these topographic derivatives, the results were compared to known geological features and it was found to be in accordance with the data provided by geological, topographic maps and field research at Miyakejima. The protocol presented in the discussion can therefore be re-used at other volcanoes worldwide where less information is available on past-eruption and geology, in order to explain the volcanic geomorphology.

  18. U.S. Geological Survey National Computer Technology Meeting; Program and abstracts, May 7-11, 1990

    USGS Publications Warehouse

    Balthrop, B. H.; Baker, E.G.

    1990-01-01

    Computer-related information from all Divisions of the U.S. Geological Survey are discussed in this compilation of abstracts. Some of the topics addressed are system administration; distributed information systems and data bases, both current (1990) and proposed; hydrologic applications; national water information systems; geographic information systems applications and techniques. The report contains some of the abstracts that were presented at the National Computer Technology Meeting that was held in May 1990. The meeting was sponsored by the Water Resources Division and was attended by more than 200 technical and managerial personnel representing all the Divisions of the U.S. Geological Survey. (USGS)

  19. Integrating 3D geological information with a national physically-based hydrological modelling system

    NASA Astrophysics Data System (ADS)

    Lewis, Elizabeth; Parkin, Geoff; Kessler, Holger; Whiteman, Mark

    2016-04-01

    Robust numerical models are an essential tool for informing flood and water management and policy around the world. Physically-based hydrological models have traditionally not been used for such applications due to prohibitively large data, time and computational resource requirements. Given recent advances in computing power and data availability, a robust, physically-based hydrological modelling system for Great Britain using the SHETRAN model and national datasets has been created. Such a model has several advantages over less complex systems. Firstly, compared with conceptual models, a national physically-based model is more readily applicable to ungauged catchments, in which hydrological predictions are also required. Secondly, the results of a physically-based system may be more robust under changing conditions such as climate and land cover, as physical processes and relationships are explicitly accounted for. Finally, a fully integrated surface and subsurface model such as SHETRAN offers a wider range of applications compared with simpler schemes, such as assessments of groundwater resources, sediment and nutrient transport and flooding from multiple sources. As such, SHETRAN provides a robust means of simulating numerous terrestrial system processes which will add physical realism when coupled to the JULES land surface model. 306 catchments spanning Great Britain have been modelled using this system. The standard configuration of this system performs satisfactorily (NSE > 0.5) for 72% of catchments and well (NSE > 0.7) for 48%. Many of the remaining 28% of catchments that performed relatively poorly (NSE < 0.5) are located in the chalk in the south east of England. As such, the British Geological Survey 3D geology model for Great Britain (GB3D) has been incorporated, for the first time in any hydrological model, to pave the way for improvements to be made to simulations of catchments with important groundwater regimes. This coupling has involved development of software to allow for easy incorporation of geological information into SHETRAN for any model setup. The addition of more realistic subsurface representation following this approach is shown to greatly improve model performance in areas dominated by groundwater processes. The resulting modelling system has great potential to be used as a resource at national, regional and local scales in an array of different applications, including climate change impact assessments, land cover change studies and integrated assessments of groundwater and surface water resources.

  20. Thermo-hydro-mechanical-chemical processes in fractured-porous media: Benchmarks and examples

    NASA Astrophysics Data System (ADS)

    Kolditz, O.; Shao, H.; Görke, U.; Kalbacher, T.; Bauer, S.; McDermott, C. I.; Wang, W.

    2012-12-01

    The book comprises an assembly of benchmarks and examples for porous media mechanics collected over the last twenty years. Analysis of thermo-hydro-mechanical-chemical (THMC) processes is essential to many applications in environmental engineering, such as geological waste deposition, geothermal energy utilisation, carbon capture and storage, water resources management, hydrology, even climate change. In order to assess the feasibility as well as the safety of geotechnical applications, process-based modelling is the only tool to put numbers, i.e. to quantify future scenarios. This charges a huge responsibility concerning the reliability of computational tools. Benchmarking is an appropriate methodology to verify the quality of modelling tools based on best practices. Moreover, benchmarking and code comparison foster community efforts. The benchmark book is part of the OpenGeoSys initiative - an open source project to share knowledge and experience in environmental analysis and scientific computation.

  1. Application of a simple cerebellar model to geologic surface mapping

    USGS Publications Warehouse

    Hagens, A.; Doveton, J.H.

    1991-01-01

    Neurophysiological research into the structure and function of the cerebellum has inspired computational models that simulate information processing associated with coordination and motor movement. The cerebellar model arithmetic computer (CMAC) has a design structure which makes it readily applicable as an automated mapping device that "senses" a surface, based on a sample of discrete observations of surface elevation. The model operates as an iterative learning process, where cell weights are continuously modified by feedback to improve surface representation. The storage requirements are substantially less than those of a conventional memory allocation, and the model is extended easily to mapping in multidimensional space, where the memory savings are even greater. ?? 1991.

  2. Metamodeling-based approach for risk assessment and cost estimation: Application to geological carbon sequestration planning

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Jeong, Hoonyoung; González-Nicolás, Ana; Templeton, Thomas C.

    2018-04-01

    Carbon capture and storage (CCS) is being evaluated globally as a geoengineering measure for significantly reducing greenhouse emission. However, long-term liability associated with potential leakage from these geologic repositories is perceived as a main barrier of entry to site operators. Risk quantification and impact assessment help CCS operators to screen candidate sites for suitability of CO2 storage. Leakage risks are highly site dependent, and a quantitative understanding and categorization of these risks can only be made possible through broad participation and deliberation of stakeholders, with the use of site-specific, process-based models as the decision basis. Online decision making, however, requires that scenarios be run in real time. In this work, a Python based, Leakage Assessment and Cost Estimation (PyLACE) web application was developed for quantifying financial risks associated with potential leakage from geologic carbon sequestration sites. PyLACE aims to assist a collaborative, analytic-deliberative decision making processes by automating metamodel creation, knowledge sharing, and online collaboration. In PyLACE, metamodeling, which is a process of developing faster-to-run surrogates of process-level models, is enabled using a special stochastic response surface method and the Gaussian process regression. Both methods allow consideration of model parameter uncertainties and the use of that information to generate confidence intervals on model outputs. Training of the metamodels is delegated to a high performance computing cluster and is orchestrated by a set of asynchronous job scheduling tools for job submission and result retrieval. As a case study, workflow and main features of PyLACE are demonstrated using a multilayer, carbon storage model.

  3. NADM Conceptual Model 1.0 -- A Conceptual Model for Geologic Map Information

    USGS Publications Warehouse

    ,

    2004-01-01

    Executive Summary -- The NADM Data Model Design Team was established in 1999 by the North American Geologic Map Data Model Steering Committee (NADMSC) with the purpose of drafting a geologic map data model for consideration as a standard for developing interoperable geologic map-centered databases by state, provincial, and federal geological surveys. The model is designed to be a technology-neutral conceptual model that can form the basis for a web-based interchange format using evolving information technology (e.g., XML, RDF, OWL), and guide implementation of geoscience databases in a common conceptual framework. The intended purpose is to allow geologic information sharing between geologic map data providers and users, independent of local information system implementation. The model emphasizes geoscience concepts and relationships related to information presented on geologic maps. Design has been guided by an informal requirements analysis, documentation of existing databases, technology developments, and other standardization efforts in the geoscience and computer-science communities. A key aspect of the model is the notion that representation of the conceptual framework (ontology) that underlies geologic map data must be part of the model, because this framework changes with time and understanding, and varies between information providers. The top level of the model distinguishes geologic concepts, geologic representation concepts, and metadata. The geologic representation part of the model provides a framework for representing the ontology that underlies geologic map data through a controlled vocabulary, and for establishing the relationships between this vocabulary and a geologic map visualization or portrayal. Top-level geologic classes in the model are Earth material (substance), geologic unit (parts of the Earth), geologic age, geologic structure, fossil, geologic process, geologic relation, and geologic event.

  4. Geologic characterization of shelf areas using usSEABED for GIS mapping, modeling processes and assessing marine sand and gravel resources

    USGS Publications Warehouse

    Williams, S.J.; Bliss, J.D.; Arsenault, M.A.; Jenkins, C.J.; Goff, J.A.

    2007-01-01

    Geologic maps depicting offshore sedimentary features serve many scientific and applied purposes. Such maps have been lacking, but recent computer technology and software offer promise in the capture and display of diverse marine data. Continental margins contain landforms which provide a variety of important functions and contain important sedimentary records. Some shelf areas also contain deposits regarded as potential aggregate resources. Because proper management of coastal and offshore areas is increasingly important, knowledge of the framework geology and marine processes is critical. Especially valuable are comprehensive and integrated digital databases based on high-quality information from original sources. Products of interest are GIS maps containing thematic information, such as sediment character and texture. These products are useful to scientists modeling nearshore and shelf processes as well as planners and managers. The U.S. Geological Survey is leading a national program to gather a variety of extant marine geologic data into the usSEABED database system. This provides centralized, integrated marine geologic data collected over the past 50 years. To date, over 340,000 sediment data points from the U.S. reside in usSEABED, which combines an array of physical data and analytical and descriptive information about the sea floor and are available to the marine community through three USGS data reports for the Atlantic, Gulf of Mexico, and Pacific published in 2006, and the project web sites: (http://woodshole.er.usg s.gov/project-pages/aggregates/ and http://walrus.wr.usgs.gov/usseabed/)

  5. Physically-enhanced data visualisation: towards real time solution of Partial Differential Equations in 3D domains

    NASA Astrophysics Data System (ADS)

    Zlotnik, Sergio

    2017-04-01

    Information provided by visualisation environments can be largely increased if the data shown is combined with some relevant physical processes and the used is allowed to interact with those processes. This is particularly interesting in VR environments where the user has a deep interplay with the data. For example, a geological seismic line in a 3D "cave" shows information of the geological structure of the subsoil. The available information could be enhanced with the thermal state of the region under study, with water-flow patterns in porous rocks or with rock displacements under some stress conditions. The information added by the physical processes is usually the output of some numerical technique applied to solve a Partial Differential Equation (PDE) that describes the underlying physics. Many techniques are available to obtain numerical solutions of PDE (e.g. Finite Elements, Finite Volumes, Finite Differences, etc). Although, all these traditional techniques require very large computational resources (particularly in 3D), making them useless in a real time visualization environment -such as VR- because the time required to compute a solution is measured in minutes or even in hours. We present here a novel alternative for the resolution of PDE-based problems that is able to provide a 3D solutions for a very large family of problems in real time. That is, the solution is evaluated in a one thousands of a second, making the solver ideal to be embedded into VR environments. Based on Model Order Reduction ideas, the proposed technique divides the computational work in to a computationally intensive "offline" phase, that is run only once in a life time, and an "online" phase that allow the real time evaluation of any solution within a family of problems. Preliminary examples of real time solutions of complex PDE-based problems will be presented, including thermal problems, flow problems, wave problems and some simple coupled problems.

  6. A Computer-Based Subduction-Zone-Earthquake Exercise for Introductory-Geology Classes.

    ERIC Educational Resources Information Center

    Shea, James Herbert

    1991-01-01

    Describes the author's computer-based program for a subduction-zone-earthquake exercise. Instructions for conducting the activity and obtaining the program from the author are provided. Written in IBM QuickBasic. (PR)

  7. Basic exploration geophysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, E.S.

    1988-01-01

    An introduction to geophysical methods used to explore for natural resources and to survey earth's geology is presented in this volume. It is suitable for second-and third-year undergraduate students majoring in geology or engineering and for professional engineering and for professional engineers and earth scientists without formal instruction in geophysics. The author assumes the reader is familiar with geometry, algebra, and trigonometry. Geophysical exploration includes seismic refraction and reflection surveying, electrical resistivity and electromagnetic field surveying, and geophysical well logging. Surveying operations are described in step-by-step procedures and are illustrated by practical examples. Computer-based methods of processing and interpreting datamore » as well as geographical methods are introduced.« less

  8. Integrated analysis of remote sensing products from basic geological surveys. [Brazil

    NASA Technical Reports Server (NTRS)

    Dasilvafagundesfilho, E. (Principal Investigator)

    1984-01-01

    Recent advances in remote sensing led to the development of several techniques to obtain image information. These techniques as effective tools in geological maping are analyzed. A strategy for optimizing the images in basic geological surveying is presented. It embraces as integrated analysis of spatial, spectral, and temporal data through photoptic (color additive viewer) and computer processing at different scales, allowing large areas survey in a fast, precise, and low cost manner.

  9. Coal-seismic, desktop computer programs in BASIC; Part 7, Display and compute shear-pair seismograms

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report discusses and presents five computer pro grams used to display and compute shear-pair seismograms.

  10. Geodatabase model for global geologic mapping: concept and implementation in planetary sciences

    NASA Astrophysics Data System (ADS)

    Nass, Andrea

    2017-04-01

    One aim of the NASA Dawn mission is to generate global geologic maps of the asteroid Vesta and the dwarf planet Ceres. To accomplish this, the Dawn Science Team followed the technical recommendations for cartographic basemap production. The geological mapping campaign of Vesta was completed and published, but mapping of the dwarf planet Ceres is still ongoing. The tiling schema for the geological mapping is the same for both planetary bodies and for Ceres it is divided into two parts: four overview quadrangles (Survey Orbit, 415 m/pixel) and 15 more detailed quadrangles (High Altitude Mapping HAMO, 140 m/pixel). The first global geologic map was based on survey images (415 m/pixel). The combine 4 Survey quadrangles completed by HAMO data served as basis for generating a more detailed view of the geologic history and also for defining the chronostratigraphy and time scale of the dwarf planet. The most detailed view can be expected within the 15 mapping quadrangles based on HAMO resolution and completed by the Low Altitude Mapping (LAMO) data with 35 m/pixel. For the interpretative mapping process of each quadrangle one responsible mapper was assigned. Unifying the geological mapping of each quadrangle and bringing this together to regional and global valid statements is already a very time intensive task. However, another challenge that has to be accomplished is to consider how the 15 individual mappers can generate one homogenous GIS-based project (w.r.t. geometrical and visual character) thus produce a geologically-consistent final map. Our approach this challenge was already discussed for mapping of Vesta. To accommodate the map requirements regarding rules for data storage and database management, the computer-based GIS environment used for the interpretative mapping process must be designed in a way that it can be adjusted to the unique features of the individual investigation areas. Within this contribution the template will be presented that uses standards for digitizing, visualization, data merging and synchronization in the processes of interpretative mapping project. Following the new technological innovations within GIS software and the individual requirements for mapping Ceres, a template was developed based on the symbology and framework. The template for (GIS-base) mapping presented here directly links the generically descriptive attributes of planetary objects to the predefined and standardized symbology in one data structure. Using this template the map results are more comparable and better controllable. Furthermore, merging and synchronization of the individual maps, map projects and sheets will be far more efficient. The template can be adapted to any other planetary body and or within future discovery missions (e.g., Lucy and Psyche which was selected to explore the early solar system by NASA) for generating reusable map results.

  11. Parameters on reconstructions of geohistory, thermal history, and hydrocarbon generation history in a sedimentary basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, S.; Lerche, I.

    1988-01-01

    Geological processes related to petroleum generation, migration, and accumulation are very complicated in terms of time and variables involved, and are very difficult to simulate by laboratory experiments. For this reason, many mathematic/computer models have been developed to simulate these geological processes based on geological, geophysical, and geochemical principles. Unfortunately, none of these models can exactly simulate these processes because of the assumptions and simplifications made in these models and the errors in the input for the models. The sensitivity analysis is a comprehensive examination on how geological, geophysical, and geochemical parameters affect the reconstructions of geohistory, thermal history, andmore » hydrocarbon generation history. In this study, a one-dimensional fluid flow/compaction model has been used to run the sensitivity analysis. The authors will show the effects of some commonly used parameters such as depth, age, lithology, porosity, permeability, unconformity (time and eroded thickness), temperature at sediment surface, bottom hole temperature, present day heat flow, thermal gradient, thermal conductivity and kerogen type, and content on the evolutions of formation thickness, porosity, permeability, pressure with time and depth, heat flow with time, temperature with time and depth, vitrinite reflectance (R/sub 0/) and TTI with time and depth, oil window in terms of time and depth, and amount of hydrocarbon generated with time and depth.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huff, Kathryn D.

    Component level and system level abstraction of detailed computational geologic repository models have resulted in four rapid computational models of hydrologic radionuclide transport at varying levels of detail. Those models are described, as is their implementation in Cyder, a software library of interchangeable radionuclide transport models appropriate for representing natural and engineered barrier components of generic geology repository concepts. A proof of principle demonstration was also conducted in which these models were used to represent the natural and engineered barrier components of a repository concept in a reducing, homogenous, generic geology. This base case demonstrates integration of the Cyder openmore » source library with the Cyclus computational fuel cycle systems analysis platform to facilitate calculation of repository performance metrics with respect to fuel cycle choices. (authors)« less

  13. Geology and Design: Formal and Rational Connections

    NASA Astrophysics Data System (ADS)

    Eriksson, S. C.; Brewer, J.

    2016-12-01

    Geological forms and the manmade environment have always been inextricably linked. From the time that Upper Paleolithic man created drawings in the Lascaux Caves in the southwest of France, geology has provided a critical and dramatic spoil for human creativity. This inspiration has manifested itself in many different ways, and the history of architecture is rife with examples of geologically derived buildings. During the early 20th Century, German Expressionist art and architecture was heavily influenced by the natural and often translucent quality of minerals. Architects like Bruno Taut drew and built crystalline forms that would go on to inspire the more restrained Bauhaus movement. Even within the context of Contemporary architecture, geology has been a fertile source for inspiration. Architectural practices across the globe leverage the rationality and grounding found in geology to inform a process that is otherwise dominated by computer-driven parametric design. The connection between advanced design technology and the beautifully realized geo natural forms insures that geology will be a relevant source of architectural inspiration well into the 21st century. The sometimes hidden relationship of geology to the various sub-disciplines of Design such as Architecture, Interiors, Landscape Architecture, and Historic Preservation is explored in relation to curriculum and the practice of design. Topics such as materials, form, history, the cultural and physical landscape, natural hazards, and global design enrich and inform curriculum across the college. Commonly, these help define place-based education.

  14. US GEOLOGICAL SURVEY'S NATIONAL SYSTEM FOR PROCESSING AND DISTRIBUTION OF NEAR REAL-TIME HYDROLOGICAL DATA.

    USGS Publications Warehouse

    Shope, William G.; ,

    1987-01-01

    The US Geological Survey is utilizing a national network of more than 1000 satellite data-collection stations, four satellite-relay direct-readout ground stations, and more than 50 computers linked together in a private telecommunications network to acquire, process, and distribute hydrological data in near real-time. The four Survey offices operating a satellite direct-readout ground station provide near real-time hydrological data to computers located in other Survey offices through the Survey's Distributed Information System. The computerized distribution system permits automated data processing and distribution to be carried out in a timely manner under the control and operation of the Survey office responsible for the data-collection stations and for the dissemination of hydrological information to the water-data users.

  15. A Machine-Learning and Filtering Based Data Assimilation Framework for Geologic Carbon Sequestration Monitoring Optimization

    NASA Astrophysics Data System (ADS)

    Chen, B.; Harp, D. R.; Lin, Y.; Keating, E. H.; Pawar, R.

    2017-12-01

    Monitoring is a crucial aspect of geologic carbon sequestration (GCS) risk management. It has gained importance as a means to ensure CO2 is safely and permanently stored underground throughout the lifecycle of a GCS project. Three issues are often involved in a monitoring project: (i) where is the optimal location to place the monitoring well(s), (ii) what type of data (pressure, rate and/or CO2 concentration) should be measured, and (iii) What is the optimal frequency to collect the data. In order to address these important issues, a filtering-based data assimilation procedure is developed to perform the monitoring optimization. The optimal monitoring strategy is selected based on the uncertainty reduction of the objective of interest (e.g., cumulative CO2 leak) for all potential monitoring strategies. To reduce the computational cost of the filtering-based data assimilation process, two machine-learning algorithms: Support Vector Regression (SVR) and Multivariate Adaptive Regression Splines (MARS) are used to develop the computationally efficient reduced-order-models (ROMs) from full numerical simulations of CO2 and brine flow. The proposed framework for GCS monitoring optimization is demonstrated with two examples: a simple 3D synthetic case and a real field case named Rock Spring Uplift carbon storage site in Southwestern Wyoming.

  16. Selected papers in the applied computer sciences 1992

    USGS Publications Warehouse

    Wiltshire, Denise A.

    1992-01-01

    This compilation of short papers reports on technical advances in the applied computer sciences. The papers describe computer applications in support of earth science investigations and research. This is the third volume in the series "Selected Papers in the Applied Computer Sciences." Listed below are the topics addressed in the compilation:Integration of geographic information systems and expert systems for resource management,Visualization of topography using digital image processing,Development of a ground-water data base for the southeastern Uited States using a geographic information system,Integration and aggregation of stream-drainage data using a geographic information system,Procedures used in production of digital geologic coverage using compact disc read-only memory (CD-ROM) technology, andAutomated methods for producing a technical publication on estimated water use in the United States.

  17. Where do students struggle in the field? Computer-aided evaluation of mapping errors from an undergraduate Field Geology summer course

    NASA Astrophysics Data System (ADS)

    Lang, K. A.; Petrie, G.

    2014-12-01

    Extended field-based summer courses provide an invaluable field experience for undergraduate majors in the geosciences. These courses often utilize the construction of geological maps and structural cross sections as the primary pedagogical tool to teach basic map orientation, rock identification and structural interpretation. However, advances in the usability and ubiquity of Geographic Information Systems in these courses presents new opportunities to evaluate student work. In particular, computer-based quantification of systematic mapping errors elucidates the factors influencing student success in the field. We present a case example from a mapping exercise conducted in a summer Field Geology course at a popular field location near Dillon, Montana. We use a computer algorithm to automatically compare the placement and attribution of unit contacts with spatial variables including topographic slope, aspect, bedding attitude, ground cover and distance from starting location. We compliment analyses with anecdotal and survey data that suggest both physical factors (e.g. steep topographic slope) as well as structural nuance (e.g. low angle bedding) may dominate student frustration, particularly in courses with a high student to instructor ratio. We propose mechanisms to improve student experience by allowing students to practice skills with orientation games and broadening student background with tangential lessons (e.g. on colluvial transport processes). As well, we suggest low-cost ways to decrease the student to instructor ratio by supporting returning undergraduates from previous years or staging mapping over smaller areas. Future applications of this analysis might include a rapid and objective system for evaluation of student maps (including point-data, such as attitude measurements) and quantification of temporal trends in student work as class sizes, pedagogical approaches or environmental variables change. Long-term goals include understanding and characterizing stochasticity in geological mapping beyond the undergraduate classroom, and better quantifying uncertainty in published map products.

  18. The Role of Visualization in Learning from Computer-Based Images. Research Report

    ERIC Educational Resources Information Center

    Piburn, Michael D.; Reynolds, Stephen J.; McAuliffe, Carla; Leedy, Debra E.; Birk, James P.; Johnson, Julia K.

    2005-01-01

    Among the sciences, the practice of geology is especially visual. To assess the role of spatial ability in learning geology, we designed an experiment using: (1) web-based versions of spatial visualization tests, (2) a geospatial test, and (3) multimedia instructional modules built around QuickTime Virtual Reality movies. Students in control and…

  19. Landslide susceptibility estimations in the Gerecse hills (Hungary).

    NASA Astrophysics Data System (ADS)

    Gerzsenyi, Dávid; Gáspár, Albert

    2017-04-01

    Surface movement processes are constantly posing threat to property in populated and agricultural areas in the Gerecse hills (Hungary). The affected geological formations are mainly unconsolidated sediments. Pleistocene loess and alluvial terrace sediments are overwhelmingly present, but fluvio-lacustrine sediments of the latest Miocene, and consolidated Eocene and Mesozoic limestones and marls can also be found in the area. Landslides and other surface movement processes are being studied for a long time in the area, but a comprehensive GIS-based geostatistical analysis have not yet been made for the whole area. This was the reason for choosing the Gerecse as the focus area of the study. However, the base data of our study are freely accessible from online servers, so the used method can be applied to other regions in Hungary. Qualitative data was acquired from the landslide-inventory map of the Hungarian Surface Movement Survey and from the Geological Map of Hungary (1 : 100 000). Morphometric parameters derived from the SRMT-1 DEM were used as quantitative variables. Using these parameters the distribution of elevation, slope gradient, aspect and categorized geological features were computed, both for areas affected and not affected by slope movements. Then likelihood values were computed for each parameters by comparing their distribution in the two areas. With combining the likelihood values of the four parameters relative hazard values were computed for each cell. This method is known as the "empirical probability estimation" originally published by Chung (2005). The map created this way shows each cell's place in their ranking based on the relative hazard values as a percentage for the whole study area (787 km2). These values provide information about how similar is a certain area to the areas already affected by landslides based on the four predictor variables. This map can also serve as a base for more complex landslide vulnerability studies involving economic factors. The landslide-inventory database used in the research provides information regarding the state of activity of the past surface movements, however the activity of many sites are stated as unknown. A complementary field survey have been carried out aiming to categorize these areas - near to Dunaszentmiklós and Neszmély villages - in one of the most landslide-affected part of the Gerecse. Reference: Chung, C. (2005). Using likelihood ratio functions for modeling the conditional probability of occurrence of future landslides for risk assessment. Computers & Geosciences, 32., pp. 1052-1068.

  20. Peak data for U.S. Geological Survey gaging stations, Texas network and computer program to estimate peak-streamflow frequency

    USGS Publications Warehouse

    Slade, R.M.; Asquith, W.H.

    1996-01-01

    About 23,000 annual peak streamflows and about 400 historical peak streamflows exist for about 950 stations in the surface-water data-collection network of Texas. These data are presented on a computer diskette along with the corresponding dates, gage heights, and information concerning the basin, and nature or cause for the flood. Also on the computer diskette is a U.S. Geological Survey computer program that estimates peak-streamflow frequency based on annual and historical peak streamflow. The program estimates peak streamflow for 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals and is based on guidelines established by the Interagency Advisory Committee on Water Data. Explanations are presented for installing the program, and an example is presented with discussion of its options.

  1. Software Development for a Three-Dimensional Gravity Inversion and Application to Study of the Border Ranges Fault System, South-Central Alaska

    NASA Astrophysics Data System (ADS)

    Cardenas, R.; Doser, D. I.; Baker, M. R.

    2011-12-01

    Summary The Border Ranges Fault (BRFS) system bounds the Cook Inlet and Susitna Basins, an important petroleum province within south-central Alaska. An initial research goal is to test several plausible models of structure along the Border Ranges Fault System by developing a novel, 3D inversion software package. The inversion utilizes gravity data constrained with geophysical, borehole, and surface geological information. The novel inversion approach involves directly modeling known geology, initially free-air corrected data, and revising a priori uncertainties on the geologic model to allow comparisons to alternative interpretations. This technique to evaluate 3D structure in regions of highly complex geology can be applied in other studies of energy resources. The software reads an ASCII text file containing the latitude, longitude, elevation, and Free Air anomalies of each gravity station as well as gridded surface files of known topology. The contributions of each node in the grid are computed in order to compare the theoretical gravity calculations from a forward model to the gravity observations. The computation of solutions to the "linearized" inversion yields a range of plausible densities. The user will have the option of varying body proportions and dimensions to compare variations in density for changing depths of the gridded surface. Introduction Previous modeling of the BRFS using geophysical data has been limited due to the complexity of local geology and structure, both of shallow crustal features and the deeper subduction zone. Since the inversion is based on a sequence of gridded surfaces, it is feasible to develop software to help build these gridded geologic models. Without a way to modify grid surface elevations, density, and magnetic susceptibility in real time, the inversion process for the geologist would be highly nonlinear and poorly constrained, especially in structural geology this complex. Without a basic understanding of the geometry of the BRFS, its role in the formation and petroleum generation processes of the upper Cook Inlet and Susitna Basins is poorly understood. Model Generation The gravitational contributions are computed using a geophysics formulation, namely the vertical line element. g = πR2Gρ(x2+y2+z2)-1/2 Each line element is semi-infinite and extends from the top to the bottom of each structural layer. The user may define a three-dimensional body at a location on the surface. Each vertex of the body will be represented as separate nodes in the grid. The contribution of the body to the gravity value will be computed as a volume integral and added to the overall gravity contributions of other nodes on the surface. The user will also be able to modify the elevation and density of the defined body in real time. The most noted effectiveness of the software is in the user-defined a priori information facilitating real time interpretations and the computational efficiency of the model solution by using vertical line elements to address structural bodies with complex geometry.

  2. Techniques and strategies for data integration in mineral resource assessment

    USGS Publications Warehouse

    Trautwein, Charles M.; Dwyer, John L.

    1991-01-01

    The Geologic and the National Mapping divisions of the U.S. Geological Survey have been involved formally in cooperative research and development of computer-based geographic information systems (GISs) applied to mineral-resource assessment objectives since 1982. Experience in the Conterminous United States Mineral Assessment Program (CUSMAP) projects including the Rolla, Missouri; Dillon, Montana; Butte, Montana; and Tonopah, Nevada 1?? ?? 2?? quadrangles, has resulted in the definition of processing requirements for geographically and mineral-resource data that are common to these studies. The diverse formats of data sets collected and compiled for regional mineral-resource assessments necessitate capabilities for digitally encoding and entering data into appropriate tabular, vector, and raster subsystems of the GIS. Although many of the required data sets are either available or can be provided in a digital format suitable for direct entry, their utility is largely dependent on the original intent and consequent preprocessing of the data. In this respect, special care must be taken to ensure the digital data type, encoding, and format will meet assessment objectives. Data processing within the GIS is directed primarily toward the development and application of models that can be used to describe spatially geological, geophysical, and geochemical environments either known or inferred to be associated with specific types of mineral deposits. Consequently, capabilities to analyze spatially, aggregate, and display relations between data sets are principal processing requirements. To facilitate the development of these models within the GIS, interfaces must be developed among vector-, raster-, and tabular-based processing subsystems to reformat resident data sets for comparative analyses and multivariate display of relations.

  3. Visualizations and Mental Models - The Educational Implications of GEOWALL

    NASA Astrophysics Data System (ADS)

    Rapp, D.; Kendeou, P.

    2003-12-01

    Work in the earth sciences has outlined many of the faulty beliefs that students possess concerning particular geological systems and processes. Evidence from educational and cognitive psychology has demonstrated that students often have difficulty overcoming their na‹ve beliefs about science. Prior knowledge is often remarkably resistant to change, particularly when students' existing mental models for geological principles may be faulty or inaccurate. Figuring out how to help students revise their mental models to include appropriate information is a major challenge. Up until this point, research has tended to focus on whether 2-dimensional computer visualizations are useful tools for helping students develop scientifically correct models. Research suggests that when students are given the opportunity to use dynamic computer-based visualizations, they are more likely to recall the learned information, and are more likely to transfer that knowledge to novel settings. Unfortunately, 2-dimensional visualization systems are often inadequate representations of the material that educators would like students to learn. For example, a 2-dimensional image of the Earth's surface does not adequately convey particular features that are critical for visualizing the geological environment. This may limit the models that students can construct following these visualizations. GEOWALL is a stereo projection system that has attempted to address this issue. It can display multidimensional static geologic images and dynamic geologic animations in a 3-dimensional format. Our current research examines whether multidimensional visualization systems such as GEOWALL may facilitate learning by helping students to develop more complex mental models. This talk will address some of the cognitive issues that influence the construction of mental models, and the difficulty of updating existing mental models. We will also discuss our current work that seeks to examine whether GEOWALL is an effective tool for helping students to learn geological information (and potentially restructure their na‹ve conceptions of geologic principles).

  4. Definitions of database files and fields of the Personal Computer-Based Water Data Sources Directory

    USGS Publications Warehouse

    Green, J. Wayne

    1991-01-01

    This report describes the data-base files and fields of the personal computer-based Water Data Sources Directory (WDSD). The personal computer-based WDSD was derived from the U.S. Geological Survey (USGS) mainframe computer version. The mainframe version of the WDSD is a hierarchical data-base design. The personal computer-based WDSD is a relational data- base design. This report describes the data-base files and fields of the relational data-base design in dBASE IV (the use of brand names in this abstract is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey) for the personal computer. The WDSD contains information on (1) the type of organization, (2) the major orientation of water-data activities conducted by each organization, (3) the names, addresses, and telephone numbers of offices within each organization from which water data may be obtained, (4) the types of data held by each organization and the geographic locations within which these data have been collected, (5) alternative sources of an organization's data, (6) the designation of liaison personnel in matters related to water-data acquisition and indexing, (7) the volume of water data indexed for the organization, and (8) information about other types of data and services available from the organization that are pertinent to water-resources activities.

  5. A campus-based course in field geology

    NASA Astrophysics Data System (ADS)

    Richard, G. A.; Hanson, G. N.

    2009-12-01

    GEO 305: Field Geology offers students practical experience in the field and in the computer laboratory conducting geological field studies on the Stony Brook University campus. Computer laboratory exercises feature mapping techniques and field studies of glacial and environmental geology, and include geophysical and hydrological analysis, interpretation, and mapping. Participants learn to use direct measurement and mathematical techniques to compute the location and geometry of features and gain practical experience in representing raster imagery and vector geographic data as features on maps. Data collecting techniques in the field include the use of hand-held GPS devices, compasses, ground-penetrating radar, tape measures, pacing, and leveling devices. Assignments that utilize these skills and techniques include mapping campus geology with GPS, using Google Earth to explore our geologic context, data file management and ArcGIS, tape and compass mapping of woodland trails, pace and compass mapping of woodland trails, measuring elevation differences on a hillside, measuring geologic sections and cores, drilling through glacial deposits, using ground penetrating radar on glaciotectonic topography, mapping the local water table, and the identification and mapping of boulders. Two three-hour sessions are offered per week, apportioned as needed between lecture; discussion; guided hands-on instruction in geospatial and other software such as ArcGIS, Google Earth, spreadsheets, and custom modules such as an arc intersection calculator; outdoor data collection and mapping; and writing of illustrated reports.

  6. Structural control of landslides. A regional approach based on a developed ArcGIS tool

    NASA Astrophysics Data System (ADS)

    Ilinca, Viorel; Sandric, Ionut; Chitu, Zenaida; Jurchescu, Marta

    2016-04-01

    The relationship between bedding planes and topographic slopes plays a major role in controlling landslide mechanisms. The catastrophic nature of many landslides around the Globe was proved to have a relevant structural background. This paper aims at analyzing the relationship between the spatial distribution of landslides and geological structure and lithology at a regional scale (1:50,000). Moreover, by automatizing a well known method to assess the influence of bedding planes on landslide occurence, this study further provides a GIS-based tool useful to speed up regional analyses, when study areas extend over hundreds or thousands of square kilometers. Three areas with different geological and geomorphological features and extents ranging from 70 to 179 km² were selected as case-studies. The sites are located in the Southern Carpathians, the Curvature and the Getic Subcarpathians of Romania. Computation of the topography - bedding plane relation required the following three phases: i) data acquisition, ii) developing a tool for an easy data processing and analysis and iii) testing the tool on the few selected sites having different geological and geomorphological settings. Three categories of spatial data were acquired: i) landslide inventory data; ii) detailed lithological data and iii) data related to geological structure (dip angle and dip direction point data). The landslide database was built based on interpretation of aerial images and field mapping during a more than 8 years long period. Lithology was extracted from geological maps at a 1:50,000 scale, while dip angle and dip direction data were obtained both from geological maps and direct measurements in the field meant to increase the level of detail. In order to rapidly identify the type of slope in relation to the geological structure (anaclinal, cataclinal and orthoclinal), a tool was developed which integrates a well-known index called TOBIA. This custom created GIS tool was developed using Python programming language and Numpy library and is available both as an ArcGIS Toolbox and as a standalone python script. Both are available at http://www.github.com/sandricionut/tobia. Preliminary results for the three analysed areas stress the influence of the geological structure on landslide occurence. In monoclinal areas the relationship between the geological structure and spatial distribution of landslide is very obvious. In slightly folded areas the relationship does not appear to be so evident, nevertheless the influence of the structure can be seen on the flanks of some anticline and syncline structures. In faulted areas, landslides occurence do not seem to be influenced by structure and the majority of the landslides occur in a diversity of directions. Even if landslides are a common process in all of these areas, their occurrence is strictly depending on the presence of lithological formations in a clayey or a marly facies. The new ArcGIS-tool is a useful instrument, facilitating the work involved in the TOBIA computation by reducing the investigation time. The resulted classiffied slopes can be rapidly incorporated as a favorability factor in landslide susceptibility prediction.

  7. Geodiversity of the Umbria region (central Italy): a GIS-based quantitative index

    NASA Astrophysics Data System (ADS)

    Melelli, Laura; Pica, Alessia; Del Monte, Maurizio

    2014-05-01

    The measure of natural range related to geological bedrock, landforms and geomorphological processes is the necessary starting point to geodiversity evaluation. Geodiversity plays a strategic role in landscape management. Whereas geotourism and geosites are identified as a driving power for the scientific and economic promotion of an area, the geodiversity knowledge is required for a complete and accurate research. For example, high values of this abiotic parameter identify and support the foundation of geoparks. According to this perspective, the geodiversity is the unifying factor for these areas of interest. While a subjective and qualitative approach may be adequate for geosites definition, identification and cultural promotion, the geodiversity concept needs a different evaluation method. A quantitative procedure allows achieving an objective and repeatable process exportable in different geographic units. Geographical Information Systems and spatial analysis techniques are the base to quantitative evaluation involving topographic, geological and geomorphological data. Therefore, the assessment of a numerical index derived from the overlay of spatial parameters can be conveniently computed in GIS environment. In this study, a geodiversity index is proposed where geological, geomorphological and landcover factors deriving mainly from maps and field survey; topographic ones are employed from DEM and remote sensed data. Each abiotic parameter is modelled in a grid format; focal functions do provide neighbourhood analysis and computing variety statistics. A particular extent is dedicated to topographic information and terrain roughness, that are strictly related to efficiency of geomorphological processes and generally corresponding to the abiotic components variability. The study area is located in central Italy and is characterized by a well known natural heritage. Thirty-seven geosites are detected in the Umbria region, where seven regional and one natural parks are present. All the area shows a strong correlation between the geological setting and the relief energy associated to topography assessment. Three main outcrop complexes are present: a fluvial lacustrine, where the lowest slope values and plain area are widespread; a terrigenous one, with a medium slope value; and a calcareous complex corresponding to the mountain areas and the highest amplitude of relief. This partition matches different geomorphological processes and landforms, ensuring a widespread distribution of geodiversity. The final map is a digital data that localizes areas with, respectively, null or minimum, medium, and high geodiversity values. The highest class overlaps to geosites areas, to high values of amplitude of relief and where the geomorphological processes are more effective and various. This confirms the method accuracy. The results obtained represent an important advancement in geodiversity research and a significant instrument for economic development and conservation management.

  8. Visualization and prediction of supercritical CO 2 distribution in sandstones during drainage: An in situ synchrotron X-ray micro-computed tomography study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voltolini, Marco; Kwon, Tae-Hyuk; Ajo-Franklin, Jonathan

    Pore-scale distribution of supercritical CO 2 (scCO 2) exerts significant control on a variety of key hydrologic as well as geochemical processes, including residual trapping and dissolution. Despite such importance, only a small number of experiments have directly characterized the three-dimensional distribution of scCO 2 in geologic materials during the invasion (drainage) process. Here, we present a study which couples dynamic high-resolution synchrotron X-ray micro-computed tomography imaging of a scCO 2/brine system at in situ pressure/temperature conditions with quantitative pore-scale modeling to allow direct validation of a pore-scale description of scCO2 distribution. The experiment combines high-speed synchrotron radiography with tomographymore » to characterize the brine saturated sample, the scCO 2 breakthrough process, and the partially saturated state of a sandstone sample from the Domengine Formation, a regionally extensive unit within the Sacramento Basin (California, USA). The availability of a 3D dataset allowed us to examine correlations between grains and pores morphometric parameters and the actual distribution of scCO 2 in the sample, including the examination of the role of small-scale sedimentary structure on CO2 distribution. The segmented scCO 2/brine volume was also used to validate a simple computational model based on the local thickness concept, able to accurately simulate the distribution of scCO 2 after drainage. The same method was also used to simulate Hg capillary pressure curves with satisfactory results when compared to the measured ones. Finally, this predictive approach, requiring only a tomographic scan of the dry sample, proved to be an effective route for studying processes related to CO 2 invasion structure in geological samples at the pore scale.« less

  9. Visualization and prediction of supercritical CO 2 distribution in sandstones during drainage: An in situ synchrotron X-ray micro-computed tomography study

    DOE PAGES

    Voltolini, Marco; Kwon, Tae-Hyuk; Ajo-Franklin, Jonathan

    2017-10-21

    Pore-scale distribution of supercritical CO 2 (scCO 2) exerts significant control on a variety of key hydrologic as well as geochemical processes, including residual trapping and dissolution. Despite such importance, only a small number of experiments have directly characterized the three-dimensional distribution of scCO 2 in geologic materials during the invasion (drainage) process. Here, we present a study which couples dynamic high-resolution synchrotron X-ray micro-computed tomography imaging of a scCO 2/brine system at in situ pressure/temperature conditions with quantitative pore-scale modeling to allow direct validation of a pore-scale description of scCO2 distribution. The experiment combines high-speed synchrotron radiography with tomographymore » to characterize the brine saturated sample, the scCO 2 breakthrough process, and the partially saturated state of a sandstone sample from the Domengine Formation, a regionally extensive unit within the Sacramento Basin (California, USA). The availability of a 3D dataset allowed us to examine correlations between grains and pores morphometric parameters and the actual distribution of scCO 2 in the sample, including the examination of the role of small-scale sedimentary structure on CO2 distribution. The segmented scCO 2/brine volume was also used to validate a simple computational model based on the local thickness concept, able to accurately simulate the distribution of scCO 2 after drainage. The same method was also used to simulate Hg capillary pressure curves with satisfactory results when compared to the measured ones. Finally, this predictive approach, requiring only a tomographic scan of the dry sample, proved to be an effective route for studying processes related to CO 2 invasion structure in geological samples at the pore scale.« less

  10. Implicit Three-Dimensional Geo-Modelling Based on HRBF Surface

    NASA Astrophysics Data System (ADS)

    Gou, J.; Zhou, W.; Wu, L.

    2016-10-01

    Three-dimensional (3D) geological models are important representations of the results of regional geological surveys. However, the process of constructing 3D geological models from two-dimensional (2D) geological elements remains difficult and time-consuming. This paper proposes a method of migrating from 2D elements to 3D models. First, the geological interfaces were constructed using the Hermite Radial Basis Function (HRBF) to interpolate the boundaries and attitude data. Then, the subsurface geological bodies were extracted from the spatial map area using the Boolean method between the HRBF surface and the fundamental body. Finally, the top surfaces of the geological bodies were constructed by coupling the geological boundaries to digital elevation models. Based on this workflow, a prototype system was developed, and typical geological structures (e.g., folds, faults, and strata) were simulated. Geological modes were constructed through this workflow based on realistic regional geological survey data. For extended applications in 3D modelling of other kinds of geo-objects, mining ore body models and urban geotechnical engineering stratum models were constructed by this method from drill-hole data. The model construction process was rapid, and the resulting models accorded with the constraints of the original data.

  11. Real Time Flood Alert System (RTFAS) for Puerto Rico

    USGS Publications Warehouse

    Lopez-Trujillo, Dianne

    2010-01-01

    The Real Time Flood Alert System is a web-based computer program, developed as a data integration tool, and designed to increase the ability of emergency managers to rapidly and accurately predict flooding conditions of streams in Puerto Rico. The system includes software and a relational database to determine the spatial and temporal distribution of rainfall, water levels in streams and reservoirs, and associated storms to determine hazardous and potential flood conditions. The computer program was developed as part of a cooperative agreement between the U.S. Geological Survey Caribbean Water Science Center and the Puerto Rico Emergency Management Agency, and integrates information collected and processed by these two agencies and the National Weather Service.

  12. Machine intelligence and robotics: Report of the NASA study group. Executive summary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A brief overview of applications of machine intelligence and robotics in the space program is given. These space exploration robots, global service robots to collect data for public service use on soil conditions, sea states, global crop conditions, weather, geology, disasters, etc., from Earth orbit, space industrialization and processing technologies, and construction of large structures in space. Program options for research, advanced development, and implementation of machine intelligence and robot technology for use in program planning are discussed. A vigorous and long-range program to incorporate and keep pace with state of the art developments in computer technology, both in spaceborne and ground-based computer systems is recommended.

  13. Coal-seismic, desktop computer programs in BASIC; Part 6, Develop rms velocity functions and apply mute and normal movement

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report presents computer programs used to develop rms velocity functions and apply mute and normal moveout to a 12-trace seismogram.

  14. A web-based 3D geological information visualization system

    NASA Astrophysics Data System (ADS)

    Song, Renbo; Jiang, Nan

    2013-03-01

    Construction of 3D geological visualization system has attracted much more concern in GIS, computer modeling, simulation and visualization fields. It not only can effectively help geological interpretation and analysis work, but also can it can help leveling up geosciences professional education. In this paper, an applet-based method was introduced for developing a web-based 3D geological information visualization system. The main aims of this paper are to explore a rapid and low-cost development method for constructing a web-based 3D geological system. First, the borehole data stored in Excel spreadsheets was extracted and then stored in SQLSERVER database of a web server. Second, the JDBC data access component was utilized for providing the capability of access the database. Third, the user interface was implemented with applet component embedded in JSP page and the 3D viewing and querying functions were implemented with PickCanvas of Java3D. Last, the borehole data acquired from geological survey were used for test the system, and the test results has shown that related methods of this paper have a certain application values.

  15. Beam-hardening correction by a surface fitting and phase classification by a least square support vector machine approach for tomography images of geological samples

    NASA Astrophysics Data System (ADS)

    Khan, F.; Enzmann, F.; Kersten, M.

    2015-12-01

    In X-ray computed microtomography (μXCT) image processing is the most important operation prior to image analysis. Such processing mainly involves artefact reduction and image segmentation. We propose a new two-stage post-reconstruction procedure of an image of a geological rock core obtained by polychromatic cone-beam μXCT technology. In the first stage, the beam-hardening (BH) is removed applying a best-fit quadratic surface algorithm to a given image data set (reconstructed slice), which minimizes the BH offsets of the attenuation data points from that surface. The final BH-corrected image is extracted from the residual data, or the difference between the surface elevation values and the original grey-scale values. For the second stage, we propose using a least square support vector machine (a non-linear classifier algorithm) to segment the BH-corrected data as a pixel-based multi-classification task. A combination of the two approaches was used to classify a complex multi-mineral rock sample. The Matlab code for this approach is provided in the Appendix. A minor drawback is that the proposed segmentation algorithm may become computationally demanding in the case of a high dimensional training data set.

  16. Cartographic production for the Florida Shelf Habitat (FLaSH) map study: generation of surface grids, contours, and KMZ files

    USGS Publications Warehouse

    Robbins, Lisa L.; Hansen, Mark; Raabe, Ellen; Knorr, Paul O.; Browne, Joseph

    2007-01-01

    The Florida shelf represents a finite source of economic resources, including commercial and recreational fisheries, tourism, recreation, sand and gravel resources, phosphate, and freshwater reserves. Yet the basic information needed to locate resources, or to interpret and utilize existing data, comes from many sources, dates, and formats. A multi-agency effort is underway to coordinate and prioritize the compilation of suitable datasets for an integrated information system of Florida’s coastal and ocean resources. This report and the associated data files represent part of the effort to make data accessible and useable with computer-mapping systems, web-based technologies, and user-friendly visualization tools. Among the datasets compiled and developed are seafloor imagery, marine sediment data, and existing bathymetric data. A U.S. Geological Survey-sponsored workshop in January 2007 resulted in the establishment of mapping priorities for the state. Bathymetry was identified as a common priority among agencies and researchers. State-of-the-art computer-mapping techniques and data-processing tools were used to develop shelf-wide raster and vector data layers. Florida Shelf Habitat (FLaSH) Mapping Project (http://coastal.er.usgs.gov/flash) endeavors to locate available data, identify data gaps, synthesize existing information, and expand our understanding of geologic processes in our dynamic coastal and marine systems.

  17. The Australian Computational Earth Systems Simulator

    NASA Astrophysics Data System (ADS)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.

  18. Development of seismic tomography software for hybrid supercomputers

    NASA Astrophysics Data System (ADS)

    Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton

    2015-04-01

    Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on supercomputers using multicore CPUs only, with preliminary performance tests showing good parallel efficiency on large numerical grids. Porting of the algorithms to hybrid supercomputers is currently ongoing.

  19. Processing of multispectral thermal IR data for geologic applications

    NASA Technical Reports Server (NTRS)

    Kahle, A. B.; Madura, D. P.; Soha, J. M.

    1979-01-01

    Multispectral thermal IR data were acquired with a 24-channel scanner flown in an aircraft over the E. Tintic Utah mining district. These digital image data required extensive computer processing in order to put the information into a format useful for a geologic photointerpreter. Simple enhancement procedures were not sufficient to reveal the total information content because the data were highly correlated in all channels. The data were shown to be dominated by temperature variations across the scene, while the much more subtle spectral variations between the different rock types were of interest. The image processing techniques employed to analyze these data are described.

  20. Development of teaching modules for geology and engineering coursework using terrestrial LiDAR scanning systems

    NASA Astrophysics Data System (ADS)

    Yarbrough, L. D.; Katzenstein, K.

    2012-12-01

    Exposing students to active and local examples of physical geologic processes is beneficial to the learning process. Students typically respond with interest to examples that use state-of-the-art technologies to investigate local or regional phenomena. For lower cognitive level of learning (e.g. knowledge, comprehension, and application), the use of "close-to-home" examples ensures that students better understand concepts. By providing these examples, the students may already have a familiarity or can easily visit the location. Furthermore, these local and regional examples help students to offer quickly other examples of similar phenomena. Investigation of these examples using normal photographic techniques, as well as a more sophisticated 3-D Light Detection And Ranging (LiDAR) (AKA Terrestrial Laser Scanning or TLS) system, allows students to gain a better understanding of the scale and the mechanics of the geologic processes and hazards. The systems are used for research, teaching and outreach efforts and depending on departmental policies can be accessible to students are various learning levels. TLS systems can yield scans at sub-centimeter resolution and contain surface reflectance of targets. These systems can serve a number of learning goals that are essential for training geoscientists and engineers. While querying the data to answer geotechnical or geomorphologic related questions, students will develop skills using large, spatial databases. The upper cognitive level of learning (e.g. analysis, synthesis, and evaluation) is also promoted by using a subset of the data and correlating the physical geologic process of stream bank erosion and rock slope failures with mathematical and computer models using the scanned data. Students use the examples and laboratory exercises to help build their engineering judgment skills with Earth materials. The students learn not only applications of math and engineering science but also the economic and social implication of designed engineering solutions. These course learning modules were developed for traditional geological engineering courses delivered on campus, for more intensive field work courses and online-based asynchronous course delivery.

  1. Education and Outreach Programs Offered by the Center for High Pressure Research and the Consortium for Materials Properties Research in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Richard, G. A.

    2003-12-01

    Major research facilities and organizations provide an effective venue for developing partnerships with educational organizations in order to offer a wide variety of educational programs, because they constitute a base where the culture of scientific investigation can flourish. The Consortium for Materials Properties Research in Earth Sciences (COMPRES) conducts education and outreach programs through the Earth Science Educational Resource Center (ESERC), in partnership with other groups that offer research and education programs. ESERC initiated its development of education programs in 1994 under the administration of the Center for High Pressure Research (CHiPR), which was funded as a National Science Foundation Science and Technology Center from 1991 to 2002. Programs developed during ESERC's association with CHiPR and COMPRES have targeted a wide range of audiences, including pre-K, K-12 students and teachers, undergraduates, and graduate students. Since 1995, ESERC has offered inquiry-based programs to Project WISE (Women in Science and Engineering) students at a high school and undergraduate level. Activities have included projects that investigated earthquakes, high pressure mineral physics, and local geology. Through a practicum known as Project Java, undergraduate computer science students have developed interactive instructional tools for several of these activities. For K-12 teachers, a course on Long Island geology is offered each fall, which includes an examination of the role that processes in the Earth's interior have played in the geologic history of the region. ESERC has worked with Stony Brook's Department of Geosciences faculty to offer courses on natural hazards, computer modeling, and field geology to undergraduate students, and on computer programming for graduate students. Each summer, a four-week residential college-level environmental geology course is offered to rising tenth graders from the Brentwood, New York schools in partnership with Stony Brook's Department of Technology and Society. During the academic year, a college-level Earth science course is offered to tenth graders from Sayville, New York. In both programs, students conduct research projects as one of their primary responsibilities. In collaboration with the Museum of Long Island Natural Sciences on the Stony Brook campus, two programs have been developed that enable visiting K-12 school classes to investigate earthquakes and phenomena that operate in the Earth's deep interior. From 1997 to 1999, the weekly activity-based Science Enrichment for the Early Years (SEEY) program, focusing on common Earth materials and fundamental Earth processes, was conducted at a local pre-K school. Since 2002, ESERC has worked with the Digital Library for Earth System Education (DLESE) to organize the Skills Workshops for their Annual Meeting and with EarthScope for the development of their Education and Outreach Program Plan. Future education programs and tools developed through COMPRES partnerships will place an increased emphasis on deep Earth materials and phenomena.

  2. Process for structural geologic analysis of topography and point data

    DOEpatents

    Eliason, Jay R.; Eliason, Valerie L. C.

    1987-01-01

    A quantitative method of geologic structural analysis of digital terrain data is described for implementation on a computer. Assuming selected valley segments are controlled by the underlying geologic structure, topographic lows in the terrain data, defining valley bottoms, are detected, filtered and accumulated into a series line segments defining contiguous valleys. The line segments are then vectorized to produce vector segments, defining valley segments, which may be indicative of the underlying geologic structure. Coplanar analysis is performed on vector segment pairs to determine which vectors produce planes which represent underlying geologic structure. Point data such as fracture phenomena which can be related to fracture planes in 3-dimensional space can be analyzed to define common plane orientation and locations. The vectors, points, and planes are displayed in various formats for interpretation.

  3. A personal computer-based nuclear magnetic resonance spectrometer

    NASA Astrophysics Data System (ADS)

    Job, Constantin; Pearson, Robert M.; Brown, Michael F.

    1994-11-01

    Nuclear magnetic resonance (NMR) spectroscopy using personal computer-based hardware has the potential of enabling the application of NMR methods to fields where conventional state of the art equipment is either impractical or too costly. With such a strategy for data acquisition and processing, disciplines including civil engineering, agriculture, geology, archaeology, and others have the possibility of utilizing magnetic resonance techniques within the laboratory or conducting applications directly in the field. Another aspect is the possibility of utilizing existing NMR magnets which may be in good condition but unused because of outdated or nonrepairable electronics. Moreover, NMR applications based on personal computer technology may open up teaching possibilities at the college or even secondary school level. The goal of developing such a personal computer (PC)-based NMR standard is facilitated by existing technologies including logic cell arrays, direct digital frequency synthesis, use of PC-based electrical engineering software tools to fabricate electronic circuits, and the use of permanent magnets based on neodymium-iron-boron alloy. Utilizing such an approach, we have been able to place essentially an entire NMR spectrometer console on two printed circuit boards, with the exception of the receiver and radio frequency power amplifier. Future upgrades to include the deuterium lock and the decoupler unit are readily envisioned. The continued development of such PC-based NMR spectrometers is expected to benefit from the fast growing, practical, and low cost personal computer market.

  4. Coal-seismic, desktop computer programs in BASIC; Part 5, Perform X-square T-square analysis and plot normal moveout lines on seismogram overlay

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language used by the Tektronix 4051 Graphic System. This report presents computer programs to perform X-square/T-square analyses and to plot normal moveout lines on a seismogram overlay.

  5. US Geological Survey National Computer Technology Meeting; Proceedings, Phoenix, Arizona, November 14-18, 1988

    USGS Publications Warehouse

    Balthrop, Barbara H.; Terry, J.E.

    1991-01-01

    The U.S. Geological Survey National Computer Technology Meetings (NCTM) are sponsored by the Water Resources Division and provide a forum for the presentation of technical papers and the sharing of ideas or experiences related to computer technology. This report serves as a proceedings of the meeting held in November, 1988 at the Crescent Hotel in Phoenix, Arizona. The meeting was attended by more than 200 technical and managerial people representing all Divisions of the U.S. Geological Survey.Scientists in every Division of the U.S. Geological Survey rely heavily upon state-of-the-art computer technology (both hardware and sofnuare). Today the goals of each Division are pursued in an environment where high speed computers, distributed communications, distributed data bases, high technology input/output devices, and very sophisticated simulation tools are used regularly. Therefore, information transfer and the sharing of advances in technology are very important issues that must be addressed regularly.This report contains complete papers and abstracts of papers that were presented at the 1988 NCTM. The report is divided into topical sections that reflect common areas of interest and application. In each section, papers are presented first followed by abstracts. For these proceedings, the publication of a complete paper or only an abstract was at the discretion of the author, although complete papers were encouraged.Some papers presented at the 1988 NCTM are not published in these proceedings.

  6. Human Centered Computing for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2005-01-01

    The science objectives are to determine the aqueous, climatic, and geologic history of a site on Mars where conditions may have been favorable to the preservation of evidence of prebiotic or biotic processes. Human Centered Computing is a development process that starts with users and their needs, rather than with technology. The goal is a system design that serves the user, where the technology fits the task and the complexity is that of the task not of the tool.

  7. Earth science photographs from the U.S. Geological Survey Library

    USGS Publications Warehouse

    McGregor, Joseph K.; Abston, Carl C.

    1995-01-01

    This CD-ROM set contains 1,500 scanned photographs from the U.S. Geological Survey Library for use as a photographic glossary of elementary geologic terms. Scholars are encouraged to copy these public domain images into their reports or databases to enhance their presentations. High-quality prints and (or) slides are available upon request from the library. This CD-ROM was produced in accordance with the ISO 9660 standard; however, it is intended for use on DOS-based computer systems only.

  8. Surface Water Quality-Assurance Plan for the North Florida Program Office of the U.S. Geological Survey

    USGS Publications Warehouse

    Franklin, Marvin A.

    2000-01-01

    The U.S. Geological Survey, Water Resources Division, has a policy that requires each District office to prepare a Surface Water Quality-Assurance Plan. The plan for each District describes the policies and procedures that ensure high quality in the collection, processing, analysis, computer storage, and publication of surface-water data. The North Florida Program Office Surface Water Quality-Assurance Plan documents the standards, policies, and procedures used by the North Florida Program office for activities related to the collection, processing, storage, analysis, and publication of surface-water data.

  9. Total Petroleum Systems and Geologic Assessment of Oil and Gas Resources in the Powder River Basin Province, Wyoming and Montana

    USGS Publications Warehouse

    Anna, L.O.

    2009-01-01

    The U.S. Geological Survey completed an assessment of the undiscovered oil and gas potential of the Powder River Basin in 2006. The assessment of undiscovered oil and gas used the total petroleum system concept, which includes mapping the distribution of potential source rocks and known petroleum accumulations and determining the timing of petroleum generation and migration. Geologically based, it focuses on source and reservoir rock stratigraphy, timing of tectonic events and the configuration of resulting structures, formation of traps and seals, and burial history modeling. The total petroleum system is subdivided into assessment units based on similar geologic characteristics and accumulation and petroleum type. In chapter 1 of this report, five total petroleum systems, eight conventional assessment units, and three continuous assessment units were defined and the undiscovered oil and gas resources within each assessment unit quantitatively estimated. Chapter 2 describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).

  10. A parallel-processing approach to computing for the geographic sciences

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.

  11. Planetary image conversion task

    NASA Technical Reports Server (NTRS)

    Martin, M. D.; Stanley, C. L.; Laughlin, G.

    1985-01-01

    The Planetary Image Conversion Task group processed 12,500 magnetic tapes containing raw imaging data from JPL planetary missions and produced an image data base in consistent format on 1200 fully packed 6250-bpi tapes. The output tapes will remain at JPL. A copy of the entire tape set was delivered to US Geological Survey, Flagstaff, Ariz. A secondary task converted computer datalogs, which had been stored in project specific MARK IV File Management System data types and structures, to flat-file, text format that is processable on any modern computer system. The conversion processing took place at JPL's Image Processing Laboratory on an IBM 370-158 with existing software modified slightly to meet the needs of the conversion task. More than 99% of the original digital image data was successfully recovered by the conversion task. However, processing data tapes recorded before 1975 was destructive. This discovery is of critical importance to facilities responsible for maintaining digital archives since normal periodic random sampling techniques would be unlikely to detect this phenomenon, and entire data sets could be wiped out in the act of generating seemingly positive sampling results. Reccomended follow-on activities are also included.

  12. Review of edgematchimg procedures for digital cartographic data used in Geographic Information Systems (GIS)

    USGS Publications Warehouse

    Nebert, D.D.

    1989-01-01

    In the process of developing a continuous hydrographic data layer for water resources applications in the Pacific Northwest, map-edge discontinuities in the U.S. Geological Survey 1:100 ,000-scale digital data that required application of computer-assisted edgematching procedures were identified. The spatial data sets required by the project must have line features that match closely enough across map boundaries to ensure full line topology when adjacent files are joined by the computer. Automated edgematching techniques are evaluated as to their effects on positional accuracy. Interactive methods such as selective node-matching and on-screen editing are also reviewed. Interactive procedures complement automated methods by allowing supervision of edgematching in a cartographic and hydrologic context. Common edge conditions encountered in the preparation of the Northwest Rivers data base are described, as are recommended processing solutions. Suggested edgematching procedures for 1:100,000-scale hydrography data are included in an appendix to encourage consistent processing of this theme on a national scale. (USGS)

  13. Potential-Field Geophysical Software for the PC

    USGS Publications Warehouse

    ,

    1995-01-01

    The computer programs of the Potential-Field Software Package run under the DOS operating system on IBM-compatible personal computers. They are used for the processing, display, and interpretation of potential-field geophysical data (gravity- and magnetic-field measurements) and other data sets that can be represented as grids or profiles. These programs have been developed on a variety of computer systems over a period of 25 years by the U.S. Geological Survey.

  14. The Role of Computer Simulation in an Inquiry-Based Learning Environment: Reconstructing Geological Events as Geologists

    ERIC Educational Resources Information Center

    Lin, Li-Fen; Hsu, Ying-Shao; Yeh, Yi-Fen

    2012-01-01

    Several researchers have investigated the effects of computer simulations on students' learning. However, few have focused on how simulations with authentic contexts influences students' inquiry skills. Therefore, for the purposes of this study, we developed a computer simulation (FossilSim) embedded in an authentic inquiry lesson. FossilSim…

  15. Measuring novices' field mapping abilities using an in-class exercise based on expert task analysis

    NASA Astrophysics Data System (ADS)

    Caulkins, J. L.

    2010-12-01

    We are interested in developing a model of expert-like behavior for improving the teaching methods of undergraduate field geology. Our aim is to assist students in mastering the process of field mapping more efficiently and effectively and to improve their ability to think creatively in the field. To examine expert-mapping behavior, a cognitive task analysis was conducted with expert geologic mappers in an attempt to define the process of geologic mapping (i.e. to understand how experts carry out geological mapping). The task analysis indicates that expert mappers have a wealth of geologic scenarios at their disposal that they compare against examples seen in the field, experiences that most undergraduate mappers will not have had. While presenting students with many geological examples in class may increase their understanding of geologic processes, novices still struggle when presented with a novel field situation. Based on the task analysis, a short (45-minute) paper-map-based exercise was designed and tested with 14 pairs of 3rd year geology students. The exercise asks students to generate probable geologic models based on a series of four (4) data sets. Each data set represents a day’s worth of data; after the first “day,” new sheets simply include current and previously collected data (e.g. “Day 2” data set includes data from “Day 1” plus the new “Day 2” data). As the geologic complexity increases, students must adapt, reject or generate new geologic models in order to fit the growing data set. Preliminary results of the exercise indicate that students who produced more probable geologic models, and produced higher ratios of probable to improbable models, tended to go on to do better on the mapping exercises at the 3rd year field school. These results suggest that those students with more cognitively available geologic models may be more able to use these models in field settings than those who are unable to draw on these models for whatever reason. Giving students practice at generating geologic models to explain data may be useful in preparing our students for field mapping exercises.

  16. Reviews.

    ERIC Educational Resources Information Center

    Repak, Arthur J.; And Others

    1988-01-01

    Computer software, audiovisuals, and books are reviewed. Includes topics on interfacing, ionic equilibrium, space, the classification system, Acquired Immune Disease Syndrome, evolution, human body processes, energy, pesticides, teaching school, cells, and geological aspects. Availability, price, and a description of each are provided. (RT)

  17. Adaptive Conditioning of Multiple-Point Geostatistical Facies Simulation to Flow Data with Facies Probability Maps

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, M.; Jafarpour, B.

    2013-12-01

    Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the sampling history to improve the data mismatch objective function. We extend the application of this adaptive conditioning approach to the case where multiple training images are proposed to describe the geologic scenario in a given formation. We discuss the advantages and limitations of the proposed adaptive conditioning scheme and use numerical experiments from fluvial channel formations to demonstrate its applicability and performance compared to non-adaptive conditioning techniques.

  18. Fast laboratory-based micro-computed tomography for pore-scale research: Illustrative experiments and perspectives on the future

    NASA Astrophysics Data System (ADS)

    Bultreys, Tom; Boone, Marijn A.; Boone, Matthieu N.; De Schryver, Thomas; Masschaele, Bert; Van Hoorebeke, Luc; Cnudde, Veerle

    2016-09-01

    Over the past decade, the wide-spread implementation of laboratory-based X-ray micro-computed tomography (micro-CT) scanners has revolutionized both the experimental and numerical research on pore-scale transport in geological materials. The availability of these scanners has opened up the possibility to image a rock's pore space in 3D almost routinely to many researchers. While challenges do persist in this field, we treat the next frontier in laboratory-based micro-CT scanning: in-situ, time-resolved imaging of dynamic processes. Extremely fast (even sub-second) micro-CT imaging has become possible at synchrotron facilities over the last few years, however, the restricted accessibility of synchrotrons limits the amount of experiments which can be performed. The much smaller X-ray flux in laboratory-based systems bounds the time resolution which can be attained at these facilities. Nevertheless, progress is being made to improve the quality of measurements performed on the sub-minute time scale. We illustrate this by presenting cutting-edge pore scale experiments visualizing two-phase flow and solute transport in real-time with a lab-based environmental micro-CT set-up. To outline the current state of this young field and its relevance to pore-scale transport research, we critically examine its current bottlenecks and their possible solutions, both on the hardware and the software level. Further developments in laboratory-based, time-resolved imaging could prove greatly beneficial to our understanding of transport behavior in geological materials and to the improvement of pore-scale modeling by providing valuable validation.

  19. Revisiting Frazier's subdeltas: enhancing datasets with dimensionality, better to understand geologic systems

    USGS Publications Warehouse

    Flocks, James

    2006-01-01

    Scientific knowledge from the past century is commonly represented by two-dimensional figures and graphs, as presented in manuscripts and maps. Using today's computer technology, this information can be extracted and projected into three- and four-dimensional perspectives. Computer models can be applied to datasets to provide additional insight into complex spatial and temporal systems. This process can be demonstrated by applying digitizing and modeling techniques to valuable information within widely used publications. The seminal paper by D. Frazier, published in 1967, identified 16 separate delta lobes formed by the Mississippi River during the past 6,000 yrs. The paper includes stratigraphic descriptions through geologic cross-sections, and provides distribution and chronologies of the delta lobes. The data from Frazier's publication are extensively referenced in the literature. Additional information can be extracted from the data through computer modeling. Digitizing and geo-rectifying Frazier's geologic cross-sections produce a three-dimensional perspective of the delta lobes. Adding the chronological data included in the report provides the fourth-dimension of the delta cycles, which can be visualized through computer-generated animation. Supplemental information can be added to the model, such as post-abandonment subsidence of the delta-lobe surface. Analyzing the regional, net surface-elevation balance between delta progradations and land subsidence is computationally intensive. By visualizing this process during the past 4,500 yrs through multi-dimensional animation, the importance of sediment compaction in influencing both the shape and direction of subsequent delta progradations becomes apparent. Visualization enhances a classic dataset, and can be further refined using additional data, as well as provide a guide for identifying future areas of study.

  20. National geochemical data base; PLUTO geochemical data base for the United States

    USGS Publications Warehouse

    Baedecker, Philip A.; Grossman, Jeffrey N.; Buttleman, Kim P.

    1998-01-01

    The PLUTO CD-ROM data base contains inorganic geothermal data obtained by the analytical laboratories of the Geologic Division of the U.S. Geological Survey (USGS) for the United States, including Hawaii and Alaska, in support of USGS program activities requiring chemical data. This CD-ROM was produced in accordance with the ISO 9660 standard and can be accessed by any computer system that has the appropriate software to read the ISO 9660 discs; however, the disc is intended for use in a DOS environment.

  1. 3D Stratigraphic Modeling of Central Aachen

    NASA Astrophysics Data System (ADS)

    Dong, M.; Neukum, C.; Azzam, R.; Hu, H.

    2010-05-01

    Since 1980s, advanced computer hardware and software technologies, as well as multidisciplinary research have provided possibilities to develop advanced three dimensional (3D) simulation software for geosciences application. Some countries, such as USA1) and Canada2) 3), have built up regional 3D geological models based on archival geological data. Such models have played huge roles in engineering geology2), hydrogeology2) 3), geothermal industry1) and so on. In cooperating with the Municipality of Aachen, the Department of Engineering Geology of RWTH Aachen University have built up a computer-based 3D stratigraphic model of 50 meter' depth for the center of Aachen, which is a 5 km by 7 km geologically complex area. The uncorrelated data from multi-resources, discontinuous nature and unconformable connection of the units are main challenges for geological modeling in this area. The reliability of 3D geological models largely depends on the quality and quantity of data. Existing 1D and 2D geological data were collected, including 1) approximately 6970 borehole data of different depth compiled in Microsoft Access database and MapInfo database; 2) a Digital Elevation Model (DEM); 3) geological cross sections; and 4) stratigraphic maps in 1m, 2m and 5m depth. Since acquired data are of variable origins, they were managed step by step. The main processes are described below: 1) Typing errors of borehole data were identified and the corrected data were exported to Variowin2.2 to distinguish duplicate points; 2) The surface elevation of borehole data was compared to the DEM, and differences larger than 3m were eliminated. Moreover, where elevation data missed, it was read from the DEM; 3) Considerable data were collected from municipal constructions, such as residential buildings, factories, and roads. Therefore, many boreholes are spatially clustered, and only one or two representative points were picked out in such areas; After above procedures, 5839 boreholes with -x, -y, -z coordinates, down-hole depth, and stratigraphic information are available. 4) We grouped stratigraphic units into four main layers based on analysis of geological settings of the modeling area. The stratigraphic units extend from Quaternary, Cretaceous, Carboniferous to Devonian. In order to facilitate the determination of each unit boundaries, a series of standard code was used to integrate data with different descriptive attributes. 5) The Quaternary and Cretaceous units are characterized by subhorizontal layers. Kriging interpolation was processed to the borehole data in order to estimate data distribution and surface relief for the layers. 6) The Carboniferous and Devonian units are folded. The lack of software support, concerning simulating folds and the shallow depth of boreholes and cross sections constrained the determination of geological boundaries. A strategy of digitalizing the fold surfaces from cross sections and establishing them as inclined strata was followed. The modeling was simply subdivided into two steps. The first step consisted of importing data into the modeling software. The second step involved the construction of subhorizontal layers and folds, which were constrained by geological maps, cross sections and outcrops. The construction of the 3D stratigraphic model is of high relevance to further simulation and application, such as 1) lithological modeling; 2) answering simple questions such as "At which unit is the water table?" and calculating volume of groundwater storage during assessment of aquifer vulnerability to contamination; and 3) assigned by geotechnical properties in grids and providing them for user required application. Acknowledgements: Borehole data is kindly provided by the Municipality of Aachen. References: 1. Janet T. Watt, Jonathan M.G. Glen, David A. John and David A. Ponce (2007) Three-dimensional geologic model of the northern Nevada rift and the Beowawe geothermal system, north-central Nevada. Geosphere, v. 3; no. 6; p. 667-682 2. Martin Ross, Michel Parent and René Lefebvre (2005) 3D geologic framework models for regional hydrogeology and land-use management: a case study from a Quaternary basin of southwestern Quebec, Canada. Hydrogeology Journal, 13:690-707 3. Martin Ross, Richard Martel, René Lefebvre, Michel Parent and Martine M. Savard (2004) Assessing rock aquifer vulnerability using downward advective times from a 3D model of surficial geology: A case study from the St. Lawrence Lowlands, Canada. Geofísica Internacional Vol. 43, Num. 4, pp. 591-602

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narr, W.; Currie, J.B.

    The occurrence of natural fracture systems in subsurface rock can be predicted if careful evaluation is made of the ecologic processes that affect sedimentary strata during their cycle of burial, diagenesis, uplift, and erosional unloading. Variations in the state of stress within rock arise, for example, from changes in temperature, pore pressure, weight of overburden, or tectonic loading. Hence geologic processes acting on a sedimentary unit should be analyzed for their several contributions to the state of stress, and this information used to compute a stress history. From this stress history, predictions may be made as to when in themore » burial cycle to expect fracture (joint) formation, what type of fractures (extension or shear) may occur, and which geologic factors are most favorable to development of fractures. A stress history is computed for strata of the naturally fractured Altamont oil field in Utah's Uinta basin. Calculations suggest that fractures formed in extension, that the well-cemented rocks are those most likely to be fractured, that fractures began to develop only after stata were uplifted and denuded of overburden. Geologic evidence on fracture genesis and development is in accord with the stress history prediction. Stress history can be useful in evaluating a sedimentary basin for naturally fractured reservoir exploration plays.« less

  3. A Fast Full Tensor Gravity computation algorithm for High Resolution 3D Geologic Interpretations

    NASA Astrophysics Data System (ADS)

    Jayaram, V.; Crain, K.; Keller, G. R.

    2011-12-01

    We present an algorithm to rapidly calculate the vertical gravity and full tensor gravity (FTG) values due to a 3-D geologic model. This algorithm can be implemented on single, multi-core CPU and graphical processing units (GPU) architectures. Our technique is based on the line element approximation with a constant density within each grid cell. This type of parameterization is well suited for high-resolution elevation datasets with grid size typically in the range of 1m to 30m. The large high-resolution data grids in our studies employ a pre-filtered mipmap pyramid type representation for the grid data known as the Geometry clipmap. The clipmap was first introduced by Microsoft Research in 2004 to do fly-through terrain visualization. This method caches nested rectangular extents of down-sampled data layers in the pyramid to create view-dependent calculation scheme. Together with the simple grid structure, this allows the gravity to be computed conveniently on-the-fly, or stored in a highly compressed format. Neither of these capabilities has previously been available. Our approach can perform rapid calculations on large topographies including crustal-scale models derived from complex geologic interpretations. For example, we used a 1KM Sphere model consisting of 105000 cells at 10m resolution with 100000 gravity stations. The line element approach took less than 90 seconds to compute the FTG and vertical gravity on an Intel Core i7 CPU at 3.07 GHz utilizing just its single core. Also, unlike traditional gravity computational algorithms, the line-element approach can calculate gravity effects at locations interior or exterior to the model. The only condition that must be met is the observation point cannot be located directly above the line element. Therefore, we perform a location test and then apply appropriate formulation to those data points. We will present and compare the computational performance of the traditional prism method versus the line element approach on different CPU-GPU system configurations. The algorithm calculates the expected gravity at station locations where the observed gravity and FTG data were acquired. This algorithm can be used for all fast forward model calculations of 3D geologic interpretations for data from airborne, space and submarine gravity, and FTG instrumentation.

  4. Guidelines and standard procedures for continuous water-quality monitors: Station operation, record computation, and data reporting

    USGS Publications Warehouse

    Wagner, Richard J.; Boulger, Robert W.; Oblinger, Carolyn J.; Smith, Brett A.

    2006-01-01

    The U.S. Geological Survey uses continuous water-quality monitors to assess the quality of the Nation's surface water. A common monitoring-system configuration for water-quality data collection is the four-parameter monitoring system, which collects temperature, specific conductance, dissolved oxygen, and pH data. Such systems also can be configured to measure other properties, such as turbidity or fluorescence. Data from sensors can be used in conjunction with chemical analyses of samples to estimate chemical loads. The sensors that are used to measure water-quality field parameters require careful field observation, cleaning, and calibration procedures, as well as thorough procedures for the computation and publication of final records. This report provides guidelines for site- and monitor-selection considerations; sensor inspection and calibration methods; field procedures; data evaluation, correction, and computation; and record-review and data-reporting processes, which supersede the guidelines presented previously in U.S. Geological Survey Water-Resources Investigations Report WRIR 00-4252. These procedures have evolved over the past three decades, and the process continues to evolve with newer technologies.

  5. Offshore survey provides answers to coastal stability and potential offshore extensions of landslides into Abalone Cove, Palos Verdes peninsula, Calif

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dill, R.F.; Slosson, J.E.

    1993-04-01

    The configuration and stability of the present coast line near Abalone Cove, on the south side of Palos Verdes Peninsula, California is related to the geology, oceanographic conditions, and recent and ancient landslide activity. This case study utilizes offshore high resolution seismic profiles, side-scan sonar, diving, and coring, to relate marine geology to the stability of a coastal region with known active landslides utilizing a desk top computer and off-the-shelf software. Electronic navigation provided precise positioning that when applied to computer generated charts permitted correlation of survey data needed to define the offshore geology and sea floor sediment patterns. Amore » mackintosh desk-top computer and commercially available off-the-shelf software provided the analytical tools for constructing a base chart and a means to superimpose template overlays of topography, isopachs or sediment thickness, bottom roughness and sediment distribution patterns. This composite map of offshore geology and oceanography was then related to an extensive engineering and geological land study of the coastal zone forming Abalone Cove, an area of active landslides. Vibrocoring provided ground sediment data for high resolution seismic traverses. This paper details the systems used, present findings relative to potential landslide movements, coastal erosion and discuss how conclusions were reached to determine whether or not onshore landslide failures extend offshore.« less

  6. Environmental mapping and monitoring of Iceland by remote sensing (EMMIRS)

    NASA Astrophysics Data System (ADS)

    Pedersen, Gro B. M.; Vilmundardóttir, Olga K.; Falco, Nicola; Sigurmundsson, Friðþór S.; Rustowicz, Rose; Belart, Joaquin M.-C.; Gísladóttir, Gudrun; Benediktsson, Jón A.

    2016-04-01

    Iceland is exposed to rapid and dynamic landscape changes caused by natural processes and man-made activities, which impact and challenge the country. Fast and reliable mapping and monitoring techniques are needed on a big spatial scale. However, currently there is lack of operational advanced information processing techniques, which are needed for end-users to incorporate remote sensing (RS) data from multiple data sources. Hence, the full potential of the recent RS data explosion is not being fully exploited. The project Environmental Mapping and Monitoring of Iceland by Remote Sensing (EMMIRS) bridges the gap between advanced information processing capabilities and end-user mapping of the Icelandic environment. This is done by a multidisciplinary assessment of two selected remote sensing super sites, Hekla and Öræfajökull, which encompass many of the rapid natural and man-made landscape changes that Iceland is exposed to. An open-access benchmark repository of the two remote sensing supersites is under construction, providing high-resolution LIDAR topography and hyperspectral data for land-cover and landform classification. Furthermore, a multi-temporal and multi-source archive stretching back to 1945 allows a decadal evaluation of landscape and ecological changes for the two remote sensing super sites by the development of automated change detection techniques. The development of innovative pattern recognition and machine learning-based approaches to image classification and change detection is one of the main tasks of the EMMIRS project, aiming to extract and compute earth observation variables as automatically as possible. Ground reference data collected through a field campaign will be used to validate the implemented methods, which outputs are then inferred with geological and vegetation models. Here, preliminary results of an automatic land-cover classification based on hyperspectral image analysis are reported. Furthermore, the EMMIRS project investigates the complex landscape dynamics between geological and ecological processes. This is done through cross-correlation of mapping results and implementation of modelling techniques that simulate geological and ecological processes in order to extrapolate the landscape evolution

  7. Cross-disciplinary Undergraduate Research: A Case Study in Digital Mapping, western Ireland

    NASA Astrophysics Data System (ADS)

    Whitmeyer, S. J.; de Paor, D. G.; Nicoletti, J.; Rivera, M.; Santangelo, B.; Daniels, J.

    2008-12-01

    As digital mapping technology becomes ever more advanced, field geologists spend a greater proportion of time learning digital methods relative to analyzing rocks and structures. To explore potential solutions to the time commitment implicit in learning digital field methods, we paired James Madison University (JMU) geology majors (experienced in traditional field techniques) with Worcester Polytechnic Institute (WPI) engineering students (experienced in computer applications) during a four week summer mapping project in Connemara, western Ireland. The project consisted of approximately equal parts digital field mapping (directed by the geology students), and lab-based map assembly, evaluation and formatting for virtual 3D terrains (directed by the engineering students). Students collected geologic data in the field using ruggedized handheld computers (Trimble GeoExplorer® series) with ArcPAD® software. Lab work initially focused on building geologic maps in ArcGIS® from the digital field data and then progressed to developing Google Earth-based visualizations of field data and maps. Challenges included exporting GIS data, such as locations and attributes, to KML tags for viewing in Google Earth, which we accomplished using a Linux bash script written by one of our engineers - a task outside the comfort zone of the average geology major. We also attempted to expand the scope of Google Earth by using DEMs of present-day geologically-induced landforms as representative models for paleo-geographic reconstructions of the western Ireland field area. As our integrated approach to digital field work progressed, we found that our digital field mapping produced data at a faster rate than could be effectively managed during our allotted time for lab work. This likely reflected the more developed methodology for digital field data collection, as compared with our lab-based attempts to develop new methods for 3D visualization of geologic maps. However, this experiment in cross-disciplinary undergraduate research was a big success, with an enthusiastic interchange of expertise between undergraduate geology and engineering students that produced new, cutting-edge methods for visualizing geologic data and maps.

  8. Catalog of Computer Programs Used in Undergraduate Geological Education. Second Edition. Installment 4.

    ERIC Educational Resources Information Center

    Burger, H. Robert

    1984-01-01

    Describes 70 computer programs related to (1) structural geology; (2) sedimentology and stratigraphy; and (3) the environment, groundwater, glacial geology, and oceanography. Potential use(s), language, required hardware, and sources are included. (JM)

  9. Streamstats: U.S. Geological Survey Web Application for Streamflow Statistics for Connecticut

    USGS Publications Warehouse

    Ahearn, Elizabeth A.; Ries, Kernell G.; Steeves, Peter A.

    2006-01-01

    Introduction An important mission of the U. S. Geological Survey (USGS) is to provide information on streamflow in the Nation's rivers. Streamflow statistics are used by water managers, engineers, scientists, and others to protect people and property during floods and droughts, and to manage land, water, and biological resources. Common uses for streamflow statistics include dam, bridge, and culvert design; water-supply planning and management; water-use appropriations and permitting; wastewater and industrial discharge permitting; hydropower-facility design and regulation; and flood-plain mapping for establishing flood-insurance rates and land-use zones. In an effort to improve access to published streamflow statistics, and to make the process of computing streamflow statistics for ungaged stream sites easier, more accurate, and more consistent, the USGS and the Environmental Systems Research Institute, Inc. (ESRI) developed StreamStats (Ries and others, 2004). StreamStats is a Geographic Information System (GIS)-based Web application for serving previously published streamflow statistics and basin characteristics for USGS data-collection stations, and computing streamflow statistics and basin characteristics for ungaged stream sites. The USGS, in cooperation with the Connecticut Department of Environmental Protection and the Connecticut Department of Transportation, has implemented StreamStats for Connecticut.

  10. StreamStats in Oklahoma - Drainage-Basin Characteristics and Peak-Flow Frequency Statistics for Ungaged Streams

    USGS Publications Warehouse

    Smith, S. Jerrod; Esralew, Rachel A.

    2010-01-01

    The USGS Streamflow Statistics (StreamStats) Program was created to make geographic information systems-based estimation of streamflow statistics easier, faster, and more consistent than previously used manual techniques. The StreamStats user interface is a map-based internet application that allows users to easily obtain streamflow statistics, basin characteristics, and other information for user-selected U.S. Geological Survey data-collection stations and ungaged sites of interest. The application relies on the data collected at U.S. Geological Survey streamflow-gaging stations, computer aided computations of drainage-basin characteristics, and published regression equations for several geographic regions comprising the United States. The StreamStats application interface allows the user to (1) obtain information on features in selected map layers, (2) delineate drainage basins for ungaged sites, (3) download drainage-basin polygons to a shapefile, (4) compute selected basin characteristics for delineated drainage basins, (5) estimate selected streamflow statistics for ungaged points on a stream, (6) print map views, (7) retrieve information for U.S. Geological Survey streamflow-gaging stations, and (8) get help on using StreamStats. StreamStats was designed for national application, with each state, territory, or group of states responsible for creating unique geospatial datasets and regression equations to compute selected streamflow statistics. With the cooperation of the Oklahoma Department of Transportation, StreamStats has been implemented for Oklahoma and is available at http://water.usgs.gov/osw/streamstats/. The Oklahoma StreamStats application covers 69 processed hydrologic units and most of the state of Oklahoma. Basin characteristics available for computation include contributing drainage area, contributing drainage area that is unregulated by Natural Resources Conservation Service floodwater retarding structures, mean-annual precipitation at the drainage-basin outlet for the period 1961-1990, 10-85 channel slope (slope between points located at 10 percent and 85 percent of the longest flow-path length upstream from the outlet), and percent impervious area. The Oklahoma StreamStats application interacts with the National Streamflow Statistics database, which contains the peak-flow regression equations in a previously published report. Fourteen peak-flow (flood) frequency statistics are available for computation in the Oklahoma StreamStats application. These statistics include the peak flow at 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals for rural, unregulated streams; and the peak flow at 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals for rural streams that are regulated by Natural Resources Conservation Service floodwater retarding structures. Basin characteristics and streamflow statistics cannot be computed for locations in playa basins (mostly in the Oklahoma Panhandle) and along main stems of the largest river systems in the state, namely the Arkansas, Canadian, Cimarron, Neosho, Red, and Verdigris Rivers, because parts of the drainage areas extend outside of the processed hydrologic units.

  11. Caprock Breach: A Threat to Secure Geologic Sequestration

    NASA Astrophysics Data System (ADS)

    Selvadurai, A. P.; Dong, W.

    2013-12-01

    The integrity of caprock in providing a reliable barrier is crucial to several environmental geosciences endeavours related to geologic sequestration of CO2, deep geologic disposal of hazardous wastes and contaminants. The integrity of geologic barriers can be compromised by several factors. The re-activation of dormant fractures and development of new fractures in the caprock during the injection process are regarded as effects that can pose a threat to storage security. Other poromechanical influences of pore structure collapse due to chemically induced erosion of the porous fabric resulting in worm-hole type features can also contribute to compromising storage security. The assessment of the rate of steady or transient seepage through defects in the caprock can allow geoscientists to make prudent evaluations of the effectiveness of a sequestration strategy. While complicated computational simulations can be used to calculate leakage through defects, it is useful to explore alternative analytical results that could be used in providing preliminary estimates of leakage rates through defects in the caprock in a storage setting. The relevance of such developments is underscored by the fact that the permeability characteristics of the storage formation, the fracture and the surficial rocks overlying the caprock can rarely be quantified with certainty. This paper presents the problem of a crack in a caprock that connects to a storage formation and an overburden rock or surficial soil formation. The geologic media are maintained at constant far-field flow potentials and leakage takes place at either steady or transient conditions. The paper develops an analytical result that can be used to estimate the steady seepage through the crack. The analytical result can also be used to estimate the leakage through hydraulically non-intersecting cracks and leakage from caprock-well casing interfaces. The analytical result is used to estimate the accuracy of a computational procedure based on a finite element procedure.

  12. Modeling coupled Thermo-Hydro-Mechanical processes including plastic deformation in geological porous media

    NASA Astrophysics Data System (ADS)

    Kelkar, S.; Karra, S.; Pawar, R. J.; Zyvoloski, G.

    2012-12-01

    There has been an increasing interest in the recent years in developing computational tools for analyzing coupled thermal, hydrological and mechanical (THM) processes that occur in geological porous media. This is mainly due to their importance in applications including carbon sequestration, enhanced geothermal systems, oil and gas production from unconventional sources, degradation of Arctic permafrost, and nuclear waste isolation. Large changes in pressures, temperatures and saturation can result due to injection/withdrawal of fluids or emplaced heat sources. These can potentially lead to large changes in the fluid flow and mechanical behavior of the formation, including shear and tensile failure on pre-existing or induced fractures and the associated permeability changes. Due to this, plastic deformation and large changes in material properties such as permeability and porosity can be expected to play an important role in these processes. We describe a general purpose computational code FEHM that has been developed for the purpose of modeling coupled THM processes during multi-phase fluid flow and transport in fractured porous media. The code uses a continuum mechanics approach, based on control volume - finite element method. It is designed to address spatial scales on the order of tens of centimeters to tens of kilometers. While large deformations are important in many situations, we have adapted the small strain formulation as useful insight can be obtained in many problems of practical interest with this approach while remaining computationally manageable. Nonlinearities in the equations and the material properties are handled using a full Jacobian Newton-Raphson technique. Stress-strain relationships are assumed to follow linear elastic/plastic behavior. The code incorporates several plasticity models such as von Mises, Drucker-Prager, and also a large suite of models for coupling flow and mechanical deformation via permeability and stresses/deformations. In this work we present several example applications of such models.

  13. A program for mass spectrometer control and data processing analyses in isotope geology; written in BASIC for an 8K Nova 1120 computer

    USGS Publications Warehouse

    Stacey, J.S.; Hope, J.

    1975-01-01

    A system is described which uses a minicomputer to control a surface ionization mass spectrometer in the peak switching mode, with the object of computing isotopic abundance ratios of elements of geologic interest. The program uses the BASIC language and is sufficiently flexible to be used for multiblock analyses of any spectrum containing from two to five peaks. In the case of strontium analyses, ratios are corrected for rubidium content and normalized for mass spectrometer fractionation. Although almost any minicomputer would be suitable, the model used was the Data General Nova 1210 with 8K memory. Assembly language driver program and interface hardware-descriptions for the Nova 1210 are included.

  14. Acoustic reverse-time migration using GPU card and POSIX thread based on the adaptive optimal finite-difference scheme and the hybrid absorbing boundary condition

    NASA Astrophysics Data System (ADS)

    Cai, Xiaohui; Liu, Yang; Ren, Zhiming

    2018-06-01

    Reverse-time migration (RTM) is a powerful tool for imaging geologically complex structures such as steep-dip and subsalt. However, its implementation is quite computationally expensive. Recently, as a low-cost solution, the graphic processing unit (GPU) was introduced to improve the efficiency of RTM. In the paper, we develop three ameliorative strategies to implement RTM on GPU card. First, given the high accuracy and efficiency of the adaptive optimal finite-difference (FD) method based on least squares (LS) on central processing unit (CPU), we study the optimal LS-based FD method on GPU. Second, we develop the CPU-based hybrid absorbing boundary condition (ABC) to the GPU-based one by addressing two issues of the former when introduced to GPU card: time-consuming and chaotic threads. Third, for large-scale data, the combinatorial strategy for optimal checkpointing and efficient boundary storage is introduced for the trade-off between memory and recomputation. To save the time of communication between host and disk, the portable operating system interface (POSIX) thread is utilized to create the other CPU core at the checkpoints. Applications of the three strategies on GPU with the compute unified device architecture (CUDA) programming language in RTM demonstrate their efficiency and validity.

  15. Machine processing of remotely sensed data - quantifying global process: Models, sensor systems, and analytical methods; Proceedings of the Eleventh International Symposium, Purdue University, West Lafayette, IN, June 25-27, 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mengel, S.K.; Morrison, D.B.

    1985-01-01

    Consideration is given to global biogeochemical issues, image processing, remote sensing of tropical environments, global processes, geology, landcover hydrology, and ecosystems modeling. Topics discussed include multisensor remote sensing strategies, geographic information systems, radars, and agricultural remote sensing. Papers are presented on fast feature extraction; a computational approach for adjusting TM imagery terrain distortions; the segmentation of a textured image by a maximum likelihood classifier; analysis of MSS Landsat data; sun angle and background effects on spectral response of simulated forest canopies; an integrated approach for vegetation/landcover mapping with digital Landsat images; geological and geomorphological studies using an image processing technique;more » and wavelength intensity indices in relation to tree conditions and leaf-nutrient content.« less

  16. Geologic process studies using Synthetic Aperture Radar (SAR) data

    NASA Technical Reports Server (NTRS)

    Evans, Diane L.

    1992-01-01

    The use of SAR data to study geologic processes for better understanding of recent tectonic activity and climate change as well as the mitigation of geologic hazards and exploration for nonrenewable resources is discussed. The geologic processes that are particularly amenable to SAR-based data include volcanism; soil erosion, degradation, and redistribution; coastal erosion and inundation; glacier fluctuations; permafrost; and crustal motions. When SAR data are combined with data from other planned spaceborne sensors including ESA ERS, the Japanese Earth Resources Satellite, and the Canadian Radarsat, it will be possible to build a time-series view of temporal changes over many regions of earth.

  17. Venus geology and tectonics - Hotspot and crustal spreading models and questions for the Magellan mission

    NASA Technical Reports Server (NTRS)

    Head, James W.; Crumpler, L. S.

    1990-01-01

    Spacecraft and ground-based observations of Venus have revealed a geologically young and active surface - with volcanoes, rift zones, orogenic belts and evidence for hotspots and crustal spreading - yet the processes responsible for these features cannot be identified from the available data. The Magellan spacecraft will acquire an unprecedented global data set which will provide a comprehensive and well resolved view of the planet. This will permit global geological mapping, an assessment of the style and relative importance of geological processes, and will help in the understanding of links between the surface geology and mantle dynamics of this earth-like planet.

  18. Novice Interpretations of Visual Representations of Geosciences Data

    NASA Astrophysics Data System (ADS)

    Burkemper, L. K.; Arthurs, L.

    2013-12-01

    Past cognition research of individual's perception and comprehension of bar and line graphs are substantive enough that they have resulted in the generation of graph design principles and graph comprehension theories; however, gaps remain in our understanding of how people process visual representations of data, especially of geologic and atmospheric data. This pilot project serves to build on others' prior research and begin filling the existing gaps. The primary objectives of this pilot project include: (i) design a novel data collection protocol based on a combination of paper-based surveys, think-aloud interviews, and eye-tracking tasks to investigate student data handling skills of simple to complex visual representations of geologic and atmospheric data, (ii) demonstrate that the protocol yields results that shed light on student data handling skills, and (iii) generate preliminary findings upon which tentative but perhaps helpful recommendations on how to more effectively present these data to the non-scientist community and teach essential data handling skills. An effective protocol for the combined use of paper-based surveys, think-aloud interviews, and computer-based eye-tracking tasks for investigating cognitive processes involved in perceiving, comprehending, and interpreting visual representations of geologic and atmospheric data is instrumental to future research in this area. The outcomes of this pilot study provide the foundation upon which future more in depth and scaled up investigations can build. Furthermore, findings of this pilot project are sufficient for making, at least, tentative recommendations that can help inform (i) the design of physical attributes of visual representations of data, especially more complex representations, that may aid in improving students' data handling skills and (ii) instructional approaches that have the potential to aid students in more effectively handling visual representations of geologic and atmospheric data that they might encounter in a course, television news, newspapers and magazines, and websites. Such recommendations would also be the potential subject of future investigations and have the potential to impact the design features when data is presented to the public and instructional strategies not only in geoscience courses but also other science, technology, engineering, and mathematics (STEM) courses.

  19. Ground-water modeling of the Death Valley Region, Nevada and California

    USGS Publications Warehouse

    Belcher, W.R.; Faunt, C.C.; Sweetkind, D.S.; Blainey, J.B.; San Juan, C. A.; Laczniak, R.J.; Hill, M.C.

    2006-01-01

    The Death Valley regional ground-water flow system (DVRFS) of southern Nevada and eastern California covers an area of about 100,000 square kilometers and contains very complex geology and hydrology. Using a computer model to represent the complex system, the U.S. Geological Survey simulated ground-water flow in the Death Valley region for use with U.S. Department of Energy projects in southern Nevada. The model was created to help address contaminant cleanup activities associated with the underground nuclear testing conducted from 1951 to 1992 at the Nevada Test Site and to support the licensing process for the proposed geologic repository for high-level nuclear waste at Yucca Mountain, Nevada.

  20. Modelling surface water-groundwater interaction with a conceptual approach: model development and application in New Zealand

    NASA Astrophysics Data System (ADS)

    Yang, J.; Zammit, C.; McMillan, H. K.

    2016-12-01

    As in most countries worldwide, water management in lowland areas is a big concern for New Zealand due to its economic importance for water related human activities. As a result, the estimation of available water resources in these areas (e.g., for irrigation and water supply purpose) is crucial and often requires an understanding of complex hydrological processes, which are often characterized by strong interactions between surface water and groundwater (usually expressed as losing and gaining rivers). These processes are often represented and simulated using integrated physically based hydrological models. However models with physically based groundwater modules typically require large amount of non-readily available geologic and aquifer information and are computationally intensive. Instead, this paper presents a conceptual groundwater model that is fully integrated into New Zealand's national hydrological model TopNet based on TopModel concepts (Beven, 1992). Within this conceptual framework, the integrated model can simulate not only surface processes, but also groundwater processes and surface water-groundwater interaction processes (including groundwater flow, river-groundwater interaction, and groundwater interaction with external watersheds). The developed model was applied to two New Zealand catchments with different hydro-geological and climate characteristics (Pareora catchment in the Canterbury Plains and Grey catchment on the West Coast). Previous studies have documented strong interactions between the river and groundwater, based on the analysis of a large number of concurrent flow measurements and associated information along the river main stem. Application of the integrated hydrological model indicates flow simulation (compared to the original hydrological model conceptualisation) during low flow conditions are significantly improved and further insights on local river dynamics are gained. Due to its conceptual characteristics and low level of data requirement, the integrated model could be used at local and national scales to improve the simulation of hydrological processes in non-topographically driven areas (where groundwater processes are important), and to assess impact of climate change on the integrated hydrological cycle in these areas.

  1. MODPATH-LGR; documentation of a computer program for particle tracking in shared-node locally refined grids by using MODFLOW-LGR

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.

    2011-01-01

    The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.

  2. A parallel-processing approach to computing for the geographic sciences; applications and systems enhancements

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.

  3. User's manual for the National Water Information System of the U.S. Geological Survey: Automated Data Processing System (ADAPS)

    USGS Publications Warehouse

    ,

    2003-01-01

    The Automated Data Processing System (ADAPS) was developed for the processing, storage, and retrieval of water data, and is part of the National Water Information System (NWIS) developed by the U.S. Geological Survey. NWIS is a distributed water database in which data can be processed over a network of computers at U.S. Geological Survey offices throughout the United States. NWIS comprises four subsystems: ADAPS, the Ground-Water Site Inventory System (GWSI), the Water-Quality System (QWDATA), and the Site-Specific Water-Use Data System (SWUDS). This section of the NWIS User's Manual describes the automated data processing of continuously recorded water data, which primarily are surface-water data; however, the system also allows for the processing of water-quality and ground-water data. This manual describes various components and features of the ADAPS, and provides an overview of the data processing system and a description of the system framework. The components and features included are: (1) data collection and processing, (2) ADAPS menus and programs, (3) command line functions, (4) steps for processing station records, (5) postprocessor programs control files, (6) the standard format for transferring and entering unit and daily values, and (7) relational database (RDB) formats.

  4. Integration of Geophysical Data into Structural Geological Modelling through Bayesian Networks

    NASA Astrophysics Data System (ADS)

    de la Varga, Miguel; Wellmann, Florian; Murdie, Ruth

    2016-04-01

    Structural geological models are widely used to represent the spatial distribution of relevant geological features. Several techniques exist to construct these models on the basis of different assumptions and different types of geological observations (e.g. Jessell et al., 2014). However, two problems are prevalent when constructing models: (i) observations and assumptions, and therefore also the constructed model, are subject to uncertainties, and (ii) additional information, such as geophysical data, is often available, but cannot be considered directly in the geological modelling step. In our work, we propose the integration of all available data into a Bayesian network including the generation of the implicit geological method by means of interpolation functions (Mallet, 1992; Lajaunie et al., 1997; Mallet, 2004; Carr et al., 2001; Hillier et al., 2014). As a result, we are able to increase the certainty of the resultant models as well as potentially learn features of our regional geology through data mining and information theory techniques. MCMC methods are used in order to optimize computational time and assure the validity of the results. Here, we apply the aforementioned concepts in a 3-D model of the Sandstone Greenstone Belt in the Archean Yilgarn Craton in Western Australia. The example given, defines the uncertainty in the thickness of greenstone as limited by Bouguer anomaly and the internal structure of the greenstone as limited by the magnetic signature of a banded iron formation. The incorporation of the additional data and specially the gravity provides an important reduction of the possible outcomes and therefore the overall uncertainty. References Carr, C. J., K. R. Beatson, B. J. Cherrie, J. T. Mitchell, R. W. Fright, C. B. McCallum, and R. T. Evans, 2001, Reconstruction and representation of 3D objects with radial basis functions: Proceedings of the 28th annual conference on Computer graphics and interactive techniques, 67-76. Jessell, M., Aillères, L., de Kemp, E., Lindsay, M., Wellmann, F., Hillier, M., ... & Martin, R. (2014). Next Generation Three-Dimensional Geologic Modeling and Inversion. Lajaunie, C., G. Courrioux, and L. Manuel, 1997, Foliation fields and 3D cartography in geology: Principles of a method based on potential interpolation: Mathematical Geology, 29, 571-584. Mallet, J.-L., 1992, Discrete smooth interpolation in geometric modelling: Computer-Aided Design, 24, 178-191 Mallet, L. J., 2004, Space-time mathematical framework for sedimentary geology: Mathematical Geology, 36, 1-32.

  5. Voxel inversion of airborne electromagnetic data

    NASA Astrophysics Data System (ADS)

    Auken, E.; Fiandaca, G.; Kirkegaard, C.; Vest Christiansen, A.

    2013-12-01

    Inversion of electromagnetic data usually refers to a model space being linked to the actual observation points, and for airborne surveys the spatial discretization of the model space reflects the flight lines. On the contrary, geological and groundwater models most often refer to a regular voxel grid, not correlated to the geophysical model space. This means that incorporating the geophysical data into the geological and/or hydrological modelling grids involves a spatial relocation of the models, which in itself is a subtle process where valuable information is easily lost. Also the integration of prior information, e.g. from boreholes, is difficult when the observation points do not coincide with the position of the prior information, as well as the joint inversion of airborne and ground-based surveys. We developed a geophysical inversion algorithm working directly in a voxel grid disconnected from the actual measuring points, which then allows for informing directly geological/hydrogeological models, for easier incorporation of prior information and for straightforward integration of different data types in joint inversion. The new voxel model space defines the soil properties (like resistivity) on a set of nodes, and the distribution of the properties is computed everywhere by means of an interpolation function f (e.g. inverse distance or kriging). The position of the nodes is fixed during the inversion and is chosen to sample the soil taking into account topography and inversion resolution. Given this definition of the voxel model space, both 1D and 2D/3D forward responses can be computed. The 1D forward responses are computed as follows: A) a 1D model subdivision, in terms of model thicknesses and direction of the "virtual" horizontal stratification, is defined for each 1D data set. For EM soundings the "virtual" horizontal stratification is set up parallel to the topography at the sounding position. B) the "virtual" 1D models are constructed by interpolating the soil properties in the medium point of the "virtual" layers. For 2D/3D forward responses the algorithm operates similarly, simply filling the 2D/3D meshes of the forward responses by computing the interpolation values in the centres of the mesh cells. The new definition of the voxel model space allows for incorporating straightforwardly the geophysical information into geological and/or hydrological models, just by using for defining the geophysical model space a voxel (hydro)geological grid. This simplify also the propagation of the uncertainty of geophysical parameters into the (hydro)geological models. Furthermore, prior information from boreholes, like resistivity logs, can be applied directly to the voxel model space, even if the borehole positions do not coincide with the actual observation points. In fact, the prior information is constrained to the model parameters through the interpolation function at the borehole locations. The presented algorithm is a further development of the AarhusInv program package developed at Aarhus University (formerly em1dinv), which manages both large scale AEM surveys and ground-based data. This work has been carried out as part of the HyGEM project, supported by the Danish Council of Strategic Research under grant number DSF 11-116763.

  6. Map showing contours on top of the upper Cretaceous Mowry Shale, Powder River basin, Wyoming and Montana

    USGS Publications Warehouse

    Crysdale, B.L.

    1991-01-01

    This map is one in a series of U.S. Geological Survey Miscellaneous Field Studies (MF) maps showing computer-generated structure contours, isopachs, and cross sections of selected formations in the Powder River basin, Wyoming and Montana. The map and cross sections were constructed from information stored in a U.S. Geological Survey Evolution of Sedimentary Basins data base. This data base contains picks of geologic formation and (or) unit tops and bases determined from electric resistivity and gamma-ray logs of 8,592 wells penetrating Tertiary and older rocks in the Powder River basin. Well completion cards (scout tickets) were reviewed and compared with copies of all logs, and formation or unit contacts determined by N. M. Denson, D.L. Macke, R. R. Schumann and others. This isopach map is based on information from 4,926 of these wells that penetrate the Minnelusa Formation and equivalents.

  7. Map showing structure contours on the top of the upper Jurassic Morrison Formation, Powder River basin, Wyoming and Montana

    USGS Publications Warehouse

    Crysdale, B.L.

    1991-01-01

    This map is one in a series of U.S. Geological Survey Miscellaneous Field Studies (MF) maps showing computer-generated structure contours, isopachs, and cross sections of selected formations in the Powder River basin, Wyoming and Montana. The map and cross sections were constructed from information stored in a U.S. Geological Survey Evolution of Sedimentary Basins data base. This data base contains picks of geologic formation and (or) unit tops and bases determined from electric resistivity and gamma-ray logs of 8,592 wells penetrating Tertiary and older rocks in the Powder River basin. Well completion cards (scout tickets) were reviewed and compared with copies of all logs, and formation or unit contacts determined by N. M. Denson, D.L. Macke, R. R. Schumann and others. This isopach map is based on information from 2,429 of these wells that penetrate the Minnelusa Formation and equivalents.

  8. Catalog of Computer Programs Used in Undergraduate Geology Education (Second Edition): Installment 3.

    ERIC Educational Resources Information Center

    Burger, H. Robert

    1983-01-01

    Presents annotated list of computer programs related to geophysics, geomorphology, paleontology, economic geology, petroleum geology, and miscellaneous topics. Entries include description, instructional use(s), programing language, and availability. Programs described in previous installments (found in SE 533 635 and 534 182) focused on…

  9. A Graphical Approach to Quantitative Structural Geology.

    ERIC Educational Resources Information Center

    De Paor, Declan G.

    1986-01-01

    Describes how computer graphic methods can be used in teaching structural geology. Describes the design of a graphics workstation for the Apple microcomputer. Includes a listing of commands used with software to plot structures in a digitized form. Argues for the establishment of computer laboratories for structural geology classes. (TW)

  10. The Coastal Ocean Prediction Systems program: Understanding and managing our coastal ocean

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eden, H.F.; Mooers, C.N.K.

    1990-06-01

    The goal of COPS is to couple a program of regular observations to numerical models, through techniques of data assimilation, in order to provide a predictive capability for the US coastal ocean including the Great Lakes, estuaries, and the entire Exclusive Economic Zone (EEZ). The objectives of the program include: determining the predictability of the coastal ocean and the processes that govern the predictability; developing efficient prediction systems for the coastal ocean based on the assimilation of real-time observations into numerical models; and coupling the predictive systems for the physical behavior of the coastal ocean to predictive systems for biological,more » chemical, and geological processes to achieve an interdisciplinary capability. COPS will provide the basis for effective monitoring and prediction of coastal ocean conditions by optimizing the use of increased scientific understanding, improved observations, advanced computer models, and computer graphics to make the best possible estimates of sea level, currents, temperatures, salinities, and other properties of entire coastal regions.« less

  11. Stochastic simulation by image quilting of process-based geological models

    NASA Astrophysics Data System (ADS)

    Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef

    2017-09-01

    Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.

  12. Delimitation of volcanic edifices for landscape characterization and planning

    NASA Astrophysics Data System (ADS)

    Melis, Maria Teresa; Mundula, Filippo; Dessì, Francesco; Danila Patta, Elisabetta; Funedda, Antonio; Cioni, Raffaello

    2014-05-01

    The European Landscape Convention, recently adopted in Italy, indicates specific landforms to be selected as special protected sites. Active and inactive volcanic edifices, defined as the products of evolution of aggradational (lava effusion, pyroclastic deposition, magma intrusion) and degradational processes (erosion, deformation, gravitative phenomena), are one of the specific landforms to be protected. In order to protect these sites, management and planning measures are to be defined and shared with the local communities. In the framework of the Regional Landscape Management Plan of Sardinia (Italy), a detailed study aimed at identifying and delimiting Cenozoic volcanic edifices was performed. The large geological and morphological variability of the volcanic edifices of Sardinia in terms of type, dimension, age, integrity (a measure of the wholeness and intactnes of the volcanic edifice), geology and paleomorphology of the substrate, does not allow the definition of an automatic procedure for extracting the boundaries to delimit the volcanic edifices. In addition, quantitative geomorphological studies in the field of volcanology are confined to specific volcano types, and landscape literature does not suggest any universal criteria for delimiting volcanic edifices, except for the use of the concave breaks in slope at their base (Euillades et al., Computers and Geosciences, 2013). As this simple criterion can be unequivocally applied only in the ideal case of symmetric cones or domes built up on a planar surface, we developed a multidisciplinary methodology based on the integrated analysis of geological, geomorphological and morphometrical data of each edifice. The process of selection and delimitation of the volcanic edifices is the result of the following steps: i) a literature based delimitation of the volcanic edifice; ii) a preliminary delimitation through photo-interpretation and the use of geological criteria; and iii) a final refinement based on the use of DEM-based quantitative elaborations. This final step consists in the construction of maps of the angle of slope and of the surface curvature (concavity, convexity) generated by digital topographic maps in 1:10.000. In addition to this, morphological parameters were combined following the method proposed by Grosse et al. (Geomorphology, 2012) and a new algorithm based on a different combination of the morphometric parameters. The edifice boundaries are manually defined by cross-checking all the available data, and the results are discussed through the use of some examples.

  13. Project APhiD: A Lorenz-gauged A-Φ decomposition for parallelized computation of ultra-broadband electromagnetic induction in a fully heterogeneous Earth

    NASA Astrophysics Data System (ADS)

    Weiss, Chester J.

    2013-08-01

    An essential element for computational hypothesis testing, data inversion and experiment design for electromagnetic geophysics is a robust forward solver, capable of easily and quickly evaluating the electromagnetic response of arbitrary geologic structure. The usefulness of such a solver hinges on the balance among competing desires like ease of use, speed of forward calculation, scalability to large problems or compute clusters, parsimonious use of memory access, accuracy and by necessity, the ability to faithfully accommodate a broad range of geologic scenarios over extremes in length scale and frequency content. This is indeed a tall order. The present study addresses recent progress toward the development of a forward solver with these properties. Based on the Lorenz-gauged Helmholtz decomposition, a new finite volume solution over Cartesian model domains endowed with complex-valued electrical properties is shown to be stable over the frequency range 10-2-1010 Hz and range 10-3-105 m in length scale. Benchmark examples are drawn from magnetotellurics, exploration geophysics, geotechnical mapping and laboratory-scale analysis, showing excellent agreement with reference analytic solutions. Computational efficiency is achieved through use of a matrix-free implementation of the quasi-minimum-residual (QMR) iterative solver, which eliminates explicit storage of finite volume matrix elements in favor of "on the fly" computation as needed by the iterative Krylov sequence. Further efficiency is achieved through sparse coupling matrices between the vector and scalar potentials whose non-zero elements arise only in those parts of the model domain where the conductivity gradient is non-zero. Multi-thread parallelization in the QMR solver through OpenMP pragmas is used to reduce the computational cost of its most expensive step: the single matrix-vector product at each iteration. High-level MPI communicators farm independent processes to available compute nodes for simultaneous computation of multi-frequency or multi-transmitter responses.

  14. MAPTEACH | Alaska Division of Geological & Geophysical Surveys

    Science.gov Websites

    Alaska's Cultural Heritage) is a hands-on education program for middle and high school students in Alaska computer-based maps of scientific, cultural and personal significance. The project emphasizes the

  15. A 3D object-based model to simulate highly-heterogeneous, coarse, braided river deposits

    NASA Astrophysics Data System (ADS)

    Huber, E.; Huggenberger, P.; Caers, J.

    2016-12-01

    There is a critical need in hydrogeological modeling for geologically more realistic representation of the subsurface. Indeed, widely-used representations of the subsurface heterogeneity based on smooth basis functions such as cokriging or the pilot-point approach fail at reproducing the connectivity of high permeable geological structures that control subsurface solute transport. To realistically model the connectivity of high permeable structures of coarse, braided river deposits, multiple-point statistics and object-based models are promising alternatives. We therefore propose a new object-based model that, according to a sedimentological model, mimics the dominant processes of floodplain dynamics. Contrarily to existing models, this object-based model possesses the following properties: (1) it is consistent with field observations (outcrops, ground-penetrating radar data, etc.), (2) it allows different sedimentological dynamics to be modeled that result in different subsurface heterogeneity patterns, (3) it is light in memory and computationally fast, and (4) it can be conditioned to geophysical data. In this model, the main sedimentological elements (scour fills with open-framework-bimodal gravel cross-beds, gravel sheet deposits, open-framework and sand lenses) and their internal structures are described by geometrical objects. Several spatial distributions are proposed that allow to simulate the horizontal position of the objects on the floodplain as well as the net rate of sediment deposition. The model is grid-independent and any vertical section can be computed algebraically. Furthermore, model realizations can serve as training images for multiple-point statistics. The significance of this model is shown by its impact on the subsurface flow distribution that strongly depends on the sedimentological dynamics modeled. The code will be provided as a free and open-source R-package.

  16. Conversion of the Bayou Choctaw geological site characterization report to a three-dimensional model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, Joshua S.; Rautman, Christopher Arthur

    2004-02-01

    The geologic model implicit in the original site characterization report for the Bayou Choctaw Strategic Petroleum Reserve Site near Baton Rouge, Louisiana, has been converted to a numerical, computer-based three-dimensional model. The original site characterization model was successfully converted with minimal modifications and use of new information. The geometries of the salt diapir, selected adjacent sedimentary horizons, and a number of faults have been modeled. Models of a partial set of the several storage caverns that have been solution-mined within the salt mass are also included. Collectively, the converted model appears to be a relatively realistic representation of the geologymore » of the Bayou Choctaw site as known from existing data. A small number of geometric inconsistencies and other problems inherent in 2-D vs. 3-D modeling have been noted. Most of the major inconsistencies involve faults inferred from drill hole data only. Modem computer software allows visualization of the resulting site model and its component submodels with a degree of detail and flexibility that was not possible with conventional, two-dimensional and paper-based geologic maps and cross sections. The enhanced visualizations may be of particular value in conveying geologic concepts involved in the Bayou Choctaw Strategic Petroleum Reserve site to a lay audience. A Microsoft WindowsTM PC-based viewer and user-manipulable model files illustrating selected features of the converted model are included in this report.« less

  17. A hierarchical network-based algorithm for multi-scale watershed delineation

    NASA Astrophysics Data System (ADS)

    Castronova, Anthony M.; Goodall, Jonathan L.

    2014-11-01

    Watershed delineation is a process for defining a land area that contributes surface water flow to a single outlet point. It is a commonly used in water resources analysis to define the domain in which hydrologic process calculations are applied. There has been a growing effort over the past decade to improve surface elevation measurements in the U.S., which has had a significant impact on the accuracy of hydrologic calculations. Traditional watershed processing on these elevation rasters, however, becomes more burdensome as data resolution increases. As a result, processing of these datasets can be troublesome on standard desktop computers. This challenge has resulted in numerous works that aim to provide high performance computing solutions to large data, high resolution data, or both. This work proposes an efficient watershed delineation algorithm for use in desktop computing environments that leverages existing data, U.S. Geological Survey (USGS) National Hydrography Dataset Plus (NHD+), and open source software tools to construct watershed boundaries. This approach makes use of U.S. national-level hydrography data that has been precomputed using raster processing algorithms coupled with quality control routines. Our approach uses carefully arranged data and mathematical graph theory to traverse river networks and identify catchment boundaries. We demonstrate this new watershed delineation technique, compare its accuracy with traditional algorithms that derive watershed solely from digital elevation models, and then extend our approach to address subwatershed delineation. Our findings suggest that the open-source hierarchical network-based delineation procedure presented in the work is a promising approach to watershed delineation that can be used summarize publicly available datasets for hydrologic model input pre-processing. Through our analysis, we explore the benefits of reusing the NHD+ datasets for watershed delineation, and find that the our technique offers greater flexibility and extendability than traditional raster algorithms.

  18. Spreadsheet log analysis in subsurface geology

    USGS Publications Warehouse

    Doveton, J.H.

    2000-01-01

    Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.

  19. MyDTW - Dynamic Time Warping program for stratigraphical time series

    NASA Astrophysics Data System (ADS)

    Kotov, Sergey; Paelike, Heiko

    2017-04-01

    One of the general tasks in many geological disciplines is matching of one time or space signal to another. It can be classical correlation between two cores or cross-sections in sedimentology or marine geology. For example, tuning a paleoclimatic signal to a target curve, driven by variations in the astronomical parameters, is a powerful technique to construct accurate time scales. However, these methods can be rather time-consuming and can take ours of routine work even with the help of special semi-automatic software. Therefore, different approaches to automate the processes have been developed during last decades. Some of them are based on classical statistical cross-correlations such as the 'Correlator' after Olea [1]. Another ones use modern ideas of dynamic programming. A good example is as an algorithm developed by Lisiecki and Lisiecki [2] or dynamic time warping based algorithm after Pälike [3]. We introduce here an algorithm and computer program, which are also stemmed from the Dynamic Time Warping algorithm class. Unlike the algorithm of Lisiecki and Lisiecki, MyDTW does not lean on a set of penalties to follow geological logics, but on a special internal structure and specific constrains. It differs also from [3] in basic ideas of implementation and constrains design. The algorithm is implemented as a computer program with a graphical user interface using Free Pascal and Lazarus IDE and available for Windows, Mac OS, and Linux. Examples with synthetic and real data are demonstrated. Program is available for free download at http://www.marum.de/Sergey_Kotov.html . References: 1. Olea, R.A. Expert systems for automated correlation and interpretation of wireline logs // Math Geol (1994) 26: 879. doi:10.1007/BF02083420 2. Lisiecki L. and Lisiecki P. Application of dynamic programming to the correlation of paleoclimate records // Paleoceanography (2002), Volume 17, Issue 4, pp. 1-1, CiteID 1049, doi: 10.1029/2001PA000733 3. Pälike, H. Extending the astronomical calibration of the Geological Time Scale PhD thesis, University of Cambridge, (2002)

  20. 3D fast adaptive correlation imaging for large-scale gravity data based on GPU computation

    NASA Astrophysics Data System (ADS)

    Chen, Z.; Meng, X.; Guo, L.; Liu, G.

    2011-12-01

    In recent years, large scale gravity data sets have been collected and employed to enhance gravity problem-solving abilities of tectonics studies in China. Aiming at the large scale data and the requirement of rapid interpretation, previous authors have carried out a lot of work, including the fast gradient module inversion and Euler deconvolution depth inversion ,3-D physical property inversion using stochastic subspaces and equivalent storage, fast inversion using wavelet transforms and a logarithmic barrier method. So it can be say that 3-D gravity inversion has been greatly improved in the last decade. Many authors added many different kinds of priori information and constraints to deal with nonuniqueness using models composed of a large number of contiguous cells of unknown property and obtained good results. However, due to long computation time, instability and other shortcomings, 3-D physical property inversion has not been widely applied to large-scale data yet. In order to achieve 3-D interpretation with high efficiency and precision for geological and ore bodies and obtain their subsurface distribution, there is an urgent need to find a fast and efficient inversion method for large scale gravity data. As an entirely new geophysical inversion method, 3D correlation has a rapid development thanks to the advantage of requiring no a priori information and demanding small amount of computer memory. This method was proposed to image the distribution of equivalent excess masses of anomalous geological bodies with high resolution both longitudinally and transversely. In order to tranform the equivalence excess masses into real density contrasts, we adopt the adaptive correlation imaging for gravity data. After each 3D correlation imaging, we change the equivalence into density contrasts according to the linear relationship, and then carry out forward gravity calculation for each rectangle cells. Next, we compare the forward gravity data with real data, and comtinue to perform 3D correlation imaging for the redisual gravity data. After several iterations, we can obtain a satisfactoy results. Newly developed general purpose computing technology from Nvidia GPU (Graphics Processing Unit) has been put into practice and received widespread attention in many areas. Based on the GPU programming mode and two parallel levels, five CPU loops for the main computation of 3D correlation imaging are converted into three loops in GPU kernel functions, thus achieving GPU/CPU collaborative computing. The two inner loops are defined as the dimensions of blocks and the three outer loops are defined as the dimensions of threads, thus realizing the double loop block calculation. Theoretical and real gravity data tests show that results are reliable and the computing time is greatly reduced. Acknowledgments We acknowledge the financial support of Sinoprobe project (201011039 and 201011049-03), the Fundamental Research Funds for the Central Universities (2010ZY26 and 2011PY0183), the National Natural Science Foundation of China (41074095) and the Open Project of State Key Laboratory of Geological Processes and Mineral Resources (GPMR0945).

  1. Property Grids for the Kansas High Plains Aquifer from Water Well Drillers' Logs

    NASA Astrophysics Data System (ADS)

    Bohling, G.; Adkins-Heljeson, D.; Wilson, B. B.

    2017-12-01

    Like a number of state and provincial geological agencies, the Kansas Geological Survey hosts a database of water well drillers' logs, containing the records of sediments and lithologies characterized during drilling. At the moment, the KGS database contains records associated with over 90,000 wells statewide. Over 60,000 of these wells are within the High Plains aquifer (HPA) in Kansas, with the corresponding logs containing descriptions of over 500,000 individual depth intervals. We will present grids of hydrogeological properties for the Kansas HPA developed from this extensive, but highly qualitative, data resource. The process of converting the logs into quantitative form consists of first translating the vast number of unique (and often idiosyncratic) sediment descriptions into a fairly comprehensive set of standardized lithology codes and then mapping the standardized lithologies into a smaller number of property categories. A grid is superimposed on the region and the proportion of each property category is computed within each grid cell, with category proportions in empty grid cells computed by interpolation. Grids of properties such as hydraulic conductivity and specific yield are then computed based on the category proportion grids and category-specific property values. A two-dimensional grid is employed for this large-scale, regional application, with category proportions averaged between two surfaces, such as bedrock and the water table at a particular time (to estimate transmissivity at that time) or water tables at two different times (to estimate specific yield over the intervening time period). We have employed a sequence of water tables for different years, based on annual measurements from an extensive network of wells, providing an assessment of temporal variations in the vertically averaged aquifer properties resulting from water level variations (primarily declines) over time.

  2. Web services in the U.S. geological survey streamstats web application

    USGS Publications Warehouse

    Guthrie, J.D.; Dartiguenave, C.; Ries, Kernell G.

    2009-01-01

    StreamStats is a U.S. Geological Survey Web-based GIS application developed as a tool for waterresources planning and management, engineering design, and other applications. StreamStats' primary functionality allows users to obtain drainage-basin boundaries, basin characteristics, and streamflow statistics for gaged and ungaged sites. Recently, Web services have been developed that provide the capability to remote users and applications to access comprehensive GIS tools that are available in StreamStats, including delineating drainage-basin boundaries, computing basin characteristics, estimating streamflow statistics for user-selected locations, and determining point features that coincide with a National Hydrography Dataset (NHD) reach address. For the state of Kentucky, a web service also has been developed that provides users the ability to estimate daily time series of drainage-basin average values of daily precipitation and temperature. The use of web services allows the user to take full advantage of the datasets and processes behind the Stream Stats application without having to develop and maintain them. ?? 2009 IEEE.

  3. The quest for the perfect gravity anomaly: Part 2 - Mass effects and anomaly inversion

    USGS Publications Warehouse

    Keller, Gordon R.; Hildenbrand, T.G.; Hinze, W. J.; Li, X.; Ravat, D.; Webring, M.

    2006-01-01

    Gravity anomalies have become an important tool for geologic studies since the widespread use of high-precision gravimeters after the Second World War. More recently the development of instrumentation for airborne gravity observations, procedures for acquiring data from satellite platforms, the readily available Global Positioning System for precise vertical and horizontal control, improved global data bases, and enhancement of computational hardware and software have accelerated the use of the gravity method. As a result, efforts are being made to improve the gravity databases that are made available to the geoscience community by broadening their observational holdings and increasing the accuracy and precision of the included data. Currently the North American Gravity Database as well as the individual databases of Canada, Mexico, and the United States of America are being revised using new formats and standards. The objective of this paper is to describe the use of the revised standards for gravity data processing and modeling and there impact on geological interpretations. ?? 2005 Society of Exploration Geophysicists.

  4. Identification of different geologic units using fuzzy constrained resistivity tomography

    NASA Astrophysics Data System (ADS)

    Singh, Anand; Sharma, S. P.

    2018-01-01

    Different geophysical inversion strategies are utilized as a component of an interpretation process that tries to separate geologic units based on the resistivity distribution. In the present study, we present the results of separating different geologic units using fuzzy constrained resistivity tomography. This was accomplished using fuzzy c means, a clustering procedure to improve the 2D resistivity image and geologic separation within the iterative minimization through inversion. First, we developed a Matlab-based inversion technique to obtain a reliable resistivity image using different geophysical data sets (electrical resistivity and electromagnetic data). Following this, the recovered resistivity model was converted into a fuzzy constrained resistivity model by assigning the highest probability value of each model cell to the cluster utilizing fuzzy c means clustering procedure during the iterative process. The efficacy of the algorithm is demonstrated using three synthetic plane wave electromagnetic data sets and one electrical resistivity field dataset. The presented approach shows improvement on the conventional inversion approach to differentiate between different geologic units if the correct number of geologic units will be identified. Further, fuzzy constrained resistivity tomography was performed to examine the augmentation of uranium mineralization in the Beldih open cast mine as a case study. We also compared geologic units identified by fuzzy constrained resistivity tomography with geologic units interpreted from the borehole information.

  5. Analysis of GPS Data Using Near Real-Time Data from the Volcano Exploration Project in the Community College Classroom (Invited)

    NASA Astrophysics Data System (ADS)

    House, M.; Nagy-Shadman, E.; Wilbur, B.

    2010-12-01

    Using real-time data or near-real-time data in the classroom is an exciting prospect in Introductory Physical Geology courses, especially since it promises to offer students a chance to experience the excitement and uncertainty associated with the study of the natural world that appeals to so many of their instructors. However, there are several obstacles to this approach in the community college. Namely, many introductory level community college earth science courses have no mathematics prerequisites; as such, a typical classroom may include a wide range of mathematical skills and many students may be unable to participate in the analysis of “real” data. Further, reliable computer access to websites offering real-time data can be spotty at some institutions and for some students on home computers. In response to this problem we have created a multipart volcano monitoring activity based on the USGS Volcano Exploration Project: Pu`u `O`o (VEPP) website. This activity is designed for freshman or sophomore level courses in Introductory Geology or Geological Hazards for non-majors. No prior math skills are assumed; the activity can be completed without prior knowledge of GPS data, volcano monitoring or Hawaiian geology. The activity consists of three parts: (1) a background lecture on basic geology of volcanoes like Kilauea and use of GPS in volcano monitoring; (2) a lab activity or a homework assignment based on near real-time data downloaded from the VEPP website; and (3) a group wrap-up that focuses on real-time data by exploring other aspects of the VEPP website. The lab activity requires examination of downloaded GPS time series data for a specified time period (this can be modified as desired by the instructor), computation of displacements, graphing of displacement vectors for identified time intervals and determination of actual motion vectors, followed by a discussion of the displacements observed. These activities are interspersed by guided questions. This activity will be tested for the first time in Introductory Physical Geology courses at Pasadena City College during Fall 2010.

  6. The role of visualization in learning from computer-based images

    NASA Astrophysics Data System (ADS)

    Piburn, Michael D.; Reynolds, Stephen J.; McAuliffe, Carla; Leedy, Debra E.; Birk, James P.; Johnson, Julia K.

    2005-05-01

    Among the sciences, the practice of geology is especially visual. To assess the role of spatial ability in learning geology, we designed an experiment using: (1) web-based versions of spatial visualization tests, (2) a geospatial test, and (3) multimedia instructional modules built around QuickTime Virtual Reality movies. Students in control and experimental sections were administered measures of spatial orientation and visualization, as well as a content-based geospatial examination. All subjects improved significantly in their scores on spatial visualization and the geospatial examination. There was no change in their scores on spatial orientation. A three-way analysis of variance, with the geospatial examination as the dependent variable, revealed significant main effects favoring the experimental group and a significant interaction between treatment and gender. These results demonstrate that spatial ability can be improved through instruction, that learning of geological content will improve as a result, and that differences in performance between the genders can be eliminated.

  7. The Impact of Solid Surface Features on Fluid-Fluid Interface Configuration

    NASA Astrophysics Data System (ADS)

    Araujo, J. B.; Brusseau, M. L. L.

    2017-12-01

    Pore-scale fluid processes in geological media are critical for a broad range of applications such as radioactive waste disposal, carbon sequestration, soil moisture distribution, subsurface pollution, land stability, and oil and gas recovery. The continued improvement of high-resolution image acquisition and processing have provided a means to test the usefulness of theoretical models developed to simulate pore-scale fluid processes, through the direct quantification of interfaces. High-resolution synchrotron X-ray microtomography is used in combination with advanced visualization tools to characterize fluid distributions in natural geologic media. The studies revealed the presence of fluid-fluid interface associated with macroscopic features on the surfaces of the solids such as pits and crevices. These features and respective fluid interfaces, which are not included in current theoretical or computational models, may have a significant impact on accurate simulation and understanding of multi-phase flow, energy, heat and mass transfer processes.

  8. Probabilistic risk assessment for CO2 storage in geological formations: robust design and support for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang

    2010-05-01

    CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces

  9. Improving the Laboratory Experience for Introductory Geology Students Using Active Learning and Evidence-Based Reform

    NASA Astrophysics Data System (ADS)

    Oien, R. P.; Anders, A. M.; Long, A.

    2014-12-01

    We present the initial results of transitioning laboratory activities in an introductory physical geology course from passive to active learning. Educational research demonstrates that student-driven investigations promote increased engagement and better retention of material. Surveys of students in introductory physical geology helped us identify lab activities which do not engage students. We designed new lab activities to be more collaborative, open-ended and "hands-on". Student feedback was most negative for lab activities which are computer-based. In response, we have removed computers from the lab space and increased the length and number of activities involving physical manipulation of samples and models. These changes required investment in lab equipment and supplies. New lab activities also include student-driven exploration of data with open-ended responses. Student-evaluations of the new lab activities will be compiled during Fall 2014 and Spring 2015 to allow us to measure the impact of the changes on student satisfaction and we will report on our findings to date. Modification of this course has been sponsored by NSF's Widening Implementation & Demonstration of Evidence Based Reforms (WIDER) program through grant #1347722 to the University of Illinois. The overall goal of the grant is to increase retention and satisfaction of STEM students in introductory courses.

  10. Working with Specify in a Paleo-Geological Context

    NASA Astrophysics Data System (ADS)

    Molineux, A.; Thompson, A. C.; Appleton, L.

    2014-12-01

    For geological collections with limited funding an open source relational database provides an opportunity to digitize specimens and related data. At the Non-vertebrate Paleontology Lab, a large mixed paleo and geological repository on a restricted budget, we opted for one such database, Specify. Initially created at Kansas University for neontological collections and based on a single computer, Specify has moved into the networked scene and will soon be web-based as Specify 7. We currently use the server version of Specify 6, networked to all computers in the lab each running a desktop client, often with six users at any one time. Along with improved access there have been great efforts to broaden the applicability of this database to other disciplines. Current developments are of great importance to us because they focus on the geological aspects of lithostratigraphy and chronostratigaphy and their relationship to other variables. Adoption of this software has required constant change as we move to take advantage of the great improvements. We enjoy the interaction with the developers and their willingness to listen and consider our issues. Here we discuss some of the ways in which we have fashioned Specify into a database that provides us with the flexibility that we need without removing the ability to share our data with other aggregators through accepted protocols. We discuss the customization of forms, the attachment of media and tracking of original media files, our efforts to incorporate geological specimens, and our plans to link the individual specimen record GUIDs to an IGSN numbers and thence to future connections to data derived from our specimens.

  11. Biological and geochemical data along Indian Point, Vermilion Bay, Louisiana

    USGS Publications Warehouse

    Richwine, Kathryn A.; Marot, Marci E.; Smith, Christopher G.; Osterman, Lisa E.; Adams, C. Scott

    2015-09-14

    This publication was prepared by an agency of the United States Government. Although these data were processed successfully on a computer system at the U.S. Geological Survey, no warranty expressed or implied is made regarding the display or utility of the data on any other system, or for general or scientific purposes, nor shall the act of distribution imply any such warranty. The U.S. Geological Survey shall not be held liable for improper or incorrect use of the data described and (or) contained herein. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency

  12. Taking geoscience to the IMAX: 3D and 4D insight into geological processes using micro-CT

    NASA Astrophysics Data System (ADS)

    Dobson, Katherine; Dingwell, Don; Hess, Kai-Uwe; Withers, Philip; Lee, Peter; Pistone, Mattia; Fife, Julie; Atwood, Robert

    2015-04-01

    Geology is inherently dynamic, and full understanding of any geological system can only be achieved by considering the processes by which change occurs. Analytical limitations mean understanding has largely developed from ex situ analyses of the products of geological change, rather than of the processes themselves. Most methods essentially utilise "snap shot" sampling: and from thin section petrography to high resolution crystal chemical stratigraphy and field volcanology, we capture an incomplete view of a spatially and temporally variable system. Even with detailed experimental work, we can usually only analyse samples before and after we perform an experiment, as routine analysis methods are destructive. Serial sectioning and quenched experiments stopped at different stages can give some insight into the third and fourth dimension, but the true scaling of the processes from the laboratory to the 4D (3D + time) geosphere is still poorly understood. Micro computed tomography (XMT) can visualise the internal structures and spatial associations within geological samples non-destructively. With image resolutions of between 200 microns and 50 nanometres, tomography has the ability to provide a detailed sample assessment in 3D, and quantification of mineral associations, porosity, grain orientations, fracture alignments and many other features. This allows better understanding of the role of the complex geometries and associations within the samples, but the challenge of capturing the processes that generate and modify these structures remains. To capture processes, recent work has focused on developing experimental capability for in situ experiments on geological materials. Data presented will showcase examples from recent experiments where high speed synchrotron x-ray tomography has been used to acquire each 3D image in under 2 seconds. We present a suite of studies that showcase how it is now possible to take quantification of many geological processed into 3D and 4D. This will include tracking the interactions between bubbles and crystals in a deforming magma, the dissolution of individual mineral grains from low grade ores, and quantification of three phase flow in sediments and soils. Our aim is to demonstrate how XMT can provide new insight into dynamic processes in all geoscience disciplines, and give you some insight into where 4D geoscience could take us next.

  13. Littoral transport rates in the Santa Barbara Littoral Cell: a process-based model analysis

    USGS Publications Warehouse

    Elias, E. P. L.; Barnard, Patrick L.; Brocatus, John

    2009-01-01

    Identification of the sediment transport patterns and pathways is essential for sustainable coastal zone management of the heavily modified coastline of Santa Barbara and Ventura County (California, USA). A process-based model application, based on Delft3D Online Morphology, is used to investigate the littoral transport potential along the Santa Barbara Littoral Cell (between Point Conception and Mugu Canyon). An advanced optimalization procedure is applied to enable annual sediment transport computations by reducing the ocean wave climate in 10 wave height - direction classes. Modeled littoral transport rates compare well with observed dredging volumes, and erosion or sedimentation hotspots coincide with the modeled divergence and convergence of the transport gradients. Sediment transport rates are strongly dependent on the alongshore variation in wave height due to wave sheltering, diffraction and focusing by the Northern Channel Islands, and the local orientation of the geologically-controlled coastline. Local transport gradients exceed the net eastward littoral transport, and are considered a primary driver for hot-spot erosion.

  14. Accessible Earth: Enhancing diversity in the Geosciences through accessible course design and Experiential Learning Theory

    NASA Astrophysics Data System (ADS)

    Bennett, Rick; Lamb, Diedre

    2017-04-01

    The tradition of field-based instruction in the geoscience curriculum, which culminates in a capstone geological field camp, presents an insurmountable barrier to many disabled students who might otherwise choose to pursue geoscience careers. There is a widespread perception that success as a practicing geoscientist requires direct access to outcrops and vantage points available only to those able to traverse inaccessible terrain. Yet many modern geoscience activities are based on remotely sensed geophysical data, data analysis, and computation that take place entirely from within the laboratory. To challenge the perception of geoscience as a career option only for the able bodied, we have created the capstone Accessible Earth Study Abroad Program, an alternative to geologic field camp with a focus on modern geophysical observation systems, computational thinking, and data science. In this presentation, we will report on the theoretical bases for developing the course, our experiences in teaching the course to date, and our plan for ongoing assessment, refinement, and dissemination of the effectiveness of our efforts.

  15. Using Google Earth and Satellite Imagery to Foster Place-Based Teaching in an Introductory Physical Geology Course

    ERIC Educational Resources Information Center

    Monet, Julie; Greene, Todd

    2012-01-01

    Students in an introductory physical geology course often have difficulty making connections between basic course topics and assembling key concepts (beyond textbook examples) to interpret how geologic processes shape the characteristics of the local and regional natural environment. As an approach to address these issues, we designed and…

  16. Mapping NEHRP VS30 site classes

    USGS Publications Warehouse

    Holzer, T.L.; Padovani, A.C.; Bennett, M.J.; Noce, T.E.; Tinsley, J. C.

    2005-01-01

    Site-amplification potential in a 140-km2 area on the eastern shore of San Francisco Bay, California, was mapped with data from 210 seismic cone penetration test (SCPT) soundings. NEHRP VS30 values were computed on a 50-m grid by both taking into account the thickness and using mean values of locally measured shear-wave velocities of shallow geologic units. The resulting map of NEHRP VS30 site classes differs from other published maps that (1) do not include unit thickness and (2) are based on regional compilations of velocity. Although much of the area in the new map is now classified as NEHRP Site Class D, the velocities of the geologic deposits within this area are either near the upper or lower VS30 boundary of Class D. If maps of NEHRP site classes are to be based on geologic maps, velocity distributions of geologic units may need to be considered in the definition of VS30 boundaries of NEHRP site classes. ?? 2005, Earthquake Engineering Research Institute.

  17. Computer vision enhances mobile eye-tracking to expose expert cognition in natural-scene visual-search tasks

    NASA Astrophysics Data System (ADS)

    Keane, Tommy P.; Cahill, Nathan D.; Tarduno, John A.; Jacobs, Robert A.; Pelz, Jeff B.

    2014-02-01

    Mobile eye-tracking provides the fairly unique opportunity to record and elucidate cognition in action. In our research, we are searching for patterns in, and distinctions between, the visual-search performance of experts and novices in the geo-sciences. Traveling to regions resultant from various geological processes as part of an introductory field studies course in geology, we record the prima facie gaze patterns of experts and novices when they are asked to determine the modes of geological activity that have formed the scene-view presented to them. Recording eye video and scene video in natural settings generates complex imagery that requires advanced applications of computer vision research to generate registrations and mappings between the views of separate observers. By developing such mappings, we could then place many observers into a single mathematical space where we can spatio-temporally analyze inter- and intra-subject fixations, saccades, and head motions. While working towards perfecting these mappings, we developed an updated experiment setup that allowed us to statistically analyze intra-subject eye-movement events without the need for a common domain. Through such analyses we are finding statistical differences between novices and experts in these visual-search tasks. In the course of this research we have developed a unified, open-source, software framework for processing, visualization, and interaction of mobile eye-tracking and high-resolution panoramic imagery.

  18. Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ampomah, William; Balch, Robert; Will, Robert

    This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less

  19. Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty

    DOE PAGES

    Ampomah, William; Balch, Robert; Will, Robert; ...

    2017-07-01

    This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less

  20. Characterization of Sedimentary Deposits Using usSEABED for Large-scale Mapping, Modeling and Research of U.S.Continental Margins

    NASA Astrophysics Data System (ADS)

    Williams, S. J.; Reid, J. A.; Arsenault, M. A.; Jenkins, C.

    2006-12-01

    Geologic maps of offshore areas containing detailed morphologic features and sediment character can serve many scientific and operational purposes. Such maps have been lacking, but recent computer technology and software to capture diverse marine data are offering promise. Continental margins, products of complex geologic history and dynamic oceanographic processes, dominated by the Holocene marine transgression, contain landforms which provide a variety of important functions: critical habitats for fish, ship navigation, national defense, and engineering activities (i.e., oil and gas platforms, pipeline and cable routes, wind-energy sites) and contain important sedimentary records. Some shelf areas also contain sedimentary deposits such as sand and gravel, regarded as potential aggregate resources for mitigating coastal erosion, reducing vulnerability to hazards, and restoring ecosystems. Because coastal and offshore areas are increasingly important, knowledge of the framework geology and marine processes is useful to many. Especially valuable are comprehensive and integrated digital databases based on data from original sources in the marine community. Products of interest are GIS maps containing thematic information such as seafloor physiography, geology, sediment character and texture, seafloor roughness, and geotechnical engineering properties. These map products are useful to scientists modeling nearshore and shelf processes as well as planners and managers. The USGS with partners is leading a Nation-wide program to gather a wide variety of extant marine geologic data into the usSEABED system (http://walrus.wr.usgs/usseabed). This provides a centralized, fully integrated digital database of marine geologic data collected over the past 50 years by USGS, other federal and state agencies, universities and private companies. To date, approximately 325,000 data points from the U.S. EEZ reside in usSEABED. The usSEABED, which combines a broad array of physical data and information (both analytical and descriptive) about the sea floor, including sediment textural, statistical, geochemical, geophysical, and compositional information, is available to the marine community through USGS Data Series publications. Three DS reports for the Atlantic (DS-118), Gulf of Mexico (DS-146) and Pacific(DS-182) were published in 2006 and reports for HI and AK are forthcoming. The use of usSEABED and derivative map products are part of ongoing USGS efforts to conduct regional assessments of potential marine sand and gravel resources, map benthic habitats, and support research in understanding seafloor character and mobility, transport processes and natural resources.

  1. Comparison of methods used to estimate conventional undiscovered petroleum resources: World examples

    USGS Publications Warehouse

    Ahlbrandt, T.S.; Klett, T.R.

    2005-01-01

    Various methods for assessing undiscovered oil, natural gas, and natural gas liquid resources were compared in support of the USGS World Petroleum Assessment 2000. Discovery process, linear fractal, parabolic fractal, engineering estimates, PETRIMES, Delphi, and the USGS 2000 methods were compared. Three comparisons of these methods were made in: (1) the Neuquen Basin province, Argentina (different assessors, same input data); (2) provinces in North Africa, Oman, and Yemen (same assessors, different methods); and (3) the Arabian Peninsula, Arabian (Persian) Gulf, and North Sea (different assessors, different methods). A fourth comparison (same assessors, same assessment methods but different geologic models), between results from structural and stratigraphic assessment units in the North Sea used only the USGS 2000 method, and hence compared the type of assessment unit rather than the method. In comparing methods, differences arise from inherent differences in assumptions regarding: (1) the underlying distribution of the parent field population (all fields, discovered and undiscovered), (2) the population of fields being estimated; that is, the entire parent distribution or the undiscovered resource distribution, (3) inclusion or exclusion of large outlier fields; (4) inclusion or exclusion of field (reserve) growth, (5) deterministic or probabilistic models, (6) data requirements, and (7) scale and time frame of the assessment. Discovery process, Delphi subjective consensus, and the USGS 2000 method yield comparable results because similar procedures are employed. In mature areas such as the Neuquen Basin province in Argentina, the linear and parabolic fractal and engineering methods were conservative compared to the other five methods and relative to new reserve additions there since 1995. The PETRIMES method gave the most optimistic estimates in the Neuquen Basin. In less mature areas, the linear fractal method yielded larger estimates relative to other methods. A geologically based model, such as one using the total petroleum system approach, is preferred in that it combines the elements of petroleum source, reservoir, trap and seal with the tectono-stratigraphic history of basin evolution with petroleum resource potential. Care must be taken to demonstrate that homogeneous populations in terms of geology, geologic risk, exploration, and discovery processes are used in the assessment process. The USGS 2000 method (7th Approximation Model, EMC computational program) is robust; that is, it can be used in both mature and immature areas, and provides comparable results when using different geologic models (e.g. stratigraphic or structural) with differing amounts of subdivisions, assessment units, within the total petroleum system. ?? 2005 International Association for Mathematical Geology.

  2. Isopach map of the interval from surface elevation to the top of the Pennsylvanian and Permian Minnelusa Formation and equivalents, Powder River basin, Wyoming and Montana

    USGS Publications Warehouse

    Crysdale, B.L.

    1990-01-01

    This map is one in a series of U.S. Geological Survey Miscellaneous Field Studies (MF) maps showing computer-generated structure contours, isopachs, and cross sections of selected formations in the Powder River basin, Wyoming and Montana. The map and cross sections were constructed from information stored in a U.S. Geological Survey Evolution of Sedimentary Basins data base. This data base contains picks of geologic formation and (or) unit tops and bases determined from electric resistivity and gamma-ray logs of 8,592 wells penetrating Tertiary and older rocks in the Powder River basin. Well completion cards (scout tickets) were reviewed and compared with copies of all logs, and formation or unit contacts determined by N. M. Denson, D.L. Macke, R. R. Schumann and others. This isopach map is based on information from 1,480 of these wells that penetrate the Minnelusa Formation and equivalents.

  3. Map showing contours on the top of the Pennsylvanian and Permian Minnelusa Formation and equivalents, Powder River basin, Wyoming and Montana

    USGS Publications Warehouse

    Crysdale, B.L.

    1990-01-01

    This map is one in a series of U.S. Geological Survey Miscellaneous Field Studies (MF) maps showing computer-generated structure contours, isopachs, and cross sections of selected formations in the Powder River basin, Wyoming and Montana. The map and cross sections were constructed from information stored in a U.S. Geological Survey Evolution of Sedimentary Basins data base. This data base contains picks of geologic formation and (or) unit tops and bases determined from electric resistivity and gamma-ray logs of 8,592 wells penetrating Tertiary and older rocks in the Powder River basin. Well completion cards (scout tickets) were reviewed and compared with copies of all logs, and formation or unit contacts determined by N. M. Denson, D.L. Macke, R. R. Schumann and others. This isopach map is based on information from 1,480 of these wells that penetrate the Minnelusa Formation and equivalents.

  4. A Computer-Assisted Laboratory Sequence for Petroleum Geology.

    ERIC Educational Resources Information Center

    Lumsden, David N.

    1979-01-01

    Describes a competitive oil-play game for petroleum geology students. It is accompanied by a computer program written in interactive Fortran. The program, however, is not essential, but useful for adding more interest. (SA)

  5. Fractional Steps methods for transient problems on commodity computer architectures

    NASA Astrophysics Data System (ADS)

    Krotkiewski, M.; Dabrowski, M.; Podladchikov, Y. Y.

    2008-12-01

    Fractional Steps methods are suitable for modeling transient processes that are central to many geological applications. Low memory requirements and modest computational complexity facilitates calculations on high-resolution three-dimensional models. An efficient implementation of Alternating Direction Implicit/Locally One-Dimensional schemes for an Opteron-based shared memory system is presented. The memory bandwidth usage, the main bottleneck on modern computer architectures, is specially addressed. High efficiency of above 2 GFlops per CPU is sustained for problems of 1 billion degrees of freedom. The optimized sequential implementation of all 1D sweeps is comparable in execution time to copying the used data in the memory. Scalability of the parallel implementation on up to 8 CPUs is close to perfect. Performing one timestep of the Locally One-Dimensional scheme on a system of 1000 3 unknowns on 8 CPUs takes only 11 s. We validate the LOD scheme using a computational model of an isolated inclusion subject to a constant far field flux. Next, we study numerically the evolution of a diffusion front and the effective thermal conductivity of composites consisting of multiple inclusions and compare the results with predictions based on the differential effective medium approach. Finally, application of the developed parabolic solver is suggested for a real-world problem of fluid transport and reactions inside a reservoir.

  6. Use of microcomputer in mapping depth of stratigraphic horizons in National Petroleum Reserve in Alaska

    USGS Publications Warehouse

    Payne, Thomas G.

    1982-01-01

    REGIONAL MAPPER is a menu-driven system in the BASIC language for computing and plotting (1) time, depth, and average velocity to geologic horizons, (2) interval time, thickness, and interval velocity of stratigraphic intervals, and (3) subcropping and onlapping intervals at unconformities. The system consists of three programs: FILER, TRAVERSER, and PLOTTER. A control point is a shot point with velocity analysis or a shot point at or near a well with velocity check-shot survey. Reflection time to and code number of seismic horizons are filed by digitizing tablet from record sections. TRAVERSER starts at a point of geologic control and, in traversing to another, parallels seismic events, records loss of horizons by onlap and truncation, and stores reflection time for geologic horizons at traversed shot points. TRAVERSER is basically a phantoming procedure. Permafrost thickness and velocity variations, buried canyons with low-velocity fill, and error in seismically derived velocity cause velocity anomalies that complicate depth mapping. Two depths to the top of the pebble is based shale are computed for each control point. One depth, designated Zs on seismically derived velocity. The other (Zw) is based on interval velocity interpolated linearly between wells and multiplied by interval time (isochron) to give interval thickness. Z w is computed for all geologic horizons by downward summation of interval thickness. Unknown true depth (Z) to the pebble shale may be expressed as Z = Zs + es and Z = Zw + ew where the e terms represent error. Equating the two expressions gives the depth difference D = Zs + Zw = ew + es A plot of D for the top of the pebble shale is readily contourable but smoothing is required to produce a reasonably simple surface. Seismically derived velocity used in computing Zs includes the effect of velocity anomalies but is subject to some large randomly distributed errors resulting in depth errors (es). Well-derived velocity used in computing Zw does not include the effect of velocity anomalies, but the error (ew) should reflect these anomalies and should be contourable (non-random). The D surface as contoured with smoothing is assumed to represent ew, that is, the depth effect of variations in permafrost thickness and velocity and buried canyon depth. Estimated depth (Zest) to each geologic horizon is the sum of Z w for that horizon and a constant e w as contoured for the pebble shale, which is the first highly continuous seismic horizon below the zone of anomalous velocity. Results of this 'depthing' procedure are compared with those of Tetra Tech, Inc., the subcontractor responsible for geologic and geophysical interpretation and mapping.

  7. Computer-aided boundary delineation of agricultural lands

    NASA Technical Reports Server (NTRS)

    Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt

    1989-01-01

    The National Agricultural Statistics Service of the United States Department of Agriculture (USDA) presently uses labor-intensive aerial photographic interpretation techniques to divide large geographical areas into manageable-sized units for estimating domestic crop and livestock production. Prototype software, the computer-aided stratification (CAS) system, was developed to automate the procedure, and currently runs on a Sun-based image processing system. With a background display of LANDSAT Thematic Mapper and United States Geological Survey Digital Line Graph data, the operator uses a cursor to delineate agricultural areas, called sampling units, which are assigned to strata of land-use and land-cover types. The resultant stratified sampling units are used as input into subsequent USDA sampling procedures. As a test, three counties in Missouri were chosen for application of the CAS procedures. Subsequent analysis indicates that CAS was five times faster in creating sampling units than the manual techniques were.

  8. Modelling of processes occurring in deep geological repository - Development of new modules in the GoldSim environment

    NASA Astrophysics Data System (ADS)

    Vopálka, D.; Lukin, D.; Vokál, A.

    2006-01-01

    Three new modules modelling the processes that occur in a deep geological repository have been prepared in the GoldSim computer code environment (using its Transport Module). These modules help to understand the role of selected parameters in the near-field region of the final repository and to prepare an own complex model of the repository behaviour. The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The corrosion module describes corrosion of canisters made of carbon steel and transport of corrosion products in the near-field region. This module computes balance equations between dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The diffusion module that includes also non-linear form of the interaction isotherm can be used for an evaluation of small-scale diffusion experiments.

  9. A GIS-based susceptibility map for landslides at the Franconian Alb, Germany

    NASA Astrophysics Data System (ADS)

    Jaeger, Daniel; Wilde, Martina; Lorenz, Michael; Terhorst, Birgit; Neuhäuser, Bettina; Damm, Bodo; Bemm, Stefan

    2014-05-01

    In general, slopes of cuesta scarps like the Franconian Alb are highly prone to slide activity due to susceptible geological and geomorphological conditions. The geological setting with alternating permeable and non-permeable bedrock results in the characteristic cuesta landforms of almost flat backslopes and steeper front slopes. Furthermore, this bipartite structure leads to a strong disposition for mass movements. The slopes of the study area near the town of Ebermannstadt are affected by different types of mass movements, such as topples, slides, lateral spreads and flows, either in single or in combined occurrence. In the years 1625, 1957, 1961 and 1979, four large landslides took place in the area of Ebermannstadt, reaching close to the town limits and causing major destructions to traffic facilities. In the study area, slopes are covered by debris and slide masses, thus they are prone to remobilization and further mass movements. In order to assess hazardous areas, a GIS-based susceptibility modelling was generated for the study area. The susceptibtibility modeling was processed with the slope stability model SINMAP (Stability Index Mapping), developed by TARBOTON (1997) and PACK et al. (1999). As SINMAP was particularly designed to model shallow translational slides, it should be well designed for describing the conditions of the study area in a sufficient way. SINMAP is based on the "infinite slope stability model" by HAMMONT et al. (1992) and MONTGOMERY & DIETRICH (1994), which focuses on the relation of stabilizing (cohesiveness, friction angle) and destabilizing (gravitation) factors on a plain surface. By adding a slope gradient, as well as soil mechanical and climatical data, indices of slope stabilities are calculated. For a more detailed modeling of the slope conditions, SINMAP computes different "calibration regions", which merge similar parameters of soil, land-use, vegetation, and geology. Due to the fact that vegetation, land-use, and soils only show minor differences on the slopes of the study area and therefore have no significant impact on the slope stability in the applied modeling, geology becomes the most important input factor. Therefore, first calculations are based on the main geological units drawn in the geological map, such as limestone, clay, sandstone and debris. However, the results obtained were not sufficient, as several areas of known instability were calculated as rather stable slope areas. This was due to an underrepresentation of debris and slide masses in the geological map and the models' calculation. In order to improve the modeling process, geological standard units were further differentiated, with the debris cover and its soil mechanical parameters considered in greater detail. These adjustments not only led to significantly improved modeling results in the study area, but also create a more realistic basis for SINMAP calculations in all cuesta landscapes. HAMMONT, C., HALL, D., MILLER, S., SWETIK, P. (1992): Level I Stability Analysis (LISA) Documentation for version 2.0. General Technical report INT-285, USDA Forest Service Intermountain Research Station 190p. MONTGOMERY, D. R. & DIETRICH, W. E. (1994): A physically based model for the topographic control on shallow landsliding. Water Resources Research 30/4, p 1153-1171. PACK, R. T., TARBOTON, D. G. & GOODWIN, C.N. (1999): SINMAP - A Stability Index Approach to Terrain Stability Hazard Mapping, User's Manual. Forest Renewal B.C., 65p. TARBOTON, G.D. (1997): A new method for the determination of flow directions and upslope areas in grid digital elevation models. Water Resources Research 33/2, p 309-319.

  10. Building awareness of their origins from the knowledge of the territory

    NASA Astrophysics Data System (ADS)

    Gigante, Francesco

    2017-04-01

    Brindisi (main town in the Salento peninsula, Puglia, Italy) is an important natural harbor known for centuries as the "Valigia delle Indie". The whole coast that develops in NW and SE deserves great attention to structural characteristics, values, physical-environmental relationships, nature, history, settlement and landscape. An increased sense of citizenship in young people of 11-13 years of age range is built starting from the knowledge of the territory understood as a set of experiences and climbing habits from individual perception to the collective. The 3 years of junior high school is proposed to the students with a general overview of the landscape of the province: the study of the coastal environment as a result of geological processes and ecosystem relationships is a key part of the curriculum. This specific comprehensive teaching programs in the 2nd and 3rd year include the study of ecosystems and a general overview of the geology of the planet Earth. The extracurricular part of the program consists of projects aimed at the knowledge of the area by hiking the sites and use of various technologies, both physical-chemical (environmental analysis) and computer (for the geographical classification). In particular in the coming months they will be examining the lagoon environment analysis of the biotope Torre Guaceto, a wetland safeguarded according to the Ramsar Convention 1972. For the computer part every year a literacy course is organized which includes the knowledge of the use of GIS and image processing. During these winter months also some classes will prepare for participating in a race at the national level based on knowledge of ecosystems.

  11. Estimates of ground-water recharge based on streamflow-hydrograph methods: Pennsylvania

    USGS Publications Warehouse

    Risser, Dennis W.; Conger, Randall W.; Ulrich, James E.; Asmussen, Michael P.

    2005-01-01

    This study, completed by the U.S. Geological Survey (USGS) in cooperation with the Pennsylvania Department of Conservation and Natural Resources, Bureau of Topographic and Geologic Survey (T&GS), provides estimates of ground-water recharge for watersheds throughout Pennsylvania computed by use of two automated streamflow-hydrograph-analysis methods--PART and RORA. The PART computer program uses a hydrograph-separation technique to divide the streamflow hydrograph into components of direct runoff and base flow. Base flow can be a useful approximation of recharge if losses and interbasin transfers of ground water are minimal. The RORA computer program uses a recession-curve displacement technique to estimate ground-water recharge from each storm period indicated on the streamflow hydrograph. Recharge estimates were made using streamflow records collected during 1885-2001 from 197 active and inactive streamflow-gaging stations in Pennsylvania where streamflow is relatively unaffected by regulation. Estimates of mean-annual recharge in Pennsylvania computed by the use of PART ranged from 5.8 to 26.6 inches; estimates from RORA ranged from 7.7 to 29.3 inches. Estimates from the RORA program were about 2 inches greater than those derived from the PART program. Mean-monthly recharge was computed from the RORA program and was reported as a percentage of mean-annual recharge. On the basis of this analysis, the major ground-water recharge period in Pennsylvania typically is November through May; the greatest monthly recharge typically occurs in March.

  12. Methods for computing water-quality loads at sites in the U.S. Geological Survey National Water Quality Network

    USGS Publications Warehouse

    Lee, Casey J.; Murphy, Jennifer C.; Crawford, Charles G.; Deacon, Jeffrey R.

    2017-10-24

    The U.S. Geological Survey publishes information on concentrations and loads of water-quality constituents at 111 sites across the United States as part of the U.S. Geological Survey National Water Quality Network (NWQN). This report details historical and updated methods for computing water-quality loads at NWQN sites. The primary updates to historical load estimation methods include (1) an adaptation to methods for computing loads to the Gulf of Mexico; (2) the inclusion of loads computed using the Weighted Regressions on Time, Discharge, and Season (WRTDS) method; and (3) the inclusion of loads computed using continuous water-quality data. Loads computed using WRTDS and continuous water-quality data are provided along with those computed using historical methods. Various aspects of method updates are evaluated in this report to help users of water-quality loading data determine which estimation methods best suit their particular application.

  13. The identity of the North East of England has been shaped by the rocks beneath our feet

    NASA Astrophysics Data System (ADS)

    Shields, Deborah

    2017-04-01

    Geology and Geography students within England learn about the earth's processes and human processes, however it is not always easy for them to see the link between them and to their own lives. The changes to the specification within A-level Geography has seen an emphasis on how processes are linked to their own lives and the local area. I am fortunate to teach both Geography and Geology and I want my students who study both subjects to appreciate the links within the subjects. I also want them to appreciate the local geology and see how it has shaped the North East of England. I have therefore, created a series of lessons to help them to explore the local geology and place identity of the North East of England. To help them to develop an understanding of how the local geology influences place identity. I have used an enquiry based approach which uses the KWL chart and a concept map for students to demonstrate their understanding. These lessons are structured using the learning cycle. The lessons are differentiated through the use of cheat sheets, different levels of hand-outs and grouping of students. The learning objectives are:- 1. Describe the Geology of the North East of England. 2. Explain at least one process which has formed local geology. 3. Define place identity. 4. Discuss the North East of England's identity. 5. Discuss how the local Geology has influenced the North East of England's identity. The North East of England's geology mainly consists of coal and limestone. There is rich industrial heritage of the North East which is based around coal mining. Therefore, coal mining has had a great impact on the identity of the North East of England. There are also a number of different SSSIs which is due to the Magnesium limestone in the area, which has helped to shape the identity of the region. There are a number of areas of outstanding natural beauty due to the local geology and this has helped to create a positive identity for the North East of England.

  14. Researchers Mine Information from Next-Generation Subsurface Flow Simulations

    DOE PAGES

    Gedenk, Eric D.

    2015-12-01

    A research team based at Virginia Tech University leveraged computing resources at the US Department of Energy's (DOE's) Oak Ridge National Laboratory to explore subsurface multiphase flow phenomena that can't be experimentally observed. Using the Cray XK7 Titan supercomputer at the Oak Ridge Leadership Computing Facility, the team took Micro-CT images of subsurface geologic systems and created two-phase flow simulations. The team's model development has implications for computational research pertaining to carbon sequestration, oil recovery, and contaminant transport.

  15. Geology, structure, and statistics of multi-ring basins on Mars

    NASA Technical Reports Server (NTRS)

    Schultz, Richard A.; Frey, Herbert V.

    1990-01-01

    Available data on Martian multi-ring basins were compiled and evaluated using the new 1:15 million scale geologic maps of Mars and global topography was revised as base maps. Published center coordinates and ring diameters of Martian basins were plotted by computer and superimposed onto the base maps. In many cases basin centers or ring diameters or both had to be adjusted to achieve a better fit to the revised maps. It was also found that additional basins can explain subcircular topographic lows as well as map patterns of old Noachian materials, volcanic plains units, and channels in the Tharsis region.

  16. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously. The new approaches and expanded use of computers will require substantial increases in the quantity and sophistication of the Division 's computer resources. The requirements presented in this report will be used to develop technical specifications that describe the computer resources needed during the 1990's. (USGS)

  17. Introducing students to digital geological mapping: A workflow based on cheap hardware and free software

    NASA Astrophysics Data System (ADS)

    Vrabec, Marko; Dolžan, Erazem

    2016-04-01

    The undergraduate field course in Geological Mapping at the University of Ljubljana involves 20-40 students per year, which precludes the use of specialized rugged digital field equipment as the costs would be way beyond the capabilities of the Department. A different mapping area is selected each year with the aim to provide typical conditions that a professional geologist might encounter when doing fieldwork in Slovenia, which includes rugged relief, dense tree cover, and moderately-well- to poorly-exposed bedrock due to vegetation and urbanization. It is therefore mandatory that the digital tools and workflows are combined with classical methods of fieldwork, since, for example, full-time precise GNSS positioning is not viable under such circumstances. Additionally, due to the prevailing combination of complex geological structure with generally poor exposure, students cannot be expected to produce line (vector) maps of geological contacts on the go, so there is no need for such functionality in hardware and software that we use in the field. Our workflow therefore still relies on paper base maps, but is strongly complemented with digital tools to provide robust positioning, track recording, and acquisition of various point-based data. Primary field hardware are students' Android-based smartphones and optionally tablets. For our purposes, the built-in GNSS chips provide adequate positioning precision most of the time, particularly if they are GLONASS-capable. We use Oruxmaps, a powerful free offline map viewer for the Android platform, which facilitates the use of custom-made geopositioned maps. For digital base maps, which we prepare in free Windows QGIS software, we use scanned topographic maps provided by the National Geodetic Authority, but also other maps such as aerial imagery, processed Digital Elevation Models, scans of existing geological maps, etc. Point data, like important outcrop locations or structural measurements, are entered into Oruxmaps as waypoints. Students are also encouraged to directly measure structural data with specialized Android apps such as the MVE FieldMove Clino. Digital field data is exported from Oruxmaps to Windows computers primarily in the ubiquitous GPX data format and then integrated in the QGIS environment. Recorded GPX tracks are also used with the free Geosetter Windows software to geoposition and tag any digital photographs taken in the field. With minimal expenses, our workflow provides the students with basic familiarity and experience in using digital field tools and methods. The workflow is also practical enough for the prevailing field conditions of Slovenia that the faculty staff is using it in geological mapping for scientific research and consultancy work.

  18. Performance evaluation and geologic utility of LANDSAT 4 TM and MSS scanners

    NASA Technical Reports Server (NTRS)

    Paley, H. N.

    1983-01-01

    Experiments using artificial targets (polyethylene sheets) to help calibrate and evaluate atmospheric effects as well as the radiometric precision and spatial characteristics of the NS-001 and TM sensor systems were attempted and show the technical feasibility of using plastic targets for such studies, although weather precluded successful TM data acquisition. Tapes for six LANDSAT 4 TM scenes were acquired and data processing began. Computer enhanced TM simulator and LANDSAT 4 TM data were compared for a porphyry copper deposit in Southern Arizona. Preliminary analyses performed on two TM scenes acquired in the CCT-PT format, show the TM data appear to contain a marked increase in geologically useful information; however, a number of instrumental processing artifacts may well limit the ability of the geologist to fully extract this information.

  19. Archive of digital boomer seismic reflection data collected offshore northeast Florida during USGS cruise 02FGS01, October 2002

    USGS Publications Warehouse

    Subino, Janice A.; Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Calderon, Karynna

    2012-01-01

    This Digital Versatile Disc (DVD) publication was prepared by an agency of the United States Government. Although these data have been processed successfully on a computer system at the U.S. Geological Survey, no warranty expressed or implied is made regarding the display or utility of the data on any other system, nor shall the act of distribution imply any such warranty. The U.S. Geological Survey shall not be held liable for improper or incorrect use of the data described and (or) contained herein. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof.

  20. Tomography in Geology: 3D Modeling and Analysis of Structural Features of Rocks Using Computed MicroTomography

    NASA Astrophysics Data System (ADS)

    Ponomarev, A. A.; Mamadaliev, R. A.; Semenova, T. V.

    2016-10-01

    The article presents a brief overview of the current state of computed tomography in the sphere of oil and gas production in Russia and in the world. Operation of computed microtomograph Skyscan 1172 is also provided, as well as personal examples of its application in solving geological problems.

  1. Review of the Water Resources Information System of Argentina

    USGS Publications Warehouse

    Hutchison, N.E.

    1987-01-01

    A representative of the U.S. Geological Survey traveled to Buenos Aires, Argentina, in November 1986, to discuss water information systems and data bank implementation in the Argentine Government Center for Water Resources Information. Software has been written by Center personnel for a minicomputer to be used to manage inventory (index) data and water quality data. Additional hardware and software have been ordered to upgrade the existing computer. Four microcomputers, statistical and data base management software, and network hardware and software for linking the computers have also been ordered. The Center plans to develop a nationwide distributed data base for Argentina that will include the major regional offices as nodes. Needs for continued development of the water resources information system for Argentina were reviewed. Identified needs include: (1) conducting a requirements analysis to define the content of the data base and insure that all user requirements are met, (2) preparing a plan for the development, implementation, and operation of the data base, and (3) developing a conceptual design to inform all development personnel and users of the basic functionality planned for the system. A quality assurance and configuration management program to provide oversight to the development process was also discussed. (USGS)

  2. The Challenges and Benefits of Using Computer Technology for Communication and Teaching in the Geosciences

    NASA Astrophysics Data System (ADS)

    Fairley, J. P.; Hinds, J. J.

    2003-12-01

    The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.

  3. A new method for mapping multidimensional data to lower dimensions

    NASA Technical Reports Server (NTRS)

    Gowda, K. C.

    1983-01-01

    A multispectral mapping method is proposed which is based on the new concept of BEND (Bidimensional Effective Normalised Difference). The method, which involves taking one sample point at a time and finding the interrelationships between its features, is found very economical from the point of view of storage and processing time. It has good dimensionality reduction and clustering properties, and is highly suitable for computer analysis of large amounts of data. The transformed values obtained by this procedure are suitable for either a planar 2-space mapping of geological sample points or for making grayscale and color images of geo-terrains. A few examples are given to justify the efficacy of the proposed procedure.

  4. Diverse Geological Applications For Basil: A 2d Finite-deformation Computational Algorithm

    NASA Astrophysics Data System (ADS)

    Houseman, Gregory A.; Barr, Terence D.; Evans, Lynn

    Geological processes are often characterised by large finite-deformation continuum strains, on the order of 100% or greater. Microstructural processes cause deformation that may be represented by a viscous constitutive mechanism, with viscosity that may depend on temperature, pressure, or strain-rate. We have developed an effective com- putational algorithm for the evaluation of 2D deformation fields produced by Newto- nian or non-Newtonian viscous flow. With the implementation of this algorithm as a computer program, Basil, we have applied it to a range of diverse applications in Earth Sciences. Viscous flow fields in 2D may be defined for the thin-sheet case or, using a velocity-pressure formulation, for the plane-strain case. Flow fields are represented using 2D triangular elements with quadratic interpolation for velocity components and linear for pressure. The main matrix equation is solved by an efficient and compact conjugate gradient algorithm with iteration for non-Newtonian viscosity. Regular grids may be used, or grids based on a random distribution of points. Definition of the prob- lem requires that velocities, tractions, or some combination of the two, are specified on all external boundary nodes. Compliant boundaries may also be defined, based on the idea that traction is opposed to and proportional to boundary displacement rate. In- ternal boundary segments, allowing fault-like displacements within a viscous medium have also been developed, and we find that the computed displacement field around the fault tip is accurately represented for Newtonian and non-Newtonian viscosities, in spite of the stress singularity at the fault tip. Basil has been applied by us and colleagues to problems that include: thin sheet calculations of continental collision, Rayleigh-Taylor instability of the continental mantle lithosphere, deformation fields around fault terminations at the outcrop scale, stress and deformation fields in and around porphyroblasts, and deformation of the subducted oceanic slab. Application of Basil to a diverse range of topics is facilitated by the use of command syntax input files that allow most aspects of the calculation to be controlled easily, together with a post-processing package, Sybil, for display and interpretation of the results. Sybil uses a menu-driven graphical interface to access a powerful combination of commands, to- gether with log files that allow repetitive tasks to be more automated

  5. Topographic attributes as a guide for automated detection or highlighting of geological features

    NASA Astrophysics Data System (ADS)

    Viseur, Sophie; Le Men, Thibaud; Guglielmi, Yves

    2015-04-01

    Photogrammetry or LIDAR technology combined with photography allow geoscientists to obtain 3D high-resolution numerical representations of outcrops, generally termed as Digital Outcrop Models (DOM). For over a decade, these 3D numerical outcrops serve as support for precise and accurate interpretations of geological features such as fracture traces or plans, strata, facies mapping, etc. These interpretations have the benefit to be directly georeferenced and embedded into the 3D space. They are then easily integrated into GIS or geomodeler softwares for modelling in 3D the subsurface geological structures. However, numerical outcrops generally represent huge data sets that are heavy to manipulate and hence to interpret. This may be particularly tedious as soon as several scales of geological features must be investigated or as geological features are very dense and imbricated. Automated tools for interpreting geological features from DOMs would be then a significant help to process these kinds of data. Such technologies are commonly used for interpreting seismic or medical data. However, it may be noticed that even if many efforts have been devoted to easily and accurately acquire 3D topographic point clouds and photos and to visualize accurate 3D textured DOMs, few attentions have been paid to the development of algorithms for automated detection of the geological structures from DOMs. The automatic detection of objects on numerical data generally assumes that signals or attributes computed from this data allows the recognition of the targeted object boundaries. The first step consists then in defining attributes that highlight the objects or their boundaries. For DOM interpretations, some authors proposed to use differential operators computed on the surface such as normal or curvatures. These methods generally extract polylines corresponding to fracture traces or bed limits. Other approaches rely on the PCA technology to segregate different topographic plans. This approach assume that structural or sedimentary features coincide with topographic surface parts. In this work, several topographic attributes are proposed to highlight geological features on outcrops. Among them, differential operators are used but also combined and processed to display particular topographic shapes. Moreover, two kinds of attributes are used: unsupervised and supervised attributes. The supervised attributes integrate an a priori knowledge about the objects to extract (e.g.: a preferential orientation of fracture surfaces, etc.). This strategy may be compared to the one used for seismic interpretation. Indeed, many seismic attributes have been proposed to highlight geological structures hardly observable due to data noise. The same issue exist with topographic data: plants, erosions, etc. generate noise that make interpretation sometimes hard. The proposed approach has been applied on real case studies to show how it could help the interpretation of geological features. The obtained 'topographic attributes' are shown and discussed.

  6. Thermohydrology of fractured geologic materials

    NASA Astrophysics Data System (ADS)

    Esh, David Whittaker

    1998-11-01

    Thermohydrological and thermohydrochemical modeling as applied to the disposal of radioactive materials in a geologic repository is presented. Site hydrology, chemistry, and mineralogy were summarized and conceptual models of the fundamental system processes were developed. The numerical model TOUGH2 was used to complete computer simulations of thermohydrological processes in fractured, geologic media. Sensitivity studies investigating the impact of dimensionality and different conceptual models to represent fractures (ECM, DK, MINC) on thermohydrological response were developed. Sensitivity to parameter variation within a given conceptual model was also considered. The sensitivity of response was examined against thermohydrological metrics derived from the flow and redistribution of moisture. A simple thermohydrochemical model to investigate a three-process coupling (thermal-hydrological-chemical) was presented. The redistribution of chloride was evaluated because the chemical behavior is well known and defensible. In addition, it is very important to overall system performance. For all of the simulations completed, chloride was found to be extremely concentrated in the fluids that eventually return to the engineered barrier system. Chloride concentration and mass flux were increased from ambient by over a factor of 1000 for some simulations. Thermohydrology was found to have the potential to significantly alter chemistry from ambient conditions.

  7. Recovery Act: Web-based CO{sub 2} Subsurface Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paolini, Christopher; Castillo, Jose

    2012-11-30

    The Web-based CO{sub 2} Subsurface Modeling project focused primarily on extending an existing text-only, command-line driven, isothermal and isobaric, geochemical reaction-transport simulation code, developed and donated by Sienna Geodynamics, into an easier-to-use Web-based application for simulating long-term storage of CO{sub 2} in geologic reservoirs. The Web-based interface developed through this project, publically accessible via URL http://symc.sdsu.edu/, enables rapid prototyping of CO{sub 2} injection scenarios and allows students without advanced knowledge of geochemistry to setup a typical sequestration scenario, invoke a simulation, analyze results, and then vary one or more problem parameters and quickly re-run a simulation to answer what-if questions.more » symc.sdsu.edu has 2x12 core AMD Opteron™ 6174 2.20GHz processors and 16GB RAM. The Web-based application was used to develop a new computational science course at San Diego State University, COMP 670: Numerical Simulation of CO{sub 2} Sequestration, which was taught during the fall semester of 2012. The purpose of the class was to introduce graduate students to Carbon Capture, Use and Storage (CCUS) through numerical modeling and simulation, and to teach students how to interpret simulation results to make predictions about long-term CO{sub 2} storage capacity in deep brine reservoirs. In addition to the training and education component of the project, significant software development efforts took place. Two computational science doctoral and one geological science masters student, under the direction of the PIs, extended the original code developed by Sienna Geodynamics, named Sym.8. New capabilities were added to Sym.8 to simulate non-isothermal and non-isobaric flows of charged aqueous solutes in porous media, in addition to incorporating HPC support into the code for execution on many-core XSEDE clusters. A successful outcome of this project was the funding and training of three new computational science students and one geological science student in technologies relevant to carbon sequestration and problems involving flow in subsurface media. The three computational science students are currently finishing their doctorial studies on different aspects of modeling CO{sub 2} sequestration, while the geological science student completed his master’s thesis in modeling the thermal response of CO{sub 2} injection in brine and, as a direct result of participation in this project, is now employed at ExxonMobil as a full-time staff geologist.« less

  8. Lithospheric magnetic field modelling of the African continent

    NASA Astrophysics Data System (ADS)

    Hemant, K.; Maus, S.

    2003-04-01

    New magnetic satellite missions in low-earth orbit are providing increasingly accurate maps of the lithospheric magnetic field. These maps can be used to infer the geological structure of regions hidden by Phanerozoic cover, taking into account our knowledge of crustal structure from surface geology and seismic methods. A GIS based modelling technique has been developed to model the various geological units of the continents using the UNESCO geological map of the world, supported by background geological information from various sources. Geological units of each region are assigned a susceptibility value based on laboratory values of the constituent rock types. Then, using the 3SMAC seismic crustal structure, a vertically integrated susceptibility (VIS) model is computed at each point of the region. Starting with this VIS model, the total field anomaly is computed at an altitude of 400 km and compared with the MF2 lithospheric magnetic field model derived from CHAMP data. The modelling results of the Precambrian units of the West African cratons agree well with MF2. The anomaly in the Central African cratonic region also correlates well, although part of it is unaccounted for as yet. Furthermore, the anomalies over the Tanzanian craton and surrounding region agree very well. Most of the regions around the South African cratons are hidden by Phanerozoic cover, yet the results above the Kaapvaal craton and the southern Zimbabwe craton around the Limpopo belt show good correspondence with the observed anomaly map. The results also suggest a probable extension of the Precambrian units below the sediments of younger age. In general, the lower crust is likely to be more mafic than presumed in our current understanding of Central Africa. Deviations in the magnitude of the anomalies in some regions are likely to be due to incomplete seismic information in those regions. Thus, the thickness of crustal layers derived from magnetic anomalies for these locations may help to constrain future geophysical models in the less explored regions of Africa.

  9. Development of a Web Based Simulating System for Earthquake Modeling on the Grid

    NASA Astrophysics Data System (ADS)

    Seber, D.; Youn, C.; Kaiser, T.

    2007-12-01

    Existing cyberinfrastructure-based information, data and computational networks now allow development of state- of-the-art, user-friendly simulation environments that democratize access to high-end computational environments and provide new research opportunities for many research and educational communities. Within the Geosciences cyberinfrastructure network, GEON, we have developed the SYNSEIS (SYNthetic SEISmogram) toolkit to enable efficient computations of 2D and 3D seismic waveforms for a variety of research purposes especially for helping to analyze the EarthScope's USArray seismic data in a speedy and efficient environment. The underlying simulation software in SYNSEIS is a finite difference code, E3D, developed by LLNL (S. Larsen). The code is embedded within the SYNSEIS portlet environment and it is used by our toolkit to simulate seismic waveforms of earthquakes at regional distances (<1000km). Architecturally, SYNSEIS uses both Web Service and Grid computing resources in a portal-based work environment and has a built in access mechanism to connect to national supercomputer centers as well as to a dedicated, small-scale compute cluster for its runs. Even though Grid computing is well-established in many computing communities, its use among domain scientists still is not trivial because of multiple levels of complexities encountered. We grid-enabled E3D using our own dialect XML inputs that include geological models that are accessible through standard Web services within the GEON network. The XML inputs for this application contain structural geometries, source parameters, seismic velocity, density, attenuation values, number of time steps to compute, and number of stations. By enabling a portal based access to a such computational environment coupled with its dynamic user interface we enable a large user community to take advantage of such high end calculations in their research and educational activities. Our system can be used to promote an efficient and effective modeling environment to help scientists as well as educators in their daily activities and speed up the scientific discovery process.

  10. A Review of High-Performance Computational Strategies for Modeling and Imaging of Electromagnetic Induction Data

    NASA Astrophysics Data System (ADS)

    Newman, Gregory A.

    2014-01-01

    Many geoscientific applications exploit electrostatic and electromagnetic fields to interrogate and map subsurface electrical resistivity—an important geophysical attribute for characterizing mineral, energy, and water resources. In complex three-dimensional geologies, where many of these resources remain to be found, resistivity mapping requires large-scale modeling and imaging capabilities, as well as the ability to treat significant data volumes, which can easily overwhelm single-core and modest multicore computing hardware. To treat such problems requires large-scale parallel computational resources, necessary for reducing the time to solution to a time frame acceptable to the exploration process. The recognition that significant parallel computing processes must be brought to bear on these problems gives rise to choices that must be made in parallel computing hardware and software. In this review, some of these choices are presented, along with the resulting trade-offs. We also discuss future trends in high-performance computing and the anticipated impact on electromagnetic (EM) geophysics. Topics discussed in this review article include a survey of parallel computing platforms, graphics processing units to multicore CPUs with a fast interconnect, along with effective parallel solvers and associated solver libraries effective for inductive EM modeling and imaging.

  11. Some thoughts on cartographic and geographic information systems for the 1980's

    USGS Publications Warehouse

    Starr, L.E.; Anderson, Kirk E.

    1981-01-01

    The U.S. Geological Survey is adopting computer techniques to meet the expanding need for cartographic base category data. Digital methods are becoming increasingly important in the mapmaking process, and the demand is growing for physical, social, and economic data. Recognizing these emerging needs, the National Mapping Division began, several years ago, an active program to develop advanced digital methods to support cartographic and geographic data processing. An integrated digital cartographic database would meet the anticipated needs. Such a database would contain data from various sources, and could provide a variety of standard and customized map and digital data file products. This cartographic database soon will be technologically feasible. The present trends in the economics of cartographic and geographic data handling and the growing needs for integrated physical, social, and economic data make such a database virtually mandatory.

  12. The U.S. geological survey rass-statpac system for management and statistical reduction of geochemical data

    USGS Publications Warehouse

    VanTrump, G.; Miesch, A.T.

    1977-01-01

    RASS is an acronym for Rock Analysis Storage System and STATPAC, for Statistical Package. The RASS and STATPAC computer programs are integrated into the RASS-STATPAC system for the management and statistical reduction of geochemical data. The system, in its present form, has been in use for more than 9 yr by scores of U.S. Geological Survey geologists, geochemists, and other scientists engaged in a broad range of geologic and geochemical investigations. The principal advantage of the system is the flexibility afforded the user both in data searches and retrievals and in the manner of statistical treatment of data. The statistical programs provide for most types of statistical reduction normally used in geochemistry and petrology, but also contain bridges to other program systems for statistical processing and automatic plotting. ?? 1977.

  13. Large Scale Geologic Controls on Hydraulic Stimulation

    NASA Astrophysics Data System (ADS)

    McLennan, J. D.; Bhide, R.

    2014-12-01

    When simulating a hydraulic fracturing, the analyst has historically prescribed a single planar fracture. Originally (in the 1950s through the 1970s) this was necessitated by computational restrictions. In the latter part of the twentieth century, hydraulic fracture simulation evolved to incorporate vertical propagation controlled by modulus, fluid loss, and the minimum principal stress. With improvements in software, computational capacity, and recognition that in-situ discontinuities are relevant, fully three-dimensional hydraulic simulation is now becoming possible. Advances in simulation capabilities enable coupling structural geologic data (three-dimensional representation of stresses, natural fractures, and stratigraphy) with decision making processes for stimulation - volumes, rates, fluid types, completion zones. Without this interaction between simulation capabilities and geological information, low permeability formation exploitation may linger on the fringes of real economic viability. Comparative simulations have been undertaken in varying structural environments where the stress contrast and the frequency of natural discontinuities causes varying patterns of multiple, hydraulically generated or reactivated flow paths. Stress conditions and nature of the discontinuities are selected as variables and are used to simulate how fracturing can vary in different structural regimes. The basis of the simulations is commercial distinct element software (Itasca Corporation's 3DEC).

  14. 10 CFR 960.4-2-7 - Tectonics.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... activity within the geologic setting during the Quaternary Period. (2) Historical earthquakes within the... isolation. (3) Indications, based on correlations of earthquakes with tectonic processes and features, that either the frequency of occurrence or the magnitude of earthquakes within the geologic setting may...

  15. 10 CFR 960.5-2-11 - Tectonics.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of active faulting within the geologic setting. (2) Historical earthquakes or past man-induced... design limits. (3) Evidence, based on correlations of earthquakes with tectonic processes and features, (e.g., faults) within the geologic setting, that the magnitude of earthquakes at the site during...

  16. 10 CFR 960.4-2-7 - Tectonics.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... activity within the geologic setting during the Quaternary Period. (2) Historical earthquakes within the... isolation. (3) Indications, based on correlations of earthquakes with tectonic processes and features, that either the frequency of occurrence or the magnitude of earthquakes within the geologic setting may...

  17. 10 CFR 960.4-2-7 - Tectonics.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... activity within the geologic setting during the Quaternary Period. (2) Historical earthquakes within the... isolation. (3) Indications, based on correlations of earthquakes with tectonic processes and features, that either the frequency of occurrence or the magnitude of earthquakes within the geologic setting may...

  18. 10 CFR 960.5-2-11 - Tectonics.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of active faulting within the geologic setting. (2) Historical earthquakes or past man-induced... design limits. (3) Evidence, based on correlations of earthquakes with tectonic processes and features, (e.g., faults) within the geologic setting, that the magnitude of earthquakes at the site during...

  19. 10 CFR 960.4-2-7 - Tectonics.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... activity within the geologic setting during the Quaternary Period. (2) Historical earthquakes within the... isolation. (3) Indications, based on correlations of earthquakes with tectonic processes and features, that either the frequency of occurrence or the magnitude of earthquakes within the geologic setting may...

  20. 10 CFR 960.5-2-11 - Tectonics.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of active faulting within the geologic setting. (2) Historical earthquakes or past man-induced... design limits. (3) Evidence, based on correlations of earthquakes with tectonic processes and features, (e.g., faults) within the geologic setting, that the magnitude of earthquakes at the site during...

  1. 10 CFR 960.5-2-11 - Tectonics.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... of active faulting within the geologic setting. (2) Historical earthquakes or past man-induced... design limits. (3) Evidence, based on correlations of earthquakes with tectonic processes and features, (e.g., faults) within the geologic setting, that the magnitude of earthquakes at the site during...

  2. 10 CFR 960.5-2-11 - Tectonics.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of active faulting within the geologic setting. (2) Historical earthquakes or past man-induced... design limits. (3) Evidence, based on correlations of earthquakes with tectonic processes and features, (e.g., faults) within the geologic setting, that the magnitude of earthquakes at the site during...

  3. MATLAB as an incentive for student learning of skills

    NASA Astrophysics Data System (ADS)

    Bank, C. G.; Ghent, R. R.

    2016-12-01

    Our course "Computational Geology" takes a holistic approach to student learning by using MATLAB as a focal point to increase students' computing, quantitative reasoning, data analysis, report writing, and teamwork skills. The course, taught since 2007 with recent enrollments around 35 and aimed at 2nd to 3rd-year students, is required for the Geology and Earth and Environmental Systems major programs, and can be chosen as elective in our other programs, including Geophysics. The course is divided into five projects: Pacific plate velocity from the Hawaiian hotspot track, predicting CO2 concentration in the atmosphere, volume of Earth's oceans and sea-level rise, comparing wind directions for Vancouver and Squamish, and groundwater flow. Each project is based on real data, focusses on a mathematical concept (linear interpolation, gradients, descriptive statistics, differential equations) and highlights a programming task (arrays, functions, text file input/output, curve fitting). Working in teams of three, students need to develop a conceptional model to explain the data, and write MATLAB code to visualize the data and match it to their conceptional model. The programming is guided, and students work individually on different aspects (for example: reading the data, fitting a function, unit conversion) which they need to put together to solve the problem. They then synthesize their thought process in a paper. Anecdotal evidence shows that students continue using MATLAB in other courses.

  4. MICHIGAN SOIL VAPOR EXTRACTION REMEDIATION (MISER) MODEL: A COMPUTER PROGRAM TO MODEL SOIL VAPOR EXTRACTION AND BIOVENTING OF ORGANIC CHEMICALS IN UNSATURATED GEOLOGICAL MATERIAL

    EPA Science Inventory

    Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...

  5. TOUGH2Biot - A simulator for coupled thermal-hydrodynamic-mechanical processes in subsurface flow systems: Application to CO2 geological storage and geothermal development

    NASA Astrophysics Data System (ADS)

    Lei, Hongwu; Xu, Tianfu; Jin, Guangrong

    2015-04-01

    Coupled thermal-hydrodynamic-mechanical processes have become increasingly important in studying the issues affecting subsurface flow systems, such as CO2 sequestration in deep saline aquifers and geothermal development. In this study, a mechanical module based on the extended Biot consolidation model was developed and incorporated into the well-established thermal-hydrodynamic simulator TOUGH2, resulting in an integrated numerical THM simulation program TOUGH2Biot. A finite element method was employed to discretize space for rock mechanical calculation and the Mohr-Coulomb failure criterion was used to determine if the rock undergoes shear-slip failure. Mechanics is partly coupled with the thermal-hydrodynamic processes and gives feedback to flow through stress-dependent porosity and permeability. TOUGH2Biot was verified against analytical solutions for the 1D Terzaghi consolidation and cooling-induced subsidence. TOUGH2Biot was applied to evaluate the thermal, hydrodynamic, and mechanical responses of CO2 geological sequestration at the Ordos CCS Demonstration Project, China and geothermal exploitation at the Geysers geothermal field, California. The results demonstrate that TOUGH2Biot is capable of analyzing change in pressure and temperature, displacement, stress, and potential shear-slip failure caused by large scale underground man-made activity in subsurface flow systems. TOUGH2Biot can also be easily extended for complex coupled process problems in fractured media and be conveniently updated to parallel versions on different platforms to take advantage of high-performance computing.

  6. Goal-Oriented Intelligence in Optimization of Distributed Parameter Systems

    DTIC Science & Technology

    2004-08-01

    Yarus, and R.L. Chambers, editors, AAPG Computer Applications in geology, No. 3, The American Association of Petroleum Geologists, Tulsa, OK, USA...Stochastic Modeling and Geostatistics – Principles, Methods, and Case Studies, AAPG Computer Applications in geology, No. 3, The American

  7. San Mateo County Geographic Information Systems (GIS) project

    USGS Publications Warehouse

    Brabb, E.E.

    1986-01-01

    Earthquakes and ground failures in the United States cause billions of dollars of damages each year, but techniques for predicting and reducing these hazardous geologic processes remain elusive. geologists, geophysicists, hydrologists, engineers, cartographers, and computer specialists from the U.S geological Survey in Menlo Park, California, are working together on a project involving GIS techniques to determine how to predict the consequences of earthquakes and landslides, using San Mateo County as a subject area. Together with members of the Planning and Emergency Serivces Departments of San Mateo County and the Association of Bay Area Governments, They are also determining how to reduce the losses caused by hazards. 

  8. Stochastic simulation of spatially correlated geo-processes

    USGS Publications Warehouse

    Christakos, G.

    1987-01-01

    In this study, developments in the theory of stochastic simulation are discussed. The unifying element is the notion of Radon projection in Euclidean spaces. This notion provides a natural way of reconstructing the real process from a corresponding process observable on a reduced dimensionality space, where analysis is theoretically easier and computationally tractable. Within this framework, the concept of space transformation is defined and several of its properties, which are of significant importance within the context of spatially correlated processes, are explored. The turning bands operator is shown to follow from this. This strengthens considerably the theoretical background of the geostatistical method of simulation, and some new results are obtained in both the space and frequency domains. The inverse problem is solved generally and the applicability of the method is extended to anisotropic as well as integrated processes. Some ill-posed problems of the inverse operator are discussed. Effects of the measurement error and impulses at origin are examined. Important features of the simulated process as described by geomechanical laws, the morphology of the deposit, etc., may be incorporated in the analysis. The simulation may become a model-dependent procedure and this, in turn, may provide numerical solutions to spatial-temporal geologic models. Because the spatial simu??lation may be technically reduced to unidimensional simulations, various techniques of generating one-dimensional realizations are reviewed. To link theory and practice, an example is computed in detail. ?? 1987 International Association for Mathematical Geology.

  9. Dissolution on Saturn's Moon Titan: A 3D Karst Landscape Evolution Model

    NASA Astrophysics Data System (ADS)

    Cornet, Thomas; Fleurant, Cyril; Seignovert, Benoît; Cordier, Daniel; Bourgeois, Olivier; Le Mouélic, Stéphane; Rodriguez, Sebastien; Lucas, Antoine

    2017-04-01

    Titan is an Earth-like world possessing a nitrogen-rich atmosphere that covers a surface with signs of lacustrine (lakes, seas, depressions), fluvial (channels, valleys) and aeolian (dunes) activity [1]. The chemistry implied in the geological processes is, however, strikingly different from that on Earth. Titan's extremely cold environment (T -180°C) allows water to exist only under the form of icy "bedrock". The presence of methane as the second major constituent in the atmosphere, as well as an active nitrogen-methane photochemistry, allows methane and ethane to drive a hydrocarbon cycle similar to the terrestrial hydrological cycle. A plethora of organic solids, more or less soluble in liquid hydrocarbons, is also produced in the atmosphere and can lead, by atmospheric sedimentation over geological timescales, to formation of some kind of organic geological sedimentary layer. Based on comparisons between Titan's landscapes seen in the Cassini spacecraft data and terrestrial analogues, karstic-like dissolution and evaporitic crystallization have been suggested in various instances to take part in the landscape development on Titan. Dissolution has been invoked, for instance, for the development of the so-called "labyrinthic terrain", located at high latitudes and resembling terrestrial cockpit or polygonal karst terrain. In this work, we aim at testing this hypothesis by comparing the natural landscapes visible in the Cassini/RADAR images of Titan's surface, with those inferred from the use of a 3D Landscape Evolution Model (LEM) based on the Channel-Hillslope Integrated Landscape Development (CHILD) [2] modified to include karstic dissolution as the major geological process [3]. Digital Elevation Models (DEMs) are generated from an initial quasi-planar surface for a set of dissolution rates, diffusion coefficients (solute transport), and sink densities of the mesh. The landscape evolves over millions of years. Synthetic SAR images are generated from these DEMs in order to be compared with Titan's landforms seen in the Cassini SAR data. Inference on the possible thickness and degree of maturation of the Titan karst will be discussed. [1] Lopes R.M.C. et al. (2010), Icarus ; [2] Tucker et al. (2001), Computers Geosciences ; [3] Fleurant C. et al. (2008), Geomorph., Rel., Proc., Envir.

  10. Rupture Dynamics and Seismic Radiation on Rough Faults for Simulation-Based PSHA

    NASA Astrophysics Data System (ADS)

    Mai, P. M.; Galis, M.; Thingbaijam, K. K. S.; Vyas, J. C.; Dunham, E. M.

    2017-12-01

    Simulation-based ground-motion predictions may augment PSHA studies in data-poor regions or provide additional shaking estimations, incl. seismic waveforms, for critical facilities. Validation and calibration of such simulation approaches, based on observations and GMPE's, is important for engineering applications, while seismologists push to include the precise physics of the earthquake rupture process and seismic wave propagation in 3D heterogeneous Earth. Geological faults comprise both large-scale segmentation and small-scale roughness that determine the dynamics of the earthquake rupture process and its radiated seismic wavefield. We investigate how different parameterizations of fractal fault roughness affect the rupture evolution and resulting near-fault ground motions. Rupture incoherence induced by fault roughness generates realistic ω-2 decay for high-frequency displacement amplitude spectra. Waveform characteristics and GMPE-based comparisons corroborate that these rough-fault rupture simulations generate realistic synthetic seismogram for subsequent engineering application. Since dynamic rupture simulations are computationally expensive, we develop kinematic approximations that emulate the observed dynamics. Simplifying the rough-fault geometry, we find that perturbations in local moment tensor orientation are important, while perturbations in local source location are not. Thus, a planar fault can be assumed if the local strike, dip, and rake are maintained. The dynamic rake angle variations are anti-correlated with local dip angles. Based on a dynamically consistent Yoffe source-time function, we show that the seismic wavefield of the approximated kinematic rupture well reproduces the seismic radiation of the full dynamic source process. Our findings provide an innovative pseudo-dynamic source characterization that captures fault roughness effects on rupture dynamics. Including the correlations between kinematic source parameters, we present a new pseudo-dynamic rupture modeling approach for computing broadband ground-motion time-histories for simulation-based PSHA

  11. Image enhancements of Landsat 8 (OLI) and SAR data for preliminary landslide identification and mapping applied to the central region of Kenya

    NASA Astrophysics Data System (ADS)

    Mwaniki, M. W.; Kuria, D. N.; Boitt, M. K.; Ngigi, T. G.

    2017-04-01

    Image enhancements lead to improved performance and increased accuracy of feature extraction, recognition, identification, classification and hence change detection. This increases the utility of remote sensing to suit environmental applications and aid disaster monitoring of geohazards involving large areas. The main aim of this study was to compare the effect of image enhancement applied to synthetic aperture radar (SAR) data and Landsat 8 imagery in landslide identification and mapping. The methodology involved pre-processing Landsat 8 imagery, image co-registration, despeckling of the SAR data, after which Landsat 8 imagery was enhanced by Principal and Independent Component Analysis (PCA and ICA), a spectral index involving bands 7 and 4, and using a False Colour Composite (FCC) with the components bearing the most geologic information. The SAR data were processed using textural and edge filters, and computation of SAR incoherence. The enhanced spatial, textural and edge information from the SAR data was incorporated to the spectral information from Landsat 8 imagery during the knowledge based classification. The methodology was tested in the central highlands of Kenya, characterized by rugged terrain and frequent rainfall induced landslides. The results showed that the SAR data complemented Landsat 8 data which had enriched spectral information afforded by the FCC with enhanced geologic information. The SAR classification depicted landslides along the ridges and lineaments, important information lacking in the Landsat 8 image classification. The success of landslide identification and classification was attributed to the enhanced geologic features by spectral, textural and roughness properties.

  12. Semantics-informed geological maps: Conceptual modeling and knowledge encoding

    NASA Astrophysics Data System (ADS)

    Lombardo, Vincenzo; Piana, Fabrizio; Mimmo, Dario

    2018-07-01

    This paper introduces a novel, semantics-informed geologic mapping process, whose application domain is the production of a synthetic geologic map of a large administrative region. A number of approaches concerning the expression of geologic knowledge through UML schemata and ontologies have been around for more than a decade. These approaches have yielded resources that concern specific domains, such as, e.g., lithology. We develop a conceptual model that aims at building a digital encoding of several domains of geologic knowledge, in order to support the interoperability of the sources. We apply the devised terminological base to the classification of the elements of a geologic map of the Italian Western Alps and northern Apennines (Piemonte region). The digitally encoded knowledge base is a merged set of ontologies, called OntoGeonous. The encoding process identifies the objects of the semantic encoding, the geologic units, gathers the relevant information about such objects from authoritative resources, such as GeoSciML (giving priority to the application schemata reported in the INSPIRE Encoding Cookbook), and expresses the statements by means of axioms encoded in the Web Ontology Language (OWL). To support interoperability, OntoGeonous interlinks the general concepts by referring to the upper part level of ontology SWEET (developed by NASA), and imports knowledge that is already encoded in ontological format (e.g., ontology Simple Lithology). Machine-readable knowledge allows for consistency checking and for classification of the geological map data through algorithms of automatic reasoning.

  13. Semantic mediation in the national geologic map database (US)

    USGS Publications Warehouse

    Percy, D.; Richard, S.; Soller, D.

    2008-01-01

    Controlled language is the primary challenge in merging heterogeneous databases of geologic information. Each agency or organization produces databases with different schema, and different terminology for describing the objects within. In order to make some progress toward merging these databases using current technology, we have developed software and a workflow that allows for the "manual semantic mediation" of these geologic map databases. Enthusiastic support from many state agencies (stakeholders and data stewards) has shown that the community supports this approach. Future implementations will move toward a more Artificial Intelligence-based approach, using expert-systems or knowledge-bases to process data based on the training sets we have developed manually.

  14. Maps showing aeromagnetic survey and geologic interpretation of the Chignik and Sutwik Island quadrangles, Alaska

    USGS Publications Warehouse

    Case, J.E.; Cox, D.P.; Detra, D.E.; Detterman, R.L.; Wilson, Frederic H.

    1981-01-01

    An aeromagnetic survey over part of the Chignik and Sutwik Island quadrangles, on the southern Alaska Peninsula, was flown in 1977 as part of the Alaska mineral resource assessment program (AMRAP). Maps at scales 1:250,000 and 1:63,360 have been released on open-file (U.s. Geological Survey, 1978a, 1978b). This report includes the aeromagnetic map superimposed on the topographic base (sheet 1) and an interpretation map superimposed on the topographic and simplified geologic base (sheet 2). This discussion provides an interpretation of the aeromagnetic data with respect to regional geology, occurrence of ore deposits and prospects, and potential oil and gas resources. The survey was flown along northwest-southeast lines, spaced about 1.6 km apart, at a nominal elevation of about 300 m above the land surface. A proton-precession magnetometer was used for the survey, and the resulting digital data were computer contoured at intervals of 10 and 50 gammas (sheet 1). The International Geomagnetic Reference Field (IGRF) of 1965, updated to 1977, was removed from the total field data.

  15. Industry and Academic Consortium for Computer Based Subsurface Geology Laboratory

    NASA Astrophysics Data System (ADS)

    Brown, A. L.; Nunn, J. A.; Sears, S. O.

    2008-12-01

    Twenty two licenses for Petrel Software acquired through a grant from Schlumberger are being used to redesign the laboratory portion of Subsurface Geology at Louisiana State University. The course redesign is a cooperative effort between LSU's Geology and Geophysics and Petroleum Engineering Departments and Schlumberger's Technical Training Division. In spring 2008, two laboratory sections were taught with 22 students in each section. The class contained geology majors, petroleum engineering majors, and geology graduate students. Limited enrollments and 3 hour labs make it possible to incorporate hands-on visualization, animation, manipulation of data and images, and access to geological data available online. 24/7 access to the laboratory and step by step instructions for Petrel exercises strongly promoted peer instruction and individual learning. Goals of the course redesign include: enhancing visualization of earth materials; strengthening student's ability to acquire, manage, and interpret multifaceted geological information; fostering critical thinking, the scientific method; improving student communication skills; providing cross training between geologists and engineers and increasing the quantity, quality, and diversity of students pursuing Earth Science and Petroleum Engineering careers. IT resources available in the laboratory provide students with sophisticated visualization tools, allowing them to switch between 2-D and 3-D reconstructions more seamlessly, and enabling them to manipulate larger integrated data-sets, thus permitting more time for critical thinking and hypothesis testing. IT resources also enable faculty and students to simultaneously work with the software to visually interrogate a 3D data set and immediately test hypothesis formulated in class. Preliminary evaluation of class results indicate that students found MS-Windows based Petrel easy to learn. By the end of the semester, students were able to not only map horizons and faults using seismic and well data but also compute volumetrics. Exam results indicated that while students could complete sophisticated exercises using the software, their understanding of key concepts such as conservation of volume in a palinspastic reconstruction or association of structures with a particular stress regime was limited. Future classes will incorporate more paper and pencil exercises to illustrate basic concepts. The equipment, software, and exercises developed will be used in additional upper level undergraduate and graduate classes.

  16. MODFLOW-2000, the U.S. Geological Survey Modular Ground-Water Model--Documentation of the SEAWAT-2000 Version with the Variable-Density Flow Process (VDF) and the Integrated MT3DMS Transport Process (IMT)

    USGS Publications Warehouse

    Langevin, Christian D.; Shoemaker, W. Barclay; Guo, Weixing

    2003-01-01

    SEAWAT-2000 is the latest release of the SEAWAT computer program for simulation of three-dimensional, variable-density, transient ground-water flow in porous media. SEAWAT-2000 was designed by combining a modified version of MODFLOW-2000 and MT3DMS into a single computer program. The code was developed using the MODFLOW-2000 concept of a process, which is defined as ?part of the code that solves a fundamental equation by a specified numerical method.? SEAWAT-2000 contains all of the processes distributed with MODFLOW-2000 and also includes the Variable-Density Flow Process (as an alternative to the constant-density Ground-Water Flow Process) and the Integrated MT3DMS Transport Process. Processes may be active or inactive, depending on simulation objectives; however, not all processes are compatible. For example, the Sensitivity and Parameter Estimation Processes are not compatible with the Variable-Density Flow and Integrated MT3DMS Transport Processes. The SEAWAT-2000 computer code was tested with the common variable-density benchmark problems and also with problems representing evaporation from a salt lake and rotation of immiscible fluids.

  17. Geological hazard monitoring system in Georgia

    NASA Astrophysics Data System (ADS)

    Gaprindashvili, George

    2017-04-01

    Georgia belongs to one of world's most complex mountainous regions according to the scale and frequency of Geological processes and damage caused to population, farmlands, and Infrastructure facilities. Geological hazards (landslide, debrisflow/mudflow, rockfall, erosion and etc.) are affecting many populated areas, agricultural fields, roads, oil and gas pipes, high-voltage electric power transmission towers, hydraulic structures, and tourist complexes. Landslides occur almost in all geomorphological zones, resulting in wide differentiation in the failure types and mechanisms and in the size-frequency distribution. In Georgia, geological hazards triggered by: 1. Activation of highly intense earthquakes; 2. Meteorological events provoking the disaster processes on the background of global climatic change; 3. Large-scale Human impact on the environment. The prediction and monitoring of Geological Hazards is a very wide theme, which involves different researchers from different spheres. Geological hazard monitoring is essential to prevent and mitigate these hazards. In past years in Georgia several monitoring system, such as Ground-based geodetic techniques, Debrisflow Early Warning System (EWS) were installed on high sensitive landslide and debrisflow areas. This work presents description of Geological hazard monitoring system in Georgia.

  18. Multi-phase classification by a least-squares support vector machine approach in tomography images of geological samples

    NASA Astrophysics Data System (ADS)

    Khan, Faisal; Enzmann, Frieder; Kersten, Michael

    2016-03-01

    Image processing of X-ray-computed polychromatic cone-beam micro-tomography (μXCT) data of geological samples mainly involves artefact reduction and phase segmentation. For the former, the main beam-hardening (BH) artefact is removed by applying a best-fit quadratic surface algorithm to a given image data set (reconstructed slice), which minimizes the BH offsets of the attenuation data points from that surface. A Matlab code for this approach is provided in the Appendix. The final BH-corrected image is extracted from the residual data or from the difference between the surface elevation values and the original grey-scale values. For the segmentation, we propose a novel least-squares support vector machine (LS-SVM, an algorithm for pixel-based multi-phase classification) approach. A receiver operating characteristic (ROC) analysis was performed on BH-corrected and uncorrected samples to show that BH correction is in fact an important prerequisite for accurate multi-phase classification. The combination of the two approaches was thus used to classify successfully three different more or less complex multi-phase rock core samples.

  19. Gravity and magnetic anomaly modeling and correlation using the SPHERE program and Magsat data

    NASA Technical Reports Server (NTRS)

    Braile, L. W.; Hinze, W. J. (Principal Investigator); Vonfrese, R. R. B.

    1980-01-01

    The spherical Earth inversion, modeling, and contouring software were tested and modified for processing data in the Southern Hemisphere. Preliminary geologic/tectonic maps and selected cross sections for South and Central America and the Caribbean region are being compiled and as well as gravity and magnetic models for the major geological features of the area. A preliminary gravity model of the Andeas Beniff Zone was constructed so that the density columns east and west of the subducted plates are in approximate isostatic equilibrium. The magnetic anomaly for the corresponding magnetic model of the zone is being computed with the SPHERE program. A test tape containing global magnetic measurements was converted to a tape compatible with Purdue's CDC system. NOO data were screened for periods of high diurnal activity and reduced to anomaly form using the IGS-75 model. Magnetic intensity anomaly profiles were plotted on the conterminous U.S. map using the track lines as the anomaly base level. The transcontinental magnetic high seen in POGO and MAGSAT data is also represented in the NOO data.

  20. Tectonic evaluation of the Nubian shield of Northeastern Sudan using thematic mapper imagery

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Bechtel is nearing completion of a one-year program that uses digitally enhanced LANDSAT Thematic Mapper (TM) data to compile the first comprehensive regional tectonic map of the Proterozoic Nubian Shield exposed in the northern Red Sea Hills of northeastern Sudan. The status of significant objectives of this study are given. Pertinent published and unpublished geologic literature and maps of the northern Red Sea Hills to establish the geologic framework of the region were reviewed. Thematic mapper imagery for optimal base-map enhancements was processed. Photo mosaics of enhanced images to serve as base maps for compilation of geologic information were completed. Interpretation of TM imagery to define and delineate structural and lithogologic provinces was completed. Geologic information (petrologic, and radiometric data) was compiled from the literature review onto base-map overlays. Evaluation of the tectonic evolution of the Nubian Shield based on the image interpretation and the compiled tectonic maps is continuing.

  1. Environmental flow allocation and statistics calculator

    USGS Publications Warehouse

    Konrad, Christopher P.

    2011-01-01

    The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.

  2. A nodal discontinuous Galerkin approach to 3-D viscoelastic wave propagation in complex geological media

    NASA Astrophysics Data System (ADS)

    Lambrecht, L.; Lamert, A.; Friederich, W.; Möller, T.; Boxberg, M. S.

    2018-03-01

    A nodal discontinuous Galerkin (NDG) approach is developed and implemented for the computation of viscoelastic wavefields in complex geological media. The NDG approach combines unstructured tetrahedral meshes with an element-wise, high-order spatial interpolation of the wavefield based on Lagrange polynomials. Numerical fluxes are computed from an exact solution of the heterogeneous Riemann problem. Our implementation offers capabilities for modelling viscoelastic wave propagation in 1-D, 2-D and 3-D settings of very different spatial scale with little logistical overhead. It allows the import of external tetrahedral meshes provided by independent meshing software and can be run in a parallel computing environment. Computation of adjoint wavefields and an interface for the computation of waveform sensitivity kernels are offered. The method is validated in 2-D and 3-D by comparison to analytical solutions and results from a spectral element method. The capabilities of the NDG method are demonstrated through a 3-D example case taken from tunnel seismics which considers high-frequency elastic wave propagation around a curved underground tunnel cutting through inclined and faulted sedimentary strata. The NDG method was coded into the open-source software package NEXD and is available from GitHub.

  3. Three Dimensional Simulation of the Baneberry Nuclear Event

    NASA Astrophysics Data System (ADS)

    Lomov, Ilya N.; Antoun, Tarabay H.; Wagoner, Jeff; Rambo, John T.

    2004-07-01

    Baneberry, a 10-kiloton nuclear event, was detonated at a depth of 278 m at the Nevada Test Site on December 18, 1970. Shortly after detonation, radioactive gases emanating from the cavity were released into the atmosphere through a shock-induced fissure near surface ground zero. Extensive geophysical investigations, coupled with a series of 1D and 2D computational studies were used to reconstruct the sequence of events that led to the catastrophic failure. However, the geological profile of the Baneberry site is complex and inherently three-dimensional, which meant that some geological features had to be simplified or ignored in the 2D simulations. This left open the possibility that features unaccounted for in the 2D simulations could have had an important influence on the eventual containment failure of the Baneberry event. This paper presents results from a high-fidelity 3D Baneberry simulation based on the most accurate geologic and geophysical data available. The results are compared with available data, and contrasted against the results of the previous 2D computational studies.

  4. ecological geological maps: GIS-based evaluation of the Geo-Ecological Quality Index (GEQUI) in Sicily (Central Mediterranean)

    NASA Astrophysics Data System (ADS)

    Nigro, Fabrizio; Arisco, Giuseppe; Perricone, Marcella; Renda, Pietro; Favara, Rocco

    2010-05-01

    The condition of landscapes and the ecological communities within them is strongly related to levels of human activity. As a consequence, determining status and trends in the pattern of human-dominated landscapes can be useful for understanding the overall conditions of geo-ecological resources. Ecological geological maps are recent tools providing useful informations about a-biotic and biotic features worldwide. These maps represents a new generation of geological maps and depict the lithospheric components conditions on surface, where ecological dynamics (functions and properties) and human activities develop. Thus, these maps are too a fundamental political tool to plan the human activities management in relationship to the territorial/environmental patterns of a date region. Different types of ecological geological maps can be develop regarding the: conditions (situations), zoning, prognosis and recommendations. The ecological geological conditions maps reflects the complex of parameters or individual characteristics of lithosphere, which characterized the opportunity of the influence of lithosphere components on the biota (man, fauna, flora, and ecosystem). The ecological geological zoning maps are foundamental basis for prognosis estimation and nature defenses measures. Estimation from the position of comfort and safety of human life and function of ecosystem is given on these maps. The ecological geological prognosis maps reflect the spatial-temporary prognoses of ecological geological conditions changing during the natural dynamic of natural surrounding and the main-during the economic mastering of territory and natural technical systems. Finally, the ecological geological recommendation maps are based on the ecological geological and social-economical informations, aiming the regulation of territory by the regulation of economic activities and the defense of bio- and socio-sphere extents. Each of these maps may also be computed or in analytic or in synthetic way. The first, characterized or estimated, prognosticated one or several indexes of geological ecological conditions. In the second type of maps, the whole complex is reflected, which defined the modern or prognosticable ecological geological situation. Regarding the ecological geological zoning maps, the contemporary state of ecological geological conditions may be evaluated by a range of parameters into classes of conditions and, on the basis of these informations, the estimation from the position of comfort and safety of human life and function of ecosystem is given. Otherwise, the concept of geoecological land evaluation has become established in the study of landscape/environmental plannings in recent years. It requires different thematic data-sets, deriving from the natural-, social- and amenity-environmental resources analysis, that may be translate in environmental (vulnerability/quality) indexes. There have been some attempts to develop integrated indices related to various aspects of the environment within the framework of sustainable development (e.g.: United Nations Commission on Sustainable Development, World Economic Forum, Advisory Board on Indicators of Sustainable Development of the International Institute for Sustainable Development, Living Planet Index established by the World Wide Fund for Nature, etc.). So, the ecological geological maps represent the basic tool for the geoecological land evaluation policies and may be computed in terms of index-maps. On these basis, a GIS application for assessing the ecological geological zoning is presented for Sicily (Central Mediterranean). The Geo-Ecological Quality Index (GEQUI) map was computed by considering a lot of variables. Ten variables (lithology, climate, landslide distribution, erosion rate, soil type, land cover, habitat, groundwater pollution, roads density and buildings density) generated from available data, were used in the model, in which weighting values to each informative layer were assigned. An overlay analysis was carried out, allowing to classify the region into five classes: bad, poor, moderate, good and high.

  5. A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS

    USGS Publications Warehouse

    Frigeri, A.; Hare, T.; Neteler, M.; Coradini, A.; Federico, C.; Orosei, R.

    2011-01-01

    Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey, represents the state of the art for processing planetary remote sensing data, from the raw unprocessed state to the map projected product. The second, the Geographic Resources Analysis Support System (GRASS) is a Geographic Information System developed by an international team of developers, and one of the core projects promoted by the Open Source Geospatial Foundation (OSGeo). We have worked on enabling the combined use of these software systems throughout the set-up of a common user interface, the unification of the cartographic reference system nomenclature and the minimization of data conversion. Both software packages are distributed with free open source licenses, as well as the source code, scripts and configuration files hereafter presented. In this paper we describe our work done to merge these working environments into a common one, where the user benefits from functionalities of both systems without the need to switch or transfer data from one software suite to the other one. Thereafter we provide an example of its usage in the handling of planetary data and the crafting of a digital geologic map. ?? 2010 Elsevier Ltd. All rights reserved.

  6. Tidal Simulations of an Incised-Valley Fluvial System with a Physics-Based Geologic Model

    NASA Astrophysics Data System (ADS)

    Ghayour, K.; Sun, T.

    2012-12-01

    Physics-based geologic modeling approaches use fluid flow in conjunction with sediment transport and deposition models to devise evolutionary geologic models that focus on underlying physical processes and attempt to resolve them at pertinent spatial and temporal scales. Physics-based models are particularly useful when the evolution of a depositional system is driven by the interplay of autogenic processes and their response to allogenic controls. This interplay can potentially create complex reservoir architectures with high permeability sedimentary bodies bounded by a hierarchy of shales that can effectively impede flow in the subsurface. The complex stratigraphy of tide-influenced fluvial systems is an example of such co-existing and interacting environments of deposition. The focus of this talk is a novel formulation of boundary conditions for hydrodynamics-driven models of sedimentary systems. In tidal simulations, a time-accurate boundary treatment is essential for proper imposition of tidal forcing and fluvial inlet conditions where the flow may be reversed at times within a tidal cycle. As such, the boundary treatment at the inlet has to accommodate for a smooth transition from inflow to outflow and vice-versa without creating numerical artifacts. Our numerical experimentations showed that boundary condition treatments based on a local (frozen) one-dimensional approach along the boundary normal which does not account for the variation of flow quantities in the tangential direction often lead to unsatisfactory results corrupted by numerical artifacts. In this talk, we propose a new boundary treatment that retains all spatial and temporal terms in the model and as such is capable to account for nonlinearities and sharp variations of model variables near boundaries. The proposed approach borrows heavily from the idea set forth by J. Sesterhenn1 for compressible Navier-Stokes equations. The methodology is successfully applied to a tide-influenced incised valley fluvial system and the resulting stratigraphy is shown and discussed for different tide amplitudes. 1 Sesterhenn, J.: "A characteristic-type formulation of the Navier-Stokes equations for high-order upwind schemes", Computers & Fluids 30 (1) 37-67, 2001.;

  7. Archive of Sediment Data Collected from Sandy Point to Belle Pass, Louisiana, 1983 through 2000 (Vibracore Surveys: 00SCC, CR83, P86, and USACE Borehole Cores)

    USGS Publications Warehouse

    Dreher, Chandra A.; Flocks, James G.; Ferina, Nick F.; Kulp, Mark A.

    2008-01-01

    This CD-ROM publication was prepared by an agency of the U.S. Government. Although these data have been processed successfully on a computer system at the U.S. Geological Survey, no warranty expressed or implied is made regarding the display or utility of data on any other system, or for general or scientific purposes, nor shall the act of distribution imply any such warranty. The U.S. Geological Survey shall not be held liable for improper or incorrect use of the data described and (or) contained herein. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not constitute or imply its endorsement, recommendation, or favoring by the U.S. Government nor any agency thereof.

  8. Archive of side scan sonar and bathymetry data collected during USGS Cruise 06FSH01 offshore of Siesta Key, Florida, May 2006

    USGS Publications Warehouse

    Ciembronowicz, Katherine T.; Flocks, James G.; Wiese, Dana S.; DeWitt, Nancy T.; Ferina, Nick F.; Robbins, Lisa L.; Harrison, Arnell S.

    2007-01-01

    This DVD publication was prepared by an agency of the United States Government. Although these data have been processed successfully on a computer system at the U.S. Geological Survey, no warranty expressed or implied is made regarding the display or utility of the data on any other system, or for general or scientific purposes, nor shall the act of distribution imply any such warranty. The U.S. Geological Survey shall not be held liable for improper or incorrect use of the data described and (or) contained herein. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof.

  9. Archive of digital Chirp subbottom profile data collected during USGS cruises 10CCT01, 10CCT02, and 10CCT03, Mississippi and Alabama Gulf Islands, March and April 2010

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; DeWitt, Nancy T.; Pfeiffer, William R.; Kelso, Kyle W.; Thompson, Phillip R.

    2011-01-01

    This Digital Versatile Disc (DVD) publication was prepared by an agency of the United States Government. Although these data have been processed successfully on a computer system at the U.S. Geological Survey, no warranty expressed or implied is made regarding the display or utility of the data on any other system, nor shall the act of distribution imply any such warranty. The U.S. Geological Survey shall not be held liable for improper or incorrect use of the data described and (or) contained herein. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof.

  10. Archive of digital boomer seismic reflection data collected offshore east-central Florida during USGS cruises 96FGS01 and 97FGS01 in November of 1996 and May of 1997

    USGS Publications Warehouse

    Subino, Janice A.; Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Calderon, Karynna

    2012-01-01

    This Digital Versatile Disc (DVD) publication was prepared by an agency of the United States Government. Although these data have been processed successfully on a computer system at the U.S. Geological Survey, no warranty expressed or implied is made regarding the display or utility of the data on any other system, nor shall the act of distribution imply any such warranty. The U.S. Geological Survey shall not be held liable for improper or incorrect use of the data described and (or) contained herein. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof.

  11. Archive of Digital Chirp Sub-bottom profile data collected during USGS cruise 09CCT01 offshore of Sabine Pass and Galveston, Texas, March 2009

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Dellapenna, Timothy M.; Sanford, Jordan M.; Wiese, Dana S.

    2010-01-01

    This Digital Versatile Disc (DVD) publication was prepared by an agency of the United States Government. Although these data have been processed successfully on a computer system at the U.S. Geological Survey, no warranty expressed or implied is made regarding the display or utility of the data on any other system, nor shall the act of distribution imply any such warranty. The U.S. Geological Survey shall not be held liable for improper or incorrect use of the data described and (or) contained herein. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof.

  12. Geologic Map and Digital Data Base of the Almo Quadrangle and City of Rocks National Reserve, Cassia County, Idaho

    USGS Publications Warehouse

    Miller, David M.; Armstrong, Richard L.; Bedford, David R.; Davis, Marsha

    2008-01-01

    This geologic map describes the geology of the City of Rocks National Reserve and environs, located in the Albion Mountains of south-central Idaho. The most prominent geologic features of the Reserve are the spectacular rock spires that attracted visitors, beginning with commentary in the journals of travelers to California during the Gold Rush of 1849. The tectonic history is outlined, and descriptions of landscape processes, a newly discovered Quaternary fault, and features of the pinnacles are presented.

  13. A proposed new framework for valorization of geoheritage in Norway

    NASA Astrophysics Data System (ADS)

    Dahl, Rolv; Bergengren, Anna; Heldal, Tom

    2015-04-01

    The geological history of Norway is a complex one, . The exploitation of geological resources of different kinds has always provided the backbone of the Norwegian community. Nevertheless, the perception of geology and the geological processes that created the landscape is little appreciated, compared to bio-diversity and cultural heritage. Some geological localities play an important role in our perception and scientific understanding of the landscape. Other localities are, or could be, important tourist destinations. Other localities can in turn be important for geoscience education on all levels, whereas other plays a major role in the understanding of geodiversity and geoheritage and should be protected as natural monuments. A database based on old registrations has been compiled and a web mapping server is recently launched based on old and new registrations. However, no systematical classification and identification of important sites has been done for the last thirty years. We are now calling for a crowdsourcing process in the geological community in order to validate and valorize the registrations, as well as defining new points and areas of interest. Furthermore, we are developing a valorization system for these localities. The framework for this system is based on studies from inventories in other countries, as well as suggestions from ProGeo. The aim is to raise awareness of important sites, and how they are treated and utilized for scientific, or educational purposes, as tourist destinations or heritage sites. Our presentation will focus on the development of the framework and its implications.

  14. Cognitive factors affecting student understanding of geologic time

    NASA Astrophysics Data System (ADS)

    Dodick, Jeff; Orion, Nir

    2003-04-01

    A critical element of the earth sciences is reconstructing geological structures and systems that have developed over time. A survey of the science education literature shows that there has been little attention given to this concept. In this study, we present a model, based on Montagnero's ([1996]) model of diachronic thinking, which describes how students reconstruct geological transformations over time. For geology, three schemes of diachronic thinking are relevant: 1. Transformation, which is a principle of change; in geology it is understood through actualistic thinking (the idea that present proceeses can be used to model the past). 2. Temporal organization, which defines the sequential order of a transformation; in geology it is based on the three-dimensional relationship among strata. 3. Interstage linkage, which is the connections between successive stages of a transformation; in geology it is based on both actualism and causal reasoning. Three specialized instruments were designed to determine the factors which influence reconstructive thinking: (a) the GeoTAT which tests diachronic thinking skills, (b) the TST which tests the relationship between spatial thinking and temporal thinking, and (c) the SFT which tests the influence of dimensional factors on temporal awareness. Based on the model constructed in this study we define the critical factors influencing reconstructive thinking: (a) the transformation scheme which influences the other diachronic schemes, (b) knowledge of geological processes, and (c) extracognitive factors. Among the students tested, there was a significant difference between Grade 9-12 students and Grade 7-8 students in their ability to reconstruct geological phenomena using diachronic thinking. This suggests that somewhere between Grades 7 and 8 it is possible to start teaching some of the logical principles used in geology to reconstruct geological structures.

  15. Geologic processes influence the effects of mining on aquatic ecosystems

    USGS Publications Warehouse

    Schmidt, Travis S.; Clements, William H.; Wanty, Richard B.; Verplanck, Philip L.; Church, Stan E.; San Juan, Carma A.; Fey, David L.; Rockwell, Barnaby W.; DeWitt, Ed H.; Klein, Terry L.

    2012-01-01

    Geologic processes strongly influence water and sediment quality in aquatic ecosystems but rarely are geologic principles incorporated into routine biomonitoring studies. We test if elevated concentrations of metals in water and sediment are restricted to streams downstream of mines or areas that may discharge mine wastes. We surveyed 198 catchments classified as “historically mined” or “unmined,” and based on mineral-deposit criteria, to determine whether water and sediment quality were influenced by naturally occurring mineralized rock, by historical mining, or by a combination of both. By accounting for different geologic sources of metals to the environment, we were able to distinguish aquatic ecosystems limited by metals derived from natural processes from those due to mining. Elevated concentrations of metals in water and sediment were not restricted to mined catchments; depauperate aquatic communities were found in unmined catchments. The type and intensity of hydrothermal alteration and the mineral deposit type were important determinants of water and sediment quality as well as the aquatic community in both mined and unmined catchments. This study distinguished the effects of different rock types and geologic sources of metals on ecosystems by incorporating basic geologic processes into reference and baseline site selection, resulting in a refined assessment. Our results indicate that biomonitoring studies should account for natural sources of metals in some geologic environments as contributors to the effect of mines on aquatic ecosystems, recognizing that in mining-impacted drainages there may have been high pre-mining background metal concentrations.

  16. The roles of humans and robots as field geologists on the Moon

    NASA Technical Reports Server (NTRS)

    Spudis, Paul D.; Taylor, G. Jeffrey

    1992-01-01

    The geologic exploration of the Moon will be one of the primary scientific functions of any lunar base program. Geologic reconnaissance, the broad-scale characterization of processes and regions, is an ongoing effort that has already started and will continue after base establishment. Such reconnaissance is best done by remote sensing from lunar orbit and simple, automated, sample return missions of the Soviet Luna class. Field study, in contrast, requires intensive work capabilities and the guiding influence of human intelligence. We suggest that the most effective way to accomplish the goals of geologic field study on the Moon is through the use of teleoperated robots, under the direct control of a human geologists who remains at the lunar base, or possibly on Earth. These robots would have a global traverse range, could possess sensory abilities optimized for geologic field work, and would accomplish surface exploration goals without the safety and life support concerns attendance with the use of human geologists on the Moon. By developing the capability to explore any point on the Moon immediately after base establishment, the use of such teleoperated, robotic field geologists makes the single-site lunar base into a 'global' base from the viewpoint of geologic exploration.

  17. Advances in Geologic Disposal System Modeling and Application to Crystalline Rock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mariner, Paul E.; Stein, Emily R.; Frederick, Jennifer M.

    The Used Fuel Disposition Campaign (UFDC) of the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE), Office of Fuel Cycle Technology (OFCT) is conducting research and development (R&D) on geologic disposal of used nuclear fuel (UNF) and high-level nuclear waste (HLW). Two of the high priorities for UFDC disposal R&D are design concept development and disposal system modeling (DOE 2011). These priorities are directly addressed in the UFDC Generic Disposal Systems Analysis (GDSA) work package, which is charged with developing a disposal system modeling and analysis capability for evaluating disposal system performance for nuclear waste in geologic mediamore » (e.g., salt, granite, clay, and deep borehole disposal). This report describes specific GDSA activities in fiscal year 2016 (FY 2016) toward the development of the enhanced disposal system modeling and analysis capability for geologic disposal of nuclear waste. The GDSA framework employs the PFLOTRAN thermal-hydrologic-chemical multi-physics code and the Dakota uncertainty sampling and propagation code. Each code is designed for massively-parallel processing in a high-performance computing (HPC) environment. Multi-physics representations in PFLOTRAN are used to simulate various coupled processes including heat flow, fluid flow, waste dissolution, radionuclide release, radionuclide decay and ingrowth, precipitation and dissolution of secondary phases, and radionuclide transport through engineered barriers and natural geologic barriers to the biosphere. Dakota is used to generate sets of representative realizations and to analyze parameter sensitivity.« less

  18. Probabilistic assessment methodology for continuous-type petroleum accumulations

    USGS Publications Warehouse

    Crovelli, R.A.

    2003-01-01

    The analytic resource assessment method, called ACCESS (Analytic Cell-based Continuous Energy Spreadsheet System), was developed to calculate estimates of petroleum resources for the geologic assessment model, called FORSPAN, in continuous-type petroleum accumulations. The ACCESS method is based upon mathematical equations derived from probability theory in the form of a computer spreadsheet system. ?? 2003 Elsevier B.V. All rights reserved.

  19. A connectivity-based modeling approach for representing hysteresis in macroscopic two-phase flow properties

    DOE PAGES

    Cihan, Abdullah; Birkholzer, Jens; Trevisan, Luca; ...

    2014-12-31

    During CO 2 injection and storage in deep reservoirs, the injected CO 2 enters into an initially brine saturated porous medium, and after the injection stops, natural groundwater flow eventually displaces the injected mobile-phase CO 2, leaving behind residual non-wetting fluid. Accurate modeling of two-phase flow processes are needed for predicting fate and transport of injected CO 2, evaluating environmental risks and designing more effective storage schemes. The entrapped non-wetting fluid saturation is typically a function of the spatially varying maximum saturation at the end of injection. At the pore-scale, distribution of void sizes and connectivity of void space playmore » a major role for the macroscopic hysteresis behavior and capillary entrapment of wetting and non-wetting fluids. This paper presents development of an approach based on the connectivity of void space for modeling hysteretic capillary pressure-saturation-relative permeability relationships. The new approach uses void-size distribution and a measure of void space connectivity to compute the hysteretic constitutive functions and to predict entrapped fluid phase saturations. Two functions, the drainage connectivity function and the wetting connectivity function, are introduced to characterize connectivity of fluids in void space during drainage and wetting processes. These functions can be estimated through pore-scale simulations in computer-generated porous media or from traditional experimental measurements of primary drainage and main wetting curves. The hysteresis model for saturation-capillary pressure is tested successfully by comparing the model-predicted residual saturation and scanning curves with actual data sets obtained from column experiments found in the literature. A numerical two-phase model simulator with the new hysteresis functions is tested against laboratory experiments conducted in a quasi-two-dimensional flow cell (91.4cm×5.6cm×61cm), packed with homogeneous and heterogeneous sands. Initial results show that the model can predict spatial and temporal distribution of injected fluid during the experiments reasonably well. However, further analyses are needed for comprehensively testing the ability of the model to predict transient two-phase flow processes and capillary entrapment in geological reservoirs during geological carbon sequestration.« less

  20. Customized Geological Map Patterns for the Macintosh Computer.

    ERIC Educational Resources Information Center

    Boyer, Paul Slayton

    1986-01-01

    Describes how the graphics capabilities of the Apple Macintosh computer can be used in geological teaching by customizing fill patterns with lithologic symbols. Presents two methods for doing this: creating a dummy document, or by changing the pattern resource resident in the operating system. Special symbols can also replace fonts. (TW)

  1. Analysis of geologic terrain models for determination of optimum SAR sensor configuration and optimum information extraction for exploration of global non-renewable resources. Pilot study: Arkansas Remote Sensing Laboratory, part 1, part 2, and part 3

    NASA Technical Reports Server (NTRS)

    Kaupp, V. H.; Macdonald, H. C.; Waite, W. P.; Stiles, J. A.; Frost, F. S.; Shanmugam, K. S.; Smith, S. A.; Narayanan, V.; Holtzman, J. C. (Principal Investigator)

    1982-01-01

    Computer-generated radar simulations and mathematical geologic terrain models were used to establish the optimum radar sensor operating parameters for geologic research. An initial set of mathematical geologic terrain models was created for three basic landforms and families of simulated radar images were prepared from these models for numerous interacting sensor, platform, and terrain variables. The tradeoffs between the various sensor parameters and the quantity and quality of the extractable geologic data were investigated as well as the development of automated techniques of digital SAR image analysis. Initial work on a texture analysis of SEASAT SAR imagery is reported. Computer-generated radar simulations are shown for combinations of two geologic models and three SAR angles of incidence.

  2. MICHIGAN SOIL VAPOR EXTRACTION REMEDIATION (MISER) MODEL: A COMPUTER PROGRAM TO MODEL SOIL VAPORT EXTRACTION AND BIOVENTING OF ORGANIC MATERIALS IN UNSATURATED GEOLO-GICAL MATERIAL (EPA/600/SR-97/099)

    EPA Science Inventory

    Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...

  3. Geological terrain models

    NASA Technical Reports Server (NTRS)

    Kaupp, V. H.; Macdonald, H. C.; Waite, W. P.

    1981-01-01

    The initial phase of a program to determine the best interpretation strategy and sensor configuration for a radar remote sensing system for geologic applications is discussed. In this phase, terrain modeling and radar image simulation were used to perform parametric sensitivity studies. A relatively simple computer-generated terrain model is presented, and the data base, backscatter file, and transfer function for digital image simulation are described. Sets of images are presented that simulate the results obtained with an X-band radar from an altitude of 800 km and at three different terrain-illumination angles. The simulations include power maps, slant-range images, ground-range images, and ground-range images with statistical noise incorporated. It is concluded that digital image simulation and computer modeling provide cost-effective methods for evaluating terrain variations and sensor parameter changes, for predicting results, and for defining optimum sensor parameters.

  4. Semi-Infinite Geology Modeling Algorithm (SIGMA): a Modular Approach to 3D Gravity

    NASA Astrophysics Data System (ADS)

    Chang, J. C.; Crain, K.

    2015-12-01

    Conventional 3D gravity computations can take up to days, weeks, and even months, depending on the size and resolution of the data being modeled. Additional modeling runs, due to technical malfunctions or additional data modifications, only compound computation times even further. We propose a new modeling algorithm that utilizes vertical line elements to approximate mass, and non-gridded (point) gravity observations. This algorithm is (1) magnitudes faster than conventional methods, (2) accurate to less than 0.1% error, and (3) modular. The modularity of this methodology means that researchers can modify their geology/terrain or gravity data, and only the modified component needs to be re-run. Additionally, land-, sea-, and air-based platforms can be modeled at their observation point, without having to filter data into a synthesized grid.

  5. Geologic application of thermal inertia imaging using HCMM data. [Walker Lane, Nevada; San Rafael, Utah; and Death Valley and Pisgah Crater, Lavic Lake Region, California

    NASA Technical Reports Server (NTRS)

    Kahle, A. B.; Schieldge, J. P.; Abrams, M. J.; Alley, R. E.; Levine, C. J. (Principal Investigator)

    1981-01-01

    Three test sites in the western US were selected to discriminate among surface geologic materials on the basis of their thermal properties as determined from HCMM data. Attempts to determine quantitatively accurate thermal inertia values from HCMM digital data met with only partial success due to the effects of sensor miscalibrations, radiative transfer in the atmosphere, and varying meteorology and elevation across a scene. In most instances, apparent thermal inertia was found to be an excellent qualitative representation of true thermal inertia. Computer processing of digital day and night HCMM data allowed construction of geologically useful images. At some test sites, more information was provided by data than LANDSAT data. Soil moisture effects and differences in spectrally dark materials were more effectively displayed using the thermal data.

  6. Tabular data and graphical images in support of the U.S. Geological Survey National Oil and Gas Assessment--San Juan Basin Province (5022): Chapter 7 in Total petroleum systems and geologic assessment of undiscovered oil and gas resources in the San Juan Basin Province, exclusive of Paleozoic rocks, New Mexico and Colorado

    USGS Publications Warehouse

    Klett, T.R.; Le, P.A.

    2013-01-01

    This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD–ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).

  7. Geoillustrator - fast sketching of geological illustrations and animations

    NASA Astrophysics Data System (ADS)

    Patel, Daniel; Langeland, Tor; Solteszova, Veronika

    2014-05-01

    We present our research results in the Geoillustrator project. The project has been going for four years and is ending in March. It was aimed at developing a rapid sketching tool for generating geological illustrations and animations for understanding the processes that have led to a current subsurface configuration. The sketching tool facilitates effective dissemination of ideas, e.g. through generation of interactive geo-scientific illustrations for interdisciplinary communication and communication to decision makers, media and lay persons. This can improve work processes in early phases of oil and gas exploration where critical decisions have to be taken based on limited information. It is a challenge for involved specialists in early exploration phases to externalize their ideas, and effectively achieve consensus in multidisciplinary working groups. In these work processes, a tool for rapid sketching of geology would be very useful for expressing geological hypotheses and creating and comparing different evolution scenarios. Often, decisions are influenced by factors that are not relevant, e.g. the geologists who produce the most polished illustrations of their hypothesis have a higher probability for getting their theories through to decision makers as it is more clearly communicated. This results in a competitive advantage for geologists who are skilled in creating illustrations. Having a tool that would lift the ability of all geologists to express their ideas to an equal level would result in more alternatives and better foundation for decision making. Digital sketching will also allow capturing otherwise lost material which can constitute a large amount of mental work and ideas. The results of sketching are currently scrapped as paper or erased from the blackboard or exist only as rough personal sketches. By using a digital sketching tool, the sketches can be exported to a form usable in modelling tools used in later phases of exploration. Currently, no digital tool exists supporting the above mentioned requirements. However, in the Geoillustrator project, relevant visualization and sketching methods have been researched, and prototypes have been developed which demonstrate a set of the mentioned functionalities. Our published results in the project which we will present can be found on our website http://www.cmr.no/cmr_computing/index.cfm?id=313109

  8. Animation-Based Learning in Geology: Impact of Animations Coupled with Seductive Details

    ERIC Educational Resources Information Center

    Clayton, Rodney L.

    2016-01-01

    Research is not clear on how to address the difficulty students have conceptualizing geologic processes and phenomena. This study investigated how animations coupled with seductive details effect learners' situational interest and emotions. A quantitative quasi-experimental study employing an independent-measures factorial design was used. The…

  9. A Web-based Visualization System for Three Dimensional Geological Model using Open GIS

    NASA Astrophysics Data System (ADS)

    Nemoto, T.; Masumoto, S.; Nonogaki, S.

    2017-12-01

    A three dimensional geological model is an important information in various fields such as environmental assessment, urban planning, resource development, waste management and disaster mitigation. In this study, we have developed a web-based visualization system for 3D geological model using free and open source software. The system has been successfully implemented by integrating web mapping engine MapServer and geographic information system GRASS. MapServer plays a role of mapping horizontal cross sections of 3D geological model and a topographic map. GRASS provides the core components for management, analysis and image processing of the geological model. Online access to GRASS functions has been enabled using PyWPS that is an implementation of WPS (Web Processing Service) Open Geospatial Consortium (OGC) standard. The system has two main functions. Two dimensional visualization function allows users to generate horizontal and vertical cross sections of 3D geological model. These images are delivered via WMS (Web Map Service) and WPS OGC standards. Horizontal cross sections are overlaid on the topographic map. A vertical cross section is generated by clicking a start point and an end point on the map. Three dimensional visualization function allows users to visualize geological boundary surfaces and a panel diagram. The user can visualize them from various angles by mouse operation. WebGL is utilized for 3D visualization. WebGL is a web technology that brings hardware-accelerated 3D graphics to the browser without installing additional software. The geological boundary surfaces can be downloaded to incorporate the geologic structure in a design on CAD and model for various simulations. This study was supported by JSPS KAKENHI Grant Number JP16K00158.

  10. Automatic mapping of the base of aquifer — A case study from Morrill, Nebraska

    USGS Publications Warehouse

    Gulbrandsen, Mats Lundh; Ball, Lyndsay B.; Minsley, Burke J.; Hansen, Thomas Mejer

    2017-01-01

    When a geologist sets up a geologic model, various types of disparate information may be available, such as exposures, boreholes, and (or) geophysical data. In recent years, the amount of geophysical data available has been increasing, a trend that is only expected to continue. It is nontrivial (and often, in practice, impossible) for the geologist to take all the details of the geophysical data into account when setting up a geologic model. We have developed an approach that allows for the objective quantification of information from geophysical data and borehole observations in a way that is easy to integrate in the geologic modeling process. This will allow the geologist to make a geologic interpretation that is consistent with the geophysical information at hand. We have determined that automated interpretation of geologic layer boundaries using information from boreholes and geophysical data alone can provide a good geologic layer model, even before manual interpretation has begun. The workflow is implemented on a set of boreholes and airborne electromagnetic (AEM) data from Morrill, Nebraska. From the borehole logs, information about the depth to the base of aquifer (BOA) is extracted and used together with the AEM data to map a surface that represents this geologic contact. Finally, a comparison between our automated approach and a previous manual mapping of the BOA in the region validates the quality of the proposed method and suggests that this workflow will allow a much faster and objective geologic modeling process that is consistent with the available data.

  11. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  12. Experiments with microcomputer-based artificial intelligence environments

    USGS Publications Warehouse

    Summers, E.G.; MacDonald, R.A.

    1988-01-01

    The U.S. Geological Survey (USGS) has been experimenting with the use of relatively inexpensive microcomputers as artificial intelligence (AI) development environments. Several AI languages are available that perform fairly well on desk-top personal computers, as are low-to-medium cost expert system packages. Although performance of these systems is respectable, their speed and capacity limitations are questionable for serious earth science applications foreseen by the USGS. The most capable artificial intelligence applications currently are concentrated on what is known as the "artificial intelligence computer," and include Xerox D-series, Tektronix 4400 series, Symbolics 3600, VAX, LMI, and Texas Instruments Explorer. The artificial intelligence computer runs expert system shells and Lisp, Prolog, and Smalltalk programming languages. However, these AI environments are expensive. Recently, inexpensive 32-bit hardware has become available for the IBM/AT microcomputer. USGS has acquired and recently completed Beta-testing of the Gold Hill Systems 80386 Hummingboard, which runs Common Lisp on an IBM/AT microcomputer. Hummingboard appears to have the potential to overcome many of the speed/capacity limitations observed with AI-applications on standard personal computers. USGS is a Beta-test site for the Gold Hill Systems GoldWorks expert system. GoldWorks combines some high-end expert system shell capabilities in a medium-cost package. This shell is developed in Common Lisp, runs on the 80386 Hummingboard, and provides some expert system features formerly available only on AI-computers including frame and rule-based reasoning, on-line tutorial, multiple inheritance, and object-programming. ?? 1988 International Association for Mathematical Geology.

  13. A fast, parallel algorithm to solve the basic fluvial erosion/transport equations

    NASA Astrophysics Data System (ADS)

    Braun, J.

    2012-04-01

    Quantitative models of landform evolution are commonly based on the solution of a set of equations representing the processes of fluvial erosion, transport and deposition, which leads to predict the geometry of a river channel network and its evolution through time. The river network is often regarded as the backbone of any surface processes model (SPM) that might include other physical processes acting at a range of spatial and temporal scales along hill slopes. The basic laws of fluvial erosion requires the computation of local (slope) and non-local (drainage area) quantities at every point of a given landscape, a computationally expensive operation which limits the resolution of most SPMs. I present here an algorithm to compute the various components required in the parameterization of fluvial erosion (and transport) and thus solve the basic fluvial geomorphic equation, that is very efficient because it is O(n) (the number of required arithmetic operations is linearly proportional to the number of nodes defining the landscape), and is fully parallelizable (the computation cost decreases in a direct inverse proportion to the number of processors used to solve the problem). The algorithm is ideally suited for use on latest multi-core processors. Using this new technique, geomorphic problems can be solved at an unprecedented resolution (typically of the order of 10,000 X 10,000 nodes) while keeping the computational cost reasonable (order 1 sec per time step). Furthermore, I will show that the algorithm is applicable to any regular or irregular representation of the landform, and is such that the temporal evolution of the landform can be discretized by a fully implicit time-marching algorithm, making it unconditionally stable. I will demonstrate that such an efficient algorithm is ideally suited to produce a fully predictive SPM that links observationally based parameterizations of small-scale processes to the evolution of large-scale features of the landscapes on geological time scales. It can also be used to model surface processes at the continental or planetary scale and be linked to lithospheric or mantle flow models to predict the potential interactions between tectonics driving surface uplift in orogenic areas, mantle flow producing dynamic topography on continental scales and surface processes.

  14. Web-client based distributed generalization and geoprocessing

    USGS Publications Warehouse

    Wolf, E.B.; Howe, K.

    2009-01-01

    Generalization and geoprocessing operations on geospatial information were once the domain of complex software running on high-performance workstations. Currently, these computationally intensive processes are the domain of desktop applications. Recent efforts have been made to move geoprocessing operations server-side in a distributed, web accessible environment. This paper initiates research into portable client-side generalization and geoprocessing operations as part of a larger effort in user-centered design for the US Geological Survey's The National Map. An implementation of the Ramer-Douglas-Peucker (RDP) line simplification algorithm was created in the open source OpenLayers geoweb client. This algorithm implementation was benchmarked using differing data structures and browser platforms. The implementation and results of the benchmarks are discussed in the general context of client-side geoprocessing. (Abstract).

  15. How to Make a Virtual Landscape with Outcrops for Use in Geoscience Teaching

    NASA Astrophysics Data System (ADS)

    Houghton, J.; Gordon, C.; Craven, B.; Robinson, A.; Lloyd, G. E. E.; Morgan, D. J.

    2016-12-01

    We are using screen-based virtual reality landscapes to augment the teaching of basic geological field skills and to enhance 3D visualisation skills. Here we focus on the processes of creating these landscapes, both imagined and real, in the Unity 3D game engine. The virtual landscapes are terrains with embedded data for mapping exercises, or draped geological maps for understanding the 3D interaction of the geology with the topography. The nature of the landscapes built depends on the learning outcomes of the intended teaching exercise. For example, a simple model of two hills and a valley over which to drape a series of different geological maps can be used to enhance the understanding of the 3D interaction of the geology with the topography. A more complex topography reflecting the underlying geology can be used for geological mapping exercises. The process starts with a contour image or DEM, which needs to be converted into RAW files to be imported into Unity. Within Unity itself, there are a series of steps needed to create a world around the terrain (the setting of cameras, lighting, skyboxes etc) before the terrain can be painted with vegetation and populated with assets or before a splatmap of the geology can be added. We discuss how additional features such as a GPS unit or compass can be included. We are also working to create landscapes based on real localities, both in response to the demand for greater realism and to support students unable to access the field due to health or mobility issues. This includes adding 3D photogrammetric images of outcrops into the worlds. This process uses the open source/freeware tools VisualSFM and MeshLab to create files suitable to be imported into Unity. This project is a collaboration between the University of Leeds and Leeds College of Art, UK, and all our virtual landscapes are freely available online at www.see.leeds.ac.uk/virtual-landscapes/.

  16. Integrating Field-Centered, Project Based Activities with Academic Year Coursework: A Curriculum Wide Approach

    NASA Astrophysics Data System (ADS)

    Kelso, P. R.; Brown, L. M.

    2015-12-01

    Based upon constructivist principles and the recognition that many students are motivated by hands-on activities and field experiences, we designed a new undergraduate curriculum at Lake Superior State University. One of our major goals was to develop stand-alone field projects in most of the academic year courses. Examples of courses impacted include structural geology, geophysics, and geotectonics, Students learn geophysical concepts in the context of near surface field-based geophysical studies while students in structural geology learn about structural processes through outcrop study of fractures, folds and faults. In geotectonics students learn about collisional and rifting processes through on-site field studies of specific geologic provinces. Another goal was to integrate data and samples collected by students in our sophomore level introductory field course along with stand-alone field projects in our clastic systems and sequence stratigraphy courses. Our emphasis on active learning helps students develop a meaningful geoscience knowledge base and complex reasoning skills in authentic contexts. We simulate the activities of practicing geoscientists by engaging students in all aspects of a project, for example: field-oriented project planning and design; acquiring, analyzing, and interpreting data; incorporating supplemental material and background data; and preparing oral and written project reports. We find through anecdotal evidence including student comments and personal observation that the projects stimulate interest, provide motivation for learning new concepts, integrate skill and concept acquisition vertically through the curriculum, apply concepts from multiple geoscience subdisiplines, and develop soft skills such as team work, problem solving, critical thinking and communication skills. Through this projected-centered Lake Superior State University geology curriculum students practice our motto of "learn geology by doing geology."

  17. Use of remote sensing and GIS in mapping the environmental sensitivity areas for desertification of Egyptian territory

    NASA Astrophysics Data System (ADS)

    Gad, A.; Lotfy, I.

    2008-06-01

    Desertification is defined in the first art of the convention to combat desertification as "land degradation in arid, semiarid and dry sub-humid areas resulting from climatic variations and human activities". Its consequence include a set of important processes which are active in arid and semi arid environment, where water is the main limiting factor of land use performance in such ecosystem . Desertification indicators or the groups of associated indicators should be focused on a single process. They should be based on available reliable information sources, including remotely sensed images, topographic data (maps or DEM'S), climate, soils and geological data. The current work aims to map the Environmental Sensitivity Areas (ESA's) to desertification in whole territory of Egypt at a scale of 1:1 000 000. ETM satellite images, geologic and soil maps were used as main sources for calculating the index of Environmental Sensitivity Areas (ESAI) for desertification. The algorism is adopted from MEDALLUS methodology as follows; ESAI = (SQI * CQI * VQI)1/3 Where SQI is the soil quality index, CQI is the climate quality index and VQI is the vegetation quality index. The SQI is based on rating the parent material, slope, soil texture, and soil depth. The VQI is computed on bases of rating three categories (i.e. erosion protection, drought resistance and plant cover). The CQI is based on the aridity index, derived from values of annual rainfall and potential evapotranspiration. Arc-GIS 9 software was used for the computation and sensitivity maps production. The results show that the soil of the Nile Valley are characterized by a moderate SQI, however the those in the interference zone are low soil quality indexed. The dense vegetation of the valley has raised its VQI to be good, however coastal areas are average and interference zones are low. The maps of ESA's for desertification show that 86.1% of Egyptian territory is classified as very sensitive areas, while 4.3% as Moderately sensitive, and 9.6% as sensitive. It can be concluded that implementing the maps of sensitivity to desertification is rather useful in the arid and semi arid areas as they give more likely quantitative trend for frequency of sensitive areas. The integration of different factors contributing to desertification sensitivity may lead to plan a successful combating. The usage of space data and GIS proved to be suitable tools to rely estimation and to fulfill the needed large computational requirements. They are also useful in visualizing the sensitivity situation of different desertification parameters.

  18. Policy and procedures for the management and archival storage of data collected for hydrologic investigations, U.S. Geological Survey, Indiana District

    USGS Publications Warehouse

    Martin, Jeffrey D.; Cohen, David A.

    1993-01-01

    This report describes the policy and procedures used by the Indiana District of the U.S. Geological Survey, Water Resources Division, to manage and store data collected during hydrologic investigations. It is the policy of the Indiana District that data collected to meet the objectives of projects for hydrologic investigations be documented, organized, and archieved in a manner that (1) facilitates retrieval, evaluation, and use by other District personnel, and (2) enables verifi- cation of data contained in all reports and computer data bases.

  19. Radiometric age file for Alaska: A section in The United States Geological Survey in Alaska: Accomplishments during 1980

    USGS Publications Warehouse

    Shew, Nora B.; Wilson, Frederic H.

    1982-01-01

    The Alaska radiometric age file of the Branch of Alaskan Geology is a computer-based compilation of radiometric dates from the state of Alaska and the western parts of the Yukon Territory and British Columbia. More than 1800 age determinations from over 250 references have been entered in the file. References date back to 1958 and include both published and unpublished sources. The file is the outgrowth of an original radiometric age file compiled by Don Grybeck and students at the University of Alaska-Fairbanks (Turner and others, 1975).

  20. Augmented Topological Descriptors of Pore Networks for Material Science.

    PubMed

    Ushizima, D; Morozov, D; Weber, G H; Bianchi, A G C; Sethian, J A; Bethel, E W

    2012-12-01

    One potential solution to reduce the concentration of carbon dioxide in the atmosphere is the geologic storage of captured CO2 in underground rock formations, also known as carbon sequestration. There is ongoing research to guarantee that this process is both efficient and safe. We describe tools that provide measurements of media porosity, and permeability estimates, including visualization of pore structures. Existing standard algorithms make limited use of geometric information in calculating permeability of complex microstructures. This quantity is important for the analysis of biomineralization, a subsurface process that can affect physical properties of porous media. This paper introduces geometric and topological descriptors that enhance the estimation of material permeability. Our analysis framework includes the processing of experimental data, segmentation, and feature extraction and making novel use of multiscale topological analysis to quantify maximum flow through porous networks. We illustrate our results using synchrotron-based X-ray computed microtomography of glass beads during biomineralization. We also benchmark the proposed algorithms using simulated data sets modeling jammed packed bead beds of a monodispersive material.

  1. The graphic cell method: a new look at digitizing geologic maps

    USGS Publications Warehouse

    Hanley, J.T.

    1982-01-01

    The graphic cell method is an alternative method of digitizing areal geologic information. It involves a discrete-point sampling scheme in which the computer establishes a matrix of cells over the map. Each cell and the whole cell is assigned the identity or value of the geologic information that is recognized at its center. Cell size may be changed to suit the needs of the user. The computer program resolves the matrix and identifies potential errors such as multiple assignments. Input includes the digitized boundaries of each geologic formation. This method should eliminate a primary bottleneck in the creation and testing of geomathematical models in such disciplines as resource appraisal. ?? 1982.

  2. Object-Oriented Programming When Developing Software in Geology and Geophysics

    NASA Astrophysics Data System (ADS)

    Ahmadulin, R. K.; Bakanovskaya, L. N.

    2017-01-01

    The paper reviews the role of object-oriented programming when developing software in geology and geophysics. Main stages have been identified at which it is worthwhile to apply principles of object-oriented programming when developing software in geology and geophysics. The research was based on a number of problems solved in Geology and Petroleum Production Institute. Distinctive features of these problems are given and areas of application of the object-oriented approach are identified. Developing applications in the sphere of geology and geophysics has shown that the process of creating such products is simplified due to the use of object-oriented programming, firstly when designing structures for data storage and graphical user interfaces.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, G.A.; Commer, M.

    Three-dimensional (3D) geophysical imaging is now receiving considerable attention for electrical conductivity mapping of potential offshore oil and gas reservoirs. The imaging technology employs controlled source electromagnetic (CSEM) and magnetotelluric (MT) fields and treats geological media exhibiting transverse anisotropy. Moreover when combined with established seismic methods, direct imaging of reservoir fluids is possible. Because of the size of the 3D conductivity imaging problem, strategies are required exploiting computational parallelism and optimal meshing. The algorithm thus developed has been shown to scale to tens of thousands of processors. In one imaging experiment, 32,768 tasks/processors on the IBM Watson Research Blue Gene/Lmore » supercomputer were successfully utilized. Over a 24 hour period we were able to image a large scale field data set that previously required over four months of processing time on distributed clusters based on Intel or AMD processors utilizing 1024 tasks on an InfiniBand fabric. Electrical conductivity imaging using massively parallel computational resources produces results that cannot be obtained otherwise and are consistent with timeframes required for practical exploration problems.« less

  4. High Performance Geostatistical Modeling of Biospheric Resources

    NASA Astrophysics Data System (ADS)

    Pedelty, J. A.; Morisette, J. T.; Smith, J. A.; Schnase, J. L.; Crosier, C. S.; Stohlgren, T. J.

    2004-12-01

    We are using parallel geostatistical codes to study spatial relationships among biospheric resources in several study areas. For example, spatial statistical models based on large- and small-scale variability have been used to predict species richness of both native and exotic plants (hot spots of diversity) and patterns of exotic plant invasion. However, broader use of geostastics in natural resource modeling, especially at regional and national scales, has been limited due to the large computing requirements of these applications. To address this problem, we implemented parallel versions of the kriging spatial interpolation algorithm. The first uses the Message Passing Interface (MPI) in a master/slave paradigm on an open source Linux Beowulf cluster, while the second is implemented with the new proprietary Xgrid distributed processing system on an Xserve G5 cluster from Apple Computer, Inc. These techniques are proving effective and provide the basis for a national decision support capability for invasive species management that is being jointly developed by NASA and the US Geological Survey.

  5. Example-based super-resolution for single-image analysis from the Chang'e-1 Mission

    NASA Astrophysics Data System (ADS)

    Wu, Fan-Lu; Wang, Xiang-Jun

    2016-11-01

    Due to the low spatial resolution of images taken from the Chang'e-1 (CE-1) orbiter, the details of the lunar surface are blurred and lost. Considering the limited spatial resolution of image data obtained by a CCD camera on CE-1, an example-based super-resolution (SR) algorithm is employed to obtain high-resolution (HR) images. SR reconstruction is important for the application of image data to increase the resolution of images. In this article, a novel example-based algorithm is proposed to implement SR reconstruction by single-image analysis, and the computational cost is reduced compared to other example-based SR methods. The results show that this method can enhance the resolution of images using SR and recover detailed information about the lunar surface. Thus it can be used for surveying HR terrain and geological features. Moreover, the algorithm is significant for the HR processing of remotely sensed images obtained by other imaging systems.

  6. Computational and Experimental Investigations of the Molecular Scale Structure and Dynamics of Gologically Important Fluids and Mineral-Fluid Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowers, Geoffrey

    United States Department of Energy grant DE-FG02-10ER16128, “Computational and Spectroscopic Investigations of the Molecular Scale Structure and Dynamics of Geologically Important Fluids and Mineral-Fluid Interfaces” (Geoffrey M. Bowers, P.I.) focused on developing a molecular-scale understanding of processes that occur in fluids and at solid-fluid interfaces using the combination of spectroscopic, microscopic, and diffraction studies with molecular dynamics computer modeling. The work is intimately tied to the twin proposal at Michigan State University (DOE DE-FG02-08ER15929; same title: R. James Kirkpatrick, P.I. and A. Ozgur Yazaydin, co-P.I.).

  7. Coupling of geochemical and multiphase flow processes for validation of the MUFITS reservoir simulator against TOUGHREACT

    NASA Astrophysics Data System (ADS)

    De Lucia, Marco; Kempka, Thomas; Afanasyev, Andrey; Melnik, Oleg; Kühn, Michael

    2016-04-01

    Coupled reactive transport simulations, especially in heterogeneous settings considering multiphase flow, are extremely time consuming and suffer from significant numerical issues compared to purely hydrodynamic simulations. This represents a major hurdle in the assessment of geological subsurface utilization, since it constrains the practical application of reactive transport modelling to coarse spatial discretization or oversimplified geological settings. In order to overcome such limitations, De Lucia et al. [1] developed and validated a one-way coupling approach between geochemistry and hydrodynamics, which is particularly well suited for CO2 storage simulations, while being of general validity. In the present study, the models used for the validation of the one-way coupling approach introduced by De Lucia et al. (2015), and originally performed with the TOUGHREACT simulator, are transferred to and benchmarked against the multiphase reservoir simulator MUFITS [2]. The geological model is loosely inspired by an existing CO2 storage site. Its grid comprises 2,950 elements enclosed in a single layer, but reflecting a realistic three-dimensional anticline geometry. For the purpose of this comparison, homogeneous and heterogeneous scenarios in terms of porosity and permeability were investigated. In both cases, the results of the MUFITS simulator are in excellent agreement with those produced with the fully-coupled TOUGHREACT simulator, while profiting from significantly higher computational performance. This study demonstrates how a computationally efficient simulator such as MUFITS can be successfully included in a coupled process simulation framework, and also suggests ameliorations and specific strategies for the coupling of chemical processes with hydrodynamics and heat transport, aiming at tackling geoscientific problems beyond the storage of CO2. References [1] De Lucia, M., Kempka, T., and Kühn, M. A coupling alternative to reactive transport simulations for long-term prediction of chemical reactions in heterogeneous CO2 storage systems, Geosci. Model Dev., 8, 279-294, 2015, doi:10.5194/gmd-8-279-2015 [2] Afanasyev, A.A. Application of the reservoir simulator MUFITS for 3D modeling of CO2 storage in geological formations, Energy Procedia, 40, 365-374, 2013, doi:10.1016/j.egypro.2013.08.042

  8. Synthetic Earthquake Statistics From Physical Fault Models for the Lower Rhine Embayment

    NASA Astrophysics Data System (ADS)

    Brietzke, G. B.; Hainzl, S.; Zöller, G.

    2012-04-01

    As of today, seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates they fail to provide a link between the observed seismicity and the underlying physical processes. Solving a state-of-the-art fully dynamic description set of all relevant physical processes related to earthquake fault systems is likely not useful since it comes with a large number of degrees of freedom, poor constraints on its model parameters and a huge computational effort. Here, quasi-static and quasi-dynamic physical fault simulators provide a compromise between physical completeness and computational affordability and aim at providing a link between basic physical concepts and statistics of seismicity. Within the framework of quasi-static and quasi-dynamic earthquake simulators we investigate a model of the Lower Rhine Embayment (LRE) that is based upon seismological and geological data. We present and discuss statistics of the spatio-temporal behavior of generated synthetic earthquake catalogs with respect to simplification (e.g. simple two-fault cases) as well as to complication (e.g. hidden faults, geometric complexity, heterogeneities of constitutive parameters).

  9. Finite-difference time-domain modelling of through-the-Earth radio signal propagation

    NASA Astrophysics Data System (ADS)

    Ralchenko, M.; Svilans, M.; Samson, C.; Roper, M.

    2015-12-01

    This research seeks to extend the knowledge of how a very low frequency (VLF) through-the-Earth (TTE) radio signal behaves as it propagates underground, by calculating and visualizing the strength of the electric and magnetic fields for an arbitrary geology through numeric modelling. To achieve this objective, a new software tool has been developed using the finite-difference time-domain method. This technique is particularly well suited to visualizing the distribution of electromagnetic fields in an arbitrary geology. The frequency range of TTE radio (400-9000 Hz) and geometrical scales involved (1 m resolution for domains a few hundred metres in size) involves processing a grid composed of millions of cells for thousands of time steps, which is computationally expensive. Graphics processing unit acceleration was used to reduce execution time from days and weeks, to minutes and hours. Results from the new modelling tool were compared to three cases for which an analytic solution is known. Two more case studies were done featuring complex geologic environments relevant to TTE communications that cannot be solved analytically. There was good agreement between numeric and analytic results. Deviations were likely caused by numeric artifacts from the model boundaries; however, in a TTE application in field conditions, the uncertainty in the conductivity of the various geologic formations will greatly outweigh these small numeric errors.

  10. Analysis and Application of Lineaments Extraction Using GF-1 Satellite Images in Loess Covered

    NASA Astrophysics Data System (ADS)

    Han, L.; Liu, Z.; Zhao, Z.; Ning, Y.

    2018-04-01

    Faults, folds and other tectonics regions belong to the weak areas of geology, will form linear geomorphology as a result of erosion, which appears as lineaments on the earth surface. Lineaments control the distribution of regional formation, groundwater, and geothermal, etc., so it is an important indicator for the evaluation of the strength and stability of the geological structure. The current algorithms mostly are artificial visual interpretation and computer semi-automatic extraction, not only time-consuming, but labour-intensive. It is difficult to guarantee the accuracy due to the dependence on the expert's knowledge, experience, and the computer hardware and software. Therefore, an integrated algorithm is proposed based on the GF-1 satellite image data, taking the loess area in the northern part of Jinlinghe basin as an example. Firstly, the best bands with 4-3-2 composition is chosen using optimum index factor (OIF). Secondly, line edge is highlighted by Gaussian high-pass filter and tensor voting. Finally, the Hough Transform is used to detect the geologic lineaments. Thematic maps of geological structure in this area are mapped through the extraction of lineaments. The experimental results show that, influenced by the northern margin of Qinling Mountains and the declined Weihe Basin, the lineaments are mostly distributed over the terrain lines, and mainly in the NW, NE, NNE, and ENE directions. It provided a reliable basis for analysing tectonic stress trend because of the agreement with the existing regional geological survey. The algorithm is more practical and has higher robustness, less disturbed by human factors.

  11. Comparative study of large scale simulation of underground explosions inalluvium and in fractured granite using stochastic characterization

    NASA Astrophysics Data System (ADS)

    Vorobiev, O.; Ezzedine, S. M.; Antoun, T.; Glenn, L.

    2014-12-01

    This work describes a methodology used for large scale modeling of wave propagation fromunderground explosions conducted at the Nevada Test Site (NTS) in two different geological settings:fractured granitic rock mass and in alluvium deposition. We show that the discrete nature of rockmasses as well as the spatial variability of the fabric of alluvium is very important to understand groundmotions induced by underground explosions. In order to build a credible conceptual model of thesubsurface we integrated the geological, geomechanical and geophysical characterizations conductedduring recent test at the NTS as well as historical data from the characterization during the undergroundnuclear test conducted at the NTS. Because detailed site characterization is limited, expensive and, insome instances, impossible we have numerically investigated the effects of the characterization gaps onthe overall response of the system. We performed several computational studies to identify the keyimportant geologic features specific to fractured media mainly the joints; and those specific foralluvium porous media mainly the spatial variability of geological alluvium facies characterized bytheir variances and their integral scales. We have also explored common key features to both geologicalenvironments such as saturation and topography and assess which characteristics affect the most theground motion in the near-field and in the far-field. Stochastic representation of these features based onthe field characterizations have been implemented in Geodyn and GeodynL hydrocodes. Both codeswere used to guide site characterization efforts in order to provide the essential data to the modelingcommunity. We validate our computational results by comparing the measured and computed groundmotion at various ranges. This work performed under the auspices of the U.S. Department of Energy by Lawrence LivermoreNational Laboratory under Contract DE-AC52-07NA27344.

  12. Supercomputing with TOUGH2 family codes for coupled multi-physics simulations of geologic carbon sequestration

    NASA Astrophysics Data System (ADS)

    Yamamoto, H.; Nakajima, K.; Zhang, K.; Nanai, S.

    2015-12-01

    Powerful numerical codes that are capable of modeling complex coupled processes of physics and chemistry have been developed for predicting the fate of CO2 in reservoirs as well as its potential impacts on groundwater and subsurface environments. However, they are often computationally demanding for solving highly non-linear models in sufficient spatial and temporal resolutions. Geological heterogeneity and uncertainties further increase the challenges in modeling works. Two-phase flow simulations in heterogeneous media usually require much longer computational time than that in homogeneous media. Uncertainties in reservoir properties may necessitate stochastic simulations with multiple realizations. Recently, massively parallel supercomputers with more than thousands of processors become available in scientific and engineering communities. Such supercomputers may attract attentions from geoscientist and reservoir engineers for solving the large and non-linear models in higher resolutions within a reasonable time. However, for making it a useful tool, it is essential to tackle several practical obstacles to utilize large number of processors effectively for general-purpose reservoir simulators. We have implemented massively-parallel versions of two TOUGH2 family codes (a multi-phase flow simulator TOUGH2 and a chemically reactive transport simulator TOUGHREACT) on two different types (vector- and scalar-type) of supercomputers with a thousand to tens of thousands of processors. After completing implementation and extensive tune-up on the supercomputers, the computational performance was measured for three simulations with multi-million grid models, including a simulation of the dissolution-diffusion-convection process that requires high spatial and temporal resolutions to simulate the growth of small convective fingers of CO2-dissolved water to larger ones in a reservoir scale. The performance measurement confirmed that the both simulators exhibit excellent scalabilities showing almost linear speedup against number of processors up to over ten thousand cores. Generally this allows us to perform coupled multi-physics (THC) simulations on high resolution geologic models with multi-million grid in a practical time (e.g., less than a second per time step).

  13. Estimation of reservoir storage capacity using multibeam sonar and terrestrial lidar, Randy Poynter Lake, Rockdale County, Georgia, 2012

    USGS Publications Warehouse

    Lee, K.G.

    2013-01-01

    The U.S. Geological Survey, in cooperation with the Rockdale County Department of Water Resources, conducted a bathymetric and topographic survey of Randy Poynter Lake in northern Georgia in 2012. The Randy Poynter Lake watershed drains surface area from Rockdale, Gwinnett, and Walton Counties. The reservoir serves as the water supply for the Conyers-Rockdale Big Haynes Impoundment Authority. The Randy Poynter reservoir was surveyed to prepare a current bathymetric map and determine storage capacities at specified water-surface elevations. Topographic and bathymetric data were collected using a marine-based mobile mapping unit to estimate storage capacity. The marine-based mobile mapping unit operates with several components: multibeam echosounder, singlebeam echosounder, light detection and ranging system, navigation and motion-sensing system, and data acquisition computer. All data were processed and combined to develop a triangulated irregular network, a reservoir capacity table, and a bathymetric contour map.

  14. Linear programming model to develop geodiversity map using utility theory

    NASA Astrophysics Data System (ADS)

    Sepehr, Adel

    2015-04-01

    In this article, the classification and mapping of geodiversity based on a quantitative methodology was accomplished using linear programming, the central idea of which being that geosites and geomorphosites as main indicators of geodiversity can be evaluated by utility theory. A linear programming method was applied for geodiversity mapping over Khorasan-razavi province located in eastern north of Iran. In this route, the main criteria for distinguishing geodiversity potential in the studied area were considered regarding rocks type (lithology), faults position (tectonic process), karst area (dynamic process), Aeolian landforms frequency and surface river forms. These parameters were investigated by thematic maps including geology, topography and geomorphology at scales 1:100'000, 1:50'000 and 1:250'000 separately, imagery data involving SPOT, ETM+ (Landsat 7) and field operations directly. The geological thematic layer was simplified from the original map using a practical lithologic criterion based on a primary genetic rocks classification representing metamorphic, igneous and sedimentary rocks. The geomorphology map was provided using DEM at scale 30m extracted by ASTER data, geology and google earth images. The geology map shows tectonic status and geomorphology indicated dynamic processes and landform (karst, Aeolian and river). Then, according to the utility theory algorithms, we proposed a linear programming to classify geodiversity degree in the studied area based on geology/morphology parameters. The algorithm used in the methodology was consisted a linear function to be maximized geodiversity to certain constraints in the form of linear equations. The results of this research indicated three classes of geodiversity potential including low, medium and high status. The geodiversity potential shows satisfied conditions in the Karstic areas and Aeolian landscape. Also the utility theory used in the research has been decreased uncertainty of the evaluations.

  15. Geologic uncertainty in a regulatory environment: An example from the potential Yucca Mountain nuclear waste repository site

    NASA Astrophysics Data System (ADS)

    Rautman, C. A.; Treadway, A. H.

    1991-11-01

    Regulatory geologists are concerned with predicting the performance of sites proposed for waste disposal or for remediation of existing pollution problems. Geologic modeling of these sites requires large-scale expansion of knowledge obtained from very limited sampling. This expansion induces considerable uncertainty into the geologic models of rock properties that are required for modeling the predicted performance of the site. One method for assessing this uncertainty is through nonparametric geostatistical simulation. Simulation can produce a series of equiprobable models of a rock property of interest. Each model honors measured values at sampled locations, and each can be constructed to emulate both the univariate histogram and the spatial covariance structure of the measured data. Computing a performance model for a number of geologic simulations allows evaluation of the effects of geologic uncertainty. A site may be judged acceptable if the number of failures to meet a particular performance criterion produced by these computations is sufficiently low. A site that produces too many failures may be either unacceptable or simply inadequately described. The simulation approach to addressing geologic uncertainty is being applied to the potential high-level nuclear waste repository site at Yucca Mountain, Nevada, U.S.A. Preliminary geologic models of unsaturated permeability have been created that reproduce observed statistical properties reasonably well. A spread of unsaturated groundwater travel times has been computed that reflects the variability of those geologic models. Regions within the simulated models exhibiting the greatest variability among multiple runs are candidates for obtaining the greatest reduction in uncertainty through additional site characterization.

  16. Performance of a process-based hydrodynamic model in predicting shoreline change

    NASA Astrophysics Data System (ADS)

    Safak, I.; Warner, J. C.; List, J. H.

    2012-12-01

    Shoreline change is controlled by a complex combination of processes that include waves, currents, sediment characteristics and availability, geologic framework, human interventions, and sea level rise. A comprehensive data set of shoreline position (14 shorelines between 1978-2002) along the continuous and relatively non-interrupted North Carolina Coast from Oregon Inlet to Cape Hatteras (65 km) reveals a spatial pattern of alternating erosion and accretion, with an erosional average shoreline change rate of -1.6 m/yr and up to -8 m/yr in some locations. This data set gives a unique opportunity to study long-term shoreline change in an area hit by frequent storm events while relatively uninfluenced by human interventions and the effects of tidal inlets. Accurate predictions of long-term shoreline change may require a model that accurately resolves surf zone processes and sediment transport patterns. Conventional methods for predicting shoreline change such as one-line models and regression of shoreline positions have been designed for computational efficiency. These methods, however, not only have several underlying restrictions (validity for small angle of wave approach, assuming bottom contours and shoreline to be parallel, depth of closure, etc.) but also their empirical estimates of sediment transport rates in the surf zone have been shown to vary greatly from the calculations of process-based hydrodynamic models. We focus on hind-casting long-term shoreline change using components of the process-based, three-dimensional coupled-ocean-atmosphere-wave-sediment transport modeling system (COAWST). COAWST is forced with historical predictions of atmospheric and oceanographic data from public-domain global models. Through a method of coupled concurrent grid-refinement approach in COAWST, the finest grid with resolution of O(10 m) that covers the surf zone along the section of interest is forced at its spatial boundaries with waves and currents computed on the grids that cover the U.S. East Coast with resolutions as low as O(1 km). The computed patterns of the gradients of surf-zone integrated longshore sediment transport rates are compared with the observed shoreline change.

  17. Accessible Earth: Enhancing diversity in the Geosciences through accessible course design

    NASA Astrophysics Data System (ADS)

    Bennett, R. A.; Lamb, D. A.

    2017-12-01

    The tradition of field-based instruction in the geoscience curriculum, which culminates in a capstone geological field camp, presents an insurmountable barrier to many disabled students who might otherwise choose to pursue geoscience careers. There is a widespread perception that success as a practicing geoscientist requires direct access to outcrops and vantage points available only to those able to traverse inaccessible terrain. Yet many modern geoscience activities are based on remotely sensed geophysical data, data analysis, and computation that take place entirely from within the laboratory. To challenge the perception of geoscience as a career option only for the non-disabled, we have created the capstone Accessible Earth Study Abroad Program, an alternative to geologic field camp for all students, with a focus on modern geophysical observation systems, computational thinking, data science, and professional development.In this presentation, we will review common pedagogical approaches in geosciences and current efforts to make the field more inclusive. We will review curricular access and inclusivity relative to a wide range of learners and provide examples of accessible course design based on our experiences in teaching a study abroad course in central Italy, and our plans for ongoing assessment, refinement, and dissemination of the effectiveness of our efforts.

  18. QADATA user's manual; an interactive computer program for the retrieval and analysis of the results from the external blind sample quality- assurance project of the U.S. Geological Survey

    USGS Publications Warehouse

    Lucey, K.J.

    1990-01-01

    The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)

  19. Selected Streamflow Statistics for Streamgaging Stations in Delaware, 2003

    USGS Publications Warehouse

    Ries, Kernell G.

    2004-01-01

    Flow-duration and low-flow frequency statistics were calculated for 15 streamgaging stations in Delaware, in cooperation with the Delaware Geological Survey. The flow-duration statistics include the 1-, 2-, 5-, 10-, 20-, 30-, 40-, 50-, 60-, 70-, 80-, 90-, 95-, 98-, and 99-percent duration discharges. The low-flow frequency statistics include the average discharges for 1, 7, 14, 30, 60, 90, and 120 days that recur, on average, once in 1.01, 2, 5, 10, 20, 50, and 100 years. The statistics were computed using U.S. Geological Survey computer programs that can be downloaded from the World Wide Web at no cost. The computer programs automate standard U.S. Geological Survey methods for computing the statistics. Documentation is provided at the Web sites for the individual programs. The computed statistics are presented in tabular format on a separate page for each station, along with the station name, station number, the location, the period of record, and remarks.

  20. Use of Groundwater Lifetime Expectancy for the Performance Assessment of Deep Geologic Radioactive Waste Repositories.

    NASA Astrophysics Data System (ADS)

    Cornaton, F.; Park, Y.; Normani, S.; Sudicky, E.; Sykes, J.

    2005-12-01

    Long-term solutions for the disposal of toxic wastes usually involve isolation of the wastes in a deep subsurface geologic environment. In the case of spent nuclear fuel, the safety of the host repository depends on two main barriers: the engineered barrier and the natural geological barrier. If radionuclide leakage occurs from the engineered barrier, the geological medium represents the ultimate barrier that is relied upon to ensure safety. Consequently, an evaluation of radionuclide travel times from the repository to the biosphere is critically important in a performance assessment analysis. In this study, we develop a travel time framework based on the concept of groundwater lifetime expectancy as a safety indicator. Lifetime expectancy characterizes the time radionuclides will spend in the subsurface after their release from the repository and prior to discharging into the biosphere. The probability density function of lifetime expectancy is computed throughout the host rock by solving the backward-in-time solute transport equation subject to a properly posed set of boundary conditions. It can then be used to define optimal repository locations. In a second step, the risk associated with selected sites can be evaluated by simulating an appropriate contaminant release history. The proposed methodology is applied in the context of a typical Canadian Shield environment. Based on a statistically-generated three-dimension network of fracture zones embedded in the granitic host rock, the sensitivity and the uncertainty of lifetime expectancy to the hydraulic and dispersive properties of the fracture network, including the impact of conditioning via their surface expressions, is computed in order to demonstrate the utility of the methodology.

  1. New evidence for long-distance fluid migration within the Earth's crust

    NASA Astrophysics Data System (ADS)

    Person, M.; Baumgartner, L.

    1995-07-01

    During the past decade, geologists have come to appreciate the interconnectedness of hydrologic, tectonic, thermal, and geochemical processes operating within the Earth's continental crust [Oliver, 1992]. This has led to a new geologically-based conceptual model of hydrology which is crustal-scale and is centered in plate tectonics theory (Fig. 1). From a geological perspective, the tectonic and thermal processes which drive plate motion are also responsible, either directly or indirectly, for inducing fluid motion across and through the continents. Supporting evidence for this emerging paradigm is based on observations of pervasive rock-water interactions associated with geologic processes as diverse as the chemical alteration of crustal rocks [Shelton et al, 1992; Elliott and Aronson, 1993; McManus and Hanor, 1993; Ague, 1991, 1994], devolatilization of minerals during burial and consequent metamorphism [Cox and Etheridge, 1989], the formation of energy and mineral deposits [Garven et al, 1993; and Cathles et al, 1993], remagnitization of ancient sedimentary rocks [McCabe and Elmore, 1989], the tectonic deformation of sedimentary basins [Oliver 1992, Ge and Garven, 1992], and the regulation of global climate [Caldeira et al, 1993, Kerrick and Caldeira, 1993, 1994]. This paper summarizes the many recent lines of theoretical, laboratory, and field evidence from diverse disciplines within the Earth Sciences supporting this emerging view of crustal-scale hydrology. Evidence for two types of long-distance fluid migration are highlighted: vertical pore water movement through crystalline rocks to depths greater than six km and lateral groundwater movement through sedimentary basins over hundereds of km. Also emphasized are the many driving mechanisms on fluid motion which are not typically considered in water quality and water supply investigations. Some geologic terms used in this paper, which may be unfamiliar to the reader, are defined in geologic dictionaries [American Geologic Institute, 1976].

  2. SAM 2.1—A computer program for plotting and formatting surveying data for estimating peak discharges by the slope-area method

    USGS Publications Warehouse

    Hortness, J.E.

    2004-01-01

    The U.S. Geological Survey (USGS) measures discharge in streams using several methods. However, measurement of peak discharges is often impossible or impractical due to difficult access, inherent danger of making measurements during flood events, and timing often associated with flood events. Thus, many peak discharge values often are calculated after the fact by use of indirect methods. The most common indirect method for estimating peak dis- charges in streams is the slope-area method. This, like other indirect methods, requires measuring the flood profile through detailed surveys. Processing the survey data for efficient entry into computer streamflow models can be time demanding; SAM 2.1 is a program designed to expedite that process. The SAM 2.1 computer program is designed to be run in the field on a portable computer. The program processes digital surveying data obtained from an electronic surveying instrument during slope- area measurements. After all measurements have been completed, the program generates files to be input into the SAC (Slope-Area Computation program; Fulford, 1994) or HEC-RAS (Hydrologic Engineering Center-River Analysis System; Brunner, 2001) computer streamflow models so that an estimate of the peak discharge can be calculated.

  3. The Geo Data Portal an Example Physical and Application Architecture Demonstrating the Power of the "Cloud" Concept.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Booth, N.; Walker, J.; Kunicki, T.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics (CIDA), in holding with the President's Digital Government Strategy and the Department of Interior's IT Transformation initiative, has evolved its data center and application architecture toward the "cloud" paradigm. In this case, "cloud" refers to a goal of developing services that may be distributed to infrastructure anywhere on the Internet. This transition has taken place across the entire data management spectrum from data center location to physical hardware configuration to software design and implementation. In CIDA's case, physical hardware resides in Madison at the Wisconsin Water Science Center, in South Dakota at the Earth Resources Observation and Science Center (EROS), and in the near future at a DOI approved commercial vendor. Tasks normally conducted on desktop-based GIS software with local copies of data in proprietary formats are now done using browser-based interfaces to web processing services drawing on a network of standard data-source web services. Organizations are gaining economies of scale through data center consolidation and the creation of private cloud services as well as taking advantage of the commoditization of data processing services. Leveraging open standards for data and data management take advantage of this commoditization and provide the means to reliably build distributed service based systems. This presentation will use CIDA's experience as an illustration of the benefits and hurdles of moving to the cloud. Replicating, reformatting, and processing large data sets, such as downscaled climate projections, traditionally present a substantial challenge to environmental science researchers who need access to data subsets and derived products. The USGS Geo Data Portal (GDP) project uses cloud concepts to help earth system scientists' access subsets, spatial summaries, and derivatives of commonly needed very large data. The GDP project has developed a reusable architecture and advanced processing services that currently accesses archives hosted at Lawrence Livermore National Lab, Oregon State University, the University Corporation for Atmospheric Research, and the U.S. Geological Survey, among others. Several examples of how the GDP project uses cloud concepts will be highlighted in this presentation: 1) The high bandwidth network connectivity of large data centers reduces the need for data replication and storage local to processing services. 2) Standard data serving web services, like OPeNDAP, Web Coverage Services, and Web Feature Services allow GDP services to remotely access custom subsets of data in a variety of formats, further reducing the need for data replication and reformatting. 3) The GDP services use standard web service APIs to allow browser-based user interfaces to run complex and compute-intensive processes for users from any computer with an Internet connection. The combination of physical infrastructure and application architecture implemented for the Geo Data Portal project offer an operational example of how distributed data and processing on the cloud can be used to aid earth system science.

  4. Teaching Tectonics to Undergraduates with Web GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  5. A Physically Based Coupled Chemical and Physical Weathering Model for Simulating Soilscape Evolution

    NASA Astrophysics Data System (ADS)

    Willgoose, G. R.; Welivitiya, D.; Hancock, G. R.

    2015-12-01

    A critical missing link in existing landscape evolution models is a dynamic soil evolution models where soils co-evolve with the landform. Work by the authors over the last decade has demonstrated a computationally manageable model for soil profile evolution (soilscape evolution) based on physical weathering. For chemical weathering it is clear that full geochemistry models such as CrunchFlow and PHREEQC are too computationally intensive to be couplable to existing soilscape and landscape evolution models. This paper presents a simplification of CrunchFlow chemistry and physics that makes the task feasible, and generalises it for hillslope geomorphology applications. Results from this simplified model will be compared with field data for soil pedogenesis. Other researchers have previously proposed a number of very simple weathering functions (e.g. exponential, humped, reverse exponential) as conceptual models of the in-profile weathering process. The paper will show that all of these functions are possible for specific combinations of in-soil environmental, geochemical and geologic conditions, and the presentation will outline the key variables controlling which of these conceptual models can be realistic models of in-profile processes and under what conditions. The presentation will finish by discussing the coupling of this model with a physical weathering model, and will show sample results from our SSSPAM soilscape evolution model to illustrate the implications of including chemical weathering in the soilscape evolution model.

  6. Field estimates of gravity terrain corrections and Y2K-compatible method to convert from gravity readings with multiple base stations to tide- and long-term drift-corrected observations

    USGS Publications Warehouse

    Plouff, Donald

    2000-01-01

    Gravity observations are directly made or are obtained from other sources by the U.S. Geological Survey in order to prepare maps of the anomalous gravity field and consequently to interpret the subsurface distribution of rock densities and associated lithologic or geologic units. Observations are made in the field with gravity meters at new locations and at reoccupations of previously established gravity "stations." This report illustrates an interactively-prompted series of steps needed to convert gravity "readings" to values that are tied to established gravity datums and includes computer programs to implement those steps. Inasmuch as individual gravity readings have small variations, gravity-meter (instrument) drift may not be smoothly variable, and acommodations may be needed for ties to previously established stations, the reduction process is iterative. Decision-making by the program user is prompted by lists of best values and graphical displays. Notes about irregularities of topography, which affect the value of observed gravity but are not shown in sufficient detail on topographic maps, must be recorded in the field. This report illustrates ways to record field notes (distances, heights, and slope angles) and includes computer programs to convert field notes to gravity terrain corrections. This report includes approaches that may serve as models for other applications, for example: portrayal of system flow; style of quality control to document and validate computer applications; lack of dependence on proprietary software except source code compilation; method of file-searching with a dwindling list; interactive prompting; computer code to write directly in the PostScript (Adobe Systems Incorporated) printer language; and high-lighting the four-digit year on the first line of time-dependent data sets for assured Y2K compatibility. Computer source codes provided are written in the Fortran scientific language. In order for the programs to operate, they first must be converted (compiled) into an executable form on the user's computer. Although program testing was done in a UNIX (tradename of American Telephone and Telegraph Company) computer environment, it is anticipated that only a system-dependent date-and-time function may need to be changed for adaptation to other computer platforms that accept standard Fortran code.d del iliscipit volorer sequi ting etue feum zzriliquatum zzriustrud esenibh ex esto esequat.

  7. A computer software system for integration and analysis of grid-based remote sensing data with other natural resource data. Remote Sensing Project

    NASA Technical Reports Server (NTRS)

    Tilmann, S. E.; Enslin, W. R.; Hill-Rowley, R.

    1977-01-01

    A computer-based information system is described designed to assist in the integration of commonly available spatial data for regional planning and resource analysis. The Resource Analysis Program (RAP) provides a variety of analytical and mapping phases for single factor or multi-factor analyses. The unique analytical and graphic capabilities of RAP are demonstrated with a study conducted in Windsor Township, Eaton County, Michigan. Soil, land cover/use, topographic and geological maps were used as a data base to develope an eleven map portfolio. The major themes of the portfolio are land cover/use, non-point water pollution, waste disposal, and ground water recharge.

  8. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  9. Culvert analysis program for indirect measurement of discharge

    USGS Publications Warehouse

    Fulford, Janice M.; ,

    1993-01-01

    A program based on the U.S. Geological Survey (USGS) methods for indirectly computing peak discharges through culverts allows users to employ input data formats used by the water surface profile program (WSPRO). The program can be used to compute discharge rating surfaces or curves that describe the behavior of flow through a particular culvert or to compute discharges from measurements of upstream of the gradually varied flow equations and has been adapted slightly to provide solutions that minimize the need for the user to determine between different flow regimes. The program source is written in Fortran 77 and has been run on mini-computers and personal computers. The program does not use or require graphics capability, a color monitor, or a mouse.

  10. Insights on WWW-based geoscience teaching: Climbing the first year learning cliff

    NASA Astrophysics Data System (ADS)

    Lamberson, Michelle N.; Johnson, Mark; Bevier, Mary Lou; Russell, J. Kelly

    1997-06-01

    In early 1995, The University of British Columbia Department of Geological Sciences (now Earth and Ocean Sciences) initiated a project that explored the effectiveness of the World Wide Web as a teaching and learning medium. Four decisions made at the onset of the project have guided the department's educational technology plan: (1) over 90% of funding recieved from educational technology grants was committed towards personnel; (2) materials developed are modular in design; (3) a data-base approach was taken to resource development; and (4) a strong commitment to student involvement in courseware development. The project comprised development of a web site for an existing core course: Geology 202, Introduction to Petrology. The web site is a gateway to course information, content, resources, exercises, and several searchable data-bases (images, petrologic definitions, and minerals in thin section). Material was developed on either an IBM or UNIX machine, ported to a UNIX platform, and is accessed using the Netscape browser. The resources consist primarily of HTML files or CGI scripts with associated text, images, sound, digital movies, and animations. Students access the web site from the departmental student computer facility, from home or a computer station in the petrology laboratory. Results of a survey of the Geol 202 students indicate that they found the majority of the resources useful, and the site is being expanded. The Geology 202 project had a "trickle-up" effect throughout the department: prior to this project, there was minimal use of Internet resources in lower-level geology courses. By the end of the 1996-1997 academic year, we anticipate that at least 17 Earth and Ocean Science courses will have a WWW site for one or all of the following uses: (1) presenting basic information; (2) accessing lecture images; (3) providing a jumping-off point for exploring related WWW sites; (4) conducting on-line exercises; and/or (5) providing a communications forum for students and faculty via a Hypernews group. Url http://www.science.ubc.ca/

  11. Tabular data and graphical images in support of the U.S. Geological Survey National Oil and Gas Assessment -- San Joaquin Basin (5010): Chapter 28 in Petroleum systems and geologic assessment of oil and gas in the San Joaquin Basin Province, California

    USGS Publications Warehouse

    Klett, T.R.; Le, P.A.

    2007-01-01

    This chapter describes data used in support of the assessment process. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD–ROM. Computers and software may import the data without transcription from the portable document format (.pdf) files of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).

  12. Chapter 2: Tabular Data and Graphical Images in Support of the U.S. Geological Survey National Oil and Gas Assessment - The Wind River Basin Province

    USGS Publications Warehouse

    Klett, T.R.; Le, P.A.

    2007-01-01

    This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files) because of the number and variety of platforms and software available.

  13. Balancing Accuracy and Computational Efficiency for Ternary Gas Hydrate Systems

    NASA Astrophysics Data System (ADS)

    White, M. D.

    2011-12-01

    Geologic accumulations of natural gas hydrates hold vast organic carbon reserves, which have the potential of meeting global energy needs for decades. Estimates of vast amounts of global natural gas hydrate deposits make them an attractive unconventional energy resource. As with other unconventional energy resources, the challenge is to economically produce the natural gas fuel. The gas hydrate challenge is principally technical. Meeting that challenge will require innovation, but more importantly, scientific research to understand the resource and its characteristics in porous media. Producing natural gas from gas hydrate deposits requires releasing CH4 from solid gas hydrate. The conventional way to release CH4 is to dissociate the hydrate by changing the pressure and temperature conditions to those where the hydrate is unstable. The guest-molecule exchange technology releases CH4 by replacing it with a more thermodynamically stable molecule (e.g., CO2, N2). This technology has three advantageous: 1) it sequesters greenhouse gas, 2) it releases energy via an exothermic reaction, and 3) it retains the hydraulic and mechanical stability of the hydrate reservoir. Numerical simulation of the production of gas hydrates from geologic deposits requires accounting for coupled processes: multifluid flow, mobile and immobile phase appearances and disappearances, heat transfer, and multicomponent thermodynamics. The ternary gas hydrate system comprises five components (i.e., H2O, CH4, CO2, N2, and salt) and the potential for six phases (i.e., aqueous, liquid CO2, gas, hydrate, ice, and precipitated salt). The equation of state for ternary hydrate systems has three requirements: 1) phase occurrence, 2) phase composition, and 3) phase properties. Numerical simulation of the production of geologic accumulations of gas hydrates have historically suffered from relatively slow execution times, compared with other multifluid, porous media systems, due to strong nonlinearities and phase transitions. This paper describes and demonstrates a numerical solution scheme for ternary hydrate systems that seeks a balance between accuracy and computational efficiency. This scheme uses a generalize cubic equation of state, functional forms for the hydrate equilibria and cage occupancies, variable switching scheme for phase transitions, and kinetic exchange of hydrate formers (i.e., CH4, CO2, and N2) between the mobile phases (i.e., aqueous, liquid CO2, and gas) and hydrate phase. Accuracy of the scheme will be evaluated by comparing property values and phase equilibria against experimental data. Computational efficiency of the scheme will be evaluated by comparing the base scheme against variants. The application of interest will the production of a natural gas hydrate deposit from a geologic formation, using the guest molecule exchange process; where, a mixture of CO2 and N2 are injected into the formation. During the guest-molecule exchange, CO2 and N2 will predominately replace CH4 in the large and small cages of the sI structure, respectively.

  14. GDA (Geologic Data Assistant), an ArcPad extension for geologic mapping: code, prerequisites, and instructions

    USGS Publications Warehouse

    ,

    2006-01-01

    GDA (Geologic Data Assistant) is an extension to ArcPad, a mobile mapping software program by Environmental Systems Research Institute (ESRI) designed to run on personal digital assistant (PDA) computers. GDA and ArcPad allow a PDA to replace the paper notebook and field map traditionally used for geologic mapping. GDA allows easy collection of field data.

  15. Comparison of Pore-Network and Lattice Boltzmann Models for Pore-Scale Modeling of Geological Storage of CO2 in Natural Reservoir Rocks

    NASA Astrophysics Data System (ADS)

    Kohanpur, A. H.; Chen, Y.; Valocchi, A. J.; Tudek, J.; Crandall, D.

    2016-12-01

    CO2-brine flow in deep natural rocks is the focus of attention in geological storage of CO2. Understanding rock/flow properties at pore-scale is a vital component in field-scale modeling and prediction of fate of injected CO2. There are many challenges in working at the pore scale, such as size and selection of representative elementary volume (REV), particularly for material with complex geometry and heterogeneity, and the high computational costs. These issues factor into trade-offs that need to be made in choosing and applying pore-scale models. On one hand, pore-network modeling (PNM) simplifies the geometry and flow equations but can provide characteristic curves on fairly large samples. On the other hand, the lattice Boltzmann method (LBM) solves Navier-Stokes equations on the real geometry but is limited to small samples due to its high computational costs. Thus, both methods have some advantages but also face some challenges, which warrants a more detailed comparison and evaluation. In this study, we used industrial and micro-CT scans of actual reservoir rock samples to characterize pore structure at different resolutions. We ran LBM models directly on the characterized geometry and PNM on the equivalent 3D extracted network to determine single/two-phase flow properties during drainage and imbibition processes. Specifically, connectivity, absolute permeability, relative permeability curve, capillary pressure curve, and interface location are compared between models. We also did simulations on several subsamples from different locations including different domain sizes and orientations to encompass analysis of heterogeneity and isotropy. This work is primarily supported as part of the Center for Geologic Storage of CO2, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science and partially supported by the International Institute for Carbon-Neutral Energy Research (WPI-I2CNER) based at Kyushu University, Japan.

  16. 3D Voronoi grid dedicated software for modeling gas migration in deep layered sedimentary formations with TOUGH2-TMGAS

    NASA Astrophysics Data System (ADS)

    Bonduà, Stefano; Battistelli, Alfredo; Berry, Paolo; Bortolotti, Villiam; Consonni, Alberto; Cormio, Carlo; Geloni, Claudio; Vasini, Ester Maria

    2017-11-01

    As is known, a full three-dimensional (3D) unstructured grid permits a great degree of flexibility when performing accurate numerical reservoir simulations. However, when the Integral Finite Difference Method (IFDM) is used for spatial discretization, constraints (arising from the required orthogonality between the segment connecting the blocks nodes and the interface area between blocks) pose difficulties in the creation of grids with irregular shaped blocks. The full 3D Voronoi approach guarantees the respect of IFDM constraints and allows generation of grids conforming to geological formations and structural objects and at the same time higher grid resolution in volumes of interest. In this work, we present dedicated pre- and post-processing gridding software tools for the TOUGH family of numerical reservoir simulators, developed by the Geothermal Research Group of the DICAM Department, University of Bologna. VORO2MESH is a new software coded in C++, based on the voro++ library, allowing computation of the 3D Voronoi tessellation for a given domain and the creation of a ready to use TOUGH2 MESH file. If a set of geological surfaces is available, the software can directly generate the set of Voronoi seed points used for tessellation. In order to reduce the number of connections and so to decrease computation time, VORO2MESH can produce a mixed grid with regular blocks (orthogonal prisms) and irregular blocks (polyhedron Voronoi blocks) at the point of contact between different geological formations. In order to visualize 3D Voronoi grids together with the results of numerical simulations, the functionality of the TOUGH2Viewer post-processor has been extended. We describe an application of VORO2MESH and TOUGH2Viewer to validate the two tools. The case study deals with the simulation of the migration of gases in deep layered sedimentary formations at basin scale using TOUGH2-TMGAS. A comparison between the simulation performances of unstructured and structured grids is presented.

  17. Remote sensing and GIS-based prediction and assessment of copper-gold resources in Thailand

    NASA Astrophysics Data System (ADS)

    Yang, Shasha; Wang, Gongwen; Du, Wenhui; Huang, Luxiong

    2014-03-01

    Quantitative integration of geological information is a frontier and hotspot of prospecting decision research in the world. The forming process of large scale Cu-Au deposits is influenced by complicated geological events and restricted by various geological factors (stratum, structure and alteration). In this paper, using Thailand's copper-gold deposit district as a case study, geological anomaly theory is used along with the typical copper and gold metallogenic model, ETM+ remote sensing images, geological maps and mineral geology database in study area are combined with GIS technique. These techniques create ore-forming information such as geological information (strata, line-ring faults, intrusion), remote sensing information (hydroxyl alteration, iron alteration, linear-ring structure) and the Cu-Au prospect targets. These targets were identified using weights of evidence model. The research results show that the remote sensing and geological data can be combined to quickly predict and assess for exploration of mineral resources in a regional metallogenic belt.

  18. Building a base map with AutoCAD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flarity, S.J.

    1989-12-01

    The fundamental step in the exploration process is building a base map. Consequently, any serious computer exploration program should be capable of providing base maps. Data used in constructing base maps are available from commercial sources such as Tobin. and Petroleum Information. These data sets include line and well data, the line data being latitude longitude vectors, and the ell data any identifying text information for well and their locations. AutoCAD is a commercial program useful in building base maps. Its features include infinite zoom and pan capability, layering, block definition, text dialog boxes, and a command language, AutoLisp. AutoLispmore » provides more power by allowing the geologist to modify the way the program works. Three AutoLisp routines presented here allow geologists to construct a geologic base map from raw Tobin data. The first program, WELLS.LSP, sets up the map environment for the subsequent programs, WELLADD.LSP and LINEADD.LSP. Welladd.lisp reads the Tobin data and spots the well symbols and the identifying information. Lineadd.lsp performs the same task on line and textural information contained within the data set.« less

  19. The Cyborg Astrobiologist: scouting red beds for uncommon features with geological significance

    NASA Astrophysics Data System (ADS)

    McGuire, Patrick Charles; Díaz-Martínez, Enrique; Ormö, Jens; Gómez-Elvira, Javier; Rodríguez-Manfredi, José Antonio; Sebastián-Martínez, Eduardo; Ritter, Helge; Haschke, Robert; Oesker, Markus; Ontrup, Jörg

    2005-04-01

    The `Cyborg Astrobiologist' has undergone a second geological field trial, at a site in northern Guadalajara, Spain, near Riba de Santiuste. The site at Riba de Santiuste is dominated by layered deposits of red sandstones. The Cyborg Astrobiologist is a wearable computer and video camera system that has demonstrated a capability to find uncommon interest points in geological imagery in real time in the field. In this second field trial, the computer vision system of the Cyborg Astrobiologist was tested at seven different tripod positions, on three different geological structures. The first geological structure was an outcrop of nearly homogeneous sandstone, which exhibits oxidized-iron impurities in red areas and an absence of these iron impurities in white areas. The white areas in these `red beds' have turned white because the iron has been removed. The iron removal from the sandstone can proceed once the iron has been chemically reduced, perhaps by a biological agent. In one instance the computer vision system found several (iron-free) white spots to be uncommon and therefore interesting, as well as several small and dark nodules. The second geological structure was another outcrop some 600 m to the east, with white, textured mineral deposits on the surface of the sandstone, at the bottom of the outcrop. The computer vision system found these white, textured mineral deposits to be interesting. We acquired samples of the mineral deposits for geochemical analysis in the laboratory. This laboratory analysis of the crust identifies a double layer, consisting of an internal millimetre-size layering of calcite and an external centimetre-size efflorescence of gypsum. The third geological structure was a 50 cm thick palaeosol layer, with fossilized root structures of some plants. The computer vision system also found certain areas of these root structures to be interesting. A quasi-blind comparison of the Cyborg Astrobiologist's interest points for these images with the interest points determined afterwards by a human geologist shows that the Cyborg Astrobiologist concurred with the human geologist 68% of the time (true-positive rate), with a 32% false-positive rate and a 32% false-negative rate. The performance of the Cyborg Astrobiologist's computer vision system was by no means perfect, so there is plenty of room for improvement. However, these tests validate the image-segmentation and uncommon-mapping technique that we first employed at a different geological site (Rivas Vaciamadrid) with somewhat different properties for the imagery.

  20. Global geological mapping of Ganymede

    NASA Astrophysics Data System (ADS)

    Patterson, G. Wesley; Collins, Geoffrey C.; Head, James W.; Pappalardo, Robert T.; Prockter, Louise M.; Lucchitta, Baerbel K.; Kay, Jonathan P.

    2010-06-01

    We have compiled a global geological map of Ganymede that represents the most recent understanding of the satellite based on Galileo mission results. This contribution builds on important previous accomplishments in the study of Ganymede utilizing Voyager data and incorporates the many new discoveries that were brought about by examination of Galileo data. We discuss the material properties of geological units defined utilizing a global mosaic of the surface with a nominal resolution of 1 km/pixel assembled by the USGS with the best available Voyager and Galileo regional coverage and high resolution imagery (100-200 m/pixel) of characteristic features and terrain types obtained by the Galileo spacecraft. We also use crater density measurements obtained from our mapping efforts to examine age relationships amongst the various defined units. These efforts have resulted in a more complete understanding of the major geological processes operating on Ganymede, especially the roles of cryovolcanic and tectonic processes in the formation of might materials. They have also clarified the characteristics of the geological units that comprise the satellite's surface, the stratigraphic relationships of those geological units and structures, and the geological history inferred from those relationships. For instance, the characteristics and stratigraphic relationships of dark lineated material and reticulate material suggest they represent an intermediate stage between dark cratered material and light material units.

  1. Evaluation of three electronic report processing systems for preparing hydrologic reports of the U.S Geological Survey, Water Resources Division

    USGS Publications Warehouse

    Stiltner, G.J.

    1990-01-01

    In 1987, the Water Resources Division of the U.S. Geological Survey undertook three pilot projects to evaluate electronic report processing systems as a means to improve the quality and timeliness of reports pertaining to water resources investigations. The three projects selected for study included the use of the following configuration of software and hardware: Ventura Publisher software on an IBM model AT personal computer, PageMaker software on a Macintosh computer, and FrameMaker software on a Sun Microsystems workstation. The following assessment criteria were to be addressed in the pilot studies: The combined use of text, tables, and graphics; analysis of time; ease of learning; compatibility with the existing minicomputer system; and technical limitations. It was considered essential that the camera-ready copy produced be in a format suitable for publication. Visual improvement alone was not a consideration. This report consolidates and summarizes the findings of the electronic report processing pilot projects. Text and table files originating on the existing minicomputer system were successfully transformed to the electronic report processing systems in American Standard Code for Information Interchange (ASCII) format. Graphics prepared using a proprietary graphics software package were transferred to all the electronic report processing software through the use of Computer Graphic Metafiles. Graphics from other sources were entered into the systems by scanning paper images. Comparative analysis of time needed to process text and tables by the electronic report processing systems and by conventional methods indicated that, although more time is invested in creating the original page composition for an electronically processed report , substantial time is saved in producing subsequent reports because the format can be stored and re-used by electronic means as a template. Because of the more compact page layouts, costs of printing the reports were 15% to 25% less than costs of printing the reports prepared by conventional methods. Because the largest report workload in the offices conducting water resources investigations is preparation of Water-Resources Investigations Reports, Open-File Reports, and annual State Data Reports, the pilot studies only involved these projects. (USGS)

  2. PropBase Query Layer: a single portal to UK subsurface physical property databases

    NASA Astrophysics Data System (ADS)

    Kingdon, Andrew; Nayembil, Martin L.; Richardson, Anne E.; Smith, A. Graham

    2013-04-01

    Until recently, the delivery of geological information for industry and public was achieved by geological mapping. Now pervasively available computers mean that 3D geological models can deliver realistic representations of the geometric location of geological units, represented as shells or volumes. The next phase of this process is to populate these with physical properties data that describe subsurface heterogeneity and its associated uncertainty. Achieving this requires capture and serving of physical, hydrological and other property information from diverse sources to populate these models. The British Geological Survey (BGS) holds large volumes of subsurface property data, derived both from their own research data collection and also other, often commercially derived data sources. This can be voxelated to incorporate this data into the models to demonstrate property variation within the subsurface geometry. All property data held by BGS has for many years been stored in relational databases to ensure their long-term continuity. However these have, by necessity, complex structures; each database contains positional reference data and model information, and also metadata such as sample identification information and attributes that define the source and processing. Whilst this is critical to assessing these analyses, it also hugely complicates the understanding of variability of the property under assessment and requires multiple queries to study related datasets making extracting physical properties from these databases difficult. Therefore the PropBase Query Layer has been created to allow simplified aggregation and extraction of all related data and its presentation of complex data in simple, mostly denormalized, tables which combine information from multiple databases into a single system. The structure from each relational database is denormalized in a generalised structure, so that each dataset can be viewed together in a common format using a simple interface. Data are re-engineered to facilitate easy loading. The query layer structure comprises tables, procedures, functions, triggers, views and materialised views. The structure contains a main table PRB_DATA which contains all of the data with the following attribution: • a unique identifier • the data source • the unique identifier from the parent database for traceability • the 3D location • the property type • the property value • the units • necessary qualifiers • precision information and an audit trail Data sources, property type and units are constrained by dictionaries, a key component of the structure which defines what properties and inheritance hierarchies are to be coded and also guides the process as to what and how these are extracted from the structure. Data types served by the Query Layer include site investigation derived geotechnical data, hydrogeology datasets, regional geochemistry, geophysical logs as well as lithological and borehole metadata. The size and complexity of the data sets with multiple parent structures requires a technically robust approach to keep the layer synchronised. This is achieved through Oracle procedures written in PL/SQL containing the logic required to carry out the data manipulation (inserts, updates, deletes) to keep the layer synchronised with the underlying databases either as regular scheduled jobs (weekly, monthly etc) or invoked on demand. The PropBase Query Layer's implementation has enabled rapid data discovery, visualisation and interpretation of geological data with greater ease, simplifying the parametrisation of 3D model volumes and facilitating the study of intra-unit heterogeneity.

  3. A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration

    NASA Astrophysics Data System (ADS)

    Moosdorf, N.; Richard, S. M.

    2012-12-01

    A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the lithologic character of such units in a meaningful way. A lithogenetic unit category scheme accessible as a GeoSciML-portrayal-based OGC Styled Layer Description resource is key to enabling OneGeology (http://oneGeology.org) geologic map services to achieve a high degree of visual harmonization.

  4. The digital geologic map of Colorado in ARC/INFO format, Part A. Documentation

    USGS Publications Warehouse

    Green, Gregory N.

    1992-01-01

    This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map. This database was developed on a MicroVAX computer system using VAX V 5.4 nd ARC/INFO 5.0 software. UPDATE: April 1995, The update was done solely for the purpose of adding the abilitly to plot to an HP650c plotter. Two new ARC/INFO plot AMLs along with a lineset and shadeset for the HP650C design jet printer have been included. These new files are COLORADO.650, INDEX.650, TWETOLIN.E00 and TWETOSHD.E00. These files were created on a UNIX platform with ARC/INFO 6.1.2. Updated versions of INDEX.E00, CONTACT.E00, LINE.E00, DECO.E00 and BORDER.E00 files that included the newly defined HP650c items are also included. * Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government. Descriptors: The Digital Geologic Map of Colorado in ARC/INFO Format Open-File Report 92-050

  5. The digital geologic map of Colorado in ARC/INFO format, Part B. Common files

    USGS Publications Warehouse

    Green, Gregory N.

    1992-01-01

    This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map. This database was developed on a MicroVAX computer system using VAX V 5.4 nd ARC/INFO 5.0 software. UPDATE: April 1995, The update was done solely for the purpose of adding the abilitly to plot to an HP650c plotter. Two new ARC/INFO plot AMLs along with a lineset and shadeset for the HP650C design jet printer have been included. These new files are COLORADO.650, INDEX.650, TWETOLIN.E00 and TWETOSHD.E00. These files were created on a UNIX platform with ARC/INFO 6.1.2. Updated versions of INDEX.E00, CONTACT.E00, LINE.E00, DECO.E00 and BORDER.E00 files that included the newly defined HP650c items are also included. * Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government. Descriptors: The Digital Geologic Map of Colorado in ARC/INFO Format Open-File Report 92-050

  6. User guide for MODPATH Version 7—A particle-tracking model for MODFLOW

    USGS Publications Warehouse

    Pollock, David W.

    2016-09-26

    MODPATH is a particle-tracking post-processing program designed to work with MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. MODPATH version 7 is the fourth major release since its original publication. Previous versions were documented in USGS Open-File Reports 89–381 and 94–464 and in USGS Techniques and Methods 6–A41.MODPATH version 7 works with MODFLOW-2005 and MODFLOW–USG. Support for unstructured grids in MODFLOW–USG is limited to smoothed, rectangular-based quadtree and quadpatch grids.A software distribution package containing the computer program and supporting documentation, such as input instructions, output file descriptions, and example problems, is available from the USGS over the Internet (http://water.usgs.gov/ogw/modpath/).

  7. Analysis of the Geologic Structure and Compilation of the Geologic Map of the Northern Part of Planet Venus

    NASA Astrophysics Data System (ADS)

    Basilevsky, A. T.; Burba, G. A.; Ivanov, M. A.; Bobina, N. N.; Shashkina, V. P.; Head, J. W.

    Based on an analysis of the images of the Venusian surface obtained by the side-looking radar of the Magellan orbiter, a geologic map of the northern part of Venus (the region extending to the north of the 35°N latitude) at 1 : 10 000 000 scale is compiled. The map of this vast territory, comprising one-fifth of the planet surface, was compiled using only 12 geologic units, which implies a uniform character of terrains and land- forms on the investigated territory and, therefore, the uniformity of geologic processes that occurred on this planet. These units are the products of four main groups of geologic processes that occurred on Venus during the last 0.51 Myr: (1) basaltic volcanism; (2) tectonic compression and tensile deformation; (3) impact crater- ing; and (4) wind-related mobilization, transportation, and deposition of loose fine-grained materials. Basaltic volcanism is the main process that supplies new material on the surface of Venus. Tectonic deformation struc- tures, superposed on the material of different geologic units, determined the morphology of the units and formed the surfaces of unconformity between neighboring units. Ten of 12 geologic units form an age sequence that is virtually identical over the entire mapped territory of the planet. The possible incon- sistency of this sequence caused by anomalous relations existing between smooth plains (Ps) in the southeastern part of Lakshmi Planum and wrinkle ridged plains (Pwr) in the northern part of Sedna Planitia does not destroy this sequence as a whole. The results of our mapping support the model of global stratigraphy of Venus proposed by Basilevsky and Head (19951998) and provide evidence of the quasi-synchronous character of single-type geologic units on different areas of Venus rather than of the absence of synchronism. An analysis of the distribution of impact craters on different geologic units has shown the proximity of mean absolute ages of the material of the surface of Pwr plains, of the entire studied territory, and of the entire Venusian surface. The results of our analysis suggest that, within the area under study, the intensity of the leading geologic processes at the beginning of the studied segment of the geologic history was relatively high but decreased dramatically later.

  8. Geologic interpretation and multibeam bathymetry of the sea floor in southeastern Long Island Sound

    USGS Publications Warehouse

    Poppe, Lawrence J.; Ackerman, Seth D.; Doran, Elizabeth F.; Moser, Marc S.; Stewart, Helen F.; Forfinski, Nicholas A.; Gardner, Uther L.; Keene, Jennifer A.

    2006-01-01

    Digital terrain models (DTMs) produced from multibeam echosounder (MBES) bathymetric data provide valuable base maps for marine geological interpretations (e.g. Todd and others, 1999; Mosher and Thomson, 2002; ten Brink and others, 2004; Poppe and others, 2006a,b). These maps help define the geological variability of the sea floor (one of the primary controls of benthic habitat diversity); improve our understanding of the processes that control the distribution and transport of bottom sediments, the distribution of benthic habitats and associated infaunal community structures; and provide a detailed framework for future research, monitoring, and management activities. The bathymetric survey interpreted herein (National Oceanic and Atmospheric Administration (NOAA) survey H11255) covers roughly 95 km? of sea floor in southeastern Long Island Sound (fig. 1). This bathymetry has been examined in relation to seismic reflection data collected concurrently, as well as archived seismic profiles acquired as part of a long-standing geologic mapping partnership between the State of Connecticut and the U.S. Geological Survey (USGS). The objective of this work was to use these geophysical data sets to interpret geomorphological attributes of the sea floor in terms of the Quaternary geologic history and modern sedimentary processes within Long Island Sound.

  9. An iPad and Android-based Application for Digitally Recording Geologic Field Data

    NASA Astrophysics Data System (ADS)

    Malinconico, L. L.; Sunderlin, D.; Liew, C.; Ho, A. S.; Bekele, K. A.

    2011-12-01

    Field experience is a significant component in most geology courses, especially sed/strat and structural geology. Increasingly, the spatial presentation, analysis and interpretation of geologic data is done using digital methodologies (GIS, Google Earth, stereonet and spreadsheet programs). However, students and professionals continue to collect field data manually on paper maps and in the traditional "orange field notebooks". Upon returning from the field, data are then manually transferred into digital formats for processing, mapping and interpretation. The transfer process is both cumbersome and prone to transcription error. In conjunction with the computer science department, we are in the process of developing an application (App) for iOS (the iPad) and Android platforms that can be used to digitally record data measured in the field. This is not a mapping program, but rather a way of bypassing the field book step to acquire digital data directly that can then be used in various analysis and display programs. Currently, the application allows the user to select from five different structural data situations: contact, bedding, fault, joints and "other". The user can define a folder for the collection and separation of data for each project. Observations are stored as individual records of field measurements in each folder. The exact information gathered depends on the nature of the observation, but common to all pages is the ability to log date, time, and lat/long directly from the tablet. Information like strike and dip are entered using scroll wheels and formation names are also entered using scroll wheels that access easy-to-modify lists of the area's stratigraphic units. This insures uniformity in the creation of the digital records from day-to-day and across field teams. Pictures can also be taken using the tablet's camera that are linked to each record. Once the field collection is complete the data (including images) can be easily exported to a .csv file that can be opened in Excel for digital preparation for use in other programs. We will be field-testing the App in the fall, 2011 with weekly exercises and a week-long mapping project in Wyoming. We then will want to share the beta version of the software (at the meeting) with professional geologists and students in geology programs at other academic institutions to truly test the stability of the software and to solicit suggestions for improvements and additions.

  10. Modelling fully-coupled Thermo-Hydro-Mechanical (THM) processes in fractured reservoirs using GOLEM: a massively parallel open-source simulator

    NASA Astrophysics Data System (ADS)

    Jacquey, Antoine; Cacace, Mauro

    2017-04-01

    Utilization of the underground for energy-related purposes have received increasing attention in the last decades as a source for carbon-free energy and for safe storage solutions. Understanding the key processes controlling fluid and heat flow around geological discontinuities such as faults and fractures as well as their mechanical behaviours is therefore of interest in order to design safe and sustainable reservoir operations. These processes occur in a naturally complex geological setting, comprising natural or engineered discrete heterogeneities as faults and fractures, span a relatively large spectrum of temporal and spatial scales and they interact in a highly non-linear fashion. In this regard, numerical simulators have become necessary in geological studies to model coupled processes and complex geological geometries. In this study, we present a new simulator GOLEM, using multiphysics coupling to characterize geological reservoirs. In particular, special attention is given to discrete geological features such as faults and fractures. GOLEM is based on the Multiphysics Object-Oriented Simulation Environment (MOOSE). The MOOSE framework provides a powerful and flexible platform to solve multiphysics problems implicitly and in a tightly coupled manner on unstructured meshes which is of interest for the considered non-linear context. Governing equations in 3D for fluid flow, heat transfer (conductive and advective), saline transport as well as deformation (elastic and plastic) have been implemented into the GOLEM application. Coupling between rock deformation and fluid and heat flow is considered using theories of poroelasticity and thermoelasticity. Furthermore, considering material properties such as density and viscosity and transport properties such as porosity as dependent on the state variables (based on the International Association for the Properties of Water and Steam models) increase the coupling complexity of the problem. The GOLEM application aims therefore at integrating more physical processes observed in the field or in the laboratory to simulate more realistic scenarios. The use of high-level nonlinear solver technology allow us to tackle these complex multiphysics problems in three dimensions. Basic concepts behing the GOLEM simulator will be presented in this study as well as a few application examples to illustrate its main features.

  11. Development of performance assessment methodology for nuclear waste isolation in geologic media

    NASA Astrophysics Data System (ADS)

    Bonano, E. J.; Chu, M. S. Y.; Cranwell, R. M.; Davis, P. A.

    The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the U.S. Nuclear Regulatory Commission.

  12. Mid-Atlantic multichannel seismic-reflection profiles 14, 15, 16, and 17

    USGS Publications Warehouse

    Schlee, John Stevens

    1980-01-01

    The U. S. Geological Survey (USGS) is making available four multi­channel profiles collected by Teledyne Exploration in 1977 by means of a 48-channel streamer (3600 m long) and four airguns (2160 in). Profiles 15 and 16 were processed by.Teledyne Exploration and profiles 14 and 17 were processed on the Phoenix "I"* computer·by the USGS. The processing included standard demultiplexing, deconvolution before and aftfer stack, Common Depth Point (CDP) gathers, velocity analyses every 3 km, move-out correction, stacking, time-variant, filtering, and time-variant scaling.The released lines are over the outer edge of the Continental Shelf in the northern part of the Baltimore Canyon trough (Line 14: 140 km long and Line 15: 157 km long), over the Long Island platform. (Line 16: 313 km long), and over the Carolina platform (Line 17: 186 km long). These profiles were collected as a part of a regional grid over offshore Atlantic sedimentary basins in a continuing program to assess the resource potential by means of nonproprietary data.These profiles, plus the velocity scans and shotpoint maps, may be viewed at U. S. Geological Survey, Quissett Campus, Woods Hole, MA. 02543, and U. S. Geological Survey, Bldg. 25, Denver Federal Center, Denver, CO. Copies of maps, scans, and profiles can be purchased only from the National Geophysical Solar-Terrestrial Data Center, Environmental Data Service (NOM), Code D 621, Boulder, CO 80302.

  13. Fundamentals of Structural Geology

    NASA Astrophysics Data System (ADS)

    Pollard, David D.; Fletcher, Raymond C.

    2005-09-01

    Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors

  14. New Data Bases and Standards for Gravity Anomalies

    NASA Astrophysics Data System (ADS)

    Keller, G. R.; Hildenbrand, T. G.; Webring, M. W.; Hinze, W. J.; Ravat, D.; Li, X.

    2008-12-01

    Ever since the use of high-precision gravimeters emerged in the 1950's, gravity surveys have been an important tool for geologic studies. Recent developments that make geologically useful measurements from airborne and satellite platforms, the ready availability of the Global Positioning System that provides precise vertical and horizontal control, improved global data bases, and the increased availability of processing and modeling software have accelerated the use of the gravity method. As a result, efforts are being made to improve the gravity databases publicly available to the geoscience community by expanding their holdings and increasing the accuracy and precision of the data in them. Specifically the North American Gravity Database as well as the individual databases of Canada, Mexico, and the United States are being revised using new formats and standards to improve their coverage, standardization, and accuracy. An important part of this effort is revision of procedures and standards for calculating gravity anomalies taking into account the enhanced computational power available, modern satellite-based positioning technology, improved terrain databases, and increased interest in more accurately defining the different components of gravity anomalies. The most striking revision is the use of one single internationally accepted reference ellipsoid for the horizontal and vertical datums of gravity stations as well as for the computation of the calculated value of theoretical gravity. The new standards hardly impact the interpretation of local anomalies, but do improve regional anomalies in that long wavelength artifacts are removed. Most importantly, such new standards can be consistently applied to gravity database compilations of nations, continents, and even the entire world. Although many types of gravity anomalies have been described, they fall into three main classes. The primary class incorporates planetary effects, which are analytically prescribed, to derive the predicted or modeled gravity, and thus, anomalies of this class are termed planetary. The most primitive version of a gravity anomaly is simply the difference between the value of gravity predicted by the effect of the reference ellipsoid and the observed gravity anomaly. When the height of the gravity station increases, the ellipsoidal gravity anomaly decreases because of the increased distance of measurement from the anomaly- producing masses. The two primary anomalies in geophysics, which are appropriately classified as planetary anomalies, are the Free-air and Bouguer gravity anomalies. They employ models that account for planetary effects on gravity including the topography of the earth. A second class of anomaly, geological anomalies, includes the modeled gravity effect of known or assumed masses leading to the predicted gravity by using geological data such as densities and crustal thickness. The third class of anomaly, filtered anomalies, removes arbitrary gravity effects of largely unknown sources that are empirically or analytically determined from the nature of the gravity anomalies by filtering.

  15. Geochemical database of feed coal and coal combustion products (CCPs) from five power plants in the United States

    USGS Publications Warehouse

    Affolter, Ronald H.; Groves, Steve; Betterton, William J.; William, Benzel; Conrad, Kelly L.; Swanson, Sharon M.; Ruppert, Leslie F.; Clough, James G.; Belkin, Harvey E.; Kolker, Allan; Hower, James C.

    2011-01-01

    The principal mission of the U.S. Geological Survey (USGS) Energy Resources Program (ERP) is to (1) understand the processes critical to the formation, accumulation, occurrence, and alteration of geologically based energy resources; (2) conduct scientifically robust assessments of those resources; and (3) study the impacts of energy resource occurrence and (or) their production and use on both the environment and human health. The ERP promotes and supports research resulting in original, geology-based, non-biased energy information products for policy and decision makers, land and resource managers, other Federal and State agencies, the domestic energy industry, foreign governments, non-governmental groups, and academia. Investigations include research on the geology of oil, gas, and coal, and the impacts associated with energy resource occurrence, production, quality, and utilization. The ERP's focus on coal is to support investigations into current issues pertaining to coal production, beneficiation and (or) conversion, and the environmental impact of the coal combustion process and coal combustion products (CCPs). To accomplish these studies, the USGS combines its activities with other organizations to address domestic and international issues that relate to the development and use of energy resources.

  16. Megascale processes: Natural disasters and human behavior

    USGS Publications Warehouse

    Kieffer, S.W.; Barton, P.; Chesworth, W.; Palmer, A.R.; Reitan, P.; Zen, E.-A.

    2009-01-01

    Megascale geologic processes, such as earthquakes, tsunamis, volcanic eruptions, floods, and meteoritic impacts have occurred intermittently throughout geologic time, and perhaps on several planets. Unlike other catastrophes discussed in this volume, a unique process is unfolding on Earth, one in which humans may be the driving agent of megadisasters. Although local effects on population clusters may have been catastrophic in the past, human societies have never been interconnected globally at the scale that currently exists. We review some megascale processes and their effects in the past, and compare present conditions and possible outcomes. We then propose that human behavior itself is having effects on the planet that are comparable to, or greater than, these natural disasters. Yet, unlike geologic processes, human behavior is potentially under our control. Because the effects of our behavior threaten the stability, or perhaps even existence, of a civilized society, we call for the creation of a body to institute coherent global, credible, scientifi cally based action that is sensitive to political, economic, religious, and cultural values. The goal would be to institute aggressive monitoring, identify and understand trends, predict their consequences, and suggest and evaluate alternative actions to attempt to rescue ourselves and our ecosystems from catastrophe. We provide a template modeled after several existing national and international bodies. ?? 2009 The Geological Society of America.

  17. Plate motions and deformations from geologic and geodetic data

    NASA Technical Reports Server (NTRS)

    Jordan, Thomas H.

    1989-01-01

    The very long baseline interferometry (VLBI) measurements made in the western U.S. since 1979 provide discrete samples of the temporal and spatial deformation field. The interpretation of the VLBI derived rates of deformation requires an examination of geologic information and more densely sampled ground based geodetic data. Triangulation and trilateration data measured on two regional networks, one in the central Mojave Desert and one in the Coast Ranges east of the San Andreas fault, were processed. At the spatial scales spanned by these local geodetic networks, auxiliary geologic and geophysical data were utilized to examine the relation between measured incremental strain and the accommodation of strain seen in local geologic structures, strain release in earthquakes, and principal stress directions inferred from in situ measurements. VLBI data was also processed from stations distributed across the Pacific-North America plate boundary zone in the western U.S. The VLBI data were used to constrain the integrated rate of deformation across portions of the continental plate boundary in California and to provide a tectonic framework to interpret regional geodetic and geologic studies.

  18. SEAWAT: A Computer Program for Simulation of Variable-Density Groundwater Flow and Multi-Species Solute and Heat Transport

    USGS Publications Warehouse

    Langevin, Christian D.

    2009-01-01

    SEAWAT is a MODFLOW-based computer program designed to simulate variable-density groundwater flow coupled with multi-species solute and heat transport. The program has been used for a wide variety of groundwater studies including saltwater intrusion in coastal aquifers, aquifer storage and recovery in brackish limestone aquifers, and brine migration within continental aquifers. SEAWAT is relatively easy to apply because it uses the familiar MODFLOW structure. Thus, most commonly used pre- and post-processors can be used to create datasets and visualize results. SEAWAT is a public domain computer program distributed free of charge by the U.S. Geological Survey.

  19. Catalog of Computer Programs Used in Undergraduate Geological Education, Second Edition: Installment 2.

    ERIC Educational Resources Information Center

    Burger, H. Robert

    1983-01-01

    Part 1 (SE 533 635) presented programs for use in mineralogy, petrology, and geochemistry. This part presents an annotated list of 64 additional programs, focusing on introductory geology, mapping, and statistical packages for geological analyses. A brief description, source, suggested use(s), programing language, and other information are…

  20. Community Decadal Panel for Terrestrial Analogs to Mars

    NASA Astrophysics Data System (ADS)

    Barlow, N. G.; Farr, T.; Baker, V. R.; Bridges, N.; Carsey, F.; Duxbury, N.; Gilmore, M. S.; Green, J. R.; Grin, E.; Hansen, V.; Keszthelyi, L.; Lanagan, P.; Lentz, R.; Marinangeli, L.; Morris, P. A.; Ori, G. G.; Paillou, P.; Robinson, C.; Thomson, B.

    2001-11-01

    It is well recognized that interpretations of Mars must begin with the Earth as a reference. The most successful comparisons have focused on understanding geologic processes on the Earth well enough to extrapolate to Mars' environment. Several facets of terrestrial analog studies have been pursued and are continuing. These studies include field workshops, characterization of terrestrial analog sites for Mars, instrument tests, laboratory measurements (including analysis of martian meteorites), and computer and laboratory modeling. The combination of all these activities allows scientists to constrain the processes operating in specific terrestrial environments and extrapolate how similar processes could affect Mars. The Terrestrial Analogs for Mars Community Panel is considering the following two key questions: (1) How do terrestrial analog studies tie in to the MEPAG science questions about life, past climate, and geologic evolution of Mars, and (2) How can future instrumentation be used to address these questions. The panel is considering the issues of data collection, value of field workshops, data archiving, laboratory measurements and modeling, human exploration issues, association with other areas of solar system exploration, and education and public outreach activities.

  1. Terrestrial Analogs to Mars

    NASA Astrophysics Data System (ADS)

    Farr, T. G.; Arcone, S.; Arvidson, R. W.; Baker, V.; Barlow, N. G.; Beaty, D.; Bell, M. S.; Blankenship, D. D.; Bridges, N.; Briggs, G.; Bulmer, M.; Carsey, F.; Clifford, S. M.; Craddock, R. A.; Dickerson, P. W.; Duxbury, N.; Galford, G. L.; Garvin, J.; Grant, J.; Green, J. R.; Gregg, T. K. P.; Guinness, E.; Hansen, V. L.; Hecht, M. H.; Holt, J.; Howard, A.; Keszthelyi, L. P.; Lee, P.; Lanagan, P. D.; Lentz, R. C. F.; Leverington, D. W.; Marinangeli, L.; Moersch, J. E.; Morris-Smith, P. A.; Mouginis-Mark, P.; Olhoeft, G. R.; Ori, G. G.; Paillou, P.; Reilly, J. F., II; Rice, J. W., Jr.; Robinson, C. A.; Sheridan, M.; Snook, K.; Thomson, B. J.; Watson, K.; Williams, K.; Yoshikawa, K.

    2002-08-01

    It is well recognized that interpretations of Mars must begin with the Earth as a reference. The most successful comparisons have focused on understanding geologic processes on the Earth well enough to extrapolate to Mars' environment. Several facets of terrestrial analog studies have been pursued and are continuing. These studies include field workshops, characterization of terrestrial analog sites, instrument tests, laboratory measurements (including analysis of Martian meteorites), and computer and laboratory modeling. The combination of all these activities allows scientists to constrain the processes operating in specific terrestrial environments and extrapolate how similar processes could affect Mars. The Terrestrial Analogs for Mars Community Panel has considered the following two key questions: (1) How do terrestrial analog studies tie in to the Mars Exploration Payload Assessment Group science questions about life, past climate, and geologic evolution of Mars, and (2) How can future instrumentation be used to address these questions. The panel has considered the issues of data collection, value of field workshops, data archiving, laboratory measurements and modeling, human exploration issues, association with other areas of solar system exploration, and education and public outreach activities.

  2. Agent Based Modeling Applications for Geosciences

    NASA Astrophysics Data System (ADS)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in a thermodynamic framework as a set of reactions that roll-up the integrated effect that diverse biological communities exert on a geological system. This approach may work well to predict the effect of certain biological communities in specific environments in which experimental data is available. However, it does not further our knowledge of how the geobiological system actually functions on a micro scale. Agent-based techniques may provide a framework to explore the fundamental interactions required to explain the system-wide behavior. This presentation will present a survey of several promising applications of agent-based modeling approaches to problems in the geosciences and describe specific contributions to some of the inherent challenges facing this approach.

  3. Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.

    2017-09-01

    We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction. We have developed a methodology for synthesizing physics-based broadband ground motion that incorporates the effects of realistic earthquake rupture along specific faults and the actual geology between the source and site.

  4. Performance of the Landsat-Data Collection System in a Total System Context

    NASA Technical Reports Server (NTRS)

    Paulson, R. W. (Principal Investigator); Merk, C. F.

    1975-01-01

    The author has identified the following significant results. This experiment was, and continues to be, an integration of the LANDSAT-DCS with the data collection and processing system of the Geological Survey. Although an experimental demonstration, it was a successful integration of a satellite relay system that is capable of continental data collection, and an existing governmental nationwide operational data processing and distributing networks. The Survey's data processing system uses a large general purpose computer with insufficient redundancy for 24-hour a day, 7 day a week operation. This is significant, but soluble obstacle to converting the experimental integration of the system to an operational integration.

  5. Developing, deploying and reflecting on a web-based geologic simulation tool

    NASA Astrophysics Data System (ADS)

    Cockett, R.

    2015-12-01

    Geoscience is visual. It requires geoscientists to think and communicate about processes and events in three spatial dimensions and variations through time. This is hard(!), and students often have difficulty when learning and visualizing the three dimensional and temporal concepts. Visible Geology is an online geologic block modelling tool that is targeted at students in introductory and structural geology. With Visible Geology, students are able to combine geologic events in any order to create their own geologic models and ask 'what-if' questions, as well as interrogate their models using cross sections, boreholes and depth slices. Instructors use it as a simulation and communication tool in demonstrations, and students use it to explore concepts of relative geologic time, structural relationships, as well as visualize abstract geologic representations such as stereonets. The level of interactivity and creativity inherent in Visible Geology often results in a sense of ownership and encourages engagement, leading learners to practice visualization and interpretation skills and discover geologic relationships. Through its development over the last five years, Visible Geology has been used by over 300K students worldwide as well as in multiple targeted studies at the University of Calgary and at the University of British Columbia. The ease of use of the software has made this tool practical for deployment in classrooms of any size as well as for individual use. In this presentation, I will discuss the thoughts behind the implementation and layout of the tool, including a framework used for the development and design of new educational simulations. I will also share some of the surprising and unexpected observations on student interaction with the 3D visualizations, and other insights that are enabled by web-based development and deployment.

  6. FY 1999 Laboratory Directed Research and Development annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PJ Hughes

    2000-06-13

    A short synopsis of each project is given covering the following main areas of research and development: Atmospheric sciences; Biotechnology; Chemical and instrumentation analysis; Computer and information science; Design and manufacture engineering; Ecological science; Electronics and sensors; Experimental technology; Health protection and dosimetry; Hydrologic and geologic science; Marine sciences; Materials science; Nuclear science and engineering; Process science and engineering; Sociotechnical systems analysis; Statistics and applied mathematics; and Thermal and energy systems.

  7. Approach of automatic 3D geological mapping: the case of the Kovdor phoscorite-carbonatite complex, NW Russia.

    PubMed

    Kalashnikov, A O; Ivanyuk, G Yu; Mikhailova, J A; Sokharev, V A

    2017-07-31

    We have developed an approach for automatic 3D geological mapping based on conversion of chemical composition of rocks to mineral composition by logical computation. It allows to calculate mineral composition based on bulk rock chemistry, interpolate the mineral composition in the same way as chemical composition, and, finally, build a 3D geological model. The approach was developed for the Kovdor phoscorite-carbonatite complex containing the Kovdor baddeleyite-apatite-magnetite deposit. We used 4 bulk rock chemistry analyses - Fe magn , P 2 O 5 , CO 2 and SiO 2 . We used four techniques for prediction of rock types - calculation of normative mineral compositions (norms), multiple regression, artificial neural network and developed by logical evaluation. The two latter became the best. As a result, we distinguished 14 types of phoscorites (forsterite-apatite-magnetite-carbonate rock), carbonatite and host rocks. The results show good convergence with our petrographical studies of the deposit, and recent manually built maps. The proposed approach can be used as a tool of a deposit genesis reconstruction and preliminary geometallurgical modelling.

  8. Complex path flows in geological media imaged by X-Ray computed tomography

    NASA Astrophysics Data System (ADS)

    Neuville, Amélie; Ebner, Marcus; Toussaint, Renaud; Renard, François; Koehn, Daniel; Flekkøy, Eirik; Cochard, Alain

    2013-04-01

    Stylolites as well as fractures are reported as major conduits in geological media (1, 2). The flow circulation has a strong influence on hydro-mecanico-chemical processes, in particular on crystallization and dissolution (3, 4). For instance hydrothermal ore deposits are frequently located in stylolites and fractures at depth. The fluid pressure also intervenes as a thermodynamic parameter in chemical reactions, and is in addition responsible for elastic deformations of the medium. Using tridimensional numerical simulations, we aim at better characterizing the flow circulation in complex structures, and at investigating on how the flow modifies the geological medium. First, X-Ray computed tomography scans of a complete stylolite structure (i.e. calcareous matrix, clay layering in the aperture, and the very thin aperture itself), and that of a fractured sandstone sample were performed. Then, image processing is required in order to extract the geometry of the porous medium of each sample. The geometries are actually more complicated than that of classical fractures, because of the existence of non connected -- or barely connected -- void spaces. We report on the influence of this image processing on the aperture geometry and on the computed permeability. This is addressed by first performing a numerical simulation of the tridimensional velocity field, using a coupled lattice Boltzmann method that solves the complete Navier-Stokes equation. After calculating the velocity field we then question the link between the geometry of complex stylolites and fractures, and the spatial auto-correlation of the velocity field. This correlation might indeed be important for dispersion processes. A first approach is to compute this correlation from the simulated velocity field. Another approach is to compute analytically the correlation function, from the knowledge of the aperture correlation. This is however developed in the perturbative limit of small aperture variations, that may not hold for the apertures found in stylolites. We then present the pressure field obtained within these complex structures, and give preliminary tracks on how variations of the pressure might be responsible for transformations of the medium, that affect its mechanical and transport properties. 1 A Neuville, R Toussaint, and J Schmittbuhl (2010) Hydro-thermal flows in a self-affine rough fracture. Physical Review E, 82, 036317 2 André G., C. Hibsch, S. Fourcade, M. Cathelineau and S. Buschaert (2010) Chronology of fracture sealing under a meteoric fluid environment: Microtectonic and isotopic evidence of major Cainozoic events in the eastern Paris Basin (France). Tectonophysics, 490, 214-228 3 Laronne Ben-Itzhak, L., E. Aharonov, R. Toussaint and A. Sagy (2012) Upper bound on stylolite roughness as indicator for the duration and amount of dissolution. Earth and Planetary Science Letters, 337-338, 186-196 4 Angheluta, L., J. Mathiesen, E. Aharonov (2012) Compaction of porous rock by dissolution on discrete stylolites: A one-dimensional model. Journal of Geophysical Research -- Solid Earth, 117, B08203

  9. Mini-batch optimized full waveform inversion with geological constrained gradient filtering

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Jia, Junxiong; Wu, Bangyu; Gao, Jinghuai

    2018-05-01

    High computation cost and generating solutions without geological sense have hindered the wide application of Full Waveform Inversion (FWI). Source encoding technique is a way to dramatically reduce the cost of FWI but subject to fix-spread acquisition setup requirement and slow convergence for the suppression of cross-talk. Traditionally, gradient regularization or preconditioning is applied to mitigate the ill-posedness. An isotropic smoothing filter applied on gradients generally gives non-geological inversion results, and could also introduce artifacts. In this work, we propose to address both the efficiency and ill-posedness of FWI by a geological constrained mini-batch gradient optimization method. The mini-batch gradient descent optimization is adopted to reduce the computation time by choosing a subset of entire shots for each iteration. By jointly applying the structure-oriented smoothing to the mini-batch gradient, the inversion converges faster and gives results with more geological meaning. Stylized Marmousi model is used to show the performance of the proposed method on realistic synthetic model.

  10. On-Board, Real-Time Preprocessing System for Optical Remote-Sensing Imagery

    PubMed Central

    Qi, Baogui; Zhuang, Yin; Chen, He; Chen, Liang

    2018-01-01

    With the development of remote-sensing technology, optical remote-sensing imagery processing has played an important role in many application fields, such as geological exploration and natural disaster prevention. However, relative radiation correction and geometric correction are key steps in preprocessing because raw image data without preprocessing will cause poor performance during application. Traditionally, remote-sensing data are downlinked to the ground station, preprocessed, and distributed to users. This process generates long delays, which is a major bottleneck in real-time applications for remote-sensing data. Therefore, on-board, real-time image preprocessing is greatly desired. In this paper, a real-time processing architecture for on-board imagery preprocessing is proposed. First, a hierarchical optimization and mapping method is proposed to realize the preprocessing algorithm in a hardware structure, which can effectively reduce the computation burden of on-board processing. Second, a co-processing system using a field-programmable gate array (FPGA) and a digital signal processor (DSP; altogether, FPGA-DSP) based on optimization is designed to realize real-time preprocessing. The experimental results demonstrate the potential application of our system to an on-board processor, for which resources and power consumption are limited. PMID:29693585

  11. Integrated water flow model and modflow-farm process: A comparison of theory, approaches, and features of two integrated hydrologic models

    USGS Publications Warehouse

    Dogrul, Emin C.; Schmid, Wolfgang; Hanson, Randall T.; Kadir, Tariq; Chung, Francis

    2016-01-01

    Effective modeling of conjunctive use of surface and subsurface water resources requires simulation of land use-based root zone and surface flow processes as well as groundwater flows, streamflows, and their interactions. Recently, two computer models developed for this purpose, the Integrated Water Flow Model (IWFM) from the California Department of Water Resources and the MODFLOW with Farm Process (MF-FMP) from the US Geological Survey, have been applied to complex basins such as the Central Valley of California. As both IWFM and MFFMP are publicly available for download and can be applied to other basins, there is a need to objectively compare the main approaches and features used in both models. This paper compares the concepts, as well as the method and simulation features of each hydrologic model pertaining to groundwater, surface water, and landscape processes. The comparison is focused on the integrated simulation of water demand and supply, water use, and the flow between coupled hydrologic processes. The differences in the capabilities and features of these two models could affect the outcome and types of water resource problems that can be simulated.

  12. On-Board, Real-Time Preprocessing System for Optical Remote-Sensing Imagery.

    PubMed

    Qi, Baogui; Shi, Hao; Zhuang, Yin; Chen, He; Chen, Liang

    2018-04-25

    With the development of remote-sensing technology, optical remote-sensing imagery processing has played an important role in many application fields, such as geological exploration and natural disaster prevention. However, relative radiation correction and geometric correction are key steps in preprocessing because raw image data without preprocessing will cause poor performance during application. Traditionally, remote-sensing data are downlinked to the ground station, preprocessed, and distributed to users. This process generates long delays, which is a major bottleneck in real-time applications for remote-sensing data. Therefore, on-board, real-time image preprocessing is greatly desired. In this paper, a real-time processing architecture for on-board imagery preprocessing is proposed. First, a hierarchical optimization and mapping method is proposed to realize the preprocessing algorithm in a hardware structure, which can effectively reduce the computation burden of on-board processing. Second, a co-processing system using a field-programmable gate array (FPGA) and a digital signal processor (DSP; altogether, FPGA-DSP) based on optimization is designed to realize real-time preprocessing. The experimental results demonstrate the potential application of our system to an on-board processor, for which resources and power consumption are limited.

  13. Significant achievements in the Planetary Geology Program. [geologic processes, comparative planetology, and solar system evolution

    NASA Technical Reports Server (NTRS)

    Head, J. W. (Editor)

    1978-01-01

    Developments reported at a meeting of principal investigators for NASA's planetology geology program are summarized. Topics covered include: constraints on solar system formation; asteriods, comets, and satellites; constraints on planetary interiors; volatiles and regoliths; instrument development techniques; planetary cartography; geological and geochemical constraints on planetary evolution; fluvial processes and channel formation; volcanic processes; Eolian processes; radar studies of planetary surfaces; cratering as a process, landform, and dating method; and the Tharsis region of Mars. Activities at a planetary geology field conference on Eolian processes are reported and techniques recommended for the presentation and analysis of crater size-frequency data are included.

  14. Flow characteristics at U.S. Geological Survey streamgages in the conterminous United States

    USGS Publications Warehouse

    Wolock, David

    2003-01-01

    This dataset represents point locations and flow characteristics for current (as of November 20, 2001) and historical U.S. Geological Survey (USGS) streamgages in the conterminous United States. The flow characteristics were computed from the daily streamflow data recorded at each streamgage for the period of record. The attributes associated with each streamgage include: Station number Station name Station latitude (decimal degrees in North American Datum of 1983, NAD 83) Station longitude (decimal degrees in NAD 83) First date (year, month, day) of streamflow data Last date (year, month, day) of streamflow data Number of days of streamflow data Minimum and maximum daily flow for the period of record (cubic feet per second) Percentiles (1, 5, 10, 20, 25, 50, 75, 80, 90, 95, 99) of daily flow for the period of record (cubic feet per second) Average and standard deviation of daily flow for the period of record (cubic feet per second) Mean annual base-flow index (BFI: see supplemental information) computed for the period of record (fraction, ranging from 0 to 1) Year-to-year standard deviation of the annual base-flow index computed for the period of record (fraction) Number of years of data used to compute the base-flow index (years) Reported drainage area (square miles) Reported contributing drainage area (square miles) National Water Information System (NWIS)-Web page URL for streamgage Hydrologic Unit Code (HUC, 8 digit) Hydrologic landscape region (HLR) River Reach File 1 (RF1) segment identification number (E2RF1##) Station numbers, names, locations, and drainage areas were acquired through the National Water Information System (NWIS)-Web (http://water.usgs.gov/nwis) on November 20, 2001. The streamflow data used to compute flow characteristics were copied from the Water server (water.usgs.gov:/www/htdocs/nwisweb/data1/discharge/) on November 2, 2001. The missing value indicator for all attributes is -99. Some streamflow characteristics are missing for: (1) streamgages measuring flow subject to tidal effects, which cause flow to reverse directions, (2) streamgages with site information but no streamflow data at the time the data were retrieved, and (3) streamgages with record length too short to compute the base-flow index.

  15. The Formation of Life-sustaining Planets in Extrasolar Systems

    NASA Technical Reports Server (NTRS)

    Chambers, J. E.

    2003-01-01

    The spatial exploration is providing us a large quantity of information about the composition of the planets and satellites crusts. However, most of the experiences that are proposed in the guides of activities in Planetary Geology are based exclusively on the images utilization: photographs, maps, models or artistic reconstructions [1,2]. That things help us to recognize shapes and to deduce geological processes, but they says us little about the materials that they are implicated. In order to avoid this dicotomy between shapes and materials, we have designed an experience in the one which, employing of rocks and landscapes of our geological environment more next, the pupils be able to do an exercise of compared planetology analyzing shapes, processes and material of several planetary bodies of the Solar System.

  16. Chapter 3: Tabular Data and Graphical Images in Support of the U.S. Geological Survey National Oil and Gas Assessment - Western Gulf Province, Smackover-Austin-Eagle Ford Composite Total Petroleum System (504702)

    USGS Publications Warehouse

    Klett, T.R.; Le, P.A.

    2006-01-01

    This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).

  17. Application of ERTS images and image processing to regional geologic problems and geologic mapping in northern Arizona

    NASA Technical Reports Server (NTRS)

    Goetz, A. F. H. (Principal Investigator); Billingsley, F. C.; Gillespie, A. R.; Abrams, M. J.; Squires, R. L.; Shoemaker, E. M.; Lucchitta, I.; Elston, D. P.

    1975-01-01

    The author has identified the following significant results. Computer image processing was shown to be both valuable and necessary in the extraction of the proper subset of the 200 million bits of information in an ERTS image to be applied to a specific problem. Spectral reflectivity information obtained from the four MSS bands can be correlated with in situ spectral reflectance measurements after path radiance effects have been removed and a proper normalization has been made. A detailed map of the major fault systems in a 90,000 sq km area in northern Arizona was compiled from high altitude photographs and pre-existing published and unpublished map data. With the use of ERTS images, three major fault systems, the Sinyala, Bright Angel, and Mesa Butte, were identified and their full extent measured. A byproduct of the regional studies was the identification of possible sources of shallow ground water, a scarce commodity in these regions.

  18. User guide for MODPATH version 6 - A particle-tracking model for MODFLOW

    USGS Publications Warehouse

    Pollock, David W.

    2012-01-01

    MODPATH is a particle-tracking post-processing model that computes three-dimensional flow paths using output from groundwater flow simulations based on MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. This report documents MODPATH version 6. Previous versions were documented in USGS Open-File Reports 89-381 and 94-464. The program uses a semianalytical particle-tracking scheme that allows an analytical expression of a particle's flow path to be obtained within each finite-difference grid cell. A particle's path is computed by tracking the particle from one cell to the next until it reaches a boundary, an internal sink/source, or satisfies another termination criterion. Data input to MODPATH consists of a combination of MODFLOW input data files, MODFLOW head and flow output files, and other input files specific to MODPATH. Output from MODPATH consists of several output files, including a number of particle coordinate output files intended to serve as input data for other programs that process, analyze, and display the results in various ways. MODPATH is written in FORTRAN and can be compiled by any FORTRAN compiler that fully supports FORTRAN-2003 or by most commercially available FORTRAN-95 compilers that support the major FORTRAN-2003 language extensions.

  19. Geological and geochemical aspects of uranium deposits. A selected, annotated bibliography. Vol. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M.B.; Garland, P.A.

    1977-10-01

    This bibliography was compiled by selecting 580 references from the Bibliographic Information Data Base of the Department of Energy's (DOE) National Uranium Resource Evaluation (NURE) Program. This data base and five others have been created by the Ecological Sciences Information Center to provide technical computer-retrievable data on various aspects of the nation's uranium resources. All fields of uranium geology are within the defined scope of the project, as are aerial surveying procedures, uranium reserves and resources, and universally applied uranium research. References used by DOE-NURE contractors in completing their aerial reconnaissance survey reports have been included at the request ofmore » the Grand Junction Office, DOE. The following indexes are provided to aid the user in locating reference of interest: author, keyword, geographic location, quadrangle name, geoformational index, and taxonomic name.« less

  20. Landscape Metrics to Predict Soil Spatial Patterns

    NASA Astrophysics Data System (ADS)

    Gillin, C. P.; McGuire, K. J.; Bailey, S.; Prisley, S.

    2012-12-01

    Recent literature has advocated the application of hydropedology, or the integration of hydrology and pedology, to better understand hydrologic flowpaths and soil spatial heterogeneity in a landscape. Hydropedology can be used to describe soil units affected by distinct topography, geology, and hydrology. Such a method has not been applied to digital soil mapping in the context of spatial variations in hydrological and biogeochemical processes. The purpose of this study is to use field observations of soil morphology, geospatial information technology, and a multinomial logistic regression model to predict the distribution of five hydropedological units (HPUs) across a 41-hectare forested headwater catchment in New England. Each HPU reflects varying degrees of lateral flow influence on soil development. Ninety-six soil characterization pits were located throughout the watershed, and HPU type was identified at each pit based on the presence and thickness of genetic soil horizons. Digital terrain analysis was conducted using ArcGIS and SAGA software to compute topographic and landscape metrics. Results indicate that each HPU occurs under specific topographic settings that influence subsurface hydrologic conditions. Among the most important landscape metrics are distance from stream, distance from bedrock outcrop, upslope accumulated area, the topographic wetness index, the downslope index, and curvature. Our project is unique in that it delineates high resolution soil units using a process-based morphological approach rather than a traditional taxonomical method taken by conventional soil surveys. Hydropedological predictor models can be a valuable tool for informing forest and land management decisions, water quality planning, soil carbon accounting, and understanding subsurface hydrologic dynamics. They can also be readily calibrated for regions of differing geology, topography, and climate regimes.

  1. Progress in the development of shallow-water mapping systems

    USGS Publications Warehouse

    Bergeron, E.; Worley, C.R.; O'Brien, T.

    2007-01-01

    The USGS (US Geological Survey) Coastal and Marine Geology has deployed an advance autonomous shallow-draft robotic vehicle, Iris, for shallow-water mapping in Apalachicola Bay, Florida. The vehicle incorporates a side scan sonar system, seismic-reflection profiler, single-beam echosounder, and global positioning system (GPS) navigation. It is equipped with an onboard microprocessor-based motor controller, delivering signals for speed and steering to hull-mounted brushless direct-current thrusters. An onboard motion sensor in the Sea Robotics vehicle control system enclosure has been integrated in the vehicle to measure the vehicle heave, pitch, roll, and heading. Three water-tight enclosures are mounted along the vehicle axis for the Edgetech computer and electronics system including the Sea Robotics computer, a control and wireless communications system, and a Thales ZXW real-time kinematic (RTK) GPS receiver. The vehicle has resulted in producing high-quality seismic reflection and side scan sonar data, which will help in developing the baseline oyster habitat maps.

  2. Guidelines and standard procedures for continuous water-quality monitors: Site selection, field operation, calibration, record computation, and reporting

    USGS Publications Warehouse

    Wagner, Richard J.; Mattraw, Harold C.; Ritz, George F.; Smith, Brett A.

    2000-01-01

    The U.S. Geological Survey uses continuous water-quality monitors to assess variations in the quality of the Nation's surface water. A common system configuration for data collection is the four-parameter water-quality monitoring system, which collects temperature, specific conductance, dissolved oxygen, and pH data, although systems can be configured to measure other properties such as turbidity or chlorophyll. The sensors that are used to measure these water properties require careful field observation, cleaning, and calibration procedures, as well as thorough procedures for the computation and publication of final records. Data from sensors can be used in conjunction with collected samples and chemical analyses to estimate chemical loads. This report provides guidelines for site-selection considerations, sensor test methods, field procedures, error correction, data computation, and review and publication processes. These procedures have evolved over the past three decades, and the process continues to evolve with newer technologies.

  3. Interactive Online Modules and Videos for Learning Geological Concepts at the University of Toronto Department of Earth Sciences

    NASA Astrophysics Data System (ADS)

    Veglio, E.; Graves, L. W.; Bank, C. G.

    2014-12-01

    We designed various computer-based applications and videos as educational resources for undergraduate courses at the University of Toronto in the Earth Science Department. These resources were developed in effort to enhance students' self-learning of key concepts as identified by educators at the department. The interactive learning modules and videos were created using the programs MATLAB and Adobe Creative Suite 5 (Photoshop and Premiere) and range from optical mineralogy (extinction and Becke line), petrology (equilibrium melting in 2-phase systems), crystallography (crystal systems), geophysics (gravity anomaly), and geologic history (evolution of Canada). These resources will be made available for students on internal course websites as well as through the University of Toronto Earth Science's website (www.es.utoronto.ca) where appropriate; the video platform YouTube.com may be used to reach a wide audience and promote the material. Usage of the material will be monitored and feedback will be collected over the next academic year in order to gage the use of these interactive learning tools and to assess if these computer-based applications and videos foster student engagement and active learning, and thus offer an enriched learning experience.

  4. Recommendations concerning satellite-acquired earth resource data: 1982 report of the Data Management Subcommittee of the GEOSAT Committee, Incorporated

    NASA Technical Reports Server (NTRS)

    1982-01-01

    End user concerns about the content and accessibility of libraries of remote sensing data in general are addressed. Recommendations pertaining to the United States' satellite remote sensing programs urge: (1) the continuation of the NASA/EROS Data Center program to convert pre-1979 scenes to computer readable tapes and create a historical archive of this valuable data; (2) improving the EROS archive by adding geologically interesting scenes, data from other agencies (including previously classified data), and by adopting a policy to retire data from the archive; (3) establishing a computer data base inquiry system that includes remote sensing data from all publically available sources; (4) capability for prepurchase review and evaluation; (5) a flexible price structure; and (6) adoption of standard digital data products format. Information about LANDSAT 4, the status of worldwide LANDSAT receiving stations, future non-U.S. remote sensing satellites, a list of sources for LANDSAT data, and the results of a survey of GEOSAT members' remote sensing data processing systems are also considered.

  5. Massively parallel electrical conductivity imaging of the subsurface: Applications to hydrocarbon exploration

    NASA Astrophysics Data System (ADS)

    Newman, Gregory A.; Commer, Michael

    2009-07-01

    Three-dimensional (3D) geophysical imaging is now receiving considerable attention for electrical conductivity mapping of potential offshore oil and gas reservoirs. The imaging technology employs controlled source electromagnetic (CSEM) and magnetotelluric (MT) fields and treats geological media exhibiting transverse anisotropy. Moreover when combined with established seismic methods, direct imaging of reservoir fluids is possible. Because of the size of the 3D conductivity imaging problem, strategies are required exploiting computational parallelism and optimal meshing. The algorithm thus developed has been shown to scale to tens of thousands of processors. In one imaging experiment, 32,768 tasks/processors on the IBM Watson Research Blue Gene/L supercomputer were successfully utilized. Over a 24 hour period we were able to image a large scale field data set that previously required over four months of processing time on distributed clusters based on Intel or AMD processors utilizing 1024 tasks on an InfiniBand fabric. Electrical conductivity imaging using massively parallel computational resources produces results that cannot be obtained otherwise and are consistent with timeframes required for practical exploration problems.

  6. Implementation of the Geological Hazard Monitoring and Early Warning System Based on Multi - source Data -A Case Study of Deqin Tibetan County, Yunnan Province

    NASA Astrophysics Data System (ADS)

    Zhao, Junsan; Chen, Guoping; Yuan, Lei

    2017-04-01

    The new technologies, such as 3D laser scanning, InSAR, GNSS, unmanned aerial vehicle and Internet of things, will provide much more data resources for the surveying and monitoring, as well as the development of Early Warning System (EWS). This paper provides the solutions of the design and implementation of a geological disaster monitoring and early warning system (GDMEWS), which includes landslides and debris flows hazard, based on the multi-sources of the date by use of technologies above mentioned. The complex and changeable characteristics of the GDMEWS are described. The architecture of the system, composition of the multi-source database, development mode and service logic, the methods and key technologies of system development are also analyzed. To elaborate the process of the implementation of the GDMEWS, Deqin Tibetan County is selected as a case study area, which has the unique terrain and diverse types of typical landslides and debris flows. Firstly, the system functional requirements, monitoring and forecasting models of the system are discussed. Secondly, the logic relationships of the whole process of disaster including pre-disaster, disaster rescue and post-disaster reconstruction are studied, and the support tool for disaster prevention, disaster reduction and geological disaster management are developed. Thirdly, the methods of the multi - source monitoring data integration and the generation of the mechanism model of Geological hazards and simulation are expressed. Finally, the construction of the GDMEWS is issued, which will be applied to management, monitoring and forecasting of whole disaster process in real-time and dynamically in Deqin Tibetan County. Keywords: multi-source spatial data; geological disaster; monitoring and warning system; Deqin Tibetan County

  7. Computer finds ore

    NASA Astrophysics Data System (ADS)

    Bell, Peter M.

    Artificial intelligence techniques are being used for the first time to evaluate geophysical, geochemical, and geologic data and theory in order to locate ore deposits. After several years of development, an intelligent computer code has been formulated and applied to the Mount Tolman area in Washington state. In a project funded by the United States Geological Survey and the National Science Foundation a set of computer programs, under the general title Prospector, was used successfully to locate a previously unknown ore-grade porphyry molybdenum deposit in the vicinity of Mount Tolman (Science, Sept. 3, 1982).The general area of the deposit had been known to contain exposures of porphyry mineralization. Between 1964 and 1978, exploration surveys had been run by the Bear Creek Mining Company, and later exploration was done in the area by the Amax Corporation. Some of the geophysical data and geochemical and other prospecting surveys were incorporated into the programs, and mine exploration specialists contributed to a set of rules for Prospector. The rules were encoded as ‘inference networks’ to form the ‘expert system’ on which the artificial intelligence codes were based. The molybdenum ore deposit discovered by the test is large, located subsurface, and has an areal extent of more than 18 km2.

  8. Crossing disciplines and scales to understand the critical zone

    USGS Publications Warehouse

    Brantley, S.L.; Goldhaber, M.B.; Vala, Ragnarsdottir K.

    2007-01-01

    The Critical Zone (CZ) is the system of coupled chemical, biological, physical, and geological processes operating together to support life at the Earth's surface. While our understanding of this zone has increased over the last hundred years, further advance requires scientists to cross disciplines and scales to integrate understanding of processes in the CZ, ranging in scale from the mineral-water interface to the globe. Despite the extreme heterogeneities manifest in the CZ, patterns are observed at all scales. Explanations require the use of new computational and analytical tools, inventive interdisciplinary approaches, and growing networks of sites and people.

  9. Semantic Web-based digital, field and virtual geological

    NASA Astrophysics Data System (ADS)

    Babaie, H. A.

    2012-12-01

    Digital, field and virtual Semantic Web-based education (SWBE) of geological mapping requires the construction of a set of searchable, reusable, and interoperable digital learning objects (LO) for learners, teachers, and authors. These self-contained units of learning may be text, image, or audio, describing, for example, how to calculate the true dip of a layer from two structural contours or find the apparent dip along a line of section. A collection of multi-media LOs can be integrated, through domain and task ontologies, with mapping-related learning activities and Web services, for example, to search for the description of lithostratigraphic units in an area, or plotting orientation data on stereonet. Domain ontologies (e.g., GeologicStructure, Lithostratigraphy, Rock) represent knowledge in formal languages (RDF, OWL) by explicitly specifying concepts, relations, and theories involved in geological mapping. These ontologies are used by task ontologies that formalize the semantics of computational tasks (e.g., measuring the true thickness of a formation) and activities (e.g., construction of cross section) for all actors to solve specific problems (making map, instruction, learning support, authoring). A SWBE system for geological mapping should also involve ontologies to formalize teaching strategy (pedagogical styles), learner model (e.g., for student performance, personalization of learning), interface (entry points for activities of all actors), communication (exchange of messages among different components and actors), and educational Web services (for interoperability). In this ontology-based environment, actors interact with the LOs through educational servers, that manage (reuse, edit, delete, store) ontologies, and through tools which communicate with Web services to collect resources and links to other tools. Digital geological mapping involves a location-based, spatial organization of geological elements in a set of GIS thematic layers. Each layer in the stack assembles a set of polygonal (e.g., formation, member, intrusion), linear (e.g., fault, contact), and/or point (e.g., sample or measurement site) geological elements. These feature classes, represented in domain ontologies by classes, have their own sets of property (attribute, association relation) and topological (e.g., overlap, adjacency, containment), and network (cross-cuttings; connectivity) relationships. Since geological mapping involves describing and depicting different aspects of each feature class (e.g., contact, formation, structure), the same geographic region may be investigated by different communities, for example, for its stratigraphy, rock type, structure, soil type, and isotopic and paleontological age, using sets of ontologies. These data can become interconnected applying the Semantic Web technologies, on the Linked Open Data Cloud, based on their underlying common geographic coordinates. Sets of geological data published on the Cloud will include multiple RDF links to Cloud's geospatial nodes such as GeoNames and Linked GeoData. During mapping, a device such as smartphone, laptop, or iPad, with GPS and GIS capability and a DBpedia Mobile client, can use the current position to discover and query all the geological linked data, and add new data to the thematic layers and publish them to the Cloud.

  10. New Age of 3D Geological Modelling or Complexity is not an Issue Anymore

    NASA Astrophysics Data System (ADS)

    Mitrofanov, Aleksandr

    2017-04-01

    Geological model has a significant value in almost all types of researches related to regional mapping, geodynamics and especially to structural and resource geology of mineral deposits. Well-developed geological model must take into account all vital features of modelling object without over-simplification and also should adequately represent the interpretation of the geologist. In recent years with the gradual exhaustion deposits with relatively simple morphology geologists from all over the world are faced with the necessity of building the representative models for more and more structurally complex objects. Meanwhile, the amount of tools used for that has not significantly changed in the last two-three decades. The most widespread method of wireframe geological modelling now was developed in 1990s and is fully based on engineering design set of instruments (so-called CAD). Strings and polygons representing the section-based interpretation are being used as an intermediate step in the process of wireframes generation. Despite of significant time required for this type of modelling, it still can provide sufficient results for simple and medium-complexity geological objects. However, with the increasing complexity more and more vital features of the deposit are being sacrificed because of fundamental inability (or much greater time required for modelling) of CAD-based explicit techniques to develop the wireframes of the appropriate complexity. At the same time alternative technology which is not based on sectional approach and which uses the fundamentally different mathematical algorithms is being actively developed in the variety of other disciplines: medicine, advanced industrial design, game and cinema industry. In the recent years this implicit technology started to being developed for geological modelling purpose and nowadays it is represented by very powerful set of tools that has been integrated in almost all major commercial software packages. Implicit modelling allows to develop geological models that really correspond with complicated geological reality. Models can include fault blocking, complex structural trends and folding; can be based on excessive input dataset (like lots of drilling on the mining stage) or, on the other hand, on a quite few drillholes intersections with significant input from geological interpretation of the deposit. In any case implicit modelling, if is used correctly, allows to incorporate the whole batch of geological data and relatively quickly get the easily adjustable, flexible and robust geological wireframes that can be used as a reliable foundation on the following stages of geological investigations. In SRK practice nowadays almost all the wireframe models used for structural and resource geology are developed with implicit modelling tools which significantly increased the speed and quality of geological modelling.

  11. Geology and mineral and energy resources, Roswell Resource Area, New Mexico; an interactive computer presentation

    USGS Publications Warehouse

    Tidball, Ronald R.; Bartsch-Winkler, S. B.

    1995-01-01

    This Compact Disc-Read Only Memory (CD-ROM) contains a program illustrating the geology and mineral and energy resources of the Roswell Resource Area, an administrative unit of the U.S. Bureau of Land Management in east-central New Mexico. The program enables the user to access information on the geology, geochemistry, geophysics, mining history, metallic and industrial mineral commodities, hydrocarbons, and assessments of the area. The program was created with the display software, SuperCard, version 1.5, by Aldus. The program will run only on a Macintosh personal computer. This CD-ROM was produced in accordance with Macintosh HFS standards. The program was developed on a Macintosh II-series computer with system 7.0.1. The program is a compiled, executable form that is nonproprietary and does not require the presence of the SuperCard software.

  12. Automated extraction of natural drainage density patterns for the conterminous United States through high performance computing

    USGS Publications Warehouse

    Stanislawski, Larry V.; Falgout, Jeff T.; Buttenfield, Barbara P.

    2015-01-01

    Hydrographic networks form an important data foundation for cartographic base mapping and for hydrologic analysis. Drainage density patterns for these networks can be derived to characterize local landscape, bedrock and climate conditions, and further inform hydrologic and geomorphological analysis by indicating areas where too few headwater channels have been extracted. But natural drainage density patterns are not consistently available in existing hydrographic data for the United States because compilation and capture criteria historically varied, along with climate, during the period of data collection over the various terrain types throughout the country. This paper demonstrates an automated workflow that is being tested in a high-performance computing environment by the U.S. Geological Survey (USGS) to map natural drainage density patterns at the 1:24,000-scale (24K) for the conterminous United States. Hydrographic network drainage patterns may be extracted from elevation data to guide corrections for existing hydrographic network data. The paper describes three stages in this workflow including data pre-processing, natural channel extraction, and generation of drainage density patterns from extracted channels. The workflow is concurrently implemented by executing procedures on multiple subbasin watersheds within the U.S. National Hydrography Dataset (NHD). Pre-processing defines parameters that are needed for the extraction process. Extraction proceeds in standard fashion: filling sinks, developing flow direction and weighted flow accumulation rasters. Drainage channels with assigned Strahler stream order are extracted within a subbasin and simplified. Drainage density patterns are then estimated with 100-meter resolution and subsequently smoothed with a low-pass filter. The extraction process is found to be of better quality in higher slope terrains. Concurrent processing through the high performance computing environment is shown to facilitate and refine the choice of drainage density extraction parameters and more readily improve extraction procedures than conventional processing.

  13. Viking Lander Mosaics of Mars

    NASA Technical Reports Server (NTRS)

    Morris, E. C.

    1985-01-01

    The Viking Lander 1 and 2 cameras acquired many high-resolution pictures of the Chryse Planitia and Utopia Planitia landing sites. Based on computer-processed data of a selected number of these pictures, eight high-resolution mosaics were published by the U.S. Geological Survey as part of the Atlas of Mars, Miscellaneous Investigation Series. The mosaics are composites of the best picture elements (pixels) of all the Lander pictures used. Each complete mosaic extends 342.5 deg in azimuth, from approximately 5 deg above the horizon to 60 deg below, and incorporates approximately 15 million pixels. Each mosaic is shown in a set of five sheets. One sheet contains the full panorama from one camera taken in either morning or evening. The other four sheets show sectors of the panorama at an enlarged scale; when joined together they make a panorama approximately 2' X 9'.

  14. Mechanics of wind ripple stratigraphy.

    PubMed

    Forrest, S B; Haff, P K

    1992-03-06

    Stratigraphic patterns preserved under translating surface undulations or ripples in a depositional eolian environment are computed on a grain by grain basis using physically based cellular automata models. The spontaneous appearance, growth, and motion of the simulated ripples correspond in many respects to the behavior of natural ripples. The simulations show that climbing strata can be produced by impact alone; direct action of fluid shear is unnecessary. The model provides a means for evaluating the connection between mechanical processes occurring in the paleoenvironment during deposition and the resulting stratigraphy preserved in the geologic column: vertical compression of small laminae above a planar surface indicates nascent ripple growth; supercritical laminae are associated with unusually intense deposition episodes; and a plane erosion surface separating sets of well-developed laminae is consistent with continued migration of mature ripples during a hiatus in deposition.

  15. An Improved Computing Method for 3D Mechanical Connectivity Rates Based on a Polyhedral Simulation Model of Discrete Fracture Network in Rock Masses

    NASA Astrophysics Data System (ADS)

    Li, Mingchao; Han, Shuai; Zhou, Sibao; Zhang, Ye

    2018-06-01

    Based on a 3D model of a discrete fracture network (DFN) in a rock mass, an improved projective method for computing the 3D mechanical connectivity rate was proposed. The Monte Carlo simulation method, 2D Poisson process and 3D geological modeling technique were integrated into a polyhedral DFN modeling approach, and the simulation results were verified by numerical tests and graphical inspection. Next, the traditional projective approach for calculating the rock mass connectivity rate was improved using the 3D DFN models by (1) using the polyhedral model to replace the Baecher disk model; (2) taking the real cross section of the rock mass, rather than a part of the cross section, as the test plane; and (3) dynamically searching the joint connectivity rates using different dip directions and dip angles at different elevations to calculate the maximum, minimum and average values of the joint connectivity at each elevation. In a case study, the improved method and traditional method were used to compute the mechanical connectivity rate of the slope of a dam abutment. The results of the two methods were further used to compute the cohesive force of the rock masses. Finally, a comparison showed that the cohesive force derived from the traditional method had a higher error, whereas the cohesive force derived from the improved method was consistent with the suggested values. According to the comparison, the effectivity and validity of the improved method were verified indirectly.

  16. Summary on several key techniques in 3D geological modeling.

    PubMed

    Mei, Gang

    2014-01-01

    Several key techniques in 3D geological modeling including planar mesh generation, spatial interpolation, and surface intersection are summarized in this paper. Note that these techniques are generic and widely used in various applications but play a key role in 3D geological modeling. There are two essential procedures in 3D geological modeling: the first is the simulation of geological interfaces using geometric surfaces and the second is the building of geological objects by means of various geometric computations such as the intersection of surfaces. Discrete geometric surfaces that represent geological interfaces can be generated by creating planar meshes first and then spatially interpolating; those surfaces intersect and then form volumes that represent three-dimensional geological objects such as rock bodies. In this paper, the most commonly used algorithms of the key techniques in 3D geological modeling are summarized.

  17. Modelling DC responses of 3D complex fracture networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beskardes, Gungor Didem; Weiss, Chester Joseph

    Here, the determination of the geometrical properties of fractures plays a critical role in many engineering problems to assess the current hydrological and mechanical states of geological media and to predict their future states. However, numerical modeling of geoelectrical responses in realistic fractured media has been challenging due to the explosive computational cost imposed by the explicit discretizations of fractures at multiple length scales, which often brings about a tradeoff between computational efficiency and geologic realism. Here, we use the hierarchical finite element method to model electrostatic response of realistically complex 3D conductive fracture networks with minimal computational cost.

  18. Modelling DC responses of 3D complex fracture networks

    DOE PAGES

    Beskardes, Gungor Didem; Weiss, Chester Joseph

    2018-03-01

    Here, the determination of the geometrical properties of fractures plays a critical role in many engineering problems to assess the current hydrological and mechanical states of geological media and to predict their future states. However, numerical modeling of geoelectrical responses in realistic fractured media has been challenging due to the explosive computational cost imposed by the explicit discretizations of fractures at multiple length scales, which often brings about a tradeoff between computational efficiency and geologic realism. Here, we use the hierarchical finite element method to model electrostatic response of realistically complex 3D conductive fracture networks with minimal computational cost.

  19. Study on the Integrated Geophysic Methods and Application of Advanced Geological Detection for Complicated Tunnel

    NASA Astrophysics Data System (ADS)

    Zhou, L.; Xiao, G.

    2014-12-01

    The engineering geological and hydrological conditions of current tunnels are more and more complicated, as the tunnels are elongated with deeper depth. In constructing these complicated tunnels, geological hazards prone to occur as induced by unfavorable geological bodies, such as fault zones, karst or hydrous structures, etc. The working emphasis and difficulty of the advanced geological exploration for complicated tunnels are mainly focused on the structure and water content of these unfavorable geological bodies. The technical aspects of my paper systematically studied the advanced geological exploration theory and application aspects for complicated tunnels, with discussion on the key technical points and useful conclusions. For the all-aroundness and accuracy of advanced geological exploration results, the objective of my paper is targeted on the comprehensive examination on the structure and hydrous characteristic of the unfavorable geological bodies in complicated tunnels. By the multi-component seismic modeling on a more real model containing the air medium, the wave field response characteristics of unfavorable geological bodies can be analyzed, thus providing theoretical foundation for the observation system layout, signal processing and interpretation of seismic methods. Based on the tomographic imaging theory of seismic and electromagnetic method, 2D integrated seismic and electromagnetic tomographic imaging and visualization software was designed and applied in the advanced drilling hole in the tunnel face, after validation of the forward and inverse modeling results on theoretical models. The transmission wave imaging technology introduced in my paper can be served as a new criterion for detection of unfavorable geological bodies. After careful study on the basic theory, data processing and interpretation, practical applications of TSP and ground penetrating radar (GPR) method, as well as serious examination on their application examples, my paper formulated a suite of comprehensive application system of seismic and electromagnetic methods for the advanced geological exploration of complicated tunnels. This research is funded by National Natural Science Foundation of China (Grant No. 41202223) .

  20. Global stratigraphy. [of planet Mars

    NASA Technical Reports Server (NTRS)

    Tanaka, Kenneth L.; Scott, David H.; Greeley, Ronald

    1992-01-01

    Attention is given to recent major advances in the definition and documentation of Martian stratigraphy and geology. Mariner 9 provided the images for the first global geologic mapping program, resulting in the recognition of the major geologic processes that have operated on the planet, and in the definition of the three major chronostratigraphic divisions: the Noachian, Hesperian, and Amazonian Systems. Viking Orbiter images permitted the recognition of additional geologic units and the formal naming of many formations. Epochs are assigned absolute ages based on the densities of superposed craters and crater-flux models. Recommendations are made with regard to future areas of study, namely, crustal stratigraphy and structure, the highland-lowland boundary, the Tharsis Rise, Valles Marineris, channels and valley networks, and possible Martian oceans, lakes, and ponds.

  1. Resident research associateships, postdoctoral research awards 1989: opportunities for research at the U.S. Geological Survey, U.S. Department of the Interior

    USGS Publications Warehouse

    ,; ,

    1989-01-01

    The scientists of the U.S. Geological Survey are engaged in a wide range of geologic, geophysical, geochemical, hydrologic, and cartographic programs, including the application of computer science to them. These programs offer exciting possibilities for scientific achievement and professional growth to young scientists through participation as Research Associates.

  2. Modeling and analysis of Soil Erosion processes by the River Basins model: The Case Study of the Krivacki Potok Watershed, Montenegro

    NASA Astrophysics Data System (ADS)

    Vujacic, Dusko; Barovic, Goran; Mijanovic, Dragica; Spalevic, Velibor; Curovic, Milic; Tanaskovic, Vjekoslav; Djurovic, Nevenka

    2016-04-01

    The objective of this research was to study soil erosion processes in one of Northern Montenegrin watersheds, the Krivacki Potok Watershed of the Polimlje River Basin, using modeling techniques: the River Basins computer-graphic model, based on the analytical Erosion Potential Method (EPM) of Gavrilovic for calculation of runoff and soil loss. Our findings indicate a low potential of soil erosion risk, with 554 m³ yr-1 of annual sediment yield; an area-specific sediment yield of 180 m³km-2 yr-1. The calculation outcomes were validated for the entire 57 River Basins of Polimlje, through measurements of lake sediment deposition at the Potpec hydropower plant dam. According to our analysis, the Krivacki Potok drainage basin is with the relatively low sediment discharge; according to the erosion type, it is mixed erosion. The value of the Z coefficient was calculated on 0.297, what indicates that the river basin belongs to 4th destruction category (of five). The calculated peak discharge from the river basin was 73 m3s-1 for the incidence of 100 years and there is a possibility for large flood waves to appear in the studied river basin. Using the adequate computer-graphic and analytical modeling tools, we improved the knowledge on the soil erosion processes of the river basins of this part of Montenegro. The computer-graphic River Basins model of Spalevic, which is based on the EPM analytical method of Gavrilovic, is highly recommended for soil erosion modelling in other river basins of the Southeastern Europe. This is because of its reliable detection and appropriate classification of the areas affected by the soil loss caused by soil erosion, at the same time taking into consideration interactions between the various environmental elements such as Physical-Geographical Features, Climate, Geological, Pedological characteristics, including the analysis of Land Use, all calculated at the catchment scale.

  3. Regional-scale brine migration along vertical pathways due to CO2 injection - Part 2: A simulated case study in the North German Basin

    NASA Astrophysics Data System (ADS)

    Kissinger, Alexander; Noack, Vera; Knopf, Stefan; Konrad, Wilfried; Scheer, Dirk; Class, Holger

    2017-06-01

    Saltwater intrusion into potential drinking water aquifers due to the injection of CO2 into deep saline aquifers is one of the hazards associated with the geological storage of CO2. Thus, in a site-specific risk assessment, models for predicting the fate of the displaced brine are required. Practical simulation of brine displacement involves decisions regarding the complexity of the model. The choice of an appropriate level of model complexity depends on multiple criteria: the target variable of interest, the relevant physical processes, the computational demand, the availability of data, and the data uncertainty. In this study, we set up a regional-scale geological model for a realistic (but not real) onshore site in the North German Basin with characteristic geological features for that region. A major aim of this work is to identify the relevant parameters controlling saltwater intrusion in a complex structural setting and to test the applicability of different model simplifications. The model that is used to identify relevant parameters fully couples flow in shallow freshwater aquifers and deep saline aquifers. This model also includes variable-density transport of salt and realistically incorporates surface boundary conditions with groundwater recharge. The complexity of this model is then reduced in several steps, by neglecting physical processes (two-phase flow near the injection well, variable-density flow) and by simplifying the complex geometry of the geological model. The results indicate that the initial salt distribution prior to the injection of CO2 is one of the key parameters controlling shallow aquifer salinization. However, determining the initial salt distribution involves large uncertainties in the regional-scale hydrogeological parameterization and requires complex and computationally demanding models (regional-scale variable-density salt transport). In order to evaluate strategies for minimizing leakage into shallow aquifers, other target variables can be considered, such as the volumetric leakage rate into shallow aquifers or the pressure buildup in the injection horizon. Our results show that simplified models, which neglect variable-density salt transport, can reach an acceptable agreement with more complex models.

  4. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—Technical manual for version 2.8

    USGS Publications Warehouse

    Mueller, David S.

    2016-06-21

    The software program, QRev applies common and consistent computational algorithms combined with automated filtering and quality assessment of the data to improve the quality and efficiency of streamflow measurements and helps ensure that U.S. Geological Survey streamflow measurements are consistent, accurate, and independent of the manufacturer of the instrument used to make the measurement. Software from different manufacturers uses different algorithms for various aspects of the data processing and discharge computation. The algorithms used by QRev to filter data, interpolate data, and compute discharge are documented and compared to the algorithms used in the manufacturers’ software. QRev applies consistent algorithms and creates a data structure that is independent of the data source. QRev saves an extensible markup language (XML) file that can be imported into databases or electronic field notes software. This report is the technical manual for version 2.8 of QRev.

  5. GeoPad: Innovative Applications of Information Technology in Field Science Education

    NASA Astrophysics Data System (ADS)

    Knoop, P. A.; van der Pluijm, B.

    2003-12-01

    A core requirement for most undergraduate degrees in the Earth sciences is a course in field geology, which provides students with training in field science methodologies, including geologic mapping. The University of Michigan Geological Sciences' curriculum includes a seven-week, summer field course, GS-440, based out of the university's Camp Davis Geologic Field Station, near Jackson, WY. Such field-based courses stand to benefit tremendously from recent innovations in Information Technology \\(IT\\), especially in the form of increasing portability, new haptic interfaces for personal computers, and advancements in Geographic Information System \\(GIS\\) software. Such innovations are enabling in-the-field, real-time access to powerful data collection, analysis, visualization, and interpretation tools. The benefits of these innovations, however, can only be realized on a broad basis when the IT reaches a level of maturity at which users can easily employ it to enhance their learning experience and scientific activities, rather than the IT itself being a primary focus of the curriculum or a constraint on field activities. The GeoPad represents a combination of these novel technologies that achieves that goal. The GeoPad concept integrates a ruggedized Windows XP TabletPC equipped with wireless networking, a portable GPS receiver, digital camera, microphone-headset, voice-recognition software, GIS, and supporting, digital, geo-referenced data-sets. A key advantage of the GeoPad is enabling field-based usage of visualization software and data focusing on \\(3D\\) geospatial relationships \\(developed as part of the complementary GeoWall initiative\\), which provides a powerful new tool for enhancing and facilitating undergraduate field geology education, as demonstrated during the summer 2003 session of GS-440. In addition to an education in field methodologies, students also gain practical experience using IT that they will encounter during their continued educational, research, or professional careers. This approach is immediately applicable to field geology courses elsewhere and indeed to other field-oriented programs \\(e.g., in biology, archeology, ecology\\), given similar needs.

  6. Site-conditions map for Portugal based on VS measurements: methodology and final model

    NASA Astrophysics Data System (ADS)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and subsequently for some geographical regions.

  7. Verification of the GIS-based Newmark method through 2D dynamic modelling of slope stability

    NASA Astrophysics Data System (ADS)

    Torgoev, A.; Havenith, H.-B.

    2012-04-01

    The goal of this work is to verify the simplified GIS-based Newmark displacement approach through 2D dynamic modelling of slope stability. The research is applied to a landslide-prone area in Central Asia, the Mailuu-Suu Valley, situated in the south of Kyrgyzstan. The comparison is carried out on the basis of 30 different profiles located in the target area, presenting different geological, tectonic and morphological settings. One part of the profiles were selected within landslide zones, the other part was selected in stable areas. Many of the landslides are complex slope failures involving falls, rotational sliding and/or planar sliding and flows. These input data were extracted from a 3D structural geological model built with the GOCAD software. Geophysical and geomechanical parameters were defined on the basis of results obtained by multiple surveys performed in the area over the past 15 years. These include geophysical investigation, seismological experiments and ambient noise measurements. Dynamic modelling of slope stability is performed with the UDEC version 4.01 software that is able to compute deformation of discrete elements. Inside these elements both elasto-plastic and purely elastic materials (similar to rigid blocks) were tested. Various parameter variations were tested to assess their influence on the final outputs. And even though no groundwater flow was included, the numerous simulations are very time-consuming (20 mins per model for 10 secs simulated shaking) - about 500 computation hours have been completed so far (more than 100 models). Preliminary results allow us to compare Newmark displacements computed using different GIS approaches (Jibson et al., 1998; Miles and Ho, 1999, among others) with the displacements computed using the original Newmark method (Newmark, 1965, here simulated seismograms were used) and displacements produced along joints by the corresponding 2D dynamical models. The generation of seismic amplification and its impact on peak-ground-acceleration, Arias Intensity and permanent slope movements (total and slip on joints) is assessed for numerous morphological-lithological settings (curvature, slope angle, surficial geology, various layer dips and orientations) throughout the target area. The final results of our studies should allow us to define the limitations of the simplified GIS-based Newmark displacement modelling; thus, the verified method would make landslide susceptibility and hazard mapping in seismically active regions more reliable.

  8. Using 3D Geologic Models to Synthesize Large and Disparate Datasets for Site Characterization and Verification Purposes

    NASA Astrophysics Data System (ADS)

    Hillesheim, M. B.; Rautman, C. A.; Johnson, P. B.; Powers, D. W.

    2008-12-01

    As we are all aware, increases in computing power and efficiency have allowed for the development of many modeling codes capable of processing large and sometimes disparate datasets (e.g., geological, hydrological, geochemical, etc). Because people sometimes have difficulty visualizing in three dimensions (3D) or understanding how multiple figures of various geologic features relate as a whole, 3D geologic models can be excellent tools to illustrate key concepts and findings, especially to lay persons, such as stakeholders, customers, and other concerned parties. In this presentation, we will show examples of 3D geologic modeling efforts using data collected during site characterization and verification work at the Waste Isolation Pilot Plant (WIPP). The WIPP is a U.S. Department of Energy (DOE) facility located in southeastern New Mexico, designed for the safe disposal of transuranic wastes resulting from U.S. defense programs. The 3D geologic modeling efforts focused on refining our understanding of the WIPP site by integrating a variety of geologic data. Examples include: overlaying isopach surfaces of unit thickness and overburden thickness, a map of geologic facies changes, and a transmissivity field onto a 3D structural map of a geologic unit of interest. In addition, we also present a 4D hydrogeologic model of the effects of a large-scale pumping test on water levels. All these efforts have provided additional insights into the controls on transmissivity and flow in the WIPP vicinity. Ultimately, by combining these various types of data we have increased our understanding of the WIPP site's hydrogeologic system, which is a key aspect of continued certification. Sandia is a multi program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04- 94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S Department of Energy.

  9. Exploring the Relationship between Students' Understanding of Conventional Time and Deep (Geologic) Time

    NASA Astrophysics Data System (ADS)

    Cheek, Kim A.

    2013-07-01

    Many geologic processes occur in the context of geologic or deep time. Students of all ages demonstrate difficulty grasping this fundamental concept which impacts their ability to acquire other geoscience concepts. A concept of deep time requires the ability to sequence events on an immense temporal scale (succession) and to judge the durations of geologic processes based on the rates at which they occur. The twin concepts of succession and duration are the same ideas that underlie a concept of conventional time. If deep time is an extension of conventional time and not qualitatively different from it, students should display similar reasoning patterns when dealing with analogous tasks over disparate temporal periods. Thirty-five US students aged 13-24 years participated in individual task-based interviews to ascertain how they thought about succession and duration in conventional and deep time. This is the first attempt to explore this relationship in the same study in over 30 years. Most students successfully completed temporal succession tasks, but there was greater variability in responses on duration tasks. Conventional time concepts appear to impact how students reason about deep time. The application of spatial reasoning to temporal tasks sometimes leads to correct responses but in other instances does not. Implications for future research and teaching strategies are discussed.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manteufel, R.D.; Ahola, M.P.; Turner, D.R.

    A literature review has been conducted to determine the state of knowledge available in the modeling of coupled thermal (T), hydrologic (H), mechanical (M), and chemical (C) processes relevant to the design and/or performance of the proposed high-level waste (HLW) repository at Yucca Mountain, Nevada. The review focuses on identifying coupling mechanisms between individual processes and assessing their importance (i.e., if the coupling is either important, potentially important, or negligible). The significance of considering THMC-coupled processes lies in whether or not the processes impact the design and/or performance objectives of the repository. A review, such as reported here, is usefulmore » in identifying which coupled effects will be important, hence which coupled effects will need to be investigated by the US Nuclear Regulatory Commission in order to assess the assumptions, data, analyses, and conclusions in the design and performance assessment of a geologic reposit``. Although this work stems from regulatory interest in the design of the geologic repository, it should be emphasized that the repository design implicitly considers all of the repository performance objectives, including those associated with the time after permanent closure. The scope of this review is considered beyond previous assessments in that it attempts with the current state-of-knowledge) to determine which couplings are important, and identify which computer codes are currently available to model coupled processes.« less

  11. Experimental design applications for modeling and assessing carbon dioxide sequestration in saline aquifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, John

    2014-11-29

    This project was a computer modeling effort to couple reservoir simulation and ED/RSM using Sensitivity Analysis, Uncertainty Analysis, and Optimization Methods, to assess geologic, geochemical, geomechanical, and rock-fluid effects and factors on CO 2 injectivity, capacity, and plume migration. The project objective was to develop proxy models to simplify the highly complex coupled geochemical and geomechanical models in the utilization and storage of CO 2 in the subsurface. The goals were to investigate and prove the feasibility of the ED/RSM processes and engineering development, and bridge the gaps regarding the uncertainty and unknowns of the many geochemical and geomechanical interactingmore » parameters in the development and operation of anthropogenic CO 2 sequestration and storage sites. The bottleneck in this workflow is the high computational effort of reactive transport simulation models and large number of input variables to optimize with ED/RSM techniques. The project was not to develop the reactive transport, geomechanical, or ED/RSM software, but was to use what was commercially and/or publically available as a proof of concept to generate proxy or surrogate models. A detailed geologic and petrographic mineral assemblage and geologic structure of the doubly plunging anticline was defined using the USDOE RMOTC formations of interest data (e.g., Lower Sundance, Crow Mountain, Alcova Limestone, and Red Peak). The assemblage of 23 minerals was primarily developed from literature data and petrophysical (well log) analysis. The assemblage and structure was input into a commercial reactive transport simulator to predict the effects of CO 2 injection and complex reactions with the reservoir rock. Significant impediments were encountered during the execution phase of the project. The only known commercial reactive transport simulator was incapable of simulating complex geochemistry modeled in this project. Significant effort and project funding was expended to determine the limitations of both the commercial simulator and the Lawrence Berkeley National Laboratory (LBNL) R&D simulator, TOUGHREACT available to the project. A simplified layer cake model approximating the volume of the RMOTC targeted reservoirs was defined with 1-3 minerals eventually modeled with limited success. Modeling reactive transport in porous media requires significant computational power. In this project, up to 24 processors were used to model a limited mineral set of 1-3 minerals. In addition, geomechanical aspects of injecting CO 2 into closed, semi-open, and open systems in various well completion methods was simulated. Enhanced Oil Recovery (EOR) as a storage method was not modeled. A robust and stable simulation dataset or base case was developed and used to create a master dataset with embedded instructions for input to the ED/RSM software. Little success was achieved toward the objective of the project using the commercial simulator or the LBNL simulator versions available during the time of this project. Several hundred realizations were run with the commercial simulator and ED/RSM software, most having convergence problems and terminating prematurely. A proxy model for full field CO 2 injection sequestration utilization and storage was not capable of being developed with software available for this project. Though the chemistry is reasonably known and understood, based on the amount of effort and huge computational time required, predicting CO 2 sequestration storage capacity in geologic formations to within the program goals of ±30% proved unsuccessful.« less

  12. Hardware based redundant multi-threading inside a GPU for improved reliability

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-05-05

    A system and method for verifying computation output using computer hardware are provided. Instances of computation are generated and processed on hardware-based processors. As instances of computation are processed, each instance of computation receives a load accessible to other instances of computation. Instances of output are generated by processing the instances of computation. The instances of output are verified against each other in a hardware based processor to ensure accuracy of the output.

  13. Uncertainty in training image-based inversion of hydraulic head data constrained to ERT data: Workflow and case study

    NASA Astrophysics Data System (ADS)

    Hermans, Thomas; Nguyen, Frédéric; Caers, Jef

    2015-07-01

    In inverse problems, investigating uncertainty in the posterior distribution of model parameters is as important as matching data. In recent years, most efforts have focused on techniques to sample the posterior distribution with reasonable computational costs. Within a Bayesian context, this posterior depends on the prior distribution. However, most of the studies ignore modeling the prior with realistic geological uncertainty. In this paper, we propose a workflow inspired by a Popper-Bayes philosophy that data should first be used to falsify models, then only be considered for matching. We propose a workflow consisting of three steps: (1) in defining the prior, we interpret multiple alternative geological scenarios from literature (architecture of facies) and site-specific data (proportions of facies). Prior spatial uncertainty is modeled using multiple-point geostatistics, where each scenario is defined using a training image. (2) We validate these prior geological scenarios by simulating electrical resistivity tomography (ERT) data on realizations of each scenario and comparing them to field ERT in a lower dimensional space. In this second step, the idea is to probabilistically falsify scenarios with ERT, meaning that scenarios which are incompatible receive an updated probability of zero while compatible scenarios receive a nonzero updated belief. (3) We constrain the hydrogeological model with hydraulic head and ERT using a stochastic search method. The workflow is applied to a synthetic and a field case studies in an alluvial aquifer. This study highlights the importance of considering and estimating prior uncertainty (without data) through a process of probabilistic falsification.

  14. Real-Time Joint Streaming Data Processing from Social and Physical Sensors

    NASA Astrophysics Data System (ADS)

    Kropivnitskaya, Y. Y.; Qin, J.; Tiampo, K. F.; Bauer, M.

    2014-12-01

    The results of the technological breakthroughs in computing that have taken place over the last few decades makes it possible to achieve emergency management objectives that focus on saving human lives and decreasing economic effects. In particular, the integration of a wide variety of information sources, including observations from spatially-referenced physical sensors and new social media sources, enables better real-time seismic hazard analysis through distributed computing networks. The main goal of this work is to utilize innovative computational algorithms for better real-time seismic risk analysis by integrating different data sources and processing tools into streaming and cloud computing applications. The Geological Survey of Canada operates the Canadian National Seismograph Network (CNSN) with over 100 high-gain instruments and 60 low-gain or strong motion seismographs. The processing of the continuous data streams from each station of the CNSN provides the opportunity to detect possible earthquakes in near real-time. The information from physical sources is combined to calculate a location and magnitude for an earthquake. The automatically calculated results are not always sufficiently precise and prompt that can significantly reduce the response time to a felt or damaging earthquake. Social sensors, here represented as Twitter users, can provide information earlier to the general public and more rapidly to the emergency planning and disaster relief agencies. We introduce joint streaming data processing from social and physical sensors in real-time based on the idea that social media observations serve as proxies for physical sensors. By using the streams of data in the form of Twitter messages, each of which has an associated time and location, we can extract information related to a target event and perform enhanced analysis by combining it with physical sensor data. Results of this work suggest that the use of data from social media, in conjunction with the development of innovative computing algorithms, when combined with sensor data can provide a new paradigm for real-time earthquake detection in order to facilitate rapid and inexpensive natural risk reduction.

  15. The Continental Margins Program in Georgia

    USGS Publications Warehouse

    Cocker, M.D.; Shapiro, E.A.

    1999-01-01

    From 1984 to 1993, the Georgia Geologic Survey (GGS) participated in the Minerals Management Service-funded Continental Margins Program. Geological and geophysical data acquisition focused on offshore stratigraphic framework studies, phosphate-bearing Miocene-age strata, distribution of heavy minerals, near-surface alternative sources of groundwater, and development of a PC-based Coastal Geographic Information System (GIS). Seven GGS publications document results of those investigations. In addition to those publications, direct benefits of the GGS's participation include an impetus to the GGS's investigations of economic minerals on the Georgia coast, establishment of a GIS that includes computer hardware and software, and seeds for additional investigations through the information and training acquired as a result of the Continental Margins Program. These addtional investigations are quite varied in scope, and many were made possible because of GIS expertise gained as a result of the Continental Margins Program. Future investigations will also reap the benefits of the Continental Margins Program.From 1984 to 1993, the Georgia Geologic Survey (GGS) participated in the Minerals Management Service-funded Continental Margins Program. Geological and geophysical data acquisition focused on offshore stratigraphic framework studies, phosphate-bearing Miocene-age strata, distribution of heavy minerals, near-surface alternative sources of groundwater, and development of a PC-based Coastal Geographic Information System (GIS). Seven GGS publications document results of those investigations. In addition to those publications, direct benefits of the GGS's participation include an impetus to the GGS's investigations of economic minerals on the Georgia coast, establishment of a GIS that includes computer hardware and software, and seeds for additional investigations through the information and training acquired as a result of the Continental Margins Program. These additional investigations are quite varied in scope, and many were made possible because of GIS expertise gained as a result of the Continental Margins Program. Future investigations will also reap the benefits of the Continental Margins Program.

  16. Immersive, hands-on, team-based geophysical education at the University of Texas Marine Geology and Geophysics Field Course

    NASA Astrophysics Data System (ADS)

    Saustrup, S.; Gulick, S. P.; Goff, J. A.; Davis, M. B.; Duncan, D.; Reece, R.

    2013-12-01

    The University of Texas Institute for Geophysics (UTIG), part of the Jackson School of Geosciences, annually offers a unique and intensive three-week marine geology and geophysics field course during the spring/summer semester intersession. Now entering its seventh year, the course transitions students from a classroom environment through real-world, hands-on field acquisition, on to team-oriented data interpretation, culminating in a professional presentation before academic and industry employer representatives. The course is available to graduate students and select upper-division undergraduates, preparing them for direct entry into the geoscience workforce or for further academic study. Geophysical techniques used include high-resolution multichannel seismic reflection, CHIRP sub-bottom profiling, multibeam bathymetry, sidescan sonar, sediment coring, grab sampling, data processing, and laboratory analysis of sediments. Industry-standard equipment, methods, software packages, and visualization techniques are used throughout the course, putting students ahead of many of their peers in this respect. The course begins with a 3-day classroom introduction to the field area geology, geophysical methods, and computing resources used. The class then travels to the Gulf Coast for a week of hands-on field and lab work aboard two research vessels: UTIG's 22-foot, aluminum hulled Lake Itasca; and NOAA's 82-foot high-speed catamaran R/V Manta. The smaller vessel handles primarily shallow, inshore targets using multibeam bathymetry, sidescan sonar, and grab sampling. The larger vessel is used both inshore and offshore for multichannel seismic, CHIRP profiling, multibeam bathymetry, gravity coring, and vibracoring. Field areas to date have included Galveston and Port Aransas, Texas, and Grand Isle, Louisiana, with further work in Grand Isle scheduled for 2014. In the field, students work in teams of three, participating in survey design, instrument set-up, field deployment, data acquisition optimization, quality control, data archival, log-keeping, real-time data processing, laboratory sediment analysis, and even boat-handling. Teams are rotated through the two vessels and the onshore field laboratory to ensure that each student has hands-on experience with each aspect of the process. Although all students work on all data areas in the field, after returning from the field each team is assigned a particular region or geologic problem to interpret. Each team prepares and presents a formal presentation to UTIG researchers and industry representatives, explaining and defending their interpretations. This unique approach to hands-on field training, real-world science, and project-based teamwork helps prepare students for direct entry into the workforce, giving them a leg up on competitors for positions. This course has an impressive success ratio to show, with many students receiving job offers directly as a result of their participation in the course.

  17. The micrometeoroid complex and evolution of the lunar regolith

    NASA Technical Reports Server (NTRS)

    Horz, F.; Morrison, D. A.; Gault, D. E.; Oberbeck, V. R.; Quaide, W. L.; Vedder, J. F.; Brownlee, D. E.; Hartung, J. B.

    1977-01-01

    Monte Carlo-based computer calculations, as well as analytical approaches utilizing probabilistic arguments, were applied to gain insight into the principal regolith impact processes and their resulting kinetics. Craters 10 to 1500 m in diameter are largely responsible for the overall growth of the regolith. As a consequence the regolith has to be envisioned as a complex sequence of discrete ejecta blankets. Such blankets constitute first-order discontinuities in the evolving debris layer. The micrometeoroid complex then operates intensely on these fresh ejecta blankets and accomplishes only in an uppermost layer of approximately 1-mm thickness. The absolute flux of micrometeoroids based on lunar rock analyses averaged over the past few 10 to the 6th power years is approximately an order of magnitude lower than presentday satellite fluxes; however, there is indication that the flux increased in the past 10 to the 4th power years to become compatible with the satellite data. Furthermore, there is detailed evidence that the micrometeoroid complex existed throughout geologic time.

  18. A concept for the modernization of underground mining master maps based on the enrichment of data definitions and spatial database technology

    NASA Astrophysics Data System (ADS)

    Krawczyk, Artur

    2018-01-01

    In this article, topics regarding the technical and legal aspects of creating digital underground mining maps are described. Currently used technologies and solutions for creating, storing and making digital maps accessible are described in the context of the Polish mining industry. Also, some problems with the use of these technologies are identified and described. One of the identified problems is the need to expand the range of mining map data provided by survey departments to other mining departments, such as ventilation maintenance or geological maintenance. Three solutions are proposed and analyzed, and one is chosen for further analysis. The analysis concerns data storage and making survey data accessible not only from paper documentation, but also directly from computer systems. Based on enrichment data, new processing procedures are proposed for a new way of presenting information that allows the preparation of new cartographic representations (symbols) of data with regard to users' needs.

  19. Bringing Geology to a Community: The Benefits of USing Interpretive Signs in a Self-Guided Tour

    NASA Astrophysics Data System (ADS)

    Crowley, B. E.

    2007-12-01

    Geology is often missing in education settings. However, this science is key to understanding natural history and ecology. Without some knowledge of geologic processes, it is extremely difficult to comprehend how ecosystems work or how fragile an environment can be. To fully grasp these concepts, an interested person needs more than abstract concepts and self-contained examples. He or she needs to be exposed to the use of a domain's conceptual tools in authentic activity. Likewise, to understand natural processes, it is absolutely imperative to observe them in action. It is difficult to understand coastal processes, such as waves interacting with a beach, by reading a book. It is much easier to understand these concepts by learning about them and interacting with them simultaneously. Through an NSF-funded fellowship on informal education, the author has developed a self-guided walking tour that is designed to introduce geologic processes to school groups, families, and individuals. The guide, which is based at Seabright Beach, Santa Cruz, CA, a popular destination for both locals and visitors, uses inquiry and directed questioning. The beach boasts excellent examples of coastal processes and has an exciting and dynamic history. Pilot observations indicate that participants have had rewarding experiences using the guide, that they are excited to share their new knowledge, and that they have successfully been able to apply what they have learned about coastal processes at Seabright to other beaches.

  20. Preliminary surficial geologic map database of the Amboy 30 x 60 minute quadrangle, California

    USGS Publications Warehouse

    Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.

    2006-01-01

    The surficial geologic map database of the Amboy 30x60 minute quadrangle presents characteristics of surficial materials for an area approximately 5,000 km2 in the eastern Mojave Desert of California. This map consists of new surficial mapping conducted between 2000 and 2005, as well as compilations of previous surficial mapping. Surficial geology units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects occurring post-deposition, and, where appropriate, the lithologic nature of the material. The physical properties recorded in the database focus on those that drive hydrologic, biologic, and physical processes such as particle size distribution (PSD) and bulk density. This version of the database is distributed with point data representing locations of samples for both laboratory determined physical properties and semi-quantitative field-based information. Future publications will include the field and laboratory data as well as maps of distributed physical properties across the landscape tied to physical process models where appropriate. The database is distributed in three parts: documentation, spatial map-based data, and printable map graphics of the database. Documentation includes this file, which provides a discussion of the surficial geology and describes the format and content of the map data, a database 'readme' file, which describes the database contents, and FGDC metadata for the spatial map information. Spatial data are distributed as Arc/Info coverage in ESRI interchange (e00) format, or as tabular data in the form of DBF3-file (.DBF) file formats. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files, and are appropriate for representing a view of the spatial database at the mapped scale.

  1. Integrated system for well-to-well correlation with geological knowledge base

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saito, K.; Doi, E.; Uchiyama, T.

    1987-05-01

    A task of well-to-well correlation is an essential part of the reservoir description study. Since the task is involved with diverse data such as logs, dipmeter, seismic, and reservoir engineering, a system with simultaneous access to such data is desirable. A system is developed to aid stratigraphic correlation under a Xerox 1108 workstation, written in INTERLISP-D. The system uses log, dipmeter, seismic, and computer-processed results such as Litho-Analysis and LSA (Log Shape Analyzer). The system first defines zones which are segmentations of log data into consistent layers using Litho-Analysis and LSA results. Each zone is defined as a minimum unitmore » for correlation with slot values of lithology, thickness, log values, and log shape such as bell, cylinder, and funnel. Using a user's input of local geological knowledge such as depositional environment, the system selects marker beds and performs correlation among the wells chosen from the base map. Correlation is performed first with markers and then with sandstones of lesser lateral extent. Structural dip and seismic horizon are guides for seeking a correlatable event. Knowledge of sand body geometry such as ratio of thickness and width is also used to provide a guide on how far a correlation should be made. Correlation results performed by the system are displayed on the screen for the user to examine and modify. The system has been tested with data sets from several depositional settings and has shown to be a useful tool for correlation work. The results are stored as a data base for structural mapping and reservoir engineering study.« less

  2. Knowledge Engineering Approach to the Geotectonic Discourse

    NASA Astrophysics Data System (ADS)

    Pshenichny, Cyril

    2014-05-01

    The intellectual challenge of geotectonics is, and always was, much harder than that of most of the sciences: geotectonics has to say much when there is objectively not too much to say. As the target of study (the genesis of regional and planetary geological structures) is vast and multidisciplinary and is more or less generic for many geological disciplines, its more or less complete description is practically inachievable. Hence, the normal pathway of natural-scientific research - first acquire data, then draw conclusion - unlikely can be the case here. Geotectonics does quite the opposite; its approach is purely abductive: first to suggest a conceptualization (hypothesis) based on some external grounds (either general planetary/cosmic/philosophic/religious considerations, or based on experience gained from research of other structures/regions/planets) and then to acquire data that either support or refute it. In fact, geotectonics defines the context for data acquisition, and hence, the paradigm for the entire body of geology. Being an obvious necessity for a descriptive science, this nevertheless creates a number of threats: • Like any people, scientists like simplicity and unity, and therefore a single geotectonic hypothesis may seem preferable once based on the data available at the moment and oppress other views which may acquire evidence in the future; • As impartial data acquisition is rather a myth than reality even in most of the natural sciences, in a study like geology this process becomes strongly biased by the reigning hypothesis and controlled to supply only supportive evidence; • It becomes collectively agreed that any, or great many, domains of geological knowledge are determined by a geotectonic concept, which is, in turn, offered by a reigning hypothesis (sometimes reclassified as theory) - e.g., exploration geologists must involve the global geotectonic terminology in their technical reports on assessment of mineral or hydrocarbon resources, sessions and conferences are entitled like "Geochemical signatures of postcollisional magmas" thus assuming that the concept of collision (i) has been proven to reflect the reality and (ii) surely has something to do with geochemistry of rocks; tectonic terminology becomes a ubiquitous language with no warranty of its correctness and appropriateness to the case. These issues fall into the scope of the field defined as reasoning research in the geosciences (Pshenichny, 2002; 2003). One of its main tools is knowledge engineering (Feigenbaum, 1984). As has been suggested by Anokhin and Longhinos (2013), knowledge engineering, especially its dynamic part being rapidly evolving now, may offer remedies to handle the abovementioned problems. The following solutions will be reported: • Development of an integrated geotectonic context and language shared by the community that follow contrasting geotectonic views; making concepts more or less inter-hypothesis; studying the "anatomy and physiology" of geotectonic hypotheses and fixing the points of concordance, compatibility and disagreement, computation of logical probabilities of the views given a number of hypotheses (Pshenichny, 2004); • Constructing the ontologies, conceptual graphs and event bushes for data acquisition to impartially define the semantics of data and data provenance in geology; • Building the ensembles of event bushes for related domains of geological knowledge (petrology, volcanology, sedimentology and others) to track the actual influence of geotectonic concepts and views on the geo-knowledge. Following these lines of research would create a better environment for flourishing of scientific thought in geology and makes it more efficient and operative in responding to its traditional tasks (impartial geological mapping, mineral and hydrocarbon exploration, geological education and knowledge transfer) and challenges of nowadays such as natural hazard assessment, sustainable regional development, and so forth. Moreover, this would make a significant contribution to creation of a knowledge-based society that is seen as one of the key priorities of Europe and the civilization in general.

  3. A unified framework for modelling sediment fate from source to sink and its interactions with reef systems over geological times.

    PubMed

    Salles, Tristan; Ding, Xuesong; Webster, Jody M; Vila-Concejo, Ana; Brocard, Gilles; Pall, Jodie

    2018-03-27

    Understanding the effects of climatic variability on sediment dynamics is hindered by limited ability of current models to simulate long-term evolution of sediment transfer from source to sink and associated morphological changes. We present a new approach based on a reduced-complexity model which computes over geological time: sediment transport from landmasses to coasts, reworking of marine sediments by longshore currents, and development of coral reef systems. Our framework links together the main sedimentary processes driving mixed siliciclastic-carbonate system dynamics. It offers a methodology for objective and quantitative sediment fate estimations over regional and millennial time-scales. A simulation of the Holocene evolution of the Great Barrier Reef shows: (1) how high sediment loads from catchments erosion prevented coral growth during the early transgression phase and favoured sediment gravity-flows in the deepest parts of the northern region basin floor (prior to 8 ka before present (BP)); (2) how the fine balance between climate, sea-level, and margin physiography enabled coral reefs to thrive under limited shelf sedimentation rates after ~6 ka BP; and, (3) how since 3 ka BP, with the decrease of accommodation space, reduced of vertical growth led to the lateral extension of reefs consistent with available observational data.

  4. Application of the principal component analysis (PCA) to HVSR data aimed at the seismic characterization of earthquake prone areas

    NASA Astrophysics Data System (ADS)

    Paolucci, Enrico; Lunedei, Enrico; Albarello, Dario

    2017-10-01

    In this work, we propose a procedure based on principal component analysis on data sets consisting of many horizontal to vertical spectral ratio (HVSR or H/V) curves obtained by single-station ambient vibration acquisitions. This kind of analysis aimed at the seismic characterization of the investigated area by identifying sites characterized by similar HVSR curves. It also allows to extract the typical HVSR patterns of the explored area and to establish their relative importance, providing an estimate of the level of heterogeneity under the seismic point of view. In this way, an automatic explorative seismic characterization of the area becomes possible by only considering ambient vibration data. This also implies that the relevant outcomes can be safely compared with other available information (geological data, borehole measurements, etc.) without any conceptual trade-off. The whole algorithm is remarkably fast: on a common personal computer, the processing time takes few seconds for a data set including 100-200 HVSR measurements. The procedure has been tested in three study areas in the Central-Northern Italy characterized by different geological settings. Outcomes demonstrate that this technique is effective and well correlates with most significant seismostratigraphical heterogeneities present in each of the study areas.

  5. The Wigner-Ville Transform, An Approach to Interpret GPR Data: Outlining a Rik Zone

    NASA Astrophysics Data System (ADS)

    Chavez, R. E.; Samano, M. A.; Camara, M. E.; Tejero, A.; Flores-Marquez, L. E.; Arango, C.; Velazco, V.

    2006-12-01

    In this investigation, a time-frequency analysis is performed, based in the decomposition of the GPR signal in high- and low-frequencies. This process is combined with a statistical approach to detect signal changes in time and position simultaneously. The spectral analysis is carried out through the Wigner-Ville distribution (WVD). A cross-correlation can be computed between the original signal and the time-frequency components to obtain structural anomalies in the GPR observations, and to perform a correlation with the available geology. An example of this methodology is presented, where a series of traces where analyzed from a GPR profile surveyed in an eastern area of Mexico City. This is a heavily urbanized region built on the bottom of an ancient lake. The sediments are poorly consolidated and the extraction water rate has increased the areas of subsidence. Nowadays, most of family homes and public buildings, mainly schools have started to suffer heavy damages. The geophysical study carried out in the area permitted to detect areas of high risk. The data analysis combined with previous geological studies, which included stratigraphic columns allowed to identify the geophysical characteristics of the area, which will allow to the authorities to plan the future development of the area.

  6. Sediment-Hosted Zinc-Lead Deposits of the World - Database and Grade and Tonnage Models

    USGS Publications Warehouse

    Singer, Donald A.; Berger, Vladimir I.; Moring, Barry C.

    2009-01-01

    This report provides information on sediment-hosted zinc-lead mineral deposits based on the geologic settings that are observed on regional geologic maps. The foundation of mineral-deposit models is information about known deposits. The purpose of this publication is to make this kind of information available in digital form for sediment-hosted zinc-lead deposits. Mineral-deposit models are important in exploration planning and quantitative resource assessments: Grades and tonnages among deposit types are significantly different, and many types occur in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral occurrences, geophysics, and geochemistry used in resource assessments and mineral exploration. Too few thoroughly explored mineral deposits are available in most local areas for reliable identification of the important geoscience variables, or for robust estimation of undiscovered deposits - thus, we need mineral-deposit models. Globally based deposit models allow recognition of important features because the global models demonstrate how common different features are. Well-designed and -constructed deposit models allow geologists to know from observed geologic environments the possible mineral-deposit types that might exist, and allow economists to determine the possible economic viability of these resources in the region. Thus, mineral-deposit models play the central role in transforming geoscience information to a form useful to policy makers. This publication contains a computer file of information on sediment-hosted zinc-lead deposits from around the world. It also presents new grade and tonnage models for nine types of these deposits and a file allowing locations of all deposits to be plotted in Google Earth. The data are presented in FileMaker Pro, Excel and text files to make the information available to as many as possible. The value of this information and any derived analyses depends critically on the consistent manner of data gathering. For this reason, we first discuss the rules applied in this compilation. Next, the fields of the data file are considered. Finally, we provide new grade and tonnage models that are, for the most part, based on a classification of deposits using observable geologic units from regional-scaled maps.

  7. Merging of the USGS Atlas of Mercury 1:5,000,000 Geologic Series

    NASA Technical Reports Server (NTRS)

    Frigeri, A.; Federico, C.; Pauselli, C.; Coradini, A.

    2008-01-01

    After 30 years, the planet Mercury is going to give us new information. The NASA MESSENGER [1] already made its first successful flyby on December 2007 while the European Space Agency and the Japanese Space Agency ISAS/JAXA are preparing the upcoming mission BepiColombo [2]. In order to contribute to current and future analyses on the geology of Mercury, we have started to work on the production of a single digital geologic map of Mercury derived from the merging process of the geologic maps of the Atlas of Mercury, produced by the United States Geological Survey, based on Mariner 10 data. The aim of this work is to merge the nine maps so that the final product reflects as much as possible the original work. Herein we describe the data we used, the working environment and the steps made for producing the final map.

  8. US GeoData: Digital cartographic and geographic data

    USGS Publications Warehouse

    ,

    1985-01-01

    The increasing use of computers for storing and analyzing earth science information has sparked a growth in the demand for various types of cartographic data in digital form. The production of map data in computerized form is called digital cartography, and it involves the collection, storage, processing, analysis, and display of map data with the aid of computers. The U.S. Geological Survey, the Nation's largest earth science research agency, has expanded its national mapping program to incorporate operations associated with digital cartography, including the collection of planimetric, elevation, and geographic names information in digital form. This digital information is available for use in meeting the multipurpose needs and applications of the map user community.

  9. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  10. GEOS. User Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.

    2014-12-17

    GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less

  11. Quality-assurance plan for water-resources activities of the U.S. Geological Survey in Idaho

    USGS Publications Warehouse

    Packard, F.A.

    1996-01-01

    To ensure continued confidence in its products, the Water Resources Division of the U.S. Geological Survey implemented a policy that all its scientific work be performed in accordance with a centrally managed quality-assurance program. This report establishes and documents a formal policy for current (1995) quality assurance within the Idaho District of the U.S. Geological Survey. Quality assurance is formalized by describing district organization and operational responsibilities, documenting the district quality-assurance policies, and describing district functions. The districts conducts its work through offices in Boise, Idaho Falls, Twin Falls, Sandpoint, and at the Idaho National Engineering Laboratory. Data-collection programs and interpretive studies are conducted by two operating units, and operational and technical assistance is provided by three support units: (1) Administrative Services advisors provide guidance on various personnel issues and budget functions, (2) computer and reports advisors provide guidance in their fields, and (3) discipline specialists provide technical advice and assistance to the district and to chiefs of various projects. The district's quality-assurance plan is based on an overall policy that provides a framework for defining the precision and accuracy of collected data. The plan is supported by a series of quality-assurance policy statements that describe responsibilities for specific operations in the district's program. The operations are program planning; project planning; project implementation; review and remediation; data collection; equipment calibration and maintenance; data processing and storage; data analysis, synthesis, and interpretation; report preparation and processing; and training. Activities of the district are systematically conducted under a hierarchy of supervision an management that is designed to ensure conformance with Water Resources Division goals quality assurance. The district quality-assurance plan does not describe detailed technical activities that are commonly termed "quality-control procedures." Instead, it focuses on current policies, operations, and responsibilities that are implemented at the management level. Contents of the plan will be reviewed annually and updated as programs and operations change.

  12. Volcanic Hazard Education through Virtual Field studies of Vesuvius and Laki Volcanoes

    NASA Astrophysics Data System (ADS)

    Carey, S.; Sigurdsson, H.

    2011-12-01

    Volcanic eruptions pose significant hazards to human populations and have the potential to cause significant economic impacts as shown by the recent ash-producing eruptions in Iceland. Demonstrating both the local and global impact of eruptions is important for developing an appreciation of the scale of hazards associated with volcanic activity. In order to address this need, Web-based virtual field exercises at Vesuvius volcano in Italy and Laki volcano in Iceland have been developed as curriculum enhancements for undergraduate geology classes. The exercises are built upon previous research by the authors dealing with the 79 AD explosive eruption of Vesuvius and the 1783 lava flow eruption of Laki. Quicktime virtual reality images (QTVR), video clips, user-controlled Flash animations and interactive measurement tools are used to allow students to explore archeological and geological sites, collect field data in an electronic field notebook, and construct hypotheses about the impacts of the eruptions on the local and global environment. The QTVR images provide 360o views of key sites where students can observe volcanic deposits and formations in the context of a defined field area. Video sequences from recent explosive and effusive eruptions of Carribean and Hawaiian volcanoes are used to illustrate specific styles of eruptive activity, such as ash fallout, pyroclastic flows and surges, lava flows and their effects on the surrounding environment. The exercises use an inquiry-based approach to build critical relationships between volcanic processes and the deposits that they produce in the geologic record. A primary objective of the exercises is to simulate the role of a field volcanologist who collects information from the field and reconstructs the sequence of eruptive processes based on specific features of the deposits. Testing of the Vesuvius and Laki exercises in undergraduate classes from a broad spectrum of educational institutions shows a preference for the web-based interactive tools compared with traditional paper-based laboratory exercises. The exercises are freely accessible for undergraduate classes such as introductory geology, geologic hazards, or volcanology. Accompany materials, such as lecture-based Powerpoint presentations about Vesuvius and Laki, are also being developed for instructors to better integrate the web-based exercises into their existing curriculum.

  13. Reports of Planetary Geology Program, 1982

    NASA Technical Reports Server (NTRS)

    Holt, H. E. (Compiler)

    1982-01-01

    Work conducted in the Planetary Geology program is summarized. The following categories are presented: outer solar system satellites; asteroids and comets; Venus; cratering processes and landform development; volcanic processes and landforms; aolian processes and landforms; fluvial processes and landform development; periglacial and permafrost processes; structure, tectonics and stratigraphy; remote sensing and regolith studies; geologic mapping, cartography and geodesy.

  14. Summary on Several Key Techniques in 3D Geological Modeling

    PubMed Central

    2014-01-01

    Several key techniques in 3D geological modeling including planar mesh generation, spatial interpolation, and surface intersection are summarized in this paper. Note that these techniques are generic and widely used in various applications but play a key role in 3D geological modeling. There are two essential procedures in 3D geological modeling: the first is the simulation of geological interfaces using geometric surfaces and the second is the building of geological objects by means of various geometric computations such as the intersection of surfaces. Discrete geometric surfaces that represent geological interfaces can be generated by creating planar meshes first and then spatially interpolating; those surfaces intersect and then form volumes that represent three-dimensional geological objects such as rock bodies. In this paper, the most commonly used algorithms of the key techniques in 3D geological modeling are summarized. PMID:24772029

  15. Use of Cloud Computing to Calibrate a Highly Parameterized Model

    NASA Astrophysics Data System (ADS)

    Hayley, K. H.; Schumacher, J.; MacMillan, G.; Boutin, L.

    2012-12-01

    We present a case study using cloud computing to facilitate the calibration of a complex and highly parameterized model of regional groundwater flow. The calibration dataset consisted of many (~1500) measurements or estimates of static hydraulic head, a high resolution time series of groundwater extraction and disposal rates at 42 locations and pressure monitoring at 147 locations with a total of more than one million raw measurements collected over a ten year pumping history, and base flow estimates at 5 surface water monitoring locations. This modeling project was undertaken to assess the sustainability of groundwater withdrawal and disposal plans for insitu heavy oil extraction in Northeast Alberta, Canada. The geological interpretations used for model construction were based on more than 5,000 wireline logs collected throughout the 30,865 km2 regional study area (RSA), and resulted in a model with 28 slices, and 28 hydro stratigraphic units (average model thickness of 700 m, with aquifers ranging from a depth of 50 to 500 m below ground surface). The finite element FEFLOW model constructed on this geological interpretation had 331,408 nodes and required 265 time steps to simulate the ten year transient calibration period. This numerical model of groundwater flow required 3 hours to run on a on a server with two, 2.8 GHz processers and 16 Gb. RAM. Calibration was completed using PEST. Horizontal and vertical hydraulic conductivity as well as specific storage for each unit were independent parameters. For the recharge and the horizontal hydraulic conductivity in the three aquifers with the most transient groundwater use, a pilot point parameterization was adopted. A 7*7 grid of pilot points was defined over the RSA that defined a spatially variable horizontal hydraulic conductivity or recharge field. A 7*7 grid of multiplier pilot points that perturbed the more regional field was then superimposed over the 3,600 km2 local study area (LSA). The pilot point multipliers were implemented so a higher resolution of spatial variability could be obtained where there was a higher density of observation data. Five geologic boundaries were modeled with a specified flux boundary condition and the transfer rate was used as an adjustable parameter for each of these boundaries. This parameterization resulted in 448 parameters for calibration. In the project planning stage it was estimated that the calibration might require as much 15,000 hours (1.7 years) of computing. In an effort to complete the calibration in a timely manner, the inversion was parallelized and implemented on as many as 250 computing nodes located on Amazon's EC2 servers. The results of the calibration provided a better fit to the data than previous efforts with homogenous parameters, and the highly parameterized approach facilitated subspace Monte Carlo analysis for predictive uncertainty. This scale of cloud computing is relatively new for the hydrogeology community and at the time of implementation it was believed to be the first implementation of FEFLOW model at this scale. While the experience provided several challenges, the implementation was successful and provides some valuable learning for future efforts.

  16. Utah Flooding Hazard: Raising Public Awareness through the Creation of Multidisciplinary Web-Based Maps

    NASA Astrophysics Data System (ADS)

    Castleton, J.; Erickson, B.; Bowman, S. D.; Unger, C. D.

    2014-12-01

    The Utah Geological Survey's (UGS) Geologic Hazards Program has partnered with the U.S. Army Corps of Engineers to create geologically derived web-based flood hazard maps. Flooding in Utah communities has historically been one of the most damaging geologic hazards. The most serious floods in Utah have generally occurred in the Great Salt Lake basin, particularly in the Weber River drainage on the western slopes of the Wasatch Range, in areas of high population density. With a growing population of 2.9 million, the state of Utah is motivated to raise awareness about the potential for flooding. The process of increasing community resiliency to flooding begins with identification and characterization of flood hazards. Many small communities in areas experiencing rapid growth have not been mapped completely by the Federal Emergency Management Agency (FEMA) Flood Insurance Rate Maps (FIRM). Existing FIRM maps typically only consider drainage areas that are greater than one square mile in determining flood zones and do not incorporate geologic data, such as the presence of young, geologically active alluvial fans that indicate a high potential for debris flows and sheet flooding. Our new flood hazard mapping combines and expands on FEMA data by incorporating mapping derived from 1:24,000-scale UGS geologic maps, LiDAR data, digital elevation models, and historical aerial photography. Our flood hazard maps are intended to supplement the FIRM maps to provide local governments and the public with additional flood hazard information so they may make informed decisions, ultimately reducing the risk to life and property from flooding hazards. Flooding information must be widely available and easily accessed. One of the most effective ways to inform the public is through web-based maps. Web-based flood hazard maps will not only supply the public with the flood information they need, but also provides a platform to add additional geologic hazards to an easily accessible format.

  17. The application of automatic recognition techniques in the Apollo 9 SO-65 experiment

    NASA Technical Reports Server (NTRS)

    Macdonald, R. B.

    1970-01-01

    A synoptic feature analysis is reported on Apollo 9 remote earth surface photographs that uses the methods of statistical pattern recognition to classify density points and clusterings in digital conversion of optical data. A computer derived geological map of a geological test site indicates that geological features of the range are separable, but that specific rock types are not identifiable.

  18. Numerical simualtions and implications of air inclusions on the microdynamics of ice and firn

    NASA Astrophysics Data System (ADS)

    Steinbach, Florian; Weikusat, Ilka; Bons, Paul; Griera, Albert; Kerch, Johanna; Kuiper, Ernst-Jan; Llorens-Verde, Maria-Gema

    2016-04-01

    Although ice sheets are valuable paleo-climate archives, they can loose their integrity by ice flow (Faria et al. 2010). Consequently, understanding the dynamic processes that control the flow of ice is essential when investigating the past and future climate. While recent research successfully modelled the microdynamics of pure ice (e.g. Montagnat et al., 2014; Llorens et al., 2015), work taking into account second phases is scarce. Only a few studies also show the microstructural influence of air inclusions (Azuma et al., 2012, Roessiger et al., 2014). Therefore, modelling was performed focussing on the implications of the presence of bubbles on the microdynamical mechanisms and microstructure evolution. The full-field theory crystal plasticity code (FFT) of Lebensohn (2001), was coupled to the 2D multi-process modelling platform Elle (Bons et al., 2008), following the approach by Griera et al. (2013). FFT calculates the viscoplastic response of polycrystalline materials deforming by dislocation glide, taking into account mechanical anisotropy. The models further incorporate surface- and stored strain energy driven grain boundary migration (GBM) and intracrystalline recovery simulating annihilation and rearrangement of dislocations by reduction of internal misorientations. GBM was refined for polyphase materials following Becker et al. (2008) and Roessiger et al. (2014). Additionally, the formation of new high angle grain boundaries by nucleation and polygonisation based on critical internal misorientations has been implemented. Successively running the codes for different processes in very short numerical timesteps effectively enables multi-process modelling of deformation and concurrent recrystallisation. Results show how air inclusions control and increase strain localisation, leading to locally enhanced dynamic recrystallisation. This is in compliance with Faria et al. (2014), who theoretically predicted these localizations based on firn data from EPICA Dronning Maud Land (EDML) ice core. We propose that strain localisation has a strong control on the dominating recrystallisation mechanisms and can account for microstructural observations from alpine and polar ice cores. Our results confirm dynamic recrystallisation occurring in the uppermost levels of ice sheets as observed by Kipfstuhl et al. (2009) and Weikusat et al. (2009) in EDML core. References Azuma, N., et al. (2012) Journal of Structural Geology, 42, 184-193 Becker, J.K., et al. (2008) Computers & Geosciences, 34, 201-212 Bons, P.D., et al. (2008) Lecture Notes in Earth Sciences, 106 Faria, S.H., et al. (2010) Quaternary Science Reviews, 29, 338-351 Faria, S.H., et al. (2014) Journal of Structural Geology, 61, 21-49 Griera, A., et al. (2013) Tectonophysics, 587, 4-29 Kipfstuhl, S., et al. (2009) Journal of Geophysical Research, 114, B05204 Lebensohn, R.A. (2001) Acta Materialia, 49, 2723-2737 Llorens, M.G., et al. (2015) Journal of Glaciology, in press, doi:10.1017/jog.2016.28 Montagnat, M., et al. (2014) Journal of Structural Geology, 61, 78-108 Roessiger, J., et al. (2014) Journal of Structural Geology, 61, 123-132 Weikusat, I., et al. (2009) Journal of Glaciology, 55, 461-472

  19. Chapter 6. Tabular data and graphical images in support of the U.S. Geological Survey National Oil and Gas Assessment-East Texas basin and Louisiana-Mississippi salt basins provinces, Jurassic Smackover interior salt basins total petroleum system (504902), Travis Peak and Hosston formations.

    USGS Publications Warehouse

    ,

    2006-01-01

    This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on the CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).

  20. Chapter 3. Tabular data and graphical images in support of the U.S. Geological Survey National Oil and Gas Assessment--East Texas basin and Louisiana-Mississippi salt basins provinces, Jurassic Smackover Interior salt basins total petroleum system (504902), Cotton Valley group.

    USGS Publications Warehouse

    Klett, T.R.; Le, P.A.

    2006-01-01

    This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on the CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).

  1. Comparing orbiter and rover image-based mapping of an ancient sedimentary environment, Aeolis Palus, Gale crater, Mars

    NASA Astrophysics Data System (ADS)

    Stack, K. M.; Edwards, C. S.; Grotzinger, J. P.; Gupta, S.; Sumner, D. Y.; Calef, F. J.; Edgar, L. A.; Edgett, K. S.; Fraeman, A. A.; Jacob, S. R.; Le Deit, L.; Lewis, K. W.; Rice, M. S.; Rubin, D.; Williams, R. M. E.; Williford, K. H.

    2016-12-01

    This study provides the first systematic comparison of orbital facies maps with detailed ground-based geology observations from the Mars Science Laboratory (MSL) Curiosity rover to examine the validity of geologic interpretations derived from orbital image data. Orbital facies maps were constructed for the Darwin, Cooperstown, and Kimberley waypoints visited by the Curiosity rover using High Resolution Imaging Science Experiment (HiRISE) images. These maps, which represent the most detailed orbital analysis of these areas to date, were compared with rover image-based geologic maps and stratigraphic columns derived from Curiosity's Mast Camera (Mastcam) and Mars Hand Lens Imager (MAHLI). Results show that bedrock outcrops can generally be distinguished from unconsolidated surficial deposits in high-resolution orbital images and that orbital facies mapping can be used to recognize geologic contacts between well-exposed bedrock units. However, process-based interpretations derived from orbital image mapping are difficult to infer without known regional context or observable paleogeomorphic indicators, and layer-cake models of stratigraphy derived from orbital maps oversimplify depositional relationships as revealed from a rover perspective. This study also shows that fine-scale orbital image-based mapping of current and future Mars landing sites is essential for optimizing the efficiency and science return of rover surface operations.

  2. Comparing orbiter and rover image-based mapping of an ancient sedimentary environment, Aeolis Palus, Gale crater, Mars

    USGS Publications Warehouse

    Stack, Kathryn M.; Edwards, Christopher; Grotzinger, J. P.; Gupta, S.; Sumner, D.; Edgar, Lauren; Fraeman, A.; Jacob, S.; LeDeit, L.; Lewis, K.W.; Rice, M.S.; Rubin, D.; Calef, F.; Edgett, K.; Williams, R.M.E.; Williford, K.H.

    2016-01-01

    This study provides the first systematic comparison of orbital facies maps with detailed ground-based geology observations from the Mars Science Laboratory (MSL) Curiosity rover to examine the validity of geologic interpretations derived from orbital image data. Orbital facies maps were constructed for the Darwin, Cooperstown, and Kimberley waypoints visited by the Curiosity rover using High Resolution Imaging Science Experiment (HiRISE) images. These maps, which represent the most detailed orbital analysis of these areas to date, were compared with rover image-based geologic maps and stratigraphic columns derived from Curiosity’s Mast Camera (Mastcam) and Mars Hand Lens Imager (MAHLI). Results show that bedrock outcrops can generally be distinguished from unconsolidated surficial deposits in high-resolution orbital images and that orbital facies mapping can be used to recognize geologic contacts between well-exposed bedrock units. However, process-based interpretations derived from orbital image mapping are difficult to infer without known regional context or observable paleogeomorphic indicators, and layer-cake models of stratigraphy derived from orbital maps oversimplify depositional relationships as revealed from a rover perspective. This study also shows that fine-scale orbital image-based mapping of current and future Mars landing sites is essential for optimizing the efficiency and science return of rover surface operations.

  3. Design and Implementation WebGIS for Improving the Quality of Exploration Decisions at Sin-Quyen Copper Mine, Northern Vietnam

    NASA Astrophysics Data System (ADS)

    Quang Truong, Xuan; Luan Truong, Xuan; Nguyen, Tuan Anh; Nguyen, Dinh Tuan; Cong Nguyen, Chi

    2017-12-01

    The objective of this study is to design and implement a WebGIS Decision Support System (WDSS) for reducing uncertainty and supporting to improve the quality of exploration decisions in the Sin-Quyen copper mine, northern Vietnam. The main distinctive feature of the Sin-Quyen deposit is an unusual composition of ores. Computer and software applied to the exploration problem have had a significant impact on the exploration process over the past 25 years, but up until now, no online system has been undertaken. The system was completely built on open source technology and the Open Geospatial Consortium Web Services (OWS). The input data includes remote sensing (RS), Geographical Information System (GIS) and data from drillhole explorations, the drillhole exploration data sets were designed as a geodatabase and stored in PostgreSQL. The WDSS must be able to processed exploration data and support users to access 2-dimensional (2D) or 3-dimensional (3D) cross-sections and map of boreholles exploration data and drill holes. The interface was designed in order to interact with based maps (e.g., Digital Elevation Model, Google Map, OpenStreetMap) and thematic maps (e.g., land use and land cover, administrative map, drillholes exploration map), and to provide GIS functions (such as creating a new map, updating an existing map, querying and statistical charts). In addition, the system provides geological cross-sections of ore bodies based on Inverse Distance Weighting (IDW), nearest neighbour interpolation and Kriging methods (e.g., Simple Kriging, Ordinary Kriging, Indicator Kriging and CoKriging). The results based on data available indicate that the best estimation method (of 23 borehole exploration data sets) for estimating geological cross-sections of ore bodies in Sin-Quyen copper mine is Ordinary Kriging. The WDSS could provide useful information to improve drilling efficiency in mineral exploration and for management decision making.

  4. Stochastic Seismic Inversion and Migration for Offshore Site Investigation in the Northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Son, J.; Medina-Cetina, Z.

    2017-12-01

    We discuss the comparison between deterministic and stochastic optimization approaches to the nonlinear geophysical full-waveform inverse problem, based on the seismic survey data from Mississippi Canyon in the Northern Gulf of Mexico. Since the subsea engineering and offshore construction projects actively require reliable ground models from various site investigations, the primary goal of this study is to reconstruct the accurate subsurface information of the soil and rock material profiles under the seafloor. The shallow sediment layers have naturally formed heterogeneous formations which may cause unwanted marine landslides or foundation failures of underwater infrastructure. We chose the quasi-Newton and simulated annealing as deterministic and stochastic optimization algorithms respectively. Seismic forward modeling based on finite difference method with absorbing boundary condition implements the iterative simulations in the inverse modeling. We briefly report on numerical experiments using a synthetic data as an offshore ground model which contains shallow artificial target profiles of geomaterials under the seafloor. We apply the seismic migration processing and generate Voronoi tessellation on two-dimensional space-domain to improve the computational efficiency of the imaging stratigraphical velocity model reconstruction. We then report on the detail of a field data implementation, which shows the complex geologic structures in the Northern Gulf of Mexico. Lastly, we compare the new inverted image of subsurface site profiles in the space-domain with the previously processed seismic image in the time-domain at the same location. Overall, stochastic optimization for seismic inversion with migration and Voronoi tessellation show significant promise to improve the subsurface imaging of ground models and improve the computational efficiency required for the full waveform inversion. We anticipate that by improving the inversion process of shallow layers from geophysical data will better support the offshore site investigation.

  5. Land cover mapping of the upper Kuskokwim Resource Managment Area using LANDSAT and a digital data base approach

    USGS Publications Warehouse

    Markon, Carl J.

    1988-01-01

    Digital land cover and terrain data for the Upper Kuskokwim Resource Hanagement Area (UKRMA) were produced by the U.S. Geological Survey, Earth Resources Observation Systems Field Office, Anchorage, Alaska for the Bureau of Land Management. These and other environmental data, were incorporated into a digital data base to assist in the management and planning of the UKRMA. The digital data base includes land cover classifications, elevation, slope, and aspect data centering on the UKRMA boundaries. The data are stored on computer compatible tapes at a 50-m pixel size. Additional digital data in the data base include: (a) summer and winter Landsat multispectral scanner (MSS) data registered to a 50-m Universal Transverse Mercator grid; (b) elevation, slope, aspect, and solar illumination data; (c) soils and surficial geology; and (e) study area boundary. The classification of Landsat MSS data resulted in seven major classes and 24 subclasses. Major classes include: forest, shrubland, dwarf scrub, herbaceous, barren, water, and other. The final data base will be used by resource personnel for management and planning within the UKRMA.

  6. Experimental Simulation of Cryomagmatic Processes. Water Ice, Clathrates and Salts

    NASA Astrophysics Data System (ADS)

    Muñoz-Iglesias, V.; Prieto-Ballesteros, O.; López, I.

    2018-06-01

    Study of diverse cryomagmatic processes based on the system H2O-CO2-MgSO4 with application to Europa. The type of the crystals formed is related to volume-temperature changes, while their morphology is associated with surface geological features.

  7. Using Digital Time-Lapse Videos to Teach Geomorphic Processes to Undergraduates

    NASA Astrophysics Data System (ADS)

    Clark, D. H.; Linneman, S. R.; Fuller, J.

    2004-12-01

    We demonstrate the use of relatively low-cost, computer-based digital imagery to create time-lapse videos of two distinct geomorphic processes in order to help students grasp the significance of the rates, styles, and temporal dependence of geologic phenomena. Student interviews indicate that such videos help them to understand the relationship between processes and landform development. Time-lapse videos have been used extensively in some sciences (e.g., biology - http://sbcf.iu.edu/goodpract/hangarter.html, meteorology - http://www.apple.com/education/hed/aua0101s/meteor/, chemistry - http://www.chem.yorku.ca/profs/hempsted/chemed/home.html) to demonstrate gradual processes that are difficult for many students to visualize. Most geologic processes are slower still, and are consequently even more difficult for students to grasp, yet time-lapse videos are rarely used in earth science classrooms. The advent of inexpensive web-cams and computers provides a new means to explore the temporal dimension of earth surface processes. To test the use of time-lapse videos in geoscience education, we are developing time-lapse movies that record the evolution of two landforms: a stream-table delta and a large, natural, active landslide. The former involves well-known processes in a controlled, repeatable laboratory experiment, whereas the latter tracks the developing dynamics of an otherwise poorly understood slope failure. The stream-table delta is small and grows in ca. 2 days; we capture a frame on an overhead web-cam every 3 minutes. Before seeing the video, students are asked to hypothesize how the delta will grow through time. The final time-lapse video, ca. 20-80 MB, elegantly shows channel migration, progradation rates, and formation of major geomorphic elements (topset, foreset, bottomset beds). The web-cam can also be "zoomed-in" to show smaller-scale processes, such as bedload transfer, and foreset slumping. Post-lab tests and interviews with students indicate that these time-lapse videos significantly improve student interest in the material, and comprehension of the processes. In contrast, the natural landslide is relatively unconstrained, and its processes of movement, both gradual and catastrophic, are essentially impossible to observe directly without the aid of time-lapse imagery. We are constructing a remote digital camera, mounted in a tree, which will capture 1-2 photos/day of the toe. The toe is extremely active geomorphically, and the time-lapse movie should help us (and the students) to constrain the style, frequency, and rates of movement, surface slumping, and debris-flow generation. Because we have also installed a remote weather station on the landslide, we will be able to test the links between these processes and local climate conditions.

  8. An Overview of Computer-Based Natural Language Processing.

    ERIC Educational Resources Information Center

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  9. MER Field Geologic Traverse in Gusev Crater, Mars: Initial Results From the Perspective of Spirit

    NASA Technical Reports Server (NTRS)

    Crumpler, L.; Cabrol, N.; desMarais, D.; Farmer, J.; Golmbek, M.; Grant, J.; Greely, R.; Grotzinger, J.; Haskin, L.; Arvidson, R.

    2004-01-01

    This report casts the initial results of the traverse and science investigations by the Mars Exploration Rover (MER) Spirit at Gusev crater [1] in terms of data sets commonly used in field geologic investigations: Local mapping of geologic features, analyses of selected samples, and their location within the local map, and the regional context of the field traverse in terms of the larger geologic and physiographic region. These elements of the field method are represented in the MER characterization of the Gusev traverse by perspective-based geologic/morphologic maps, the placement of the results from Mossbauer, APXS, Microscopic Imager, Mini-TES and Pancam multispectral studies in context within this geologic/ morphologic map, and the placement of the overall traverse in the context of narrow-angle MOC (Mars Orbiter Camera) and descent images. A major campaign over a significance fraction of the mission will be the first robotic traverse of the ejecta from a Martian impact crater along an approximate radial from the crater center. The Mars Exploration Rovers have been conceptually described as 'robotic field geologists', that is, a suite of instruments with mobility that enables far-field traverses to multiple sites located within a regional map/image base at which in situ analyses may be done. Initial results from MER, where the field geologic method has been used throughout the initial course of the investigation, confirm that this field geologic model is applicable for remote planetary surface exploration. The field geologic method makes use of near-field geologic characteristics ('outcrops') to develop an understanding of the larger geologic context through continuous loop of rational steps focused on real-time hypothesis identification and testing. This poster equates 'outcrops' with the locations of in situ investigations and 'regional context' with the geology over distance of several kilometers. Using this fundamental field geologic method, we have identified the basic local geologic materials on the floor of Gusev at this site, their compositions and likely lithologies, origins, processes that have modified these materials, and their potential significance in the interpretation of the regional geology both spatially and temporally.

  10. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-94, with projections to 2020; (supplement one to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.). Output files resulting from the computer simulations are included for reference.

  11. Surficial geology of Mars: A study in support of a penetrator mission to Mars

    NASA Technical Reports Server (NTRS)

    Spudis, P.; Greeley, R.

    1976-01-01

    Physiographic and surficial cover information were combined into unified surficial geology maps (30 quadrangles and 1 synoptic map). The surface of Mars is heterogeneous and is modified by wind, water, volcanism, tectonism, mass wasting and other processes. Surficial mapping identifies areas modified by these processes on a regional basis. Viking I mission results indicate that, at least in the landing site area, the surficial mapping based on Mariner data is fairly accurate. This area was mapped as a lightly cratered plain with thin or discontinuous eolian sediment. Analysis of lander images indicates that this interpretation is very close to actual surface conditions. These initial results do not imply that all surficial units are mapped correctly, but they do increase confidence in estimates based on photogeologic interpretations of orbital pictures.

  12. Study of the Effects of Photometric Geometry on Spectral Reflectance Measurements

    NASA Technical Reports Server (NTRS)

    Helfenstein, Paul

    1998-01-01

    The objective of this research is to investigate how the spectrophotometric properties of planetary surface materials depend on photometric geometry by refining and applying radiative transfer theory to data obtained from spacecraft and telescope observations of planetary surfaces, studies of laboratory analogs, and computer simulations. The goal is to perfect the physical interpretation of photometric parameters in the context of planetary surface geological properties and processes. The purpose of this report is to document the research achievements associated with this study.

  13. Surface features of central North America: a synoptic view from computer graphics

    USGS Publications Warehouse

    Pike, R.J.

    1991-01-01

    A digital shaded-relief image of the 48 contiguous United States shows the details of large- and small-scale landforms, including several linear trends. The features faithfully reflect tectonism, continental glaciation, fluvial activity, volcanism, and other surface-shaping events and processes. The new map not only depicts topography accurately and in its true complexity, but does so in one synoptic view that provides a regional context for geologic analysis unobscured by clouds, culture, vegetation, or artistic constraints. -Author

  14. Simulation of geothermal water extraction in heterogeneous reservoirs using dynamic unstructured mesh optimisation

    NASA Astrophysics Data System (ADS)

    Salinas, P.; Pavlidis, D.; Jacquemyn, C.; Lei, Q.; Xie, Z.; Pain, C.; Jackson, M.

    2017-12-01

    It is well known that the pressure gradient into a production well increases with decreasing distance to the well. To properly capture the local pressure drawdown into the well a high grid or mesh resolution is required; moreover, the location of the well must be captured accurately. In conventional simulation models, the user must interact with the model to modify grid resolution around wells of interest, and the well location is approximated on a grid defined early in the modelling process.We report a new approach for improved simulation of near wellbore flow in reservoir scale models through the use of dynamic mesh optimisation and the recently presented double control volume finite element method. Time is discretized using an adaptive, implicit approach. Heterogeneous geologic features are represented as volumes bounded by surfaces. Within these volumes, termed geologic domains, the material properties are constant. Up-, cross- or down-scaling of material properties during dynamic mesh optimization is not required, as the properties are uniform within each geologic domain. A given model typically contains numerous such geologic domains. Wells are implicitly coupled with the domain, and the fluid flows is modelled inside the wells. The method is novel for two reasons. First, a fully unstructured tetrahedral mesh is used to discretize space, and the spatial location of the well is specified via a line vector, ensuring its location even if the mesh is modified during the simulation. The well location is therefore accurately captured, the approach allows complex well trajectories and wells with many laterals to be modelled. Second, computational efficiency is increased by use of dynamic mesh optimization, in which an unstructured mesh adapts in space and time to key solution fields (preserving the geometry of the geologic domains), such as pressure, velocity or temperature, this also increases the quality of the solutions by placing higher resolution where required to reduce an error metric based on the Hessian of the field. This allows the local pressure drawdown to be captured without user¬ driven modification of the mesh. We demonstrate that the method has wide application in reservoir ¬scale models of geothermal fields, and regional models of groundwater resources.

  15. Advanced Land Observing Satellite (ALOS) Phased Array Type L-Band Synthetic Aperture Radar (PALSAR) mosaic for the Kahiltna terrane, Alaska, 2007-2010

    USGS Publications Warehouse

    Cole, Christopher J.; Johnson, Michaela R.; Graham, Garth E.

    2015-01-01

    The USGS has compiled a continuous, cloud-free 12.5-meter resolution radar mosaic of SAR data of approximately 212,000 square kilometers to examine the suitability of this technology for geologic mapping. This mosaic was created from Advanced Land Observing Satellite (ALOS) Phased Array type L-band Synthetic Aperture Radar (PALSAR) data collected from 2007 to 2010 spanning the Kahiltna terrane and the surrounding area. Interpretation of these data may help geologists understand past geologic processes and identify areas with potential for near-surface mineral resources for further ground-based geological and geochemical investigations.

  16. Geological mapping in northwestern Saudi Arabia using LANDSAT multispectral techniques

    NASA Technical Reports Server (NTRS)

    Blodget, H. W.; Brown, G. F.; Moik, J. G.

    1975-01-01

    Various computer enhancement and data extraction systems using LANDSAT data were assessed and used to complement a continuing geologic mapping program. Interactive digital classification techniques using both the parallel-piped and maximum-likelihood statistical approaches achieve very limited success in areas of highly dissected terrain. Computer enhanced imagery developed by color compositing stretched MSS ratio data was constructed for a test site in northwestern Saudi Arabia. Initial results indicate that several igneous and sedimentary rock types can be discriminated.

  17. Discussion on the 3D visualizing of 1:200 000 geological map

    NASA Astrophysics Data System (ADS)

    Wang, Xiaopeng

    2018-01-01

    Using United States National Aeronautics and Space Administration Shuttle Radar Topography Mission (SRTM) terrain data as digital elevation model (DEM), overlap scanned 1:200 000 scale geological map, program using Direct 3D of Microsoft with C# computer language, the author realized the three-dimensional visualization of the standard division geological map. User can inspect the regional geology content with arbitrary angle, rotating, roaming, and can examining the strata synthetical histogram, map section and legend at any moment. This will provide an intuitionistic analyzing tool for the geological practitioner to do structural analysis with the assistant of landform, dispose field exploration route etc.

  18. Real-Time, Sensor-Based Computing in the Laboratory.

    ERIC Educational Resources Information Center

    Badmus, O. O.; And Others

    1996-01-01

    Demonstrates the importance of Real-Time, Sensor-Based (RTSB) computing and how it can be easily and effectively integrated into university student laboratories. Describes the experimental processes, the process instrumentation and process-computer interface, the computer and communications systems, and typical software. Provides much technical…

  19. Seismic expression of Red Fork channels in Major and Kay Counties, Oklahoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanoch, C.A.

    1987-08-01

    This paper investigates the application of regional seismic to exploration and development Red Fork sands of the Cherokee Group, in Major and Kay Counties, Oklahoma. A computer-aided exploration system (CAEX) was used to justify the subtle seismic expressions with the geological interpretation. Modeling shows that the low-velocity shales are the anomalous rock in the Cherokee package, which is most represented by siltstone and thin sands. Because the Red Fork channel sands were incised into or deposited with laterally time-equivalent siltstones, no strong reflection coefficient is associated with the top of the sands. The objective sands become a seismic anomaly onlymore » when they cut into and replace a low-velocity shale. This knowledge allows mapping the channel thickness by interpreting the shale thickness from seismic data. A group shoot line in Major County, Oklahoma, has been tied to the geologic control, and the channel thicknesses have been interpreted assuming a detectable vertical resolution of 10 ft. A personal computer-based geophysical work station is used to construct velocity logs representative of the geology to produce forward-modeled synthetic seismic sections, and to display, in color, the seismic trace attributes. These synthetic sections are used as tools to compare with and interpret the seismic line and to evaluate the interpretative value of lowest cost, lesser quality data versus reprocessing or new data acquisition.« less

  20. Some relations between streamflow characteristics and the environment in the Delaware River region

    USGS Publications Warehouse

    Hely, A.G.; Olmsted, F.H.

    1963-01-01

    Streamflow characteristics are determined by a large number of factors of the meteorological and terrestrial environments. Because of lack of quantitative data to describe some of the factors and complex interrelations among them, complete analysis of the relations between streamflow and the various environmental factors is impossible. However, certain simplifying assumptions and generalizations made possible a partial analysis for the Delaware River region. For relations involving average runoff or low-flow parameters, average annual precipitation was assumed to be the principal meteorological factor, and geology (a complex of many factors) was assumed to be the principal terrestrial influence, except for that of basin size which was largely eliminated by expression of discharge in terms of unit area. As a first approximation, physiographic units were used as a basis for classifying the geology. Relations between flow parameters and precipitation are fairly well defined for some physiographic units, but not for those in which the geology varies markedly or the areal variation in average precipitation is very small. These relations provide a basis for adjusting the flow parameters to reduce or eliminate the effects of areal variations in precipitation and increase their significance in studies of the effects of terrestrial characteristics. An investigation of the residual effect of basin size (the effect remaining when discharge is expressed in terms of unit area) on relations between flow parameters and average precipitation indicates that such effect is negligible, except for very large differences in area. Parameters that are derived from base-flow recession curves and are related to a common discharge per unit area have inherent advantages as indicators of effects of terrestrial characteristics of basins, because the.y are independent of areal variations in average annual precipitation. Winter base-flow parameters are also practically independent of the effects of evapotranspiration from ground water. However, in many parts of the region these advantages are reduced or nullified by the difficulties of defining base-flow recession curves, particularly winter curves, with sufficient accuracy. In the absence of suitable base-flow recession data and a suitable basis for adjusting parameters, the ratio of the discharge equaled or exceeded 90 percent of the time to the average discharge (Qtt/Qa), or a similar duration parameter, probably is the best indicator of the influence of terrestrial characteristics, although the ratio may vary somewhat with average precipitation. In a part of the region where geologic differences are large and areal variations in average precipitation are small, values of Qm/Qa for each major geologic unit were determined from streamflow records. From these values and the percentage of area represented by each unit, a ratio for each gaging station was computed. Comparison of these computed results with the observed results indicates that nearly all of the variation in the ratio is associated with variation in geology. The investigation indicates that the original assumptions are correct; average precipitation is the principal meteorological influence and geology is the principal terrestrial influence. Together these two factors account for a very large proportion of the variation in average runoff and low-flow characteristics

  1. Aerial radiometric and magnetic survey: Aztec National Topographic Map, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-01-01

    The results of analyses of the airborne gamma radiation and total magnetic field survey flown for the region identified as the Aztec National Topographic Map NJ13-10 are presented. The airborne data gathered are reduced by ground computer facilities to yield profile plots of the basic uranium, thorium and potassium equivalent gamma radiation intensities, ratios of these intensities, aircraft altitude above the earth's surface, total gamma ray and earth's magnetic field intensity, correlated as a function of geologic units. The distribution of data within each geologic unit, for all surveyed map lines and tie lines, has been calculated and is included.more » Two sets of profiled data for each line are included, with one set displaying the above-cited data. The second set includes only flight line magnetic field, temperature, pressure, altitude data plus magnetic field data as measured at a base station. A general description of the area, including descriptions of the various geologic units and the corresponding airborne data, is included also.« less

  2. Aerial radiometric and magnetic survey: Lander National Topographic Map, Wyoming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-01-01

    The results of analyses of the airborne gamma radiation and total magnetic field survey flown for the region identified as the Lander National Topographic Map NK12-6 are presented. The airborne data gathered are reduced by ground computer facilities to yield profile plots of the basic uranium, thorium and potassium equivalent gamma radiation intensities, ratios of these intensities, aircraft altitude above the earth's surface, total gamma ray and earth's magnetic field intensity, correlated as a function of geologic units. The distribution of data within each geologic unit, for all surveyed map lines and tie lines, has been calculated and is included.more » Two sets of profiled data for each line are included, with one set displaying the above-cited data. The second set includes only flight line magnetic field, temperature, pressure, altitude data plus magnetic field data as measured at a base station. A general description of the area, including descriptions of the various geologic units and the corresponding airborne data, is included also.« less

  3. Integrated modeling of natural and human systems - problems and initiatives

    NASA Astrophysics Data System (ADS)

    Kessler, H.; Giles, J.; Gunnink, J.; Hughes, A.; Moore, R. V.; Peach, D.

    2009-12-01

    Governments and their executive agencies across the world are facing increasing pressure to make decisions about the management of resources in light of population growth and environmental change. In the UK and the Netherlands, for example, groundwater is becoming a scarce resource for large parts of its most densely populated areas. At the same time river and groundwater flooding resulting from high rainfall events are increasing in scale and frequency and sea level rise is threatening the defences of coastal cities. There is also a need for affordable housing, improved transport infrastructure and waste disposal as well as sources of renewable energy and sustainable food production. These challenges can only be resolved if solutions are based on sound scientific evidence. Although we have knowledge and understanding of many individual processes in the natural sciences it is clear that a single science discipline is unable to answer the questions and their inter-relationships. Modern science increasingly employs computer models to simulate the natural, economic and human system. Management and planning requires scenario modelling, forecasts and “predictions”. Although the outputs are often impressive in terms of apparent accuracy and visualisation, they are inherently not suited to simulate the response to feedbacks from other models of the earth system, such as the impact of human actions. Geological Survey Organisations (GSO) are increasingly employing advances in Information Technology to visualise and improve their understanding of geological systems. Instead of 2 dimensional paper maps and reports many GSOs now produce 3 dimensional geological framework models and groundwater flow models as their standard output. Additionally the British Geological Survey and the Geological Survey of the Netherlands have developed standard routines to link geological data to groundwater models, but these models are only aimed at solving one specific part of the earth's system, e.g. the flow of groundwater to an abstraction borehole or the availability of water for irrigation. Particular problems arise when model data from two or more disciplines are incompatible in terms of data formats, scientific concepts or language. Other barriers include the cultural segregation within and between science disciplines as well as impediments to data exchange due to ownership and copyright restrictions. OpenMI and GeoSciML are initiatives that are trying to overcome these barriers by building international communities that share vocabularies and data formats. This paper will give examples of the successful merging of geological and hydrological models from the UK and the Netherlands and will introduce the vision of an open Environmental Modelling Platform which aims to link data, knowledge and concepts seamlessly to numerical process models. Last but not least there is an urgent need to create a Subsurface Management System akin to a Geographic Information System in which all results of subsurface modelling can be visualised and analysed in an integrated manner.

  4. Standards for the Analysis and Processing of Surface-Water Data and Information Using Electronic Methods

    USGS Publications Warehouse

    Sauer, Vernon B.

    2002-01-01

    Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.

  5. Using Snow to Teach Geology.

    ERIC Educational Resources Information Center

    Roth, Charles

    1991-01-01

    A lesson plan, directed at middle school students and older, describes using snow to study the geological processes of solidification of molten material, sedimentation, and metamorphosis. Provides background information on these geological processes. (MCO)

  6. The Use of Information Technology To Enhance Learning in Geological Field Trips.

    ERIC Educational Resources Information Center

    Hesthammer, Jonny; Fossen, Haakon; Sautter, Michael; Saether, Bjorn; Johansen, Stale Emile

    2002-01-01

    Reports on the testing of two approaches to enhance learning in geological field trips through the use of technology. One approach used an advanced flight simulator and the other used digital cameras and computers. (Contains 18 references.) (DDR)

  7. Surface water records of Texas, 1964

    USGS Publications Warehouse

    ,

    1965-01-01

    The surface-water records for the 1964 water year for gaging stations, partial-record stations, miscellaneous sites, and base-flow studies within the State of Texas are given in this report. For convenience there are also included records for a few pertinent gaging stations in bordering States. The records were collected and computed by the Water Resources Division of the U. S. Geological Survey, under the direction of Trigg Twichell, district chief, Water Resources Division. Through September 30, 1960, the records of discharge and stage of streams and contents and stage of lakes or reservoirs were published in an annual series of U.S. Geological Survey water supply papers, entitled "Surface Water Supply of the United States." Since 1951 there have been 20 volumes in the series; each volume covered an area whose boundaries coincided with those of certain natural drainage areas. The records in Texas were contained in Parts 7 and 8 of that series. Beginning with the 1961 water year, streamflow records and related data have been released by the Geological Survey in annual reports on a State-boundary basis. Distribution of these basic-data reports is limited and primarily for local needs. Records will be published in Geological Survey water-supply papers at 5-year intervals.

  8. Earth Science Education in Zimbabwe

    NASA Astrophysics Data System (ADS)

    Walsh, Kevin L.

    1999-05-01

    Zimbabwe is a mineral-rich country with a long history of Earth Science Education. The establishment of a University Geology Department in 1960 allowed the country to produce its own earth science graduates. These graduates are readily absorbed by the mining industry and few are without work. Demand for places at the University is high and entry standards reflect this. Students enter the University after GCE A levels in three science subjects and most go on to graduate. Degree programmes include B.Sc. General in Geology (plus another science), B.Sc. Honours in Geology and M.Sc. in Exploration Geology and in Geophysics. The undergraduate curriculum is broad-based and increasingly vocationally orientated. A well-equipped building caters for relatively large student numbers and also houses analytical facilities used for research and teaching. Computers are used in teaching from the first year onwards. Staff are on average poorly qualified compared to other universities, but there is an impressive research element. The Department has good links with many overseas universities and external funding agencies play a strong supporting role. That said, financial constraints remain the greatest barrier to future development, although increasing links with the mining industry may cushion this.

  9. High-Dimensional Intrinsic Interpolation Using Gaussian Process Regression and Diffusion Maps

    DOE PAGES

    Thimmisetty, Charanraj A.; Ghanem, Roger G.; White, Joshua A.; ...

    2017-10-10

    This article considers the challenging task of estimating geologic properties of interest using a suite of proxy measurements. The current work recast this task as a manifold learning problem. In this process, this article introduces a novel regression procedure for intrinsic variables constrained onto a manifold embedded in an ambient space. The procedure is meant to sharpen high-dimensional interpolation by inferring non-linear correlations from the data being interpolated. The proposed approach augments manifold learning procedures with a Gaussian process regression. It first identifies, using diffusion maps, a low-dimensional manifold embedded in an ambient high-dimensional space associated with the data. Itmore » relies on the diffusion distance associated with this construction to define a distance function with which the data model is equipped. This distance metric function is then used to compute the correlation structure of a Gaussian process that describes the statistical dependence of quantities of interest in the high-dimensional ambient space. The proposed method is applicable to arbitrarily high-dimensional data sets. Here, it is applied to subsurface characterization using a suite of well log measurements. The predictions obtained in original, principal component, and diffusion space are compared using both qualitative and quantitative metrics. Considerable improvement in the prediction of the geological structural properties is observed with the proposed method.« less

  10. High-Dimensional Intrinsic Interpolation Using Gaussian Process Regression and Diffusion Maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thimmisetty, Charanraj A.; Ghanem, Roger G.; White, Joshua A.

    This article considers the challenging task of estimating geologic properties of interest using a suite of proxy measurements. The current work recast this task as a manifold learning problem. In this process, this article introduces a novel regression procedure for intrinsic variables constrained onto a manifold embedded in an ambient space. The procedure is meant to sharpen high-dimensional interpolation by inferring non-linear correlations from the data being interpolated. The proposed approach augments manifold learning procedures with a Gaussian process regression. It first identifies, using diffusion maps, a low-dimensional manifold embedded in an ambient high-dimensional space associated with the data. Itmore » relies on the diffusion distance associated with this construction to define a distance function with which the data model is equipped. This distance metric function is then used to compute the correlation structure of a Gaussian process that describes the statistical dependence of quantities of interest in the high-dimensional ambient space. The proposed method is applicable to arbitrarily high-dimensional data sets. Here, it is applied to subsurface characterization using a suite of well log measurements. The predictions obtained in original, principal component, and diffusion space are compared using both qualitative and quantitative metrics. Considerable improvement in the prediction of the geological structural properties is observed with the proposed method.« less

  11. Seismic, side-scan survey, diving, and coring data analyzed by a Macintosh II sup TM computer and inexpensive software provide answers to a possible offshore extension of landslides at Palos Verdes Peninsula, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dill, R.F.; Slosson, J.E.; McEachen, D.B.

    1990-05-01

    A Macintosh II{sup TM} computer and commercially available software were used to analyze and depict the topography, construct an isopach sediment thickness map, plot core positions, and locate the geology of an offshore area facing an active landslide on the southern side of Palos Verdes Peninsula California. Profile data from side scan sonar, 3.5 kHz, and Boomer subbottom, high-resolution seismic, diving, echo sounder traverses, and cores - all controlled with a mini Ranger II navigation system - were placed in MacGridzo{sup TM} and WingZ{sup TM} software programs. The computer-plotted data from seven sources were used to construct maps with overlaysmore » for evaluating the possibility of a shoreside landslide extending offshore. The poster session describes the offshore survey system and demonstrates the development of the computer data base, its placement into the MacGridzo{sup TM} gridding program, and transfer of gridded navigational locations to the WingZ{sup TM} data base and graphics program. Data will be manipulated to show how sea-floor features are enhanced and how isopach data were used to interpret the possibility of landslide displacement and Holocene sea level rise. The software permits rapid assessment of data using computerized overlays and a simple, inexpensive means of constructing and evaluating information in map form and the preparation of final written reports. This system could be useful in many other areas where seismic profiles, precision navigational locations, soundings, diver observations, and core provide a great volume of information that must be compared on regional plots to develop of field maps for geological evaluation and reports.« less

  12. Mapping three-dimensional geological features from remotely-sensed images and digital elevation models

    NASA Astrophysics Data System (ADS)

    Morris, Kevin Peter

    Accurate mapping of geological structures is important in numerous applications, ranging from mineral exploration through to hydrogeological modelling. Remotely sensed data can provide synoptic views of study areas enabling mapping of geological units within the area. Structural information may be derived from such data using standard manual photo-geologic interpretation techniques, although these are often inaccurate and incomplete. The aim of this thesis is, therefore, to compile a suite of automated and interactive computer-based analysis routines, designed to help a the user map geological structure. These are examined and integrated in the context of an expert system. The data used in this study include Digital Elevation Model (DEM) and Airborne Thematic Mapper images, both with a spatial resolution of 5m, for a 5 x 5 km area surrounding Llyn Cow lyd, Snowdonia, North Wales. The geology of this area comprises folded and faulted Ordo vician sediments intruded throughout by dolerite sills, providing a stringent test for the automated and semi-automated procedures. The DEM is used to highlight geomorphological features which may represent surface expressions of the sub-surface geology. The DEM is created from digitized contours, for which kriging is found to provide the best interpolation routine, based on a number of quantitative measures. Lambertian shading and the creation of slope and change of slope datasets are shown to provide the most successful enhancement of DEMs, in terms of highlighting a range of key geomorphological features. The digital image data are used to identify rock outcrops as well as lithologically controlled features in the land cover. To this end, a series of standard spectral enhancements of the images is examined. In this respect, the least correlated 3 band composite and a principal component composite are shown to give the best visual discrimination of geological and vegetation cover types. Automatic edge detection (followed by line thinning and extraction) and manual interpretation techniques are used to identify a set of 'geological primitives' (linear or arc features representing lithological boundaries) within these data. Inclusion of the DEM data provides the three-dimensional co-ordinates of these primitives enabling a least-squares fit to be employed to calculate dip and strike values, based, initially, on the assumption of a simple, linearly dipping structural model. A very large number of scene 'primitives' is identified using these procedures, only some of which have geological significance. Knowledge-based rules are therefore used to identify the relevant. For example, rules are developed to identify lake edges, forest boundaries, forest tracks, rock-vegetation boundaries, and areas of geomorphological interest. Confidence in the geological significance of some of the geological primitives is increased where they are found independently in both the DEM and remotely sensed data. The dip and strike values derived in this way are compared to information taken from the published geological map for this area, as well as measurements taken in the field. Many results are shown to correspond closely to those taken from the map and in the field, with an error of < 1°. These data and rules are incorporated into an expert system which, initially, produces a simple model of the geological structure. The system also provides a graphical user interface for manual control and interpretation, where necessary. Although the system currently only allows a relatively simple structural model (linearly dipping with faulting), in the future it will be possible to extend the system to model more complex features, such as anticlines, synclines, thrusts, nappes, and igneous intrusions.

  13. Airborne remote sensing for geology and the environment; present and future

    USGS Publications Warehouse

    Watson, Ken; Knepper, Daniel H.

    1994-01-01

    In 1988, a group of leading experts from government, academia, and industry attended a workshop on airborne remote sensing sponsored by the U.S. Geological Survey (USGS) and hosted by the Branch of Geophysics. The purpose of the workshop was to examine the scientific rationale for airborne remote sensing in support of government earth science in the next decade. This report has arranged the six resulting working-group reports under two main headings: (1) Geologic Remote Sensing, for the reports on geologic mapping, mineral resources, and fossil fuels and geothermal resources; and (2) Environmental Remote Sensing, for the reports on environmental geology, geologic hazards, and water resources. The intent of the workshop was to provide an evaluation of demonstrated capabilities, their direct extensions, and possible future applications, and this was the organizational format used for the geologic remote sensing reports. The working groups in environmental remote sensing chose to present their reports in a somewhat modified version of this format. A final section examines future advances and limitations in the field. There is a large, complex, and often bewildering array of remote sensing data available. Early remote sensing studies were based on data collected from airborne platforms. Much of that technology was later extended to satellites. The original 80-m-resolution Landsat Multispectral Scanner System (MSS) has now been largely superseded by the 30-m-resolution Thematic Mapper (TM) system that has additional spectral channels. The French satellite SPOT provides higher spatial resolution for channels equivalent to MSS. Low-resolution (1 km) data are available from the National Oceanographic and Atmospheric Administration's AVHRR system, which acquires reflectance and day and night thermal data daily. Several experimental satellites have acquired limited data, and there are extensive plans for future satellites including those of Japan (JERS), Europe (ESA), Canada (Radarsat), and the United States (EOS). There are currently two national airborne remote sensing programs (photography, radar) with data archived at the USGS' EROS Data Center. Airborne broadband multispectral data (comparable to Landsat MSS and TM but involving several more channels) for limited geographic areas also are available for digital processing and analysis. Narrow-band imaging spectrometer data are available for some NASA experiment sites and can be acquired for other locations commercially. Remote sensing data and derivative images, because of the uniform spatial coverage, availability at different resolutions, and digital format, are becoming important data sets for geographic information system (GIS) analyses. Examples range from overlaying digitized geologic maps on remote sensing images and draping these over topography, to maps of mineral distribution and inferred abundance. A large variety of remote sensing data sets are available, with costs ranging from a few dollars per square mile for satellite digital data to a few hundred dollars per square mile for airborne imaging spectrometry. Computer processing and analysis costs routinely surpass these expenses because of the equipment and expertise necessary for information extraction and interpretation. Effective use requires both an understanding of the current methodology and an appreciation of the most cost-effective solution.

  14. U.S. Geological Survey groundwater toolbox, a graphical and mapping interface for analysis of hydrologic data (version 1.0): user guide for estimation of base flow, runoff, and groundwater recharge from streamflow data

    USGS Publications Warehouse

    Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark

    2015-01-01

    This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.

  15. Developing Connectivist Schemas for Geological and Geomorphological Education

    NASA Astrophysics Data System (ADS)

    Whalley, B.

    2012-12-01

    Teaching geology is difficult; students need to grasp changes in time over three dimensions. Furthermore, the scales and rates of change in four dimensions may vary over several orders of magnitude. Geological explanations incorporate ideas from physics, chemistry, biology and engineering, lectures and textbooks provide a basic framework but they need to be amplified by laboratories and fieldwork involving active student participation and engagement. Being shown named 'things' is only a start to being able to being able to inculcate geological thinking that requires a wide and focused viewpoints. Kastens and Ishikawa (2006) suggested five aspects of thinking geologically, summarised as: 1. Observing, describing, recording, communicating geologically entities (ie basic cognitive skills) 2. (mentally) manipulating these entities 3. interpreting them via causal relationships 4. predicting other aspects using the basic knowledge (to create new knowledge) 5. using cognitive strategies to develop new ways of interpreting gained knowledge. These steps can be used follow the sequence from 'known' through 'need to know' to using knowledge to gain better geologic explanation, taken as enquiry-based or problem solving modes of education. These follow ideas from Dewey though Sternberg's 'thinking styles' and Siemens' connectivist approaches. Implementation of this basic schema needs to be structured for students in a complex geological world in line with Edelson's (2006) 'learning for' framework. In a geomorphological setting, this has been done by showing students how to interpret a landscape (landform, section etc) practice their skills and thus gain confidence with a tutor at hand. A web-based device, 'Virtorial' provides scenarios for students to practice interpretation (or even be assessed with). A cognitive tool is provided for landscape interpretation by division into the recognition of 'Materials' (rock, sediments etc), Processes (slope, glacial processes etc) and 'Geometry' (what it looks like). These components provide basic metadata for any landform in a landscape. Thus, the recognition of a landform means much more than a feature; the metadata provide contexts that can be used for interpretation in the field or laboratory, individually or in discussion groups, distance or field learning environments.

  16. Application of thematic mapper-type data over a porphyry-molybdenum deposit in Colorado

    NASA Technical Reports Server (NTRS)

    Rickman, D. L.; Sadowski, R. M.

    1983-01-01

    The objective of the study was to evaluate the utility of thematic mapper data as a source of geologically useful information for mountainous areas of varying vegetation density. Much of the processing was done in an a priori manner without prior ground-based information. This approach resulted in a successfull mapping of the alteration associated with the Mt. Emmons molybdenum ore body as well as several other hydrothermal systems. Supervised classification produced a vegetation map at least as accurate as the mapping done for the environmental impact statement. Principal components were used to map zones of general, subtle alteration and to separate hematitically stained rock from staining associated with hydrothermal activity. Decorrelation color composites were found to be useful field mapping aids, easily delineating many lithologies and vegetation classes of interest. The factors restricting the interpretability and computer manipulation of the data are examined.

  17. PRMS-IV, the precipitation-runoff modeling system, version 4

    USGS Publications Warehouse

    Markstrom, Steven L.; Regan, R. Steve; Hay, Lauren E.; Viger, Roland J.; Webb, Richard M.; Payn, Robert A.; LaFontaine, Jacob H.

    2015-01-01

    Computer models that simulate the hydrologic cycle at a watershed scale facilitate assessment of variability in climate, biota, geology, and human activities on water availability and flow. This report describes an updated version of the Precipitation-Runoff Modeling System. The Precipitation-Runoff Modeling System is a deterministic, distributed-parameter, physical-process-based modeling system developed to evaluate the response of various combinations of climate and land use on streamflow and general watershed hydrology. Several new model components were developed, and all existing components were updated, to enhance performance and supportability. This report describes the history, application, concepts, organization, and mathematical formulation of the Precipitation-Runoff Modeling System and its model components. This updated version provides improvements in (1) system flexibility for integrated science, (2) verification of conservation of water during simulation, (3) methods for spatial distribution of climate boundary conditions, and (4) methods for simulation of soil-water flow and storage.

  18. Spatial database for the management of "urban geology" geothematic information: the case of Drama City, Greece

    NASA Astrophysics Data System (ADS)

    Pantelias, Eustathios; Zervakou, Alexandra D.; Tsombos, Panagiotis I.; Nikolakopoulos, Konstantinos G.

    2008-10-01

    The aggregation of population in big cities leads to the concentration of human activities, economic wealth, over consumption of natural resources and urban growth without planning and sustainable management. As a result, urban societies are exposed to various dangers and threats with economical, social, ecological - environmental impacts on the urban surroundings. Problems associated with urban development are related to their geological conditions and those of their surroundings, e.g. flooding, land subsidence, groundwater pollution, soil contamination, earthquakes, landslides, etc. For these reasons, no sustainable urban planning can be done without geological information support. The first systematic recording, codification and documentation of "urban geology" geothematic information in Greece is implemented by the Institute of Geological and Mineral Exploration (I.G.M.E.) in the frame of project "Collection, codification and documentation of geothematic information for urban and suburban areas in Greece - pilot applications". Through the implementation of this project, all geothematic information derived from geological mapping, geotechnical - geochemical - geophysical research and measurements in four pilot areas of Greece Drama (North Greece), Nafplio & Sparti (Peloponnesus) and Thrakomakedones (Attica) is stored and processed in specially designed geodatabases in GIS environment containing vector and raster data. For the specific GIS application ArcGIS Personal Geodatabase is used. Data is classified in geothematic layers, grouped in geothematic datasets (e.g. Topography, Geology - Tectonics, Submarine Geology, Technical Geology, Hydrogeology, Soils, Radioactive elements, etc) and being processed in order to produced multifunctional geothematic maps. All compiled data constitute the essential base for land use planning and environmental protection in specific urban areas. With the termination of the project the produced geodatabase and other digital data (thematic maps, DEMs) will be available to all, public or private sector, concerning geological environment in urban and suburban areas, being in charge of protection and improvement of natural and human made environment.

  19. Microbial facies distribution and its geological and geochemical controls at the Hanford 300 area

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Nelson, W.; Stegen, J.; Murray, C. J.; Arntzen, E.

    2015-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  20. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  1. Simplified methods for computing total sediment discharge with the modified Einstein procedure

    USGS Publications Warehouse

    Colby, Bruce R.; Hubbell, David Wellington

    1961-01-01

    A procedure was presented in 1950 by H. A. Einstein for computing the total discharge of sediment particles of sizes that are in appreciable quantities in the stream bed. This procedure was modified by the U.S. Geological Survey and adapted to computing the total sediment discharge of a stream on the basis of samples of bed sediment, depth-integrated samples of suspended sediment, streamflow measurements, and water temperature. This paper gives simplified methods for computing total sediment discharge by the modified Einstein procedure. Each of four homographs appreciably simplifies a major step in the computations. Within the stated limitations, use of the homographs introduces much less error than is present in either the basic data or the theories on which the computations of total sediment discharge are based. The results are nearly as accurate mathematically as those that could be obtained from the longer and more complex arithmetic and algebraic computations of the Einstein procedure.

  2. Proceedings of GeoTech 85: Personal computers in geology conference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-01-01

    This book presents the papers given at a conference which considered the use of microprocessors in the exploration of petroleum and natural gas deposits. Topics covered at the conference included seismic surveys, geochemistry, expert systems, artificial intelligence, data base management systems, a portable exploration work station, open pit planning on a microcomputer, well logging, fracture analysis, production scheduling of open pit mines, resistivity logging, and coal washability.

  3. Final Report for the ZERT Project: Basic Science of Retention Issues, Risk Assessment & Measurement, Monitoring and Verification for Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spangler, Lee; Cunningham, Alfred; Lageson, David

    2011-03-31

    ZERT has made major contributions to five main areas of sequestration science: improvement of computational tools; measurement and monitoring techniques to verify storage and track migration of CO{sub 2}; development of a comprehensive performance and risk assessment framework; fundamental geophysical, geochemical and hydrological investigations of CO{sub 2} storage; and investigate innovative, bio-based mitigation strategies.

  4. Metrics for comparing dynamic earthquake rupture simulations

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  5. Estimation of water table based on geomorphologic and geologic conditions using public database of geotechnical information over Japan

    NASA Astrophysics Data System (ADS)

    Koshigai, Masaru; Marui, Atsunao

    Water table provides important information for the evaluation of groundwater resource. Recently, the estimation of water table in wide area is required for effective evaluation of groundwater resources. However, evaluation process is met with difficulties due to technical and economic constraints. Regression analysis for the prediction of groundwater levels based on geomorphologic and geologic conditions is considered as a reliable tool for the estimation of water table of wide area. Data of groundwater levels were extracted from the public database of geotechnical information. It was observed that changes in groundwater level depend on climate conditions. It was also observed and confirmed that there exist variations of groundwater levels according to geomorphologic and geologic conditions. The objective variable of the regression analysis was groundwater level. And the explanatory variables were elevation and the dummy variable consisting of group number. The constructed regression formula was significant according to the determination coefficients and analysis of the variance. Therefore, combining the regression formula and mesh map, the statistical method to estimate the water table based on geomorphologic and geologic condition for the whole country could be established.

  6. Impacts of preferential flow on coastal groundwater-surface water interactions: The heterogeneous volcanic aquifer of Hawaii

    NASA Astrophysics Data System (ADS)

    Geng, X.; Kreyns, P.; Koneshloo, M.; Michael, H. A.

    2017-12-01

    Groundwater flow and salt transport processes are important for protection of coastal water resources and ecosystems. Geological heterogeneity has been recognized as a key factor affecting rates and patterns of groundwater flow and the evolution of subsurface salinity distributions in coastal aquifers. The hydrogeologic system of the volcanic Hawaiian Islands is characterized by lava flows that can form continuous, connected geologic structures in subsurface. Understanding the role of geological heterogeneity in aquifer salinization and water exchange between aquifers and the ocean is essential for effective assessment and management of water resources in the Hawaii islands. In this study, surface-based geostatistical techniques were adopted to generate geologically-realistic, statistically equivalent model realizations of the hydrogeologic system on the Big Island of Hawaii. The density-dependent groundwater flow and solute transport code SEAWAT was used to perform 3D simulations to investigate subsurface flow and salt transport through these random realizations. Flux across the aquifer-ocean interface, aquifer salinization, and groundwater flow pathways and associated transit times were quantified. Numerical simulations of groundwater pumping at various positions in the aquifers were also conducted, and associated impacts on saltwater intrusion rates were evaluated. Results indicate the impacts of continuous geologic features on large-scale groundwater processes in coastal aquifers.

  7. The Virtual Geophysics Laboratory (VGL): Scientific Workflows Operating Across Organizations and Across Infrastructures

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Wyborn, L. A.; Fraser, R.; Rankine, T.; Woodcock, R.; Vote, J.; Evans, B.

    2012-12-01

    The Virtual Geophysics Laboratory (VGL) is web portal that provides geoscientists with an integrated online environment that: seamlessly accesses geophysical and geoscience data services from the AuScope national geoscience information infrastructure; loosely couples these data to a variety of gesocience software tools; and provides large scale processing facilities via cloud computing. VGL is a collaboration between CSIRO, Geoscience Australia, National Computational Infrastructure, Monash University, Australian National University and the University of Queensland. The VGL provides a distributed system whereby a user can enter an online virtual laboratory to seamlessly connect to OGC web services for geoscience data. The data is supplied in open standards formats using international standards like GeoSciML. A VGL user uses a web mapping interface to discover and filter the data sources using spatial and attribute filters to define a subset. Once the data is selected the user is not required to download the data. VGL collates the service query information for later in the processing workflow where it will be staged directly to the computing facilities. The combination of deferring data download and access to Cloud computing enables VGL users to access their data at higher resolutions and to undertake larger scale inversions, more complex models and simulations than their own local computing facilities might allow. Inside the Virtual Geophysics Laboratory, the user has access to a library of existing models, complete with exemplar workflows for specific scientific problems based on those models. For example, the user can load a geological model published by Geoscience Australia, apply a basic deformation workflow provided by a CSIRO scientist, and have it run in a scientific code from Monash. Finally the user can publish these results to share with a colleague or cite in a paper. This opens new opportunities for access and collaboration as all the resources (models, code, data, processing) are shared in the one virtual laboratory. VGL provides end users with access to an intuitive, user-centered interface that leverages cloud storage and cloud and cluster processing from both the research communities and commercial suppliers (e.g. Amazon). As the underlying data and information services are agnostic of the scientific domain, they can support many other data types. This fundamental characteristic results in a highly reusable virtual laboratory infrastructure that could also be used for example natural hazards, satellite processing, soil geochemistry, climate modeling, agriculture crop modeling.

  8. Analysis of the Source Physics Experiment SPE4 Prime Using State-Of Parallel Numerical Tools.

    NASA Astrophysics Data System (ADS)

    Vorobiev, O.; Ezzedine, S. M.; Antoun, T.; Glenn, L.

    2015-12-01

    This work describes a methodology used for large scale modeling of wave propagation from underground chemical explosions conducted at the Nevada National Security Site (NNSS) fractured granitic rock. We show that the discrete natures of rock masses as well as the spatial variability of the fabric of rock properties are very important to understand ground motions induced by underground explosions. In order to build a credible conceptual model of the subsurface we integrated the geological, geomechanical and geophysical characterizations conducted during recent test at the NNSS as well as historical data from the characterization during the underground nuclear test conducted at the NNSS. Because detailed site characterization is limited, expensive and, in some instances, impossible we have numerically investigated the effects of the characterization gaps on the overall response of the system. We performed several computational studies to identify the key important geologic features specific to fractured media mainly the joints characterized at the NNSS. We have also explored common key features to both geological environments such as saturation and topography and assess which characteristics affect the most the ground motion in the near-field and in the far-field. Stochastic representation of these features based on the field characterizations has been implemented into LLNL's Geodyn-L hydrocode. Simulations were used to guide site characterization efforts in order to provide the essential data to the modeling community. We validate our computational results by comparing the measured and computed ground motion at various ranges for the recently executed SPE4 prime experiment. We have also conducted a comparative study between SPE4 prime and previous experiments SPE1 and SPE3 to assess similarities and differences and draw conclusions on designing SPE5.

  9. GO2OGS 1.0: a versatile workflow to integrate complex geological information with fault data into numerical simulation models

    NASA Astrophysics Data System (ADS)

    Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.

    2015-11-01

    We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.

  10. Floods in Central Texas, September 7-14, 2010

    USGS Publications Warehouse

    Winters, Karl E.

    2012-01-01

    Severe flooding occurred near the Austin metropolitan area in central Texas September 7–14, 2010, because of heavy rainfall associated with Tropical Storm Hermine. The U.S. Geological Survey, in cooperation with the Upper Brushy Creek Water Control and Improvement District, determined rainfall amounts and annual exceedance probabilities for rainfall resulting in flooding in Bell, Williamson, and Travis counties in central Texas during September 2010. We documented peak streamflows and the annual exceedance probabilities for peak streamflows recorded at several streamflow-gaging stations in the study area. The 24-hour rainfall total exceeded 12 inches at some locations, with one report of 14.57 inches at Lake Georgetown. Rainfall probabilities were estimated using previously published depth-duration frequency maps for Texas. At 4 sites in Williamson County, the 24-hour rainfall had an annual exceedance probability of 0.002. Streamflow measurement data and flood-peak data from U.S. Geological Survey surface-water monitoring stations (streamflow and reservoir gaging stations) are presented, along with a comparison of September 2010 flood peaks to previous known maximums in the periods of record. Annual exceedance probabilities for peak streamflow were computed for 20 streamflow-gaging stations based on an analysis of streamflow-gaging station records. The annual exceedance probability was 0.03 for the September 2010 peak streamflow at the Geological Survey's streamflow-gaging stations 08104700 North Fork San Gabriel River near Georgetown, Texas, and 08154700 Bull Creek at Loop 360 near Austin, Texas. The annual exceedance probability was 0.02 for the peak streamflow for Geological Survey's streamflow-gaging station 08104500 Little River near Little River, Texas. The lack of similarity in the annual exceedance probabilities computed for precipitation and streamflow might be attributed to the small areal extent of the heaviest rainfall over these and the other gaged watersheds.

  11. How Geoscience Novices Reason about Temporal Duration: The Role of Spatial Thinking and Large Numbers

    ERIC Educational Resources Information Center

    Cheek, Kim A.

    2013-01-01

    Research about geologic time conceptions generally focuses on the placement of events on the geologic timescale, with few studies dealing with the duration of geologic processes or events. Those studies indicate that students often have very poor conceptions about temporal durations of geologic processes, but the reasons for that are relatively…

  12. 30 CFR 251.11 - Submission, inspection, and selection of geological data and information collected under a permit...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... geological data and information collected under a permit and processed by permittees or third parties. 251.11... GEOLOGICAL AND GEOPHYSICAL (G&G) EXPLORATIONS OF THE OUTER CONTINENTAL SHELF § 251.11 Submission, inspection, and selection of geological data and information collected under a permit and processed by permittees...

  13. Three-dimensional representations of salt-dome margins at four active strategic petroleum reserve sites.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rautman, Christopher Arthur; Stein, Joshua S.

    2003-01-01

    Existing paper-based site characterization models of salt domes at the four active U.S. Strategic Petroleum Reserve sites have been converted to digital format and visualized using modern computer software. The four sites are the Bayou Choctaw dome in Iberville Parish, Louisiana; the Big Hill dome in Jefferson County, Texas; the Bryan Mound dome in Brazoria County, Texas; and the West Hackberry dome in Cameron Parish, Louisiana. A new modeling algorithm has been developed to overcome limitations of many standard geological modeling software packages in order to deal with structurally overhanging salt margins that are typical of many salt domes. Thismore » algorithm, and the implementing computer program, make use of the existing interpretive modeling conducted manually using professional geological judgement and presented in two dimensions in the original site characterization reports as structure contour maps on the top of salt. The algorithm makes use of concepts of finite-element meshes of general engineering usage. Although the specific implementation of the algorithm described in this report and the resulting output files are tailored to the modeling and visualization software used to construct the figures contained herein, the algorithm itself is generic and other implementations and output formats are possible. The graphical visualizations of the salt domes at the four Strategic Petroleum Reserve sites are believed to be major improvements over the previously available two-dimensional representations of the domes via conventional geologic drawings (cross sections and contour maps). Additionally, the numerical mesh files produced by this modeling activity are available for import into and display by other software routines. The mesh data are not explicitly tabulated in this report; however an electronic version in simple ASCII format is included on a PC-based compact disk.« less

  14. Big Data Processing for a Central Texas Groundwater Case Study

    NASA Astrophysics Data System (ADS)

    Cantu, A.; Rivera, O.; Martínez, A.; Lewis, D. H.; Gentle, J. N., Jr.; Fuentes, G.; Pierce, S. A.

    2016-12-01

    As computational methods improve, scientists are able to expand the level and scale of experimental simulation and testing that is completed for case studies. This study presents a comparative analysis of multiple models for the Barton Springs segment of the Edwards aquifer. Several numerical simulations using state-mandated MODFLOW models ran on Stampede, a High Performance Computing system housed at the Texas Advanced Computing Center, were performed for multiple scenario testing. One goal of this multidisciplinary project aims to visualize and compare the output data of the groundwater model using the statistical programming language R to find revealing data patterns produced by different pumping scenarios. Presenting data in a friendly post-processing format is covered in this paper. Visualization of the data and creating workflows applicable to the management of the data are tasks performed after data extraction. Resulting analyses provide an example of how supercomputing can be used to accelerate evaluation of scientific uncertainty and geological knowledge in relation to policy and management decisions. Understanding the aquifer behavior helps policy makers avoid negative impact on the endangered species, environmental services and aids in maximizing the aquifer yield.

  15. Beyond data collection in digital mapping: interpretation, sketching and thought process elements in geological map making

    NASA Astrophysics Data System (ADS)

    Watkins, Hannah; Bond, Clare; Butler, Rob

    2016-04-01

    Geological mapping techniques have advanced significantly in recent years from paper fieldslips to Toughbook, smartphone and tablet mapping; but how do the methods used to create a geological map affect the thought processes that result in the final map interpretation? Geological maps have many key roles in the field of geosciences including understanding geological processes and geometries in 3D, interpreting geological histories and understanding stratigraphic relationships in 2D and 3D. Here we consider the impact of the methods used to create a map on the thought processes that result in the final geological map interpretation. As mapping technology has advanced in recent years, the way in which we produce geological maps has also changed. Traditional geological mapping is undertaken using paper fieldslips, pencils and compass clinometers. The map interpretation evolves through time as data is collected. This interpretive process that results in the final geological map is often supported by recording in a field notebook, observations, ideas and alternative geological models explored with the use of sketches and evolutionary diagrams. In combination the field map and notebook can be used to challenge the map interpretation and consider its uncertainties. These uncertainties and the balance of data to interpretation are often lost in the creation of published 'fair' copy geological maps. The advent of Toughbooks, smartphones and tablets in the production of geological maps has changed the process of map creation. Digital data collection, particularly through the use of inbuilt gyrometers in phones and tablets, has changed smartphones into geological mapping tools that can be used to collect lots of geological data quickly. With GPS functionality this data is also geospatially located, assuming good GPS connectivity, and can be linked to georeferenced infield photography. In contrast line drawing, for example for lithological boundary interpretation and sketching, is yet to find the digital flow that is achieved with pencil on notebook page or map. Free-form integrated sketching and notebook functionality in geological mapping software packages is in its nascence. Hence, the result is a tendency for digital geological mapping to focus on the ease of data collection rather than on the thoughts and careful observations that come from notebook sketching and interpreting boundaries on a map in the field. The final digital geological map can be assessed for when and where data was recorded, but the thought processes of the mapper are less easily assessed, and the use of observations and sketching to generate ideas and interpretations maybe inhibited by reliance on digital mapping methods. All mapping methods used have their own distinct advantages and disadvantages and with more recent technologies both hardware and software issues have arisen. We present field examples of using conventional fieldslip mapping, and compare these with more advanced technologies to highlight some of the main advantages and disadvantages of each method and discuss where geological mapping may be going in the future.

  16. Mapping process and age of Quaternary deposits on Santa Rosa Island, Channel Islands National Park, California

    NASA Astrophysics Data System (ADS)

    Schmidt, K. M.; Minor, S. A.; Bedford, D.

    2016-12-01

    Employing a geomorphic process-age classification scheme, we mapped the Quaternary surficial geology of Santa Rosa (SRI) within the Channel Islands National Park. This detailed (1:12,000 scale) map represents upland erosional transport processes and alluvial, fluvial, eolian, beach, marine terrace, mass wasting, and mixed depositional processes. Mapping was motivated through an agreement with the National Park Service and is intended to aid natural resource assessments, including post-grazing disturbance recovery and identification of mass wasting and tectonic hazards. We obtained numerous detailed geologic field observations, fossils for faunal identification as age control, and materials for numeric dating. This GPS-located field information provides ground truth for delineating map units and faults using GIS-based datasets- high-resolution (sub-meter) aerial imagery, LiDAR-based DEMs and derivative raster products. Mapped geologic units denote surface processes and Quaternary faults constrain deformation kinematics and rates, which inform models of landscape change. Significant findings include: 1) Flights of older Pleistocene (>120 ka) and possibly Pliocene marine terraces were identified beneath younger alluvial and eolian deposits at elevations as much as 275 m above modern sea level. Such elevated terraces suggest that SRI was a smaller, more submerged island in the late Neogene and (or) early Pleistocene prior to tectonic uplift. 2) Structural and geomorphic observations made along the potentially seismogenic SRI fault indicate a protracted slip history during the late Neogene and Quaternary involving early normal slip, later strike slip, and recent reverse slip. These changes in slip mode explain a marked contrast in island physiography across the fault. 3) Many of the steeper slopes are dramatically stripped of regolith, with exposed bedrock and deeply incised gullies, presumably due effects related to past grazing practices. 4) Surface water presence is spatially discontinuous and correlated with major fault traces and geologic unit boundaries.

  17. The role of the U.S. Geological Survey in the lithium industry

    USGS Publications Warehouse

    Vine, J.D.

    1978-01-01

    The U.S. Geological Survey has responsibility in the U.S. Department of the Interior to assess the nation's energy and mineral resources. The evaluation of reserves and resources of a commodity such as lithium should be a continuing process in the light of advancing technology and ever-growing knowledge of its geologic occurrence and geochemical behavior. Although reserves of lithium vary with market demand because of the investment required to find, develop, and appraise an ore body, total resources are a function of the geologic occurrence and geochemical behavior of lithium. By studying known deposits and publishing data on their origin and occurrence, the U.S. Geological Survey can aid in the discovery of new deposits and improve the resource base. Resource data are used both by the government and the private sector. Government funding for research on energy-related technologies such as electric vehicle batteries and fusion power requires assurance that there will be enough lithium available in time for commercialization. Questions of availability for all mineral commodities must be answered by the U.S. Geological Survey so that intelligent decisions can be made. ?? 1978.

  18. Spatial association between dissection density and environmental factors over the entire conterminous United States

    NASA Astrophysics Data System (ADS)

    Luo, Wei; Jasiewicz, Jaroslaw; Stepinski, Tomasz; Wang, Jinfeng; Xu, Chengdong; Cang, Xuezhi

    2016-01-01

    Previous studies of land dissection density (D) often find contradictory results regarding factors controlling its spatial variation. We hypothesize that the dominant controlling factors (and the interactions between them) vary from region to region due to differences in each region's local characteristics and geologic history. We test this hypothesis by applying a geographical detector method to eight physiographic divisions of the conterminous United States and identify the dominant factor(s) in each. The geographical detector method computes the power of determinant (q) that quantitatively measures the affinity between the factor considered and D. Results show that the factor (or factor combination) with the largest q value is different for physiographic regions with different characteristics and geologic histories. For example, lithology dominates in mountainous regions, curvature dominates in plains, and glaciation dominates in previously glaciated areas. The geographical detector method offers an objective framework for revealing factors controlling Earth surface processes.

  19. Interdisciplinary applications and interpretations of ERTS data within the Susquehanna River Basin (resource inventory, land use, and pollution)

    NASA Technical Reports Server (NTRS)

    Mcmurtry, G. J.; Petersen, G. W. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. An interdisciplinary group at Penn State University is analyzing ERTS-1 data. The geographical area of interest is that of the Susquehanna River Basin in Pennsylvania. The objectives of the work have been to ascertain the usefulness of ERTS-1 data in the areas of natural resources and land use inventory, geology and hydrology, and environmental quality. Specific results include a study of land use in the Harrisburg area, discrimination between types of forest resources and vegetation, detection of previously unknown geologic faults and correlation of these with known mineral deposits and ground water, mapping of mine spoils in the anthracite region of eastern Pennsylvania, and mapping of strip mines and acid mine drainage in central Pennsylvania. Both photointerpretive techniques and automatic computer processing methods have been developed and used, separately and in a combined approach.

  20. The Role of Geologic Mapping in NASA PDSI Planning

    NASA Astrophysics Data System (ADS)

    Williams, D. A.; Skinner, J. A.; Radebaugh, J.

    2017-12-01

    Geologic mapping is an investigative process designed to derive the geologic history of planetary objects at local, regional, hemispheric or global scales. Geologic maps are critical products that aid future exploration by robotic spacecraft or human missions, support resource exploration, and provide context for and help guide scientific discovery. Creation of these tools, however, can be challenging in that, relative to their terrestrial counterparts, non-terrestrial planetary geologic maps lack expansive field-based observations. They rely, instead, on integrating diverse data types wth a range of spatial scales and areal coverage. These facilitate establishment of geomorphic and geologic context but are generally limited with respect to identifying outcrop-scale textural details and resolving temporal and spatial changes in depositional environments. As a result, planetary maps should be prepared with clearly defined contact and unit descriptions as well as a range of potential interpretations. Today geologic maps can be made from images obtained during the traverses of the Mars rovers, and for every new planetary object visited by NASA orbital or flyby spacecraft (e.g., Vesta, Ceres, Titan, Enceladus, Pluto). As Solar System Exploration develops and as NASA prepares to send astronauts back to the Moon and on to Mars, the importance of geologic mapping will increase. In this presentation, we will discuss the past role of geologic mapping in NASA's planetary science activities and our thoughts on the role geologic mapping will have in exploration in the coming decades. Challenges that planetary mapping must address include, among others: 1) determine the geologic framework of all Solar System bodies through the systematic development of geologic maps at appropriate scales, 2) develop digital Geographic Information Systems (GIS)-based mapping techniques and standards to assist with communicating map information to the scientific community and public, 3) develop public awareness of the role and application of geologic map-information to the resolution of national issues relevant to planetary science and eventual off-planet resource assessments, 4) use topical science to drive mapping in areas likely to be determined vital to the welfare of endeavors related to planetary science and exploration.

  1. A review of applications to constrain pumping test responses to improve on geological description and uncertainty

    NASA Astrophysics Data System (ADS)

    Raghavan, Rajagopal

    2004-12-01

    This review examines the single-phase flow of fluids to wells in heterogeneous porous media and explores procedures to evaluate pumping test or pressure-response curves. This paper examines how these curves may be used to improve descriptions of reservoir properties obtained from geology, geophysics, core analysis, outcrop measurements, and rock physics. We begin our discussion with a summary of the classical attempts to handle the issue of heterogeneity in well test analysis. We then review more recent advances concerning the evaluation of conductivity or permeability in terms of statistical variables and touch on perturbation techniques. Our current view to address the issue of heterogeneity by pumping tests may be simply summarized as follows. We assume a three-dimensional array (ordered set) of values for the properties of the porous medium as a function of the coordinates that is obtained as a result of measurements and interpretations. We presume that this array of values contains all relevant information available from prior geological and geophysical interpretations, core and outcrop measurements, and rock physics. These arrays consist of several million values of properties, and the information available is usually on a very fine scale (often <0.5 m in the vertical direction); for convenience, we refer to these as cell values. The properties are assumed to be constant over the volume of each of these cells; that is, the support volume is the cell volume, and the cell volumes define the geologic scale. In this view it is implicit that small-scale permeability affects the movement of fluids. Although more information than porosity is available, we refer to this system as a "porosity cube." Because it is not economically possible to carry out computations on a fine-scale model with modern resources on a routine basis, we discuss matters relating to the aggregation and scale-up of the fine-scale model from the perspective of testing and show that specific details need to be addressed. The focus is on single-phase flow. Addressing the issue of scale-up also permits us to comment on the application of the classical or analytical solutions to heterogeneous systems. The final part of the discussion outlines the inversion process and the adjustment of cell values to match observed performance. Because the computational scale and the scale of the porosity cube are different, we recommend that the inversion process incorporate adjustments at the fine scale. In this view the scale-up process becomes a part of the inversion algorithm.

  2. Computational Modeling of the Geologic Sequestration of Carbon Dioxide

    EPA Science Inventory

    Geologic sequestration of CO2 is a component of C capture and storage (CCS), an emerging technology for reducing CO2 emissions to the atmosphere, and involves injection of captured CO2 into deep subsurface formations. Similar to the injection of hazardous wastes, before injection...

  3. Department-Generated Microcomputer Software.

    ERIC Educational Resources Information Center

    Mantei, Erwin J.

    1986-01-01

    Explains how self-produced software can be used to perform rapid number analysis or number-crunching duties in geology classes. Reviews programs in mineralogy and petrology and identifies areas in geology where computers can be used effectively. Discusses the advantages and benefits of integrating department-generated software into a geology…

  4. From spatially variable streamflow to distributed hydrological models: Analysis of key modeling decisions

    NASA Astrophysics Data System (ADS)

    Fenicia, Fabrizio; Kavetski, Dmitri; Savenije, Hubert H. G.; Pfister, Laurent

    2016-02-01

    This paper explores the development and application of distributed hydrological models, focusing on the key decisions of how to discretize the landscape, which model structures to use in each landscape element, and how to link model parameters across multiple landscape elements. The case study considers the Attert catchment in Luxembourg—a 300 km2 mesoscale catchment with 10 nested subcatchments that exhibit clearly different streamflow dynamics. The research questions are investigated using conceptual models applied at hydrologic response unit (HRU) scales (1-4 HRUs) on 6 hourly time steps. Multiple model structures are hypothesized and implemented using the SUPERFLEX framework. Following calibration, space/time model transferability is tested using a split-sample approach, with evaluation criteria including streamflow prediction error metrics and hydrological signatures. Our results suggest that: (1) models using geology-based HRUs are more robust and capture the spatial variability of streamflow time series and signatures better than models using topography-based HRUs; this finding supports the hypothesis that, in the Attert, geology exerts a stronger control than topography on streamflow generation, (2) streamflow dynamics of different HRUs can be represented using distinct and remarkably simple model structures, which can be interpreted in terms of the perceived dominant hydrologic processes in each geology type, and (3) the same maximum root zone storage can be used across the three dominant geological units with no loss in model transferability; this finding suggests that the partitioning of water between streamflow and evaporation in the study area is largely independent of geology and can be used to improve model parsimony. The modeling methodology introduced in this study is general and can be used to advance our broader understanding and prediction of hydrological behavior, including the landscape characteristics that control hydrologic response, the dominant processes associated with different landscape types, and the spatial relations of catchment processes. This article was corrected on 14 MAR 2016. See the end of the full text for details.

  5. Catalog of Computer Programs Used in Undergraduate Geological Education.

    ERIC Educational Resources Information Center

    Burger, H. Robert

    1983-01-01

    Provides list of mineralogy, petrology, and geochemistry computer programs. Each entry includes a brief description, program name and language, availability of program listing, and source and/or reference. (JN)

  6. Titan's Xanadu region: Geomorphology and formation scenario

    NASA Astrophysics Data System (ADS)

    Langhans, Migrjam; Lunine, Jonathan I.; Mitri, Giuseppe

    2013-04-01

    Based on comprehensive mapping of the region, the recent theories of Xanadu's origin are examined and a chronology of geologic processes is proposed. The geologic history of Titan's Xanadu region is different from that of the other surface units on Saturn's moon. A previously proposed origin of western Xanadu from a giant impact in the early history of the moon is difficult to confirm given the scarcity of morphologic indications of an impact basin. The basic topographic structure of the landscape is controlled by tectonic processes that date back to the early history of Titan. More recently, the surface is intensely reworked and resurfaced by fluvial processes, which seem to have leveled out and compensated height differences. Although the surface age seems young at first view, the underlying processes that created this surface and the topographic structure appear to be ancient.

  7. Computer Program for Point Location And Calculation of ERror (PLACER)

    USGS Publications Warehouse

    Granato, Gregory E.

    1999-01-01

    A program designed for point location and calculation of error (PLACER) was developed as part of the Quality Assurance Program of the Federal Highway Administration/U.S. Geological Survey (USGS) National Data and Methodology Synthesis (NDAMS) review process. The program provides a standard method to derive study-site locations from site maps in highwayrunoff, urban-runoff, and other research reports. This report provides a guide for using PLACER, documents methods used to estimate study-site locations, documents the NDAMS Study-Site Locator Form, and documents the FORTRAN code used to implement the method. PLACER is a simple program that calculates the latitude and longitude coordinates of one or more study sites plotted on a published map and estimates the uncertainty of these calculated coordinates. PLACER calculates the latitude and longitude of each study site by interpolating between the coordinates of known features and the locations of study sites using any consistent, linear, user-defined coordinate system. This program will read data entered from the computer keyboard and(or) from a formatted text file, and will write the results to the computer screen and to a text file. PLACER is readily transferable to different computers and operating systems with few (if any) modifications because it is written in standard FORTRAN. PLACER can be used to calculate study site locations in latitude and longitude, using known map coordinates or features that are identifiable in geographic information data bases such as USGS Geographic Names Information System, which is available on the World Wide Web.

  8. Groundwater in geologic processes, 2nd edition

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Sanford, Ward E.; Neuzil, Christopher E.

    2006-01-01

    Interest in the role of Groundwater in Geologic Processes has increased steadily over the past few decades. Hydrogeologists and geologists are now actively exploring the role of groundwater and other subsurface fluids in such fundamental geologic processes as crustal heat transfer, ore deposition, hydrocarbon migration, earthquakes, tectonic deformation, diagenesis, and metamorphism.Groundwater in Geologic Processes is the first comprehensive treatment of this body of inquiry. Chapters 1 to 4 develop the basic theories of groundwater motion, hydromechanics, solute transport, and heat transport. Chapter 5 applies these theories to regional groundwater flow systems in a generic sense, and Chapters 6 to 13 focus on particular geologic processes and environments. Relative to the first edition of Groundwater in Geologic Processes , this second edition includes a much more comprehensive treatment of hydromechanics (the coupling of groundwater flow and deformation). It also includes new chapters on "compaction and diagenesis," "metamorphism," and "subsea hydrogeology." Finally, it takes advantage of the substantial body of published research that has appeared since the first edition in 1998. The systematic presentation of theory and application, and the problem sets that conclude each chapter, make this book ideal for undergraduate- and graduate-level geology courses (assuming that the students have some background in calculus and introductory chemistry). It also serves as an invaluable reference for researchers and other professionals in the field

  9. 3D numerical modelling of the thermal state of deep geological nuclear waste repositories

    NASA Astrophysics Data System (ADS)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, Yu. N.

    2017-09-01

    One of the important aspects of the high-level radioactive waste (HLW) disposal in deep geological repositories is ensuring the integrity of the engineered barriers which is, among other phenomena, considerably influenced by the thermal loads. As the HLW produce significant amount of heat, the design of the repository should maintain the balance between the cost-effectiveness of the construction and the sufficiency of the safety margins, including those imposed on the thermal conditions of the barriers. The 3D finite-element computer code FENIA was developed as a tool for simulation of thermal processes in deep geological repositories. Further the models for mechanical phenomena and groundwater hydraulics will be added resulting in a fully coupled thermo-hydro-mechanical (THM) solution. The long-term simulations of the thermal state were performed for two possible layouts of the repository. One was based on the proposed project of Russian repository, and another features larger HLW amount within the same space. The obtained results describe the spatial and temporal evolution of the temperature filed inside the repository and in the surrounding rock for 3500 years. These results show that practically all generated heat was ultimately absorbed by the host rock without any significant temperature increase. Still in the short time span even in case of smaller amount of the HLW the temperature maximum exceeds 100 °C, and for larger amount of the HLW the local temperature remains above 100 °C for considerable time. Thus, the substantiation of the long-term stability of the repository would require an extensive study of the materials properties and behaviour in order to remove the excessive conservatism from the simulations and to reduce the uncertainty of the input data.

  10. U.S. Geological Survey Groundwater Modeling Software: Making Sense of a Complex Natural Resource

    USGS Publications Warehouse

    Provost, Alden M.; Reilly, Thomas E.; Harbaugh, Arlen W.; Pollock, David W.

    2009-01-01

    Computer models of groundwater systems simulate the flow of groundwater, including water levels, and the transport of chemical constituents and thermal energy. Groundwater models afford hydrologists a framework on which to organize their knowledge and understanding of groundwater systems, and they provide insights water-resources managers need to plan effectively for future water demands. Building on decades of experience, the U.S. Geological Survey (USGS) continues to lead in the development and application of computer software that allows groundwater models to address scientific and management questions of increasing complexity.

  11. Information management and analysis system for groundwater data in Thailand

    NASA Astrophysics Data System (ADS)

    Gill, D.; Luckananurung, P.

    1992-01-01

    The Ground Water Division of the Thai Department of Mineral Resources maintains a large archive of groundwater data with information on some 50,000 water wells. Each well file contains information on well location, well completion, borehole geology, water levels, water quality, and pumping tests. In order to enable efficient use of this information a computer-based system for information management and analysis was created. The project was sponsored by the United Nations Development Program and the Thai Department of Mineral Resources. The system was designed to serve users who lack prior training in automated data processing. Access is through a friendly user/system dialogue. Tasks are segmented into a number of logical steps, each of which is managed by a separate screen. Selective retrieval is possible by four different methods of area definition and by compliance with user-specified constraints on any combination of database variables. The main types of outputs are: (1) files of retrieved data, screened according to users' specifications; (2) an assortment of pre-formatted reports; (3) computed geochemical parameters and various diagrams of water chemistry derived therefrom; (4) bivariate scatter diagrams and linear regression analysis; (5) posting of data and computed results on maps; and (6) hydraulic aquifer characteristics as computed from pumping tests. Data are entered directly from formatted screens. Most records can be copied directly from hand-written documents. The database-management program performs data integrity checks in real time, enabling corrections at the time of input. The system software can be grouped into: (1) database administration and maintenance—these functions are carried out by the SIR/DBMS software package; (2) user communication interface for task definition and execution control—the interface is written in the operating system command language (VMS/DCL) and in FORTRAN 77; and (3) scientific data-processing programs, written in FORTRAN 77. The system was implemented on a DEC MicroVAX II computer.

  12. Geologic interpretation and multibeam bathymetry of the sea floor in the vicinity of the Race, eastern Long Island Sound

    USGS Publications Warehouse

    Poppe, L.J.; DiGiacomo-Cohen, M. L.; Doran, E.F.; Smith, S.M.; Stewart, H.F.; Forfinski, N.A.

    2007-01-01

    Digital terrain models (DTMs) produced from multibeam bathymetric data provide valuable base maps for marine geological interpretations (Todd and others, 1999; Mosher and Thomson, 2002; ten Brink and others, 2004; Poppe and others, 2006a, b, c, d). These maps help define the geological variability of the sea floor (one of the primary controls of benthic habitat diversity), improve our understanding of the processes that control the distribution and transport of bottom sediments and the distribution of benthic habitats and associated infaunal community structures, and provide a detailed framework for future research, monitoring, and management activities. The bathymetric survey interpreted herein (National Oceanic and Atmospheric Administration (NOAA) survey H11250) covers roughly 94 km² of sea floor in an area where a depression along the Orient Point-Fishers Island segment of the Harbor Hill-Roanoke Point-Charlestown Moraine forms the Race, the eastern opening to Long Island Sound. The Race also divides easternmost Long Island Sound from northwestern Block Island Sound (fig. 1). This bathymetry has been examined in relation to seismic reflection data collected concurrently, as well as archived seismic profiles acquired as part of a long-standing geologic mapping partnership between the State of Connecticut and the U.S. Geological Survey (USGS). The objective of this work was to use these acoustic data sets to interpret geomorphological attributes of the sea floor, and to use these interpretations to better understand the Quaternary geologic history and modern sedimentary processes.

  13. Pore-scale modeling of capillary trapping in water-wet porous media: A new cooperative pore-body filling model

    NASA Astrophysics Data System (ADS)

    Ruspini, L. C.; Farokhpoor, R.; Øren, P. E.

    2017-10-01

    We present a pore-network model study of capillary trapping in water-wet porous media. The amount and distribution of trapped non-wetting phase is determined by the competition between two trapping mechanisms - snap-off and cooperative pore-body filling. We develop a new model to describe the pore-body filling mechanism in geologically realistic pore-networks. The model accounts for the geometrical characteristics of the pore, the spatial location of the connecting throats and the local fluid topology at the time of the displacement. We validate the model by comparing computed capillary trapping curves with published data for four different water-wet rocks. Computations are performed on pore-networks extracted from micro-CT images and process-based reconstructions of the actual rocks used in the experiments. Compared with commonly used stochastic models, the new model describes more accurately the experimental measurements, especially for well connected porous systems where trapping is controlled by subtleties of the pore structure. The new model successfully predicts relative permeabilities and residual saturation for Bentheimer sandstone using in-situ measured contact angles as input to the simulations. The simulated trapped cluster size distributions are compared with predictions from percolation theory.

  14. TopoLens: Building a cyberGIS community data service for enhancing the usability of high-resolution National Topographic datasets

    USGS Publications Warehouse

    Hu, Hao; Hong, Xingchen; Terstriep, Jeff; Liu, Yan; Finn, Michael P.; Rush, Johnathan; Wendel, Jeffrey; Wang, Shaowen

    2016-01-01

    Geospatial data, often embedded with geographic references, are important to many application and science domains, and represent a major type of big data. The increased volume and diversity of geospatial data have caused serious usability issues for researchers in various scientific domains, which call for innovative cyberGIS solutions. To address these issues, this paper describes a cyberGIS community data service framework to facilitate geospatial big data access, processing, and sharing based on a hybrid supercomputer architecture. Through the collaboration between the CyberGIS Center at the University of Illinois at Urbana-Champaign (UIUC) and the U.S. Geological Survey (USGS), a community data service for accessing, customizing, and sharing digital elevation model (DEM) and its derived datasets from the 10-meter national elevation dataset, namely TopoLens, is created to demonstrate the workflow integration of geospatial big data sources, computation, analysis needed for customizing the original dataset for end user needs, and a friendly online user environment. TopoLens provides online access to precomputed and on-demand computed high-resolution elevation data by exploiting the ROGER supercomputer. The usability of this prototype service has been acknowledged in community evaluation.

  15. Mars: the evolutionary history of the northern lowlands based on crater counting and geologic mapping

    USGS Publications Warehouse

    Werner, S.C.; Tanaka, K.L.; Skinner, J.A.

    2011-01-01

    The geologic history of planetary surfaces is most effectively determined by joining geologic mapping and crater counting which provides an iterative, qualitative and quantitative method for defining relative ages and absolute model ages. Based on this approach, we present spatial and temporal details regarding the evolution of the Martian northern plains and surrounding regions. The highland–lowland boundary (HLB) formed during the pre-Noachian and was subsequently modified through various processes. The Nepenthes Mensae unit along the northern margins of the cratered highlands, was formed by HLB scarp-erosion, deposition of sedimentary and volcanic materials, and dissection by surface runoff between 3.81 and 3.65 Ga. Ages for giant polygons in Utopia and Acidalia Planitiae are ~ 3.75 Ga and likely reflect the age of buried basement rocks. These buried lowland surfaces are comparable in age to those located closer to the HLB, where a much thinner, post-HLB deposit is mapped. The emplacement of the most extensive lowland surfaces ended between 3.75 and 3.4 Ga, based on densities of craters generally View the MathML source> 3 km in diameter. Results from the polygonal terrain support the existence of a major lowland depocenter shortly after the pre-Noachian formation of the northern lowlands. In general, northern plains surfaces show gradually younger ages at lower elevations, consistent local to regional unit emplacement and resurfacing between 3.6 and 2.6 Ga. Elevation levels and morphology are not necessarily related, and variations in ages within the mapped units are found, especially in units formed and modified by multiple geological processes. Regardless, most of the youngest units in the northern lowlands are considered to be lavas, polar ice, or thick mantle deposits, arguing against the ocean theory during the Amazonian Period (younger than about 3.15 Ga). All ages measured in the closest vicinity of the steep dichotomy escarpment are also 3.7 Ga or older. The formation ages of volcanic flanks at the HLB (e.g., Alba Mons (3.6–3.4 Ga) and the last fan at Apollinaris Mons, 3.71 Ga) may give additional temporal constraint for the possible existence of any kind of Martian ocean before about 3.7 Ga. It seems to reflect the termination of a large-scale, precipitation-based hydrological cycle and major geologic processes related to such cycling.

  16. Impact of Geological Changes on Regional and Global Economies

    NASA Astrophysics Data System (ADS)

    Tatiana, Skufina; Peter, Skuf'in; Vera, Samarina; Taisiya, Shatalova; Baranov, Sergey

    2017-04-01

    Periods of geological changes such as super continent cycle (300-500 million years), Wilson's cycles (300-900 million years), magmatic-tectonic cycle (150-200 million years), and cycles with smaller periods (22, 100, 1000 years) lead to a basic contradiction preventing forming methodology of the study of impact of geological changes on the global and regional economies. The reason of this contradiction is the differences of theoretical and methodological aspects of the Earth science and economics such as different time scales and accuracy of geological changes. At the present the geological models cannot provide accurate estimation of time and place where geological changes (strong earthquakes, volcanos) are expected. Places of feature (not next) catastrophic events are the only thing we have known. Thus, it is impossible to use the periodicity to estimate both geological changes and their consequences. Taking into accounts these factors we suggested a collection of concepts for estimating impact of possible geological changes on regional and global economies. We illustrated our approach by example of estimating impact of Tohoku earthquake and tsunami of March 2011 on regional and global economies. Based on this example we concluded that globalization processes increase an impact of geological changes on regional and global levels. The research is supported by Russian Foundation for Basic Research (Projects No. 16-06-00056, 16-32-00019, 16-05-00263A).

  17. SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series

    USGS Publications Warehouse

    Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory

    2018-03-07

    This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.

  18. WATSTORE: National Water Data Storage and Retrieval System of the U. S. Geological Survey; user's guide

    USGS Publications Warehouse

    Hutchison, Norman E.

    1975-01-01

    with an IBM 370/155 computer. WATSTORE is now (1975) available to other Federal agencies and selected cooperators of the Geological Survey who acquire and(or) use water data. The WATSTORE User's Guide describes the systeb and how it operates.

  19. Ground-water recharge in the arid and semiarid southwestern United States

    USGS Publications Warehouse

    Stonestrom, David A.; Constantz, Jim; Ferré, Ty P.A.; Leake, Stanley A.

    2007-01-01

    Ground-water recharge in the arid and semiarid southwestern United States results from the complex interplay of climate, geology, and vegetation across widely ranging spatial and temporal scales. Present-day recharge tends to be narrowly focused in time and space. Widespread water-table declines accompanied agricultural development during the twentieth century, demonstrating that sustainable ground-water supplies are not guaranteed when part of the extracted resource represents paleorecharge. Climatic controls on ground-water recharge range from seasonal cycles of summer monsoonal and winter frontal storms to multimillennial cycles of glacial and interglacial periods. Precipitation patterns reflect global-scale interactions among the oceans, atmosphere, and continents. Large-scale climatic influences associated with El Niño and Pacific Decadal Oscillations strongly, but irregularly, control weather in the study area, so that year-to-year variations in precipitation and ground-water recharge are large and difficult to predict. Proxy data indicate geologically recent periods of naturally occurring multidecadal droughts unlike any in the modern instrumental record. Any anthropogenically induced climate change will likely reduce ground-water recharge through diminished snowpack at higher elevations. Future changes in El Niño and monsoonal patterns, both crucial to precipitation in the study area, are highly uncertain in current models. Current land-use modifications influence ground-water recharge through vegetation, irrigation, and impermeable area. High mountain ranges bounding the study area—the San Bernadino Mountains and Sierra Nevada to the west, and the Wasatch and southern Colorado Rocky Mountains to the east—provide external geologic controls on ground-water recharge. Internal geologic controls stem from tectonic processes that led to numerous, variably connected alluvial-filled basins, exposure of extensive Paleozoic aquifers in mountainous recharge areas, and distinct modes of recharge in the Colorado Plateau and Basin and Range subregions.The chapters in this professional paper present (first) an overview of climatic and hydrogeologic framework (chapter A), followed by a regional analysis of ground-water recharge across the entire study area (chapter B). These are followed by an overview of site-specific case studies representing different subareas of the geographically diverse arid and semiarid southwestern United States (chapter C); the case studies themselves follow in chapters D–K. The regional analysis includes detailed hydrologic modeling within the framework of a high-resolution geographic-information system (GIS). Results from the regional analysis are used to explore both the distribution of ground-water recharge for mean climatic conditions as well as the influence of two climatic patterns—the El Niño-Southern Oscillation and Pacific Decadal Oscillation—that impart a high degree of variability to the hydrologic cycle. Individual case studies employ a variety of geophysical and geochemical techniques to investigate recharge processes and relate the processes to local geologic and climatic conditions. All of the case studies made use of naturally occurring tracers to quantify recharge. Thermal and geophysical techniques that were developed in the course of the studies are presented in appendices.The quantification of ground-water recharge in arid settings is inherently difficult due to the generally low amount of recharge, its spatially and temporally spotty nature, and the absence of techniques for directly measuring fluxes entering the saturated zone from the unsaturated zone. Deep water tables in arid alluvial basins correspond to thick unsaturated zones that produce up to millennial time lags between changes in hydrologic conditions at the land surface and subsequent changes in recharge to underlying ground water. Recent advances in physical, chemical, isotopic, and modeling techniques have fostered new types of recharge assessments. Chemical and isotopic techniques include an increasing variety of environmental tracers that are useful and robust. Physically based techniques include the use of heat as a tracer and computationally intensive geophysical imaging tools for characterizing hydrologic conditions in the unsaturated zone. Modeling-based techniques include spatially distributed water-budget computations using high-resolution remotely sensed and ground-based geographic data. Application of these techniques to arid and semiarid settings in the southwestern United States reveals distinct patterns of recharge corresponding to geologic setting, climatic and vegetative history, and land use. Analysis of recharge patterns shows that large expanses of alluvial basin floors are drying out under current climatic conditions, with little to no recharge to underlying ground water. Ground-water recharge occurs mainly beneath upland catchments in which thin soils overlie permeable bedrock, ephemeral channels in which flow may average only several hours per year, and active agricultural areas. The chapters in this professional paper represent a coordinated attempt to develop a better understanding of one of the Nation's most critical yet difficult-to-quantify renewable resources.

  20. Research on the evolution model and deformation mechanisms of Baishuihe landslide based on analyzing geologic process of slope

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Tang, H.; Cai, Y.; Tan, Q.

    2016-12-01

    The landslide is a result of both inner and exterior geologic agents, and inner ones always have significant influences on the susceptibility of geologic bodies to the exterior ones. However, current researches focus more on impacts of exterior factors, such as precipitation and reservoir water, than that of geologic process. Baishuihe landslide, located on the south bank of Yangtze River and 56km upstream from the Three Gorges Project, was taken as the study subject with the in-situ investigation and exploration carried out for the first step. After the spatial analysis using the 3D model of topography built by ArcGIS (Fig.1), geologic characteristics of the slope that lies in a certain range near the Baishuihe landslide on the same bank were investigated for further insights into geologic process of the slope, with help of the geological map and structure outline map. Baishuihe landslide developed on the north limb of Baifuping anticline, a dip slope on the southwest margin of Zigui basin. The eastern and western boundaries are both ridges and in the middle a distinct slide depression is in process of deforming. Evolutionary process of Baishuihe landslide includes three steps below. 1) Emergence of Baifuping anticline leaded to interbedded dislocation, tension cracks and joint fractures in bedrocks. 2) Weathering continuously weakened strength of soft interlayers in the Shazhenxi Formation (T3s). 3) Rock slide caused by neotectonics happened on a large scale along the weak layers and joint planes, forming initial Baishuihe landslide. Although the landslide has undergone reconstruction for a long time, it could still be divided clearly into two parts, namely a) the rock landslide at the back half (south) and b) the debris landslide at the front half (north). a) The deformation mechanism for the rock landslide is believed to be deterioration in strength of weak bedding planes due to precipitation and free face caused by human activities or river incision. b) Influenced by the impoundment of Three Gorges Reservoir, about the 1/3 front of the debris landslide is permanently under the lowest water level (el. 145m) and about the 1/3 middle part locates in the fluctuation belt of water level (el. 145-175m), suggesting that deformation of the debris landslide is primarily governed by reservoir water compared with precipitation.

  1. Selene: A Videogame for Learning about the Moon

    NASA Astrophysics Data System (ADS)

    Wood, C. A.; Reese, D. D.

    2008-06-01

    The Selene game-based, metaphor-enhanced (GaME) learning object prepares players with concrete knowledge of basic lunar geology processes. Selene is embedded within an online research environment studying learning and assessment within videogames.

  2. Acausal measurement-based quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-07-01

    In measurement-based quantum computing, there is a natural "causal cone" among qubits of the resource state, since the measurement angle on a qubit has to depend on previous measurement results in order to correct the effect of by-product operators. If we respect the no-signaling principle, by-product operators cannot be avoided. Here we study the possibility of acausal measurement-based quantum computing by using the process matrix framework [Oreshkov, Costa, and Brukner, Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076]. We construct a resource process matrix for acausal measurement-based quantum computing restricting local operations to projective measurements. The resource process matrix is an analog of the resource state of the standard causal measurement-based quantum computing. We find that if we restrict local operations to projective measurements the resource process matrix is (up to a normalization factor and trivial ancilla qubits) equivalent to the decorated graph state created from the graph state of the corresponding causal measurement-based quantum computing. We also show that it is possible to consider a causal game whose causal inequality is violated by acausal measurement-based quantum computing.

  3. Digging into Inquiry-Based Earth Science Research

    ERIC Educational Resources Information Center

    Schultz, Bryan; Yates, Crystal; Schultz, Jayne M.

    2008-01-01

    To help eighth-grade students experience the excitement of Earth science research, the authors developed an inquiry-based project in which students evaluated and cataloged their campus geology and soils. Following class discussions of rock-weathering and soil-forming processes, students worked in groups to excavate multiple soil pits in the school…

  4. Subsurface data visualization in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Krijnen, Robbert; Smelik, Ruben; Appleton, Rick; van Maanen, Peter-Paul

    2017-04-01

    Due to their increasing complexity and size, visualization of geological data is becoming more and more important. It enables detailed examining and reviewing of large volumes of geological data and it is often used as a communication tool for reporting and education to demonstrate the importance of the geology to policy makers. In the Netherlands two types of nation-wide geological models are available: 1) Layer-based models in which the subsurface is represented by a series of tops and bases of geological or hydrogeological units, and 2) Voxel models in which the subsurface is subdivided in a regular grid of voxels that can contain different properties per voxel. The Geological Survey of the Netherlands (GSN) provides an interactive web portal that delivers maps and vertical cross-sections of such layer-based and voxel models. From this portal you can download a 3D subsurface viewer that can visualize the voxel model data of an area of 20 × 25 km with 100 × 100 × 5 meter voxel resolution on a desktop computer. Virtual Reality (VR) technology enables us to enhance the visualization of this volumetric data in a more natural way as compared to a standard desktop, keyboard mouse setup. The use of VR for data visualization is not new but recent developments has made expensive hardware and complex setups unnecessary. The availability of consumer of-the-shelf VR hardware enabled us to create an new intuitive and low visualization tool. A VR viewer has been implemented using the HTC Vive head set and allows visualization and analysis of the GSN voxel model data with geological or hydrogeological units. The user can navigate freely around the voxel data (20 × 25 km) which is presented in a virtual room at a scale of 2 × 2 or 3 × 3 meters. To enable analysis, e.g. hydraulic conductivity, the user can select filters to remove specific hydrogeological units. The user can also use slicing to cut-off specific sections of the voxel data to get a closer look. This slicing can be done in any direction using a 'virtual knife'. Future plans are to further improve performance from 30 up to 90 Hz update rate to reduce possible motion sickness, add more advanced filtering capabilities as well as a multi user setup, annotation capabilities and visualizing of historical data.

  5. Digital database of the geologic map of the island of Hawai'i [Hawaii

    USGS Publications Warehouse

    Trusdell, Frank A.; Wolfe, Edward W.; Morris, Jean

    2006-01-01

    This online publication (DS 144) provides the digital database for the printed map by Edward W. Wolfe and Jean Morris (I-2524-A; 1996). This digital database contains all the information used to publish U.S. Geological Survey Geologic Investigations Series I-2524-A (available only in paper form; see http://pubs.er.usgs.gov/pubs/i/i2524A). The database contains the distribution and relationships of volcanic and surficial-sedimentary deposits on the island of Hawai‘i. This dataset represents the geologic history for the five volcanoes that comprise the Island of Hawai'i. The volcanoes are Kohala, Mauna Kea, Hualalai, Mauna Loa and Kīlauea.This database of the geologic map contributes to understanding the geologic history of the Island of Hawai‘i and provides the basis for understanding long-term volcanic processes in an intra-plate ocean island volcanic system. In addition the database also serves as a basis for producing volcanic hazards assessment for the island of Hawai‘i. Furthermore it serves as a base layer to be used for interdisciplinary research.This online publication consists of a digital database of the geologic map, an explanatory pamphlet, description of map units, correlation of map units diagram, and images for plotting. Geologic mapping was compiled at a scale of 1:100,000 for the entire mapping area. The geologic mapping was compiled as a digital geologic database in ArcInfo GIS format.

  6. Distributed GPU Computing in GIScience

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.

  7. Surficial geologic map of the Amboy 30' x 60' quadrangle, San Bernardino County, California

    USGS Publications Warehouse

    Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.

    2010-01-01

    The surficial geologic map of the Amboy 30' x 60' quadrangle presents characteristics of surficial materials for an area of approximately 5,000 km2 in the eastern Mojave Desert of southern California. This map consists of new surficial mapping conducted between 2000 and 2007, as well as compilations from previous surficial mapping. Surficial geologic units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects following deposition, and, where appropriate, the lithologic nature of the material. Many physical properties were noted and measured during the geologic mapping. This information was used to classify surficial deposits and to understand their ecological importance. We focus on physical properties that drive hydrologic, biologic, and physical processes such as particle-size distribution (PSD) and bulk density. The database contains point data representing locations of samples for both laboratory determined physical properties and semiquantitative field-based information in the database. We include the locations of all field observations and note the type of information collected in the field to help assist in assessing the quality of the mapping. The publication is separated into three parts: documentation, spatial data, and printable map graphics of the database. Documentation includes this pamphlet, which provides a discussion of the surficial geology and units and the map. Spatial data are distributed as ArcGIS Geodatabase in Microsoft Access format and are accompanied by a readme file, which describes the database contents, and FGDC metadata for the spatial map information. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files that provide a view of the spatial database at the mapped scale.

  8. Multiresolution pattern recognition of small volcanos in Magellan data

    NASA Technical Reports Server (NTRS)

    Smyth, P.; Anderson, C. H.; Aubele, J. C.; Crumpler, L. S.

    1992-01-01

    The Magellan data is a treasure-trove for scientific analysis of venusian geology, providing far more detail than was previously available from Pioneer Venus, Venera 15/16, or ground-based radar observations. However, at this point, planetary scientists are being overwhelmed by the sheer quantities of data collected--data analysis technology has not kept pace with our ability to collect and store it. In particular, 'small-shield' volcanos (less than 20 km in diameter) are the most abundant visible geologic feature on the planet. It is estimated, based on extrapolating from previous studies and knowledge of the underlying geologic processes, that there should be on the order of 10(exp 5) to 10(exp 6) of these volcanos visible in the Magellan data. Identifying and studying these volcanos is fundamental to a proper understanding of the geologic evolution of Venus. However, locating and parameterizing them in a manual manner is very time-consuming. Hence, we have undertaken the development of techniques to partially automate this task. The goal is not the unrealistic one of total automation, but rather the development of a useful tool to aid the project scientists. The primary constraints for this particular problem are as follows: (1) the method must be reasonably robust; and (2) the method must be reasonably fast. Unlike most geological features, the small volcanos of Venus can be ascribed to a basic process that produces features with a short list of readily defined characteristics differing significantly from other surface features on Venus. For pattern recognition purposes the relevant criteria include the following: (1) a circular planimetric outline; (2) known diameter frequency distribution from preliminary studies; (3) a limited number of basic morphological shapes; and (4) the common occurrence of a single, circular summit pit at the center of the edifice.

  9. A New Numerical Simulation technology of Multistage Fracturing in Horizontal Well

    NASA Astrophysics Data System (ADS)

    Cheng, Ning; Kang, Kaifeng; Li, Jianming; Liu, Tao; Ding, Kun

    2017-11-01

    Horizontal multi-stage fracturing is recognized the effective development technology of unconventional oil resources. Geological mechanics in the numerical simulation of hydraulic fracturing technology occupies very important position, compared with the conventional numerical simulation technology, because of considering the influence of geological mechanics. New numerical simulation of hydraulic fracturing can more effectively optimize the design of fracturing and evaluate the production after fracturing. This paper studies is based on the three-dimensional stress and rock physics parameters model, using the latest fluid-solid coupling numerical simulation technology to engrave the extension process of fracture and describes the change of stress field in fracturing process, finally predict the production situation.

  10. Lattice Boltzmann-Based Approaches for Pore-Scale Reactive Transport

    DOE PAGES

    Yoon, Hongkyu; Kang, Qinjun; Valocchi, Albert J.

    2015-07-29

    Here an important geoscience and environmental applications such as geologic carbon storage, environmental remediation, and unconventional oil and gas recovery are best understood in the context of reactive flow and multicomponent transport in the subsurface environment. The coupling of chemical and microbiological reactions with hydrological and mechanical processes can lead to complex behaviors across an enormous range of spatial and temporal scales. These coupled responses are also strongly influenced by the heterogeneity and anisotropy of the geologic formations. Reactive transport processes can change the pore morphology at the pore scale, thereby leading to nonlinear interactions with advective and diffusive transport,more » which can strongly influence larger-scale properties such as permeability and dispersion.« less

  11. 10 CFR 2.1003 - Availability of material.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... its license application for a geologic repository, the NRC shall make available no later than thirty... privilege in § 2.1006, graphic-oriented documentary material that includes raw data, computer runs, computer... discrepancies; (ii) Gauge, meter and computer settings; (iii) Probe locations; (iv) Logging intervals and rates...

  12. 10 CFR 2.1003 - Availability of material.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... its license application for a geologic repository, the NRC shall make available no later than thirty... privilege in § 2.1006, graphic-oriented documentary material that includes raw data, computer runs, computer... discrepancies; (ii) Gauge, meter and computer settings; (iii) Probe locations; (iv) Logging intervals and rates...

  13. Methodology for earthquake rupture rate estimates of fault networks: example for the western Corinth rift, Greece

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Lyon-Caen, Hélène; Boiselet, Aurélien

    2017-10-01

    Modeling the seismic potential of active faults is a fundamental step of probabilistic seismic hazard assessment (PSHA). An accurate estimation of the rate of earthquakes on the faults is necessary in order to obtain the probability of exceedance of a given ground motion. Most PSHA studies consider faults as independent structures and neglect the possibility of multiple faults or fault segments rupturing simultaneously (fault-to-fault, FtF, ruptures). The Uniform California Earthquake Rupture Forecast version 3 (UCERF-3) model takes into account this possibility by considering a system-level approach rather than an individual-fault-level approach using the geological, seismological and geodetical information to invert the earthquake rates. In many places of the world seismological and geodetical information along fault networks is often not well constrained. There is therefore a need to propose a methodology relying on geological information alone to compute earthquake rates of the faults in the network. In the proposed methodology, a simple distance criteria is used to define FtF ruptures and consider single faults or FtF ruptures as an aleatory uncertainty, similarly to UCERF-3. Rates of earthquakes on faults are then computed following two constraints: the magnitude frequency distribution (MFD) of earthquakes in the fault system as a whole must follow an a priori chosen shape and the rate of earthquakes on each fault is determined by the specific slip rate of each segment depending on the possible FtF ruptures. The modeled earthquake rates are then compared to the available independent data (geodetical, seismological and paleoseismological data) in order to weight different hypothesis explored in a logic tree.The methodology is tested on the western Corinth rift (WCR), Greece, where recent advancements have been made in the understanding of the geological slip rates of the complex network of normal faults which are accommodating the ˜ 15 mm yr-1 north-south extension. Modeling results show that geological, seismological and paleoseismological rates of earthquakes cannot be reconciled with only single-fault-rupture scenarios and require hypothesizing a large spectrum of possible FtF rupture sets. In order to fit the imposed regional Gutenberg-Richter (GR) MFD target, some of the slip along certain faults needs to be accommodated either with interseismic creep or as post-seismic processes. Furthermore, computed individual faults' MFDs differ depending on the position of each fault in the system and the possible FtF ruptures associated with the fault. Finally, a comparison of modeled earthquake rupture rates with those deduced from the regional and local earthquake catalog statistics and local paleoseismological data indicates a better fit with the FtF rupture set constructed with a distance criteria based on 5 km rather than 3 km, suggesting a high connectivity of faults in the WCR fault system.

  14. Cloud GIS Based Watershed Management

    NASA Astrophysics Data System (ADS)

    Bediroğlu, G.; Colak, H. E.

    2017-11-01

    In this study, we generated a Cloud GIS based watershed management system with using Cloud Computing architecture. Cloud GIS is used as SAAS (Software as a Service) and DAAS (Data as a Service). We applied GIS analysis on cloud in terms of testing SAAS and deployed GIS datasets on cloud in terms of DAAS. We used Hybrid cloud computing model in manner of using ready web based mapping services hosted on cloud (World Topology, Satellite Imageries). We uploaded to system after creating geodatabases including Hydrology (Rivers, Lakes), Soil Maps, Climate Maps, Rain Maps, Geology and Land Use. Watershed of study area has been determined on cloud using ready-hosted topology maps. After uploading all the datasets to systems, we have applied various GIS analysis and queries. Results shown that Cloud GIS technology brings velocity and efficiency for watershed management studies. Besides this, system can be easily implemented for similar land analysis and management studies.

  15. Integration of the stratigraphic aspects of very large sea-floor databases using information processing

    USGS Publications Warehouse

    Jenkins, Clinton N.; Flocks, J.; Kulp, M.; ,

    2006-01-01

    Information-processing methods are described that integrate the stratigraphic aspects of large and diverse collections of sea-floor sample data. They efficiently convert common types of sea-floor data into database and GIS (geographical information system) tables, visual core logs, stratigraphic fence diagrams and sophisticated stratigraphic statistics. The input data are held in structured documents, essentially written core logs that are particularly efficient to create from raw input datasets. Techniques are described that permit efficient construction of regional databases consisting of hundreds of cores. The sedimentological observations in each core are located by their downhole depths (metres below sea floor - mbsf) and also by a verbal term that describes the sample 'situation' - a special fraction of the sediment or position in the core. The main processing creates a separate output event for each instance of top, bottom and situation, assigning top-base mbsf values from numeric or, where possible, from word-based relative locational information such as 'core catcher' in reference to sampler device, and recovery or penetration length. The processing outputs represent the sub-bottom as a sparse matrix of over 20 sediment properties of interest, such as grain size, porosity and colour. They can be plotted in a range of core-log programs including an in-built facility that better suits the requirements of sea-floor data. Finally, a suite of stratigraphic statistics are computed, including volumetric grades, overburdens, thicknesses and degrees of layering. ?? The Geological Society of London 2006.

  16. Aerospace technology can be applied to exploration 'back on earth'. [offshore petroleum resources

    NASA Technical Reports Server (NTRS)

    Jaffe, L. D.

    1977-01-01

    Applications of aerospace technology to petroleum exploration are described. Attention is given to seismic reflection techniques, sea-floor mapping, remote geochemical sensing, improved drilling methods and down-hole acoustic concepts, such as down-hole seismic tomography. The seismic reflection techniques include monitoring of swept-frequency explosive or solid-propellant seismic sources, as well as aerial seismic surveys. Telemetry and processing of seismic data may also be performed through use of aerospace technology. Sea-floor sonor imaging and a computer-aided system of geologic analogies for petroleum exploration are also considered.

  17. Deformation band clusters on Mars and implications for subsurface fluid flow

    USGS Publications Warehouse

    Okubo, C.H.; Schultz, R.A.; Chan, M.A.; Komatsu, G.

    2009-01-01

    High-resolution imagery reveals unprecedented lines of evidence for the presence of deformation band clusters in layered sedimentary deposits in the equatorial region of Mars. Deformation bands are a class of geologic structural discontinuity that is a precursor to faults in clastic rocks and soils. Clusters of deformation bands, consisting of many hundreds of individual subparallel bands, can act as important structural controls on subsurface fluid flow in terrestrial reservoirs, and evidence of diagenetic processes is often preserved along them. Deformation band clusters are identified on Mars based on characteristic meter-scale architectures and geologic context as observed in data from the High-Resolution Imaging Science Experiment (HiRISE) camera. The identification of deformation band clusters on Mars is a key to investigating the migration of fluids between surface and subsurface reservoirs in the planet's vast sedimentary deposits. Similar to terrestrial examples, evidence of diagenesis in the form of light- and dark-toned discoloration and wall-rock induration is recorded along many of the deformation band clusters on Mars. Therefore, these structures are important sites for future exploration and investigations into the geologic history of water and water-related processes on Mars. ?? 2008 Geological Society of America.

  18. Geodetic measurement of deformation in California. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Sauber, Jeanne Marie

    1988-01-01

    The very long baseline interferometry (VLBI) measurements made in the western U.S. since 1979 as part of the NASA Crustal Dynamics Project provide discrete samples of the temporal and spatial deformation field. The interpretation of the VLBI-derived rates of deformation requires an examination of geologic information and more densely sampled ground-based geodetic data. In the first two of three related studies embodying this thesis triangulation and trilateration data measured on two regional networks are processed, one in the central Mojave Desert and one in the Coast Ranges east of the San Andreas fault. At the spatial scales spanned by these local geodetic networks, auxiliary geologic and geophysical data have been utilized to examine the relation between measured incremental strain and the accommodation of strain seen in local geological structures, strain release in earthquakes, and principal stress directions inferred from in situ measurements. In the third study, VLBI data from stations distributed across the Pacific - North American plate boundary zone in the western United States are processed. The VLBI data have been used to constrain the integrated rate of deformation across portions of the continental plate boundary in California and to provide a tectonic framework to interpret regional geodetic and geologic studies.

  19. The USGS role in mapping the nation's submerged lands

    USGS Publications Warehouse

    Schwab, Bill; Haines, John

    2004-01-01

    The seabed provides habitat for a diverse marine life having commercial, recreational, and intrinsic value. The habitat value of the seabed is largely a function of the geological structure and related geological, biological, oceanologic, and geochemical processes. Of equal importance, the nation's submerged lands contain energy and mineral resources and are utilized for the siting of offshore infrastructure and waste disposal. Seabed character and processes influence the safety and viability of offshore operations. Seabed and subseabed characterization is a prerequisite for the assessment, protection, and utilization of both living and non-living marine resources. A comprehensive program to characterize and understand the nation's submerged lands requires scientific expertise in the fields of geology, biology, hydrography, and oceanography. The U.S. Geological Survey (USGS) has long experience as the Federal agency charged with conducting geologic research and mapping in both coastal and offshore regions. The USGS Coastal and Marine Geology Program (CMGP) leads the nation in expertise related to characterization of seabed and subseabed geology, geological processes, seabed dynamics, and (in collaboration with the National Oceanic and Atmospheric Administration (NOAA) and international partners) habitat geoscience. Numerous USGS studies show that sea-floor geology and processes determine the character and distribution of biological habitats, control coastal evolution, influence the coastal response to storm events and human alterations, and determine the occurrence and concentration of natural resources.

  20. Reports of planetary geology program, 1983

    NASA Technical Reports Server (NTRS)

    Holt, H. E. (Compiler)

    1984-01-01

    Several areas of the Planetary Geology Program were addressed including outer solar system satellites, asteroids, comets, Venus, cratering processes and landform development, volcanic processes, aeolian processes, fluvial processes, periglacial and permafrost processes, geomorphology, remote sensing, tectonics and stratigraphy, and mapping.

Top