Sample records for raster data

  1. Vector and Raster Data Storage Based on Morton Code

    NASA Astrophysics Data System (ADS)

    Zhou, G.; Pan, Q.; Yue, T.; Wang, Q.; Sha, H.; Huang, S.; Liu, X.

    2018-05-01

    Even though geomatique is so developed nowadays, the integration of spatial data in vector and raster formats is still a very tricky problem in geographic information system environment. And there is still not a proper way to solve the problem. This article proposes a method to interpret vector data and raster data. In this paper, we saved the image data and building vector data of Guilin University of Technology to Oracle database. Then we use ADO interface to connect database to Visual C++ and convert row and column numbers of raster data and X Y of vector data to Morton code in Visual C++ environment. This method stores vector and raster data to Oracle Database and uses Morton code instead of row and column and X Y to mark the position information of vector and raster data. Using Morton code to mark geographic information enables storage of data make full use of storage space, simultaneous analysis of vector and raster data more efficient and visualization of vector and raster more intuitive. This method is very helpful for some situations that need to analyse or display vector data and raster data at the same time.

  2. Raster Metafile and Raster Metafile Translator

    NASA Technical Reports Server (NTRS)

    Taylor, Nancy L.; Everton, Eric L.; Randall, Donald P.; Gates, Raymond L.; Skeens, Kristi M.

    1989-01-01

    The intent is to present an effort undertaken at NASA Langley Research Center to design a generic raster image format and to develop tools for processing images prepared in this format. Both the Raster Metafile (RM) format and the Raster Metafile Translator (RMT) are addressed. This document is intended to serve a varied audience including: users wishing to display and manipulate raster image data, programmers responsible for either interfacing the RM format with other raster formats or for developing new RMT device drivers, and programmers charged with installing the software on a host platform.

  3. Rasdaman for Big Spatial Raster Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.

    2015-12-01

    Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.

  4. Data decomposition method for parallel polygon rasterization considering load balancing

    NASA Astrophysics Data System (ADS)

    Zhou, Chen; Chen, Zhenjie; Liu, Yongxue; Li, Feixue; Cheng, Liang; Zhu, A.-xing; Li, Manchun

    2015-12-01

    It is essential to adopt parallel computing technology to rapidly rasterize massive polygon data. In parallel rasterization, it is difficult to design an effective data decomposition method. Conventional methods ignore load balancing of polygon complexity in parallel rasterization and thus fail to achieve high parallel efficiency. In this paper, a novel data decomposition method based on polygon complexity (DMPC) is proposed. First, four factors that possibly affect the rasterization efficiency were investigated. Then, a metric represented by the boundary number and raster pixel number in the minimum bounding rectangle was developed to calculate the complexity of each polygon. Using this metric, polygons were rationally allocated according to the polygon complexity, and each process could achieve balanced loads of polygon complexity. To validate the efficiency of DMPC, it was used to parallelize different polygon rasterization algorithms and tested on different datasets. Experimental results showed that DMPC could effectively parallelize polygon rasterization algorithms. Furthermore, the implemented parallel algorithms with DMPC could achieve good speedup ratios of at least 15.69 and generally outperformed conventional decomposition methods in terms of parallel efficiency and load balancing. In addition, the results showed that DMPC exhibited consistently better performance for different spatial distributions of polygons.

  5. Raster profile development for the spatial data transfer standard

    USGS Publications Warehouse

    Szemraj, John A.

    1993-01-01

    The Spatial Data Transfer Standard (SDTS), recently approved as Federal Information Processing Standard (FIPS) Publication 173, is designed to transfer various types of spatial data. Implementing all of the standard's options at one time is impractical. Profiles, or limited subsets of the SDTS, are the mechanisms by which the standards will be implemented. The development of a raster profile is being coordinated by the U.S. Geological Survey's (USGS) SDTS Task Force. This raster profile is intended to accommodate digital georeferenced image data and regularly spaces, georeferenced gridded data. The USGS's digital elevation models (DEMs) and digital orthophoto quadrangles (DOQs), National Oceanic and Atmospheric Administration's (NOAA) advanced very huh resolution radiometer (AVHRR) and Landsat data, and National Aeronautics and Space Administration's (NASA) Earth observing system (EOS) data are among the candidate data sets for this profile. Other raster profiles, designed to support nongeoreferenced and other types of "raw" sensor data will be consider in the future. As with the Topological Vector Profile (TVP) for the SDTS, development of the raster profile includes designing a prototype profile, testing the prototype profile using sample data sets, and finally, requesting and receiving FIPS approval.

  6. Spatial Data Transfer Standard (SDTS), part 5 : SDTS raster profile and extensions

    DOT National Transportation Integrated Search

    1998-01-01

    The SRPE contains specifications for a profile for use with georeferenced two-dimensional raster data. Both raster image and raster grid data are included within the scope of this profile. The transfer of indirectly referenced images is permitted, i....

  7. The Design of a High Performance Earth Imagery and Raster Data Management and Processing Platform

    NASA Astrophysics Data System (ADS)

    Xie, Qingyun

    2016-06-01

    This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC). Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  8. Raster Vs. Point Cloud LiDAR Data Classification

    NASA Astrophysics Data System (ADS)

    El-Ashmawy, N.; Shaker, A.

    2014-09-01

    Airborne Laser Scanning systems with light detection and ranging (LiDAR) technology is one of the fast and accurate 3D point data acquisition techniques. Generating accurate digital terrain and/or surface models (DTM/DSM) is the main application of collecting LiDAR range data. Recently, LiDAR range and intensity data have been used for land cover classification applications. Data range and Intensity, (strength of the backscattered signals measured by the LiDAR systems), are affected by the flying height, the ground elevation, scanning angle and the physical characteristics of the objects surface. These effects may lead to uneven distribution of point cloud or some gaps that may affect the classification process. Researchers have investigated the conversion of LiDAR range point data to raster image for terrain modelling. Interpolation techniques have been used to achieve the best representation of surfaces, and to fill the gaps between the LiDAR footprints. Interpolation methods are also investigated to generate LiDAR range and intensity image data for land cover classification applications. In this paper, different approach has been followed to classifying the LiDAR data (range and intensity) for land cover mapping. The methodology relies on the classification of the point cloud data based on their range and intensity and then converted the classified points into raster image. The gaps in the data are filled based on the classes of the nearest neighbour. Land cover maps are produced using two approaches using: (a) the conventional raster image data based on point interpolation; and (b) the proposed point data classification. A study area covering an urban district in Burnaby, British Colombia, Canada, is selected to compare the results of the two approaches. Five different land cover classes can be distinguished in that area: buildings, roads and parking areas, trees, low vegetation (grass), and bare soil. The results show that an improvement of around 10 % in the

  9. Formats and Network Protocols for Browser Access to 2D Raster Data

    NASA Astrophysics Data System (ADS)

    Plesea, L.

    2015-12-01

    Tiled web maps in browsers are a major success story, forming the foundation of many current web applications. Enabling tiled data access is the next logical step, and is likely to meet with similar success. Many ad-hoc approaches have already started to appear, and something similar is explored within the Open Geospatial Consortium. One of the main obstacles in making browser data access a reality is the lack of a well-known data format. This obstacle also represents an opportunity to analyze the requirements and possible candidates, applying lessons learned from web tiled image services and protocols. Similar to the image counterpart, a web tile raster data format needs to have good intrinsic compression and be able to handle high byte count data types including floating point. An overview of a possible solution to the format problem, a 2D data raster compression algorithm called Limited Error Raster Compression (LERC) will be presented. In addition to the format, best practices for high request rate HTTP services also need to be followed. In particular, content delivery network (CDN) caching suitability needs to be part of any design, not an after-thought. Last but not least, HTML 5 browsers will certainly be part of any solution since they provide improved access to binary data, as well as more powerful ways to view and interact with the data in the browser. In a simple but relevant application, digital elevation model (DEM) raster data is served as LERC compressed data tiles which are used to generate terrain by a HTML5 scene viewer.

  10. Raster Metafile And Raster Metafile Translator Programs

    NASA Technical Reports Server (NTRS)

    Randall, Donald P.; Gates, Raymond L.; Skeens, Kristi M.

    1994-01-01

    Raster Metafile (RM) computer program is generic raster-image-format program, and Raster Metafile Translator (RMT) program is assortment of software tools for processing images prepared in this format. Processing includes reading, writing, and displaying RM images. Such other image-manipulation features as minimal compositing operator and resizing option available under RMT command structure. RMT written in FORTRAN 77 and C language.

  11. An examination of techniques for reformatting digital cartographic data/part 1: the raster-to- vector process.

    USGS Publications Warehouse

    Peuquet, D.J.

    1981-01-01

    Current graphic devices suitable for high-speed computer input and output of cartographic data are tending more and more to be raster-oriented, such as the rotating drum scanner and the color raster display. However, the majority of commonly used manipulative techniques in computer-assisted cartography and automated spatial data handling continue to require that the data be in vector format. This situation has recently precipitated the requirement for very fast techniques for converting digital cartographic data from raster to vector format for processing, and then back into raster format for plotting. The current article is part 1 of a 2 part paper concerned with examining the state-of-the-art in these conversion techniques. -from Author

  12. Spatial inventory integrating raster databases and point sample data. [Geographic Information System for timber inventory

    NASA Technical Reports Server (NTRS)

    Strahler, A. H.; Woodcock, C. E.; Logan, T. L.

    1983-01-01

    A timber inventory of the Eldorado National Forest, located in east-central California, provides an example of the use of a Geographic Information System (GIS) to stratify large areas of land for sampling and the collection of statistical data. The raster-based GIS format of the VICAR/IBIS software system allows simple and rapid tabulation of areas, and facilitates the selection of random locations for ground sampling. Algorithms that simplify the complex spatial pattern of raster-based information, and convert raster format data to strings of coordinate vectors, provide a link to conventional vector-based geographic information systems.

  13. BOREAS TGB-12 Soil Carbon and Flux Data of NSA-MSA in Raster Format

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Knapp, David E. (Editor); Rapalee, Gloria; Davidson, Eric; Harden, Jennifer W.; Trumbore, Susan E.; Veldhuis, Hugo

    2000-01-01

    The BOREAS TGB-12 team made measurements of soil carbon inventories, carbon concentration in soil gases, and rates of soil respiration at several sites. This data set provides: (1) estimates of soil carbon stocks by horizon based on soil survey data and analyses of data from individual soil profiles; (2) estimates of soil carbon fluxes based on stocks, fire history, drain-age, and soil carbon inputs and decomposition constants based on field work using radiocarbon analyses; (3) fire history data estimating age ranges of time since last fire; and (4) a raster image and an associated soils table file from which area-weighted maps of soil carbon and fluxes and fire history may be generated. This data set was created from raster files, soil polygon data files, and detailed lab analysis of soils data that were received from Dr. Hugo Veldhuis, who did the original mapping in the field during 1994. Also used were soils data from Susan Trumbore and Jennifer Harden (BOREAS TGB-12). The binary raster file covers a 733-km 2 area within the NSA-MSA.

  14. A program for handling map projections of small-scale geospatial raster data

    USGS Publications Warehouse

    Finn, Michael P.; Steinwand, Daniel R.; Trent, Jason R.; Buehler, Robert A.; Mattli, David M.; Yamamoto, Kristina H.

    2012-01-01

    Scientists routinely accomplish small-scale geospatial modeling using raster datasets of global extent. Such use often requires the projection of global raster datasets onto a map or the reprojection from a given map projection associated with a dataset. The distortion characteristics of these projection transformations can have significant effects on modeling results. Distortions associated with the reprojection of global data are generally greater than distortions associated with reprojections of larger-scale, localized areas. The accuracy of areas in projected raster datasets of global extent is dependent on spatial resolution. To address these problems of projection and the associated resampling that accompanies it, methods for framing the transformation space, direct point-to-point transformations rather than gridded transformation spaces, a solution to the wrap-around problem, and an approach to alternative resampling methods are presented. The implementations of these methods are provided in an open-source software package called MapImage (or mapIMG, for short), which is designed to function on a variety of computer architectures.

  15. Harvesting geographic features from heterogeneous raster maps

    NASA Astrophysics Data System (ADS)

    Chiang, Yao-Yi

    2010-11-01

    Raster maps offer a great deal of geospatial information and are easily accessible compared to other geospatial data. However, harvesting geographic features locked in heterogeneous raster maps to obtain the geospatial information is challenging. This is because of the varying image quality of raster maps (e.g., scanned maps with poor image quality and computer-generated maps with good image quality), the overlapping geographic features in maps, and the typical lack of metadata (e.g., map geocoordinates, map source, and original vector data). Previous work on map processing is typically limited to a specific type of map and often relies on intensive manual work. In contrast, this thesis investigates a general approach that does not rely on any prior knowledge and requires minimal user effort to process heterogeneous raster maps. This approach includes automatic and supervised techniques to process raster maps for separating individual layers of geographic features from the maps and recognizing geographic features in the separated layers (i.e., detecting road intersections, generating and vectorizing road geometry, and recognizing text labels). The automatic technique eliminates user intervention by exploiting common map properties of how road lines and text labels are drawn in raster maps. For example, the road lines are elongated linear objects and the characters are small connected-objects. The supervised technique utilizes labels of road and text areas to handle complex raster maps, or maps with poor image quality, and can process a variety of raster maps with minimal user input. The results show that the general approach can handle raster maps with varying map complexity, color usage, and image quality. By matching extracted road intersections to another geospatial dataset, we can identify the geocoordinates of a raster map and further align the raster map, separated feature layers from the map, and recognized features from the layers with the geospatial

  16. The Use of Interactive Raster Graphics in the Display and Manipulation of Multidimensional Data

    NASA Technical Reports Server (NTRS)

    Anderson, D. C.

    1981-01-01

    Techniques for the review, display, and manipulation of multidimensional data are developed and described. Multidimensional data is meant in this context to describe scalar data associated with a three dimensional geometry or otherwise too complex to be well represented by traditional graphs. Raster graphics techniques are used to display a shaded image of a three dimensional geometry. The use of color to represent scalar data associated with the geometries in shaded images is explored. Distinct hues are associated with discrete data ranges, thus emulating the traditional representation of data with isarithms, or lines of constant numerical value. Data ranges are alternatively associated with a continuous spectrum of hues to show subtler data trends. The application of raster graphics techniques to the display of bivariate functions is explored.

  17. An examination of techniques for reformatting digital cartographic data. Part 2: the vector-to raster process.

    USGS Publications Warehouse

    Peuquet, D.J.

    1981-01-01

    Current graphic devices suitable for high-speed computer input and output of cartographic data are tending more and more to be raster-oriented, such as the rotating drum scanner and the color raster display. However, the majority of commonly used manipulative techniques in computer-assisted cartography and automated spatial data handling continue to require that the data be in vector format. The current article is the second part of a two-part paper that examines the state of the art in these conversion techniques. - from Author

  18. Raster Data Partitioning for Supporting Distributed GIS Processing

    NASA Astrophysics Data System (ADS)

    Nguyen Thai, B.; Olasz, A.

    2015-08-01

    In the geospatial sector big data concept also has already impact. Several studies facing originally computer science techniques applied in GIS processing of huge amount of geospatial data. In other research studies geospatial data is considered as it were always been big data (Lee and Kang, 2015). Nevertheless, we can prove data acquisition methods have been improved substantially not only the amount, but the resolution of raw data in spectral, spatial and temporal aspects as well. A significant portion of big data is geospatial data, and the size of such data is growing rapidly at least by 20% every year (Dasgupta, 2013). The produced increasing volume of raw data, in different format, representation and purpose the wealth of information derived from this data sets represents only valuable results. However, the computing capability and processing speed rather tackle with limitations, even if semi-automatic or automatic procedures are aimed on complex geospatial data (Kristóf et al., 2014). In late times, distributed computing has reached many interdisciplinary areas of computer science inclusive of remote sensing and geographic information processing approaches. Cloud computing even more requires appropriate processing algorithms to be distributed and handle geospatial big data. Map-Reduce programming model and distributed file systems have proven their capabilities to process non GIS big data. But sometimes it's inconvenient or inefficient to rewrite existing algorithms to Map-Reduce programming model, also GIS data can not be partitioned as text-based data by line or by bytes. Hence, we would like to find an alternative solution for data partitioning, data distribution and execution of existing algorithms without rewriting or with only minor modifications. This paper focuses on technical overview of currently available distributed computing environments, as well as GIS data (raster data) partitioning, distribution and distributed processing of GIS algorithms

  19. Survey of currently available high-resolution raster graphics systems

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.

    1987-01-01

    Presented are data obtained on high-resolution raster graphics engines currently available on the market. The data were obtained through survey responses received from various vendors and also from product literature. The questionnaire developed for this survey was basically a list of characteristics desired in a high performance color raster graphics system which could perform real-time aircraft simulations. Several vendors responded to the survey, with most reporting on their most advanced high-performance, high-resolution raster graphics engine.

  20. MapEdit: solution to continuous raster map creation

    NASA Astrophysics Data System (ADS)

    Rančić, Dejan; Djordjevi-Kajan, Slobodanka

    2003-03-01

    The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.

  1. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    USGS Publications Warehouse

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  2. Laser direct marking applied to rasterizing miniature Data Matrix Code on aluminum alloy

    NASA Astrophysics Data System (ADS)

    Li, Xia-Shuang; He, Wei-Ping; Lei, Lei; Wang, Jian; Guo, Gai-Fang; Zhang, Teng-Yun; Yue, Ting

    2016-03-01

    Precise miniaturization of 2D Data Matrix (DM) Codes on Aluminum alloy formed by raster mode laser direct part marking is demonstrated. The characteristic edge over-burn effects, which render vector mode laser direct part marking inadequate for producing precise and readable miniature codes, are minimized with raster mode laser marking. To obtain the control mechanism for the contrast and print growth of miniature DM code by raster laser marking process, the temperature field model of long pulse laser interaction with material is established. From the experimental results, laser average power and Q frequency have an important effect on the contrast and print growth of miniature DM code, and the threshold of laser average power and Q frequency for an identifiable miniature DM code are respectively 3.6 W and 110 kHz, which matches the model well within normal operating conditions. In addition, the empirical model of correlation occurring between laser marking parameters and module size is also obtained, and the optimal processing parameter values for an identifiable miniature DM code of different but certain data size are given. It is also found that an increase of the repeat scanning number effectively improves the surface finish of bore, the appearance consistency of modules, which has benefit to reading. The reading quality of miniature DM code is greatly improved using ultrasonic cleaning in water by avoiding the interference of color speckles surrounding modules.

  3. Combining satellite photographs and raster lidar data for channel connectivity in tidal marshes.

    NASA Astrophysics Data System (ADS)

    Li, Zhi; Hodges, Ben

    2017-04-01

    High resolution airborne lidar is capable of providing topographic detail down to the 1 x 1 m scale or finer over large tidal marshes of a river delta. Such data sets can be challenging to develop and ground-truth due to the inherent complexities of the environment, the relatively small changes in elevation throughout a marsh, and practical difficulties in accessing the variety of flooded, dry, and muddy regions. Standard lidar point-cloud processing techniques (as typically applied in large lidar data collection program) have a tendency to mis-identify narrow channels and water connectivity in a marsh, which makes it difficult to directly use such data for modeling marsh flows. Unfortunately, it is not always practical, or even possible, to access the point cloud and re-analyze the raw lidar data when discrepancies have been found in a raster work product. Faced with this problem in preparing a model of the Trinity River delta (Texas, USA), we developed an approach to integrating analysis of a lidar-based raster with satellite images. Our primary goal was to identify the clear land/water boundaries needed to identify channelization in the available rasterized lidar data. The channel extraction method uses pixelized satellite photographs that are stretched/distorted with image-processing techniques to match identifiable control features in both lidar and photographic data sets. A kmeans clustering algorithm was applied cluster pixels based on their colors, which is effective in separating land and water in a satellite photograph. The clustered image was matched to the lidar data such that the combination shows the channel network. In effect, we are able to use the fact that the satellite photograph is higher resolution than the lidar data, and thus provides connectivity in the clustering at a finer scale. The principal limitation of the method is the where the satellite image and lidar suffer from similar problems For example, vegetation overhanging a narrow

  4. BOREAS Forest Cover Data Layers over the SSA-MSA in Raster Format

    NASA Technical Reports Server (NTRS)

    Nickeson, Jaime; Gruszka, F; Hall, F.

    2000-01-01

    This data set, originally provided as vector polygons with attributes, has been processed by BORIS staff to provide raster files that can be used for modeling or for comparison purposes. The original data were received as ARC/INFO coverages or as export files from SERM. The data include information on forest parameters for the BOREAS SSA-MSA. Most of the data used for this product were acquired by BORIS in 1993; the maps were produced from aerial photography taken as recently as 1988. The data are stored in binary, image format files.

  5. Function modeling: improved raster analysis through delayed reading and function raster datasets

    Treesearch

    John S. Hogland; Nathaniel M. Anderson; J .Greg Jones

    2013-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...

  6. Extracting Spatiotemporal Objects from Raster Data to Represent Physical Features and Analyze Related Processes

    NASA Astrophysics Data System (ADS)

    Zollweg, J. A.

    2017-10-01

    Numerous ground-based, airborne, and orbiting platforms provide remotely-sensed data of remarkable spatial resolution at short time intervals. However, this spatiotemporal data is most valuable if it can be processed into information, thereby creating meaning. We live in a world of objects: cars, buildings, farms, etc. On a stormy day, we don't see millions of cubes of atmosphere; we see a thunderstorm `object'. Temporally, we don't see the properties of those individual cubes changing, we see the thunderstorm as a whole evolving and moving. There is a need to represent the bulky, raw spatiotemporal data from remote sensors as a small number of relevant spatiotemporal objects, thereby matching the human brain's perception of the world. This presentation reveals an efficient algorithm and system to extract the objects/features from raster-formatted remotely-sensed data. The system makes use of the Python object-oriented programming language, SciPy/NumPy for matrix manipulation and scientific computation, and export/import to the GeoJSON standard geographic object data format. The example presented will show how thunderstorms can be identified and characterized in a spatiotemporal continuum using a Python program to process raster data from NOAA's High-Resolution Rapid Refresh v2 (HRRRv2) data stream.

  7. BOREAS Forest Cover Data Layers of the NSA in Raster Format

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Knapp, David; Tuinhoff, Manning

    2000-01-01

    This data set was processed by BORIS staff from the original vector data of species, crown closure, cutting class, and site classification/subtype into raster files. The original polygon data were received from Linnet Graphics, the distributor of data for MNR. In the case of the species layer, the percentages of species composition were removed. This reduced the amount of information contained in the species layer of the gridded product, but it was necessary in order to make the gridded product easier to use. The original maps were produced from 1:15,840-scale aerial photography collected in 1988 over an area of the BOREAS NSA MSA. The data are stored in binary, image format files and they are available from Oak Ridge National Laboratory. The data files are available on a CD-ROM (see document number 20010000884).

  8. Snake River Plain Geothermal Play Fairway Analysis - Phase 1 Raster Files

    DOE Data Explorer

    John Shervais

    2015-10-09

    Snake River Plain Play Fairway Analysis - Phase 1 CRS Raster Files. This dataset contains raster files created in ArcGIS. These raster images depict Common Risk Segment (CRS) maps for HEAT, PERMEABILITY, AND SEAL, as well as selected maps of Evidence Layers. These evidence layers consist of either Bayesian krige functions or kernel density functions, and include: (1) HEAT: Heat flow (Bayesian krige map), Heat flow standard error on the krige function (data confidence), volcanic vent distribution as function of age and size, groundwater temperature (equivalue interval and natural breaks bins), and groundwater T standard error. (2) PERMEABILTY: Fault and lineament maps, both as mapped and as kernel density functions, processed for both dilational tendency (TD) and slip tendency (ST), along with data confidence maps for each data type. Data types include mapped surface faults from USGS and Idaho Geological Survey data bases, as well as unpublished mapping; lineations derived from maximum gradients in magnetic, deep gravity, and intermediate depth gravity anomalies. (3) SEAL: Seal maps based on presence and thickness of lacustrine sediments and base of SRP aquifer. Raster size is 2 km. All files generated in ArcGIS.

  9. A COMPARISON OF VECTOR AND RASTER GIS METHODS FOR CALCULATING LANDSCAPE METRICS USED IN ENVIRONMENTAL ASSESSMENTS

    EPA Science Inventory

    GIS-based measurements that combine native raster and native vector data are commonly used to assess environmental quality. Most of these measurements can be calculated using either raster or vector data formats and processing methods. Raster processes are more commonly used beca...

  10. Measuring geographic access to health care: raster and network-based methods

    PubMed Central

    2012-01-01

    Background Inequalities in geographic access to health care result from the configuration of facilities, population distribution, and the transportation infrastructure. In recent accessibility studies, the traditional distance measure (Euclidean) has been replaced with more plausible measures such as travel distance or time. Both network and raster-based methods are often utilized for estimating travel time in a Geographic Information System. Therefore, exploring the differences in the underlying data models and associated methods and their impact on geographic accessibility estimates is warranted. Methods We examine the assumptions present in population-based travel time models. Conceptual and practical differences between raster and network data models are reviewed, along with methodological implications for service area estimates. Our case study investigates Limited Access Areas defined by Michigan’s Certificate of Need (CON) Program. Geographic accessibility is calculated by identifying the number of people residing more than 30 minutes from an acute care hospital. Both network and raster-based methods are implemented and their results are compared. We also examine sensitivity to changes in travel speed settings and population assignment. Results In both methods, the areas identified as having limited accessibility were similar in their location, configuration, and shape. However, the number of people identified as having limited accessibility varied substantially between methods. Over all permutations, the raster-based method identified more area and people with limited accessibility. The raster-based method was more sensitive to travel speed settings, while the network-based method was more sensitive to the specific population assignment method employed in Michigan. Conclusions Differences between the underlying data models help to explain the variation in results between raster and network-based methods. Considering that the choice of data model/method may

  11. BOREAS TE-1 Soils Data Over The SSA Tower Sites in Raster Format

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Anderson, Darwin; Knapp, David E.

    2000-01-01

    The BOREAS TE-1 team collected various data to characterize the soil-plant systems in the BOREAS SSA. This data set was gridded from vector layers of soil maps that were received from Dr. Darwin Anderson (TE-1), who did the original soil mapping in the field during 1994. The vector layers were gridded into raster files that cover approximately 1 square kilometer over each of the tower sites in the SSA. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  12. A Comparison of Vector and Raster GIS Methods for Calculating Landscape Metrics Used in Environmental Assessments

    Treesearch

    Timothy G. Wade; James D. Wickham; Maliha S. Nash; Anne C. Neale; Kurt H. Riitters; K. Bruce Jones

    2003-01-01

    AbstractGIS-based measurements that combine native raster and native vector data are commonly used in environmental assessments. Most of these measurements can be calculated using either raster or vector data formats and processing methods. Raster processes are more commonly used because they can be significantly faster computationally...

  13. The pyramid system for multiscale raster analysis

    USGS Publications Warehouse

    De Cola, L.; Montagne, N.

    1993-01-01

    Geographical research requires the management and analysis of spatial data at multiple scales. As part of the U.S. Geological Survey's global change research program a software system has been developed that reads raster data (such as an image or digital elevation model) and produces a pyramid of aggregated lattices as well as various measurements of spatial complexity. For a given raster dataset the system uses the pyramid to report: (1) mean, (2) variance, (3) a spatial autocorrelation parameter based on multiscale analysis of variance, and (4) a monofractal scaling parameter based on the analysis of isoline lengths. The system is applied to 1-km digital elevation model (DEM) data for a 256-km2 region of central California, as well as to 64 partitions of the region. PYRAMID, which offers robust descriptions of data complexity, also is used to describe the behavior of topographic aspect with scale. ?? 1993.

  14. Raster graphics display library

    NASA Technical Reports Server (NTRS)

    Grimsrud, Anders; Stephenson, Michael B.

    1987-01-01

    The Raster Graphics Display Library (RGDL) is a high level subroutine package that give the advanced raster graphics display capabilities needed. The RGDL uses FORTRAN source code routines to build subroutines modular enough to use as stand-alone routines in a black box type of environment. Six examples are presented which will teach the use of RGDL in the fastest, most complete way possible. Routines within the display library that are used to produce raster graphics are presented in alphabetical order, each on a separate page. Each user-callable routine is described by function and calling parameters. All common blocks that are used in the display library are listed and the use of each variable within each common block is discussed. A reference on the include files that are necessary to compile the display library is contained. Each include file and its purpose are listed. The link map for MOVIE.BYU version 6, a general purpose computer graphics display system that uses RGDL software, is also contained.

  15. Geospatial data sharing, online spatial analysis and processing of Indian Biodiversity data in Internet GIS domain - A case study for raster based online geo-processing

    NASA Astrophysics Data System (ADS)

    Karnatak, H.; Pandey, K.; Oberai, K.; Roy, A.; Joshi, D.; Singh, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.

    2014-11-01

    National Biodiversity Characterization at Landscape Level, a project jointly sponsored by Department of Biotechnology and Department of Space, was implemented to identify and map the potential biodiversity rich areas in India. This project has generated spatial information at three levels viz. Satellite based primary information (Vegetation Type map, spatial locations of road & village, Fire occurrence); geospatially derived or modelled information (Disturbance Index, Fragmentation, Biological Richness) and geospatially referenced field samples plots. The study provides information of high disturbance and high biological richness areas suggesting future management strategies and formulating action plans. The study has generated for the first time baseline database in India which will be a valuable input towards climate change study in the Indian Subcontinent. The spatial data generated during the study is organized as central data repository in Geo-RDBMS environment using PostgreSQL and POSTGIS. The raster and vector data is published as OGC WMS and WFS standard for development of web base geoinformation system using Service Oriented Architecture (SOA). The WMS and WFS based system allows geo-visualization, online query and map outputs generation based on user request and response. This is a typical mashup architecture based geo-information system which allows access to remote web services like ISRO Bhuvan, Openstreet map, Google map etc., with overlay on Biodiversity data for effective study on Bio-resources. The spatial queries and analysis with vector data is achieved through SQL queries on POSTGIS and WFS-T operations. But the most important challenge is to develop a system for online raster based geo-spatial analysis and processing based on user defined Area of Interest (AOI) for large raster data sets. The map data of this study contains approximately 20 GB of size for each data layer which are five in number. An attempt has been to develop system using

  16. Preparation and Presentation of Digital Maps in Raster Format

    USGS Publications Warehouse

    Edwards, K.; Batson, R.M.

    1980-01-01

    A set of algorithms has been developed at USGS Flagstaff for displaying digital map data in raster format. The set includes: FILLIN, which assigns a specified attribute code to units of a map which have been outlined on a digitizer and converted to raster format; FILBND, which removes the outlines; ZIP, which adds patterns to the map units; and COLOR, which provides a simplified process for creating color separation plates for either photographic or lithographic reproduction. - Authors

  17. Using raster and vector data to identify objects for classify in flood risk. A case study: Raciborz

    NASA Astrophysics Data System (ADS)

    Porczek, Mariusz; Rucińska, Dorota; Lewiński, Stanisław

    2018-01-01

    The severe flood of 1997, which seriously affected Polish, Czech and German territories, gave impetus to research into the management of flood-prone areas. The material losses caused by the "Flood of the Millennium" totalled billions of Polish zloty. The extent of the disaster and of infrastructure repair costs changed the attitude of many branches of the economy, and of science. This is the direct result of consideration of the introduction of changes into spatial management and crisis management. At the same time, it focused the interest of many who were trained in analysing the vulnerability of land-use features to natural disasters such as floods. Research into the spatial distribution of geographic environmental features susceptible to flood in the Odra valley was conducted at the Faculty of Geography and Regional Studies of the University of Warsaw using Geographic Information Systems (GIS). This study seeks to examine the possibility of adapting vector and raster data and using them for land-use classification in the context of risk of flood and inundation damage. The analysed area of the city and surrounding area of Raciborz, on the upper Odra River, is a case study for identifying objects and lands susceptible to natural hazards based on publicly available satellite databases of the highest resolution, which is a very important factor in the quality of further risk analyses for applied use. The objective of the research was to create a 10×10-m-pixel raster network using raster data made available by ESA (Copernicus Land Monitoring Service) and vector data from Open Street Map.

  18. Teaching Raster GIS Operations with Spreadsheets.

    ERIC Educational Resources Information Center

    Raubal, Martin; Gaupmann, Bernhard; Kuhn, Werner

    1997-01-01

    Defines raster technology in its relationship to geographic information systems and notes that it is typically used with the application of remote sensing techniques and scanning devices. Discusses the role of spreadsheets in a raster model, and describes a general approach based on spreadsheets. Includes six computer-generated illustrations. (MJP)

  19. Methods to achieve accurate projection of regional and global raster databases

    USGS Publications Warehouse

    Usery, E.L.; Seong, J.C.; Steinwand, D.R.; Finn, M.P.

    2002-01-01

    This research aims at building a decision support system (DSS) for selecting an optimum projection considering various factors, such as pixel size, areal extent, number of categories, spatial pattern of categories, resampling methods, and error correction methods. Specifically, this research will investigate three goals theoretically and empirically and, using the already developed empirical base of knowledge with these results, develop an expert system for map projection of raster data for regional and global database modeling. The three theoretical goals are as follows: (1) The development of a dynamic projection that adjusts projection formulas for latitude on the basis of raster cell size to maintain equal-sized cells. (2) The investigation of the relationships between the raster representation and the distortion of features, number of categories, and spatial pattern. (3) The development of an error correction and resampling procedure that is based on error analysis of raster projection.

  20. BOREAS TE-20 Soils Data Over the NSA-MSA and Tower Sites in Raster Format

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Veldhuis, Hugo; Knapp, David; Veldhuis, Hugo

    2000-01-01

    The BOREAS TE-20 team collected several data sets for use in developing and testing models of forest ecosystem dynamics. This data set was gridded from vector layers of soil maps that were received from Dr. Hugo Veldhuis, who did the original mapping in the field during 1994. The vector layers were gridded into raster files that cover the NSA-MSA and tower sites. The data are stored in binary, image format files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Center (DAAC).

  1. Multidirectional Scanning Model, MUSCLE, to Vectorize Raster Images with Straight Lines

    PubMed Central

    Karas, Ismail Rakip; Bayram, Bulent; Batuk, Fatmagul; Akay, Abdullah Emin; Baz, Ibrahim

    2008-01-01

    This paper presents a new model, MUSCLE (Multidirectional Scanning for Line Extraction), for automatic vectorization of raster images with straight lines. The algorithm of the model implements the line thinning and the simple neighborhood methods to perform vectorization. The model allows users to define specified criteria which are crucial for acquiring the vectorization process. In this model, various raster images can be vectorized such as township plans, maps, architectural drawings, and machine plans. The algorithm of the model was developed by implementing an appropriate computer programming and tested on a basic application. Results, verified by using two well known vectorization programs (WinTopo and Scan2CAD), indicated that the model can successfully vectorize the specified raster data quickly and accurately. PMID:27879843

  2. Multidirectional Scanning Model, MUSCLE, to Vectorize Raster Images with Straight Lines.

    PubMed

    Karas, Ismail Rakip; Bayram, Bulent; Batuk, Fatmagul; Akay, Abdullah Emin; Baz, Ibrahim

    2008-04-15

    This paper presents a new model, MUSCLE (Multidirectional Scanning for Line Extraction), for automatic vectorization of raster images with straight lines. The algorithm of the model implements the line thinning and the simple neighborhood methods to perform vectorization. The model allows users to define specified criteria which are crucial for acquiring the vectorization process. In this model, various raster images can be vectorized such as township plans, maps, architectural drawings, and machine plans. The algorithm of the model was developed by implementing an appropriate computer programming and tested on a basic application. Results, verified by using two well known vectorization programs (WinTopo and Scan2CAD), indicated that the model can successfully vectorize the specified raster data quickly and accurately.

  3. The Confluence of GIS, Cloud and Open Source, Enabling Big Raster Data Applications

    NASA Astrophysics Data System (ADS)

    Plesea, L.; Emmart, C. B.; Boller, R. A.; Becker, P.; Baynes, K.

    2016-12-01

    The rapid evolution of available cloud services is profoundly changing the way applications are being developed and used. Massive object stores, service scalability, continuous integration are some of the most important cloud technology advances that directly influence science applications and GIS. At the same time, more and more scientists are using GIS platforms in their day to day research. Yet with new opportunities there are always some challenges. Given the large amount of data commonly required in science applications, usually large raster datasets, connectivity is one of the biggest problems. Connectivity has two aspects, one being the limited bandwidth and latency of the communication link due to the geographical location of the resources, the other one being the interoperability and intrinsic efficiency of the interface protocol used to connect. NASA and Esri are actively helping each other and collaborating on a few open source projects, aiming to provide some of the core technology components to directly address the GIS enabled data connectivity problems. Last year Esri contributed LERC, a very fast and efficient compression algorithm to the GDAL/MRF format, which itself is a NASA/Esri collaboration project. The MRF raster format has some cloud aware features that make it possible to build high performance web services on cloud platforms, as some of the Esri projects demonstrate. Currently, another NASA open source project, the high performance OnEarth WMTS server is being refactored and enhanced to better integrate with MRF, GDAL and Esri software. Taken together, the GDAL, MRF and OnEarth form the core of an open source CloudGIS toolkit that is already showing results. Since it is well integrated with GDAL, which is the most common interoperability component of GIS applications, this approach should improve the connectivity and performance of many science and GIS applications in the cloud.

  4. Linear beam raster magnet driver based on H-bridge technique

    DOEpatents

    Sinkine, Nikolai I.; Yan, Chen; Apeldoorn, Cornelis; Dail, Jeffrey Glenn; Wojcik, Randolph Frank; Gunning, William

    2006-06-06

    An improved raster magnet driver for a linear particle beam is based on an H-bridge technique. Four branches of power HEXFETs form a two-by-two switch. Switching the HEXFETs in a predetermined order and at the right frequency produces a triangular current waveform. An H-bridge controller controls switching sequence and timing. The magnetic field of the coil follows the shape of the waveform and thus steers the beam using a triangular rather than a sinusoidal waveform. The system produces a raster pattern having a highly uniform raster density distribution, eliminates target heating from non-uniform raster density distributions, and produces higher levels of beam current.

  5. A nearest-neighbor imputation approach to mapping tree species over large areas using forest inventory plots and moderate resolution raster data

    Treesearch

    B. Tyler Wilson; Andrew J. Lister; Rachel I. Riemann

    2012-01-01

    The paper describes an efficient approach for mapping multiple individual tree species over large spatial domains. The method integrates vegetation phenology derived from MODIS imagery and raster data describing relevant environmental parameters with extensive field plot data of tree species basal area to create maps of tree species abundance and distribution at a 250-...

  6. Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coquelle, Nicolas; CNRS, IBS, 38044 Grenoble; CEA, IBS, 38044 Grenoble

    A raster scanning serial protein crystallography approach is presented, that consumes as low ∼200–700 nl of sedimented crystals. New serial data pre-analysis software, NanoPeakCell, is introduced. High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able tomore » read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Å resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.« less

  7. Methods to achieve accurate projection of regional and global raster databases

    USGS Publications Warehouse

    Usery, E. Lynn; Seong, Jeong Chang; Steinwand, Dan

    2002-01-01

    Modeling regional and global activities of climatic and human-induced change requires accurate geographic data from which we can develop mathematical and statistical tabulations of attributes and properties of the environment. Many of these models depend on data formatted as raster cells or matrices of pixel values. Recently, it has been demonstrated that regional and global raster datasets are subject to significant error from mathematical projection and that these errors are of such magnitude that model results may be jeopardized (Steinwand, et al., 1995; Yang, et al., 1996; Usery and Seong, 2001; Seong and Usery, 2001). There is a need to develop methods of projection that maintain the accuracy of these datasets to support regional and global analyses and modeling

  8. DISPLAY OF PIXEL LOSS AND REPLICATION IN REPROJECTING RASTER DATA FROM THE SINUSOIDAL PROJECTION

    EPA Science Inventory

    Recent studies show the sinusoidal projection to be a superior planar projection for representing global raster datasets. This study uses the sinusoidal projection as a basis for evaluating pixel loss and replication in eight other planar map projections. The percent of pixels ...

  9. Raster and vector processing for scanned linework

    USGS Publications Warehouse

    Greenlee, David D.

    1987-01-01

    An investigation of raster editing techniques, including thinning, filling, and node detecting, was performed by using specialized software. The techniques were based on encoding the state of the 3-by-3 neighborhood surrounding each pixel into a single byte. A prototypical method for converting the edited raster linkwork into vectors was also developed. Once vector representations of the lines were formed, they were formatted as a Digital Line Graph, and further refined by deletion of nonessential vertices and by smoothing with a curve-fitting technique.

  10. Digital Screening and Halftone Techniques for Raster Processing,

    DTIC Science & Technology

    1980-01-14

    I 7 A)-AO81 090 ARMY ENGINEER TOPOGRAPH4IC LASS FORT Bs.voiR VA 161/ OIGITAL SCREENING ANO HALFTONE TECNNIOUIES FOR RASTER PROCLPSINM-.TC(U) JAN GO R... HALFTONE TECHNIQUES 0FOR RASTER PROCESSING BY RICHARD L. ROSENTHAL DTIC FEB 𔃼 7 1980 W.A Approved for public release; distribution unlimited AU...creening and halftone techniques forlt -rastei’ processing ~ A 6. PERFORMING ORG. REPORT NUMBER 7. AUTNOP-r- S. CONTRACT OR GRANT NUMBER(*) c*t- Ri chard

  11. Raster Scan Computer Image Generation (CIG) System Based On Refresh Memory

    NASA Astrophysics Data System (ADS)

    Dichter, W.; Doris, K.; Conkling, C.

    1982-06-01

    A full color, Computer Image Generation (CIG) raster visual system has been developed which provides a high level of training sophistication by utilizing advanced semiconductor technology and innovative hardware and firmware techniques. Double buffered refresh memory and efficient algorithms eliminate the problem of conventional raster line ordering by allowing the generated image to be stored in a random fashion. Modular design techniques and simplified architecture provide significant advantages in reduced system cost, standardization of parts, and high reliability. The major system components are a general purpose computer to perform interfacing and data base functions; a geometric processor to define the instantaneous scene image; a display generator to convert the image to a video signal; an illumination control unit which provides final image processing; and a CRT monitor for display of the completed image. Additional optional enhancements include texture generators, increased edge and occultation capability, curved surface shading, and data base extensions.

  12. Parallel processor-based raster graphics system architecture

    DOEpatents

    Littlefield, Richard J.

    1990-01-01

    An apparatus for generating raster graphics images from the graphics command stream includes a plurality of graphics processors connected in parallel, each adapted to receive any part of the graphics command stream for processing the command stream part into pixel data. The apparatus also includes a frame buffer for mapping the pixel data to pixel locations and an interconnection network for interconnecting the graphics processors to the frame buffer. Through the interconnection network, each graphics processor may access any part of the frame buffer concurrently with another graphics processor accessing any other part of the frame buffer. The plurality of graphics processors can thereby transmit concurrently pixel data to pixel locations in the frame buffer.

  13. What's the Point of a Raster ? Advantages of 3D Point Cloud Processing over Raster Based Methods for Accurate Geomorphic Analysis of High Resolution Topography.

    NASA Astrophysics Data System (ADS)

    Lague, D.

    2014-12-01

    High Resolution Topographic (HRT) datasets are predominantly stored and analyzed as 2D raster grids of elevations (i.e., Digital Elevation Models). Raster grid processing is common in GIS software and benefits from a large library of fast algorithms dedicated to geometrical analysis, drainage network computation and topographic change measurement. Yet, all instruments or methods currently generating HRT datasets (e.g., ALS, TLS, SFM, stereo satellite imagery) output natively 3D unstructured point clouds that are (i) non-regularly sampled, (ii) incomplete (e.g., submerged parts of river channels are rarely measured), and (iii) include 3D elements (e.g., vegetation, vertical features such as river banks or cliffs) that cannot be accurately described in a DEM. Interpolating the raw point cloud onto a 2D grid generally results in a loss of position accuracy, spatial resolution and in more or less controlled interpolation. Here I demonstrate how studying earth surface topography and processes directly on native 3D point cloud datasets offers several advantages over raster based methods: point cloud methods preserve the accuracy of the original data, can better handle the evaluation of uncertainty associated to topographic change measurements and are more suitable to study vegetation characteristics and steep features of the landscape. In this presentation, I will illustrate and compare Point Cloud based and Raster based workflows with various examples involving ALS, TLS and SFM for the analysis of bank erosion processes in bedrock and alluvial rivers, rockfall statistics (including rockfall volume estimate directly from point clouds) and the interaction of vegetation/hydraulics and sedimentation in salt marshes. These workflows use 2 recently published algorithms for point cloud classification (CANUPO) and point cloud comparison (M3C2) now implemented in the open source software CloudCompare.

  14. The effect of monitor raster latency on VEPs, ERPs and Brain-Computer Interface performance.

    PubMed

    Nagel, Sebastian; Dreher, Werner; Rosenstiel, Wolfgang; Spüler, Martin

    2018-02-01

    Visual neuroscience experiments and Brain-Computer Interface (BCI) control often require strict timings in a millisecond scale. As most experiments are performed using a personal computer (PC), the latencies that are introduced by the setup should be taken into account and be corrected. As a standard computer monitor uses a rastering to update each line of the image sequentially, this causes a monitor raster latency which depends on the position, on the monitor and the refresh rate. We technically measured the raster latencies of different monitors and present the effects on visual evoked potentials (VEPs) and event-related potentials (ERPs). Additionally we present a method for correcting the monitor raster latency and analyzed the performance difference of a code-modulated VEP BCI speller by correcting the latency. There are currently no other methods validating the effects of monitor raster latency on VEPs and ERPs. The timings of VEPs and ERPs are directly affected by the raster latency. Furthermore, correcting the raster latency resulted in a significant reduction of the target prediction error from 7.98% to 4.61% and also in a more reliable classification of targets by significantly increasing the distance between the most probable and the second most probable target by 18.23%. The monitor raster latency affects the timings of VEPs and ERPs, and correcting resulted in a significant error reduction of 42.23%. It is recommend to correct the raster latency for an increased BCI performance and methodical correctness. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams.

    PubMed

    Coquelle, Nicolas; Brewster, Aaron S; Kapp, Ulrike; Shilova, Anastasya; Weinhausen, Britta; Burghammer, Manfred; Colletier, Jacques Philippe

    2015-05-01

    High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Å resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.

  16. Tree-based approach for exploring marine spatial patterns with raster datasets.

    PubMed

    Liao, Xiaohan; Xue, Cunjin; Su, Fenzhen

    2017-01-01

    From multiple raster datasets to spatial association patterns, the data-mining technique is divided into three subtasks, i.e., raster dataset pretreatment, mining algorithm design, and spatial pattern exploration from the mining results. Comparison with the former two subtasks reveals that the latter remains unresolved. Confronted with the interrelated marine environmental parameters, we propose a Tree-based Approach for eXploring Marine Spatial Patterns with multiple raster datasets called TAXMarSP, which includes two models. One is the Tree-based Cascading Organization Model (TCOM), and the other is the Spatial Neighborhood-based CAlculation Model (SNCAM). TCOM designs the "Spatial node→Pattern node" from top to bottom layers to store the table-formatted frequent patterns. Together with TCOM, SNCAM considers the spatial neighborhood contributions to calculate the pattern-matching degree between the specified marine parameters and the table-formatted frequent patterns and then explores the marine spatial patterns. Using the prevalent quantification Apriori algorithm and a real remote sensing dataset from January 1998 to December 2014, a successful application of TAXMarSP to marine spatial patterns in the Pacific Ocean is described, and the obtained marine spatial patterns present not only the well-known but also new patterns to Earth scientists.

  17. Wildfire spread, hazard and exposure metric raster grids for central Catalonia.

    PubMed

    Alcasena, Fermín J; Ager, Alan A; Salis, Michele; Day, Michelle A; Vega-Garcia, Cristina

    2018-04-01

    We provide 40 m resolution wildfire spread, hazard and exposure metric raster grids for the 0.13 million ha fire-prone Bages County in central Catalonia (northeastern Spain) corresponding to node influence grid (NIG), crown fraction burned (CFB) and fire transmission to residential houses (TR). Fire spread and behavior data (NIG, CFB and fire perimeters) were generated with fire simulation modeling considering wildfire season extreme fire weather conditions (97 th percentile). Moreover, CFB was also generated for prescribed fire (Rx) mild weather conditions. The TR smoothed grid was obtained with a geospatial analysis considering large fire perimeters and individual residential structures located within the study area. We made these raster grids available to assist in the optimization of wildfire risk management plans within the study area and to help mitigate potential losses from catastrophic events.

  18. Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams

    PubMed Central

    Coquelle, Nicolas; Brewster, Aaron S.; Kapp, Ulrike; Shilova, Anastasya; Weinhausen, Britta; Burghammer, Manfred; Colletier, Jacques-Philippe

    2015-01-01

    High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Å resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering. PMID:25945583

  19. Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coquelle, Nicolas; Brewster, Aaron S.; Kapp, Ulrike

    High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Åmore » resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.« less

  20. Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams

    DOE PAGES

    Coquelle, Nicolas; Brewster, Aaron S.; Kapp, Ulrike; ...

    2015-04-25

    High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Åmore » resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.« less

  1. SERAPHIM: studying environmental rasters and phylogenetically informed movements.

    PubMed

    Dellicour, Simon; Rose, Rebecca; Faria, Nuno R; Lemey, Philippe; Pybus, Oliver G

    2016-10-15

    SERAPHIM ("Studying Environmental Rasters and PHylogenetically Informed Movements") is a suite of computational methods developed to study phylogenetic reconstructions of spatial movement in an environmental context. SERAPHIM extracts the spatio-temporal information contained in estimated phylogenetic trees and uses this information to calculate summary statistics of spatial spread and to visualize dispersal history. Most importantly, SERAPHIM enables users to study the impact of customized environmental variables on the spread of the study organism. Specifically, given an environmental raster, SERAPHIM computes environmental "weights" for each phylogeny branch, which represent the degree to which the environmental variable impedes (or facilitates) lineage movement. Correlations between movement duration and these environmental weights are then assessed, and the statistical significances of these correlations are evaluated using null distributions generated by a randomization procedure. SERAPHIM can be applied to any phylogeny whose nodes are annotated with spatial and temporal information. At present, such phylogenies are most often found in the field of emerging infectious diseases, but will become increasingly common in other biological disciplines as population genomic data grows. SERAPHIM 1.0 is freely available from http://evolve.zoo.ox.ac.uk/ R package, source code, example files, tutorials and a manual are also available from this website. simon.dellicour@kuleuven.be or oliver.pybus@zoo.ox.ac.ukSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. The wildland-urban interface raster dataset of Catalonia.

    PubMed

    Alcasena, Fermín J; Evers, Cody R; Vega-Garcia, Cristina

    2018-04-01

    We provide the wildland urban interface (WUI) map of the autonomous community of Catalonia (Northeastern Spain). The map encompasses an area of some 3.21 million ha and is presented as a 150-m resolution raster dataset. Individual housing location, structure density and vegetation cover data were used to spatially assess in detail the interface, intermix and dispersed rural WUI communities with a geographical information system. Most WUI areas concentrate in the coastal belt where suburban sprawl has occurred nearby or within unmanaged forests. This geospatial information data provides an approximation of residential housing potential for loss given a wildfire, and represents a valuable contribution to assist landscape and urban planning in the region.

  3. Raster Files for Utah Play Fairway Analysis

    DOE Data Explorer

    Wannamaker, Phil

    2017-06-16

    This submission contains raster files associated with several datasets that include earthquake density, Na/K geothermometers, fault density, heat flow, and gravity. Integrated together using spatial modeler tools in ArcGIS, these files can be used for play fairway analysis in regard to geothermal exploration.

  4. Custom-made raster method for fistula and graft.

    PubMed

    Blokker, C

    2005-01-01

    Unfamiliarity with fistula and graft characteristics can lead to failed punctures, haematomas and sometimes access occlusion. The Custom-made Raster Method provides detailed shunt visualisation and angiographic images by using photo-editing software. Access veins of an individual shunt and an adapted raster are projected on a digital picture of the arm. During angiography the shunt arm is fixated and a digital picture is taken from a fixed vertical angle and distance. Reference points are marked on the shunt arm, which serves as a fixation to draw a raster with coordination points. In this way a picture is created similar to a roadmap with veins. There is complete integration of digital and radiological images by using software programmes Adobe Photoshop + Illustrator or Agfa Web 1000 under Windows XP. All illustrations fit 1:1 by scaling up or down without distortion. Editing with Photoshop gives a precise projection of shunt veins on the real coloured background of the digital photograph. In this projection the grey angiography background is made completely transparent. The system can contain more detailed information in combination with echo (duplex) images of depth and diameter. This visualisation method is a useful tool for multi disciplinary access meetings with intervention radiologists, access surgeons and nephrologists. Access malfunction, aneurysms and stenosis can be projected at the exact location. The system leads to clear and concrete puncture advice. Transfer of access information and communication to other dialysis centres is facilitated.

  5. Software-based data path for raster-scanned multi-beam mask lithography

    NASA Astrophysics Data System (ADS)

    Rajagopalan, Archana; Agarwal, Ankita; Buck, Peter; Geller, Paul; Hamaker, H. Christopher; Rao, Nagswara

    2016-10-01

    well to hundreds or even thousands of CPU-cores, offering the potential for virtually unlimited capacity. Features available in EDA software such as sizing, scaling, tone reversal, OPC, MPC, rasterization, and others are easily adapted to the requirements of a data path system. This paper presents the motivation, requirements, design and performance of an advanced, scalable software data path system suitable to support multi-beam laser mask lithography.

  6. Parallel line raster eliminates ambiguities in reading timing of pulses less than 500 microseconds apart

    NASA Technical Reports Server (NTRS)

    Horne, A. P.

    1966-01-01

    Parallel horizontal line raster is used for precision timing of events occurring less than 500 microseconds apart for observation of hypervelocity phenomena. The raster uses a staircase vertical deflection and eliminates ambiguities in reading timing of pulses close to the end of each line.

  7. DEFINITION OF MULTIVARIATE GEOCHEMICAL ASSOCIATIONS WITH POLYMETALLIC MINERAL OCCURRENCES USING A SPATIALLY DEPENDENT CLUSTERING TECHNIQUE AND RASTERIZED STREAM SEDIMENT DATA - AN ALASKAN EXAMPLE.

    USGS Publications Warehouse

    Jenson, Susan K.; Trautwein, C.M.

    1984-01-01

    The application of an unsupervised, spatially dependent clustering technique (AMOEBA) to interpolated raster arrays of stream sediment data has been found to provide useful multivariate geochemical associations for modeling regional polymetallic resource potential. The technique is based on three assumptions regarding the compositional and spatial relationships of stream sediment data and their regional significance. These assumptions are: (1) compositionally separable classes exist and can be statistically distinguished; (2) the classification of multivariate data should minimize the pair probability of misclustering to establish useful compositional associations; and (3) a compositionally defined class represented by three or more contiguous cells within an array is a more important descriptor of a terrane than a class represented by spatial outliers.

  8. Estimating FIA plot characteristics using NAIP imagery, function modeling, and the RMRS Raster Utility coding library

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2015-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that...

  9. Estimating FIA plot characteristics using NAIP imagery, function modeling, and the RMRS raster utility coding library

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2015-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...

  10. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  11. Oregon Cascades Play Fairway Analysis: Raster Datasets and Models

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    This submission includes maps of the spatial distribution of basaltic, and felsic rocks in the Oregon Cascades. It also includes a final Play Fairway Analysis (PFA) model, with the heat and permeability composite risk segments (CRS) supplied separately. Metadata for each raster dataset can be found within the zip files, in the TIF images

  12. New implementation of OGC Web Processing Service in Python programming language. PyWPS-4 and issues we are facing with processing of large raster data using OGC WPS

    NASA Astrophysics Data System (ADS)

    Čepický, Jáchym; Moreira de Sousa, Luís

    2016-06-01

    The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4) completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.

  13. Population weighted raster maps can communicate findings of social audits: examples from three continents.

    PubMed

    Mitchell, Steven; Cockcroft, Anne; Andersson, Neil

    2011-12-21

    Maps can portray trends, patterns, and spatial differences that might be overlooked in tabular data and are now widely used in health research. Little has been reported about the process of using maps to communicate epidemiological findings. Population weighted raster maps show colour changes over the study area. Similar to the rasters of barometric pressure in a weather map, data are the health occurrence--a peak on the map represents a higher value of the indicator in question. The population relevance of each sentinel site, as determined in the stratified last stage random sample, combines with geography (inverse-distance weighting) to provide a population-weighted extension of each colour. This transforms the map to show population space rather than simply geographic space. Maps allowed discussion of strategies to reduce violence against women in a context of political sensitivity about quoting summary indicator figures. Time-series maps showed planners how experiences of health services had deteriorated despite a reform programme; where in a country HIV risk behaviours were improving; and how knowledge of an economic development programme quickly fell off across a region. Change maps highlighted where indicators were improving and where they were deteriorating. Maps of potential impact of interventions, based on multivariate modelling, displayed how partial and full implementation of programmes could improve outcomes across a country. Scale depends on context. To support local planning, district maps or local government authority maps of health indicators were more useful than national maps; but multinational maps of outcomes were more useful for regional institutions. Mapping was useful to illustrate in which districts enrolment in religious schools--a rare occurrence--was more prevalent. Population weighted raster maps can present social audit findings in an accessible and compelling way, increasing the use of evidence by planners with limited numeracy

  14. Micromachined mirrors for raster-scanning displays and optical fiber switches

    NASA Astrophysics Data System (ADS)

    Hagelin, Paul Merritt

    Micromachines and micro-optics have the potential to shrink the size and cost of free-space optical systems, enabling a new generation of high-performance, compact projection displays and telecommunications equipment. In raster-scanning displays and optical fiber switches, a free-space optical beam can interact with multiple tilt- up micromirrors fabricated on a single substrate. The size, rotation angle, and flatness of the mirror surfaces determine the number of pixels in a raster-display or ports in an optical switch. Single-chip and two-chip optical raster display systems demonstrate static mirror curvature correction, an integrated electronic driver board, and dynamic micromirror performance. Correction for curvature caused by a stress gradient in the micromirror leads to resolution of 102 by 119 pixels in the single-chip display. The optical design of the two-chip display features in-situ mirror curvature measurement and adjustable image magnification with a single output lens. An electronic driver board synchronizes modulation of the optical source with micromirror actuation for the display of images. Dynamic off-axis mirror motion is shown to have minimal influence on resolution. The confocal switch, a free-space optical fiber cross- connect, incorporates micromirrors having a design similar to the image-refresh scanner. Two micromirror arrays redirect optical beams from an input fiber array to the output fibers. The switch architecture supports simultaneous switching of multiple wavelength channels. A 2x2 switch configuration, using single-mode optical fiber at 1550 mn, is demonstrated with insertion loss of -4.2 dB and cross-talk of -50.5 dB. The micromirrors have sufficient size and angular range for scaling to a 32x32 cross-connect switch that has low insertion-loss and low cross-talk.

  15. Interactive Raster Data Structure Study.

    DTIC Science & Technology

    1983-01-01

    update. In a cartographic editing environment the data to be used . is very temporal and dynamic in nature , Changes in the data being addressed should...draft them. This allowed for the natural conceptualization of digital data as features. This trait has been observed as a result of functions being...performing * cartographic manipulations. In the compilation and revision environment P where data is temporal in nature , additional processing leads to

  16. BOREAS TGB-5 Fire History of Manitoba 1980 to 1991 in Raster Format

    NASA Technical Reports Server (NTRS)

    Stocks, Brian J.; Zepp, Richard; Knapp, David; Hall, Forrest G. (Editor); Conrad, Sara K. (Editor)

    2000-01-01

    The BOReal Ecosystem-Atmosphere Study Trace Gas Biogeochemistry (BOREAS TGB-5) team collected several data sets related to the effects of fire on the exchange of trace gases between the surface and the atmosphere. This raster format data set covers the province of Manitoba between 1980 and 1991. The data were gridded into the Albers Equal-Area Conic (AEAC) projection from the original vector data. The original vector data were produced by Forestry Canada from hand-drawn boundaries of fires on photocopies of 1:250,000-scale maps. The locational accuracy of the data is considered fair to poor. When the locations of some fire boundaries were compared to Landsat TM images, they were found to be off by as much as a few kilometers. This problem should be kept in mind when using these data. The data are stored in binary, image format files.

  17. Recommended GIS Analysis Methods for Global Gridded Population Data

    NASA Astrophysics Data System (ADS)

    Frye, C. E.; Sorichetta, A.; Rose, A.

    2017-12-01

    When using geographic information systems (GIS) to analyze gridded, i.e., raster, population data, analysts need a detailed understanding of several factors that affect raster data processing, and thus, the accuracy of the results. Global raster data is most often provided in an unprojected state, usually in the WGS 1984 geographic coordinate system. Most GIS functions and tools evaluate data based on overlay relationships (area) or proximity (distance). Area and distance for global raster data can be either calculated directly using the various earth ellipsoids or after transforming the data to equal-area/equidistant projected coordinate systems to analyze all locations equally. However, unlike when projecting vector data, not all projected coordinate systems can support such analyses equally, and the process of transforming raster data from one coordinate space to another often results unmanaged loss of data through a process called resampling. Resampling determines which values to use in the result dataset given an imperfect locational match in the input dataset(s). Cell size or resolution, registration, resampling method, statistical type, and whether the raster represents continuous or discreet information potentially influence the quality of the result. Gridded population data represent estimates of population in each raster cell, and this presentation will provide guidelines for accurately transforming population rasters for analysis in GIS. Resampling impacts the display of high resolution global gridded population data, and we will discuss how to properly handle pyramid creation using the Aggregate tool with the sum option to create overviews for mosaic datasets.

  18. Special raster scanning for reduction of charging effects in scanning electron microscopy.

    PubMed

    Suzuki, Kazuhiko; Oho, Eisaku

    2014-01-01

    A special raster scanning (SRS) method for reduction of charging effects is developed for the field of SEM. Both a conventional fast scan (horizontal direction) and an unusual scan (vertical direction) are adopted for acquiring raw data consisting of many sub-images. These data are converted to a proper SEM image using digital image processing techniques. About sharpness of the image and reduction of charging effects, the SRS is compared with the conventional fast scan (with frame-averaging) and the conventional slow scan. Experimental results show the effectiveness of SRS images. By a successful combination of the proposed scanning method and low accelerating voltage (LV)-SEMs, it is expected that higher-quality SEM images can be more easily acquired by the considerable reduction of charging effects, while maintaining the resolution. © 2013 Wiley Periodicals, Inc.

  19. Module greenhouse with high efficiency of transformation of solar energy, utilizing active and passive glass optical rasters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korecko, J.; Jirka, V.; Sourek, B.

    2010-10-15

    Since the eighties of the 20th century, various types of linear glass rasters for architectural usage have been developed in the Czech Republic made by the continuous melting technology. The development was focused on two main groups of rasters - active rasters with linear Fresnel lenses in fixed installation and with movable photo-thermal and/or photo-thermal/photo-voltaic absorbers. The second group are passive rasters based on total reflection of rays on an optical prism. During the last years we have been working on their standardization, exact measuring of their optical and thermal-technical characteristics and on creation of a final product that couldmore » be applied in solar architecture. With the project supported by the Ministry of Environment of the Czech Republic we were able to build an experimental greenhouse using these active and passive optical glass rasters. The project followed the growing number of technical objectives. The concept of the greenhouse consisted of interdependence construction - structural design of the greenhouse with its technological equipment securing the required temperature and humidity conditions in the interior of the greenhouse. This article aims to show the merits of the proposed scheme and presents the results of the mathematical model in the TRNSYS environment through which we could predict the future energy balance carried out similar works, thus optimizing the investment and operating costs. In this article description of various technology applications for passive and active utilization of solar radiation is presented, as well as some results of short-term and long-term experiments, including evaluation of 1-year operation of the greenhouse from the energy and interior temperature viewpoints. A comparison of the calculated energy flows in the greenhouse to real measured values, for verification of the installed model is also involved. (author)« less

  20. Raster microdiffraction with synchrotron radiation of hydrated biopolymers with nanometre step-resolution: case study of starch granules

    PubMed Central

    Riekel, C.; Burghammer, M.; Davies, R. J.; Di Cola, E.; König, C.; Lemke, H.T.; Putaux, J.-L.; Schöder, S.

    2010-01-01

    X-ray radiation damage propagation is explored for hydrated starch granules in order to reduce the step resolution in raster-microdiffraction experiments to the nanometre range. Radiation damage was induced by synchrotron radiation microbeams of 5, 1 and 0.3 µm size with ∼0.1 nm wavelength in B-type potato, Canna edulis and Phajus grandifolius starch granules. A total loss of crystallinity of granules immersed in water was found at a dose of ∼1.3 photons nm−3. The temperature dependence of radiation damage suggests that primary radiation damage prevails up to about 120 K while secondary radiation damage becomes effective at higher temperatures. Primary radiation damage remains confined to the beam track at 100 K. Propagation of radiation damage beyond the beam track at room temperature is assumed to be due to reactive species generated principally by water radiolysis induced by photoelectrons. By careful dose selection during data collection, raster scans with 500 nm step-resolution could be performed for granules immersed in water. PMID:20975219

  1. Geometrical E-beam proximity correction for raster scan systems

    NASA Astrophysics Data System (ADS)

    Belic, Nikola; Eisenmann, Hans; Hartmann, Hans; Waas, Thomas

    1999-04-01

    High pattern fidelity is a basic requirement for the generation of masks containing sub micro structures and for direct writing. Increasing needs mainly emerging from OPC at mask level and x-ray lithography require a correction of the e-beam proximity effect. The most part of e-beam writers are raster scan system. This paper describes a new method for geometrical pattern correction in order to provide a correction solution for e-beam system that are not able to apply variable doses.

  2. A hybrid structure for the storage and manipulation of very large spatial data sets

    USGS Publications Warehouse

    Peuquet, Donna J.

    1982-01-01

    The map data input and output problem for geographic information systems is rapidly diminishing with the increasing availability of mass digitizing, direct spatial data capture and graphics hardware based on raster technology. Although a large number of efficient raster-based algorithms exist for performing a wide variety of common tasks on these data, there are a number of procedures which are more efficiently performed in vector mode or for which raster mode equivalents of current vector-based techniques have not yet been developed. This paper presents a hybrid spatial data structure, named the ?vaster' structure, which can utilize the advantages of both raster and vector structures while potentially eliminating, or greatly reducing, the need for raster-to-vector and vector-to-raster conversion. Other advantages of the vaster structure are also discussed.

  3. elevatr: Access Elevation Data from Various APIs

    EPA Science Inventory

    Several web services are available that provide access to elevation data. This package provides access to several of those services and returns elevation data either as a SpatialPointsDataFrame from point elevation services or as a raster object from raster elevation services. ...

  4. The TV Turtle: A LOGO Graphics System for Raster Displays. AI Memo 361.

    ERIC Educational Resources Information Center

    Lieberman, Henry

    This discussion of the advantages and limitations of raster graphics systems points out that until recently, most computer graphics systems have been oriented toward the display of line drawings, continually refreshing the screen from a display list of vectors. Developments such as plasma panel displays and rapidly declining memory prices have now…

  5. Dynamic analysis, transformation, dissemination and applications of scientific multidimensional data in ArcGIS Platform

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Collow, T. W.; Rose, B.

    2016-12-01

    Scientific datasets are generated from various sources and platforms but they are typically produced either by earth observation systems or by modelling systems. These are widely used for monitoring, simulating, or analyzing measurements that are associated with physical, chemical, and biological phenomena over the ocean, atmosphere, or land. A significant subset of scientific datasets stores values directly as rasters or in a form that can be rasterized. This is where a value exists at every cell in a regular grid spanning the spatial extent of the dataset. Government agencies like NOAA, NASA, EPA, USGS produces large volumes of near real-time, forecast, and historical data that drives climatological and meteorological studies, and underpins operations ranging from weather prediction to sea ice loss. Modern science is computationally intensive because of the availability of an enormous amount of scientific data, the adoption of data-driven analysis, and the need to share these dataset and research results with the public. ArcGIS as a platform is sophisticated and capable of handling such complex domain. We'll discuss constructs and capabilities applicable to multidimensional gridded data that can be conceptualized as a multivariate space-time cube. Building on the concept of a two-dimensional raster, a typical multidimensional raster dataset could contain several "slices" within the same spatial extent. We will share a case from the NOAA Climate Forecast Systems Reanalysis (CFSR) multidimensional data as an example of how large collections of rasters can be efficiently organized and managed through a data model within a geodatabase called "Mosaic dataset" and dynamically transformed and analyzed using raster functions. A raster function is a lightweight, raster-valued transformation defined over a mixed set of raster and scalar input. That means, just like any tool, you can provide a raster function with input parameters. It enables dynamic processing of only the

  6. ArcGIS Framework for Scientific Data Analysis and Serving

    NASA Astrophysics Data System (ADS)

    Xu, H.; Ju, W.; Zhang, J.

    2015-12-01

    ArcGIS is a platform for managing, visualizing, analyzing, and serving geospatial data. Scientific data as part of the geospatial data features multiple dimensions (X, Y, time, and depth) and large volume. Multidimensional mosaic dataset (MDMD), a newly enhanced data model in ArcGIS, models the multidimensional gridded data (e.g. raster or image) as a hypercube and enables ArcGIS's capabilities to handle the large volume and near-real time scientific data. Built on top of geodatabase, the MDMD stores the dimension values and the variables (2D arrays) in a geodatabase table which allows accessing a slice or slices of the hypercube through a simple query and supports animating changes along time or vertical dimension using ArcGIS desktop or web clients. Through raster types, MDMD can manage not only netCDF, GRIB, and HDF formats but also many other formats or satellite data. It is scalable and can handle large data volume. The parallel geo-processing engine makes the data ingestion fast and easily. Raster function, definition of a raster processing algorithm, is a very important component in ArcGIS platform for on-demand raster processing and analysis. The scientific data analytics is achieved through the MDMD and raster function templates which perform on-demand scientific computation with variables ingested in the MDMD. For example, aggregating monthly average from daily data; computing total rainfall of a year; calculating heat index for forecasting data, and identifying fishing habitat zones etc. Addtionally, MDMD with the associated raster function templates can be served through ArcGIS server as image services which provide a framework for on-demand server side computation and analysis, and the published services can be accessed by multiple clients such as ArcMap, ArcGIS Online, JavaScript, REST, WCS, and WMS. This presentation will focus on the MDMD model and raster processing templates. In addtion, MODIS land cover, NDFD weather service, and HYCOM ocean model

  7. Investigating MALDI MSI parameters (Part 1) - A systematic survey of the effects of repetition rates up to 20kHz in continuous raster mode.

    PubMed

    Steven, Rory T; Dexter, Alex; Bunch, Josephine

    2016-07-15

    Recent developments in laser performance, combined with the desire for increases in detected ion intensity and throughput, have led to the adoption of high repetition-rate diode-pumped solid-state (DPSS) lasers in matrix-assisted laser desorption/ionization (MALDI) mass spectrometry imaging (MSI). Previous studies have demonstrated a more complex relationship between detected ion intensity, stage raster speed and laser pulse repetition rate than the simple linear relationship between number of pulses and detected ion intensity that might be expected. Here we report, for the first time, the interrelated influence of varying laser energy, repetition rate and stage raster speed on detected ion intensity. Thin films of PC 34:1 lipid standard and murine brain tissue with CHCA are analysed by continuous stage raster MALDI MSI. Contrary to previous reports, the optimum laser repetition rate is found to be dependent on both laser energy and stage raster speed and is found to be as high as 20kHz under some conditions. The effects of different repetition rates and raster speeds are also found to vary for different ion species within MALDI MSI of tissue and so may be significant when either targeting specific molecules or seeking to minimize bias. A clear dependence on time between laser pulses is also observed indicating the underlying mechanisms may be related to on-plate hysteresis-exhibiting processes such as matrix chemical modification. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Design study of a raster scanning system for moving target irradiation in heavy-ion radiotherapy.

    PubMed

    Furukawa, Takuji; Inaniwa, Taku; Sato, Shinji; Tomitani, Takehiro; Minohara, Shinichi; Noda, Koji; Kanai, Tatsuaki

    2007-03-01

    A project to construct a new treatment facility as an extension of the existing heavy-ion medical accelerator in chiba (HIMAC) facility has been initiated for further development of carbon-ion therapy. The greatest challenge of this project is to realize treatment of a moving target by scanning irradiation. For this purpose, we decided to combine the rescanning technique and the gated irradiation method. To determine how to avoid hot and/or cold spots by the relatively large number of rescannings within an acceptable irradiation time, we have studied the scanning strategy, scanning magnets and their control, and beam intensity dynamic control. We have designed a raster scanning system and carried out a simulation of irradiating moving targets. The result shows the possibility of practical realization of moving target irradiation with pencil beam scanning. We describe the present status of our design study of the raster scanning system for the HIMAC new treatment facility.

  9. Volumetric three-dimensional display system with rasterization hardware

    NASA Astrophysics Data System (ADS)

    Favalora, Gregg E.; Dorval, Rick K.; Hall, Deirdre M.; Giovinco, Michael; Napoli, Joshua

    2001-06-01

    An 8-color multiplanar volumetric display is being developed by Actuality Systems, Inc. It will be capable of utilizing an image volume greater than 90 million voxels, which we believe is the greatest utilizable voxel set of any volumetric display constructed to date. The display is designed to be used for molecular visualization, mechanical CAD, e-commerce, entertainment, and medical imaging. As such, it contains a new graphics processing architecture, novel high-performance line- drawing algorithms, and an API similar to a current standard. Three-dimensional imagery is created by projecting a series of 2-D bitmaps ('image slices') onto a diffuse screen that rotates at 600 rpm. Persistence of vision fuses the slices into a volume-filling 3-D image. A modified three-panel Texas Instruments projector provides slices at approximately 4 kHz, resulting in 8-color 3-D imagery comprised of roughly 200 radially-disposed slices which are updated at 20 Hz. Each slice has a resolution of 768 by 768 pixels, subtending 10 inches. An unusual off-axis projection scheme incorporating tilted rotating optics is used to maintain good focus across the projection screen. The display electronics includes a custom rasterization architecture which converts the user's 3- D geometry data into image slices, as well as 6 Gbits of DDR SDRAM graphics memory.

  10. Raster-scan optoacoustic angiography reveals 3D microcirculatory changes during cuffed occlusion

    NASA Astrophysics Data System (ADS)

    Subochev, Pavel; Orlova, Anna; Smolina, Ekaterina; Kirillov, Aleksey; Shakhova, Natalia; Turchin, Ilya

    2018-04-01

    Acoustic resolution photoacoustic microscopy at the optical wavelength of 532 nm was used to investigate the functional reaction of blood vessels of healthy human skin during cuffed venous occlusion. The high bandwidth of the polyvinilidene difluoride detector provided the opportunity for raster-scan optoacoustic angiography of both the superficial and deep plexuses at the high resolution of 35/50 microns (axial/lateral). A reversible increase of blood supply in the microcirculatory bed during occlusion was revealed.

  11. The effect of choosing three different C factor formulae derived from NDVI on a fully raster-based erosion modelling

    NASA Astrophysics Data System (ADS)

    Sulistyo, Bambang

    2016-11-01

    The research was aimed at studying the efect of choosing three different C factor formulae derived from NDVI on a fully raster-based erosion modelling of The USLE using remote sensing data and GIS technique. Methods applied was by analysing all factors affecting erosion such that all data were in the form of raster. Those data were R, K, LS, C and P factors. Monthly R factor was evaluated based on formula developed by Abdurachman. K factor was determined using modified formula used by Ministry of Forestry based on soil samples taken in the field. LS factor was derived from Digital Elevation Model. Three C factors used were all derived from NDVI and developed by Suriyaprasit (non-linear) and by Sulistyo (linear and non-linear). P factor was derived from the combination between slope data and landcover classification interpreted from Landsat 7 ETM+. Another analysis was the creation of map of Bulk Density used to convert erosion unit. To know the model accuracy, model validation was done by applying statistical analysis and by comparing Emodel with Eactual. A threshold value of ≥ 0.80 or ≥ 80% was chosen to justify. The research result showed that all Emodel using three formulae of C factors have coeeficient of correlation value of > 0.8. The results of analysis of variance showed that there was significantly difference between Emodel and Eactual when using C factor formula developed by Suriyaprasit and Sulistyo (non-linear). Among the three formulae, only Emodel using C factor formula developed by Sulistyo (linear) reached the accuracy of 81.13% while the other only 56.02% as developed by Sulistyo (nonlinear) and 4.70% as developed by Suriyaprasit, respectively.

  12. Methods for Data-based Delineation of Spatial Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, John E.

    In data analysis, it is often useful to delineate or segregate areas of interest from the general population of data in order to concentrate further analysis efforts on smaller areas. Three methods are presented here for automatically generating polygons around spatial data of interest. Each method addresses a distinct data type. These methods were developed for and implemented in the sample planning tool called Visual Sample Plan (VSP). Method A is used to delineate areas of elevated values in a rectangular grid of data (raster). The data used for this method are spatially related. Although VSP uses data from amore » kriging process for this method, it will work for any type of data that is spatially coherent and appears on a regular grid. Method B is used to surround areas of interest characterized by individual data points that are congregated within a certain distance of each other. Areas where data are “clumped” together spatially will be delineated. Method C is used to recreate the original boundary in a raster of data that separated data values from non-values. This is useful when a rectangular raster of data contains non-values (missing data) that indicate they were outside of some original boundary. If the original boundary is not delivered with the raster, this method will approximate the original boundary.« less

  13. Automated extraction of chemical structure information from digital raster images

    PubMed Central

    Park, Jungkap; Rosania, Gus R; Shedden, Kerby A; Nguyen, Mandee; Lyu, Naesung; Saitou, Kazuhiro

    2009-01-01

    Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links to scientific research

  14. Mosaic of Digital Raster Soviet Topographic Maps of Afghanistan

    USGS Publications Warehouse

    Chirico, Peter G.; Warner, Michael B.

    2005-01-01

    EXPLANATION The data contained in this publication include scanned, geographically referenced digital raster graphics (DRGs) of Soviet 1:200,000 - scale topographic map quadrangles. The original Afghanistan topographic map series at 1:200,000 scale, for the entire country, was published by the Soviet military between 1985 and 1991(MTDGS, 85-91). Hard copies of these original paper maps were scanned using a large format scanner, reprojected into Geographic Coordinate System (GCS) coordinates, and then clipped to remove the map collars to create a seamless, topographic map base for the entire country. An index of all available topographic map sheets is displayed here: Index_Geo_DD.pdf. This publication also includes the originial topographic map quadrangles projected in Universal Transverse Mercator (UTM) projection. The country of Afghanistan spans three UTM Zones: Zone 41, Zone 42, and Zone 43. Maps are stored as GeoTIFFs in their respective UTM zone projection. Indexes of all available topographic map sheets in their respective UTM zone are displayed here: Index_UTM_Z41.pdf, Index_UTM_Z42.pdf, Index_UTM_Z43.pdf. An Adobe Acrobat PDF file of the U.S. Department of the Army's Technical Manual 30-548, is available (U.S. Army, 1958). This document has been translated into English for assistance in reading Soviet topographic map symbols.

  15. Tularosa Basin Play Fairway Analysis Data and Models

    DOE Data Explorer

    Nash, Greg

    2017-07-11

    This submission includes raster datasets for each layer of evidence used for weights of evidence analysis as well as the deterministic play fairway analysis (PFA). Data representative of heat, permeability and groundwater comprises some of the raster datasets. Additionally, the final deterministic PFA model is provided along with a certainty model. All of these datasets are best used with an ArcGIS software package, specifically Spatial Data Modeler.

  16. Imaging melanin cancer growth in-vivo using raster-scan optoacoustic mesoscopy (RSOM) at 50 MHz and 100 MHz

    NASA Astrophysics Data System (ADS)

    Omar, Murad; Schwarz, Mathias; Soliman, Dominik; Symvoulidis, Panagiotis; Ntziachristos, Vasilis

    2016-03-01

    We used raster-scan optoacoustic mesoscopy (RSOM) at 50 MHz, and at 100 MHz, to monitor tumor growth, and tumor angiogenesis, which is a central hallmark of cancer, in-vivo. In this study we compared the performance, and the effect of the 50 MHz, and the 100 MHz frequencies on the quality of the final image. The system is based on a reflection-mode implementation of RSOM. The detectors used are custom made, ultrawideband, and spherically focused. The use of such detectors enables light coupling from the same side as the detector, thus reflection-mode. Light is in turn coupled using a fiber bundle, and the detector is raster scanned in the xy-plane. Subsequently, to retrieve small features, the raw data are reconstructed using a multi-bandwidth, beamforming reconstruction algorithm. Comparison of the system performance at the different frequencies shows as expected a higher resolution in case of the 100 MHz detector compared to the 50 MHz. On the other hand the 50 MHz has a better SNR, can detect features from deeper layers, and has higher angular acceptance. Based on these characteristics the 50 MHz detector was mostly used. After comparing the performance we monitored the growth of B16F10 cells, melanin tumor, over the course of 9 days. We see correspondence between the optoacoustic measurements and the cryoslice validations. Additionally, in areas close to the tumor we see sprouting of new vessels, starting at day 4-5, which corresponds to tumor angiogenesis.

  17. GIS data models for coal geology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McColloch, G.H. Jr.; Timberlake, K.J.; Oldham, A.V.

    A variety of spatial data models can be applied to different aspects of coal geology. The simple vector data models found in various Computer Aided Drafting (CAD) programs are sometimes used for routine mapping and some simple analyses. However, more sophisticated applications that maintain the topological relationships between cartographic elements enhance analytical potential. Also, vector data models are best for producing various types of high quality, conventional maps. The raster data model is generally considered best for representing data that varies continuously over a geographic area, such as the thickness of a coal bed. Information is lost when contour linesmore » are threaded through raster grids for display, so volumes and tonnages are more accurately determined by working directly with raster data. Raster models are especially well suited to computationally simple surface-to-surface analysis, or overlay functions. Another data model, triangulated irregular networks (TINs) are superior at portraying visible surfaces because many TIN programs support break fines. Break lines locate sharp breaks in slope such as those generated by bodies of water or ridge crests. TINs also {open_quotes}honor{close_quotes} data points so that a surface generated from a set of points will be forced to pass through those points. TINs or grids generated from TINs, are particularly good at determining the intersections of surfaces such as coal seam outcrops and geologic unit boundaries. No single technique works best for all coal-related applications. The ability to use a variety of data models, and transform from one model to another is essential for obtaining optimum results in a timely manner.« less

  18. A MODELING APPROACH FOR ESTIMATING WATERSHED IMPERVIOUS SURFACE AREA FROM NATIONAL LAND COVER DATA 92

    EPA Science Inventory

    We used National Land Cover Data 92 (NLCD92), vector impervious surface data, and raster GIS overlay methods to derive impervious surface coefficients per NLCD92 class in portions of the Nfid-Atlantic physiographic region. The methods involve a vector to raster conversion of the ...

  19. Semantic 3d City Model to Raster Generalisation for Water Run-Off Modelling

    NASA Astrophysics Data System (ADS)

    Verbree, E.; de Vries, M.; Gorte, B.; Oude Elberink, S.; Karimlou, G.

    2013-09-01

    Water run-off modelling applied within urban areas requires an appropriate detailed surface model represented by a raster height grid. Accurate simulations at this scale level have to take into account small but important water barriers and flow channels given by the large-scale map definitions of buildings, street infrastructure, and other terrain objects. Thus, these 3D features have to be rasterised such that each cell represents the height of the object class as good as possible given the cell size limitations. Small grid cells will result in realistic run-off modelling but with unacceptable computation times; larger grid cells with averaged height values will result in less realistic run-off modelling but fast computation times. This paper introduces a height grid generalisation approach in which the surface characteristics that most influence the water run-off flow are preserved. The first step is to create a detailed surface model (1:1.000), combining high-density laser data with a detailed topographic base map. The topographic map objects are triangulated to a set of TIN-objects by taking into account the semantics of the different map object classes. These TIN objects are then rasterised to two grids with a 0.5m cell-spacing: one grid for the object class labels and the other for the TIN-interpolated height values. The next step is to generalise both raster grids to a lower resolution using a procedure that considers the class label of each cell and that of its neighbours. The results of this approach are tested and validated by water run-off model runs for different cellspaced height grids at a pilot area in Amersfoort (the Netherlands). Two national datasets were used in this study: the large scale Topographic Base map (BGT, map scale 1:1.000), and the National height model of the Netherlands AHN2 (10 points per square meter on average). Comparison between the original AHN2 height grid and the semantically enriched and then generalised height grids shows

  20. A Raster Based Approach To Solar Pressure Modeling

    NASA Technical Reports Server (NTRS)

    Wright, Theodore

    2014-01-01

    The impact of photons upon a spacecraft introduces small forces and moments. The magnitude and direction of the forces depend on the material properties of the spacecraft components being illuminated. Which components are being lit depends on the orientation of the craft with respect to the Sun as well as the gimbal angles for any significant moving external parts (solar arrays, typically). Some components may shield others from the Sun.To determine solar pressure in the presence overlapping components, a 3D model can be used to determine which components are illuminated. A view (image) of the model as seen from the Sun shows the only contributors to solar pressure. This image can be decomposed into pixels, each of which can be treated as a non-overlapping flat plate as far as solar pressure calculations are concerned. The sums of the pressures and moments on these plates approximate the solar pressure and moments on the entire vehicle.The image rasterization technique can also be used to compute other spacecraft attributes that are dependent on attitude and geometry, including solar array power generation capability and free molecular flow drag.

  1. CanSIS Regional Soils Data in Vector Format

    NASA Technical Reports Server (NTRS)

    Monette, Bryan; Knapp, David; Hall, Forrest G. (Editor)

    2000-01-01

    This data set is the original vector data set received from Canada Soil Information System (CanSIS). The data include the provinces of Saskatchewan and Manitoba. Attribute tables provide the various soil data for the polygons; there is one attribute table for Saskatchewan and one for Manitoba. The data are stored in ARC/INFO export format files. Based on agreements made with Agriculture Canada, these data are available only to individuals and groups that have an official relationship with the BOREAS project. These data are not included on the BOReal Ecosystem-Atmosphere Study (BOREAS) CD-ROM set. A raster version of this data set titled 'BOREAS Regional Soils Data in Raster Format and AEAC Projection' is publicly available and is included on the BOREAS CD-ROM set.

  2. Connecting Swath Satellite Data With Imagery in Mapping Applications

    NASA Astrophysics Data System (ADS)

    Thompson, C. K.; Hall, J. R.; Penteado, P. F.; Roberts, J. T.; Zhou, A. Y.

    2016-12-01

    Visualizations of gridded science data products (referred to as Level 3 or Level 4) typically provide a straightforward correlation between image pixels and the source science data. This direct relationship allows users to make initial inferences based on imagery values, facilitating additional operations on the underlying data values, such as data subsetting and analysis. However, that same pixel-to-data relationship for ungridded science data products (referred to as Level 2) is significantly more challenging. These products, also referred to as "swath products", are in orbital "instrument space" and raster visualization pixels do not directly correlate to science data values. Interpolation algorithms are often employed during the gridding or projection of a science dataset prior to image generation, introducing intermediary values that separate the image from the source data values. NASA's Global Imagery Browse Services (GIBS) is researching techniques for efficiently serving "image-ready" data allowing client-side dynamic visualization and analysis capabilities. This presentation will cover some GIBS prototyping work designed to maintain connectivity between Level 2 swath data and its corresponding raster visualizations. Specifically, we discuss the DAta-to-Image-SYstem (DAISY), an indexing approach for Level 2 swath data, and the mechanisms whereby a client may dynamically visualize the data in raster form.

  3. Video raster stereography back shape reconstruction: a reliability study for sagittal, frontal, and transversal plane parameters.

    PubMed

    Schroeder, J; Reer, R; Braumann, K M

    2015-02-01

    As reliability of raster stereography was proved only for sagittal plane parameters with repeated measures on the same day, the present study was aiming at investigating variability and reliability of back shape reconstruction for all dimensions (sagittal, frontal, transversal) and for different intervals. For a sample of 20 healthy volunteers, intra-individual variability (SEM and CV%) and reliability (ICC ± 95% CI) were proved for sagittal (thoracic kyphosis, lumbar lordosis, pelvis tilt angle, and trunk inclination), frontal (pelvis torsion, pelvis and trunk imbalance, vertebral side deviation, and scoliosis angle), transversal (vertebral rotation), and functional (hyperextension) spine shape reconstruction parameters for different test-retest intervals (on the same day, between-day, between-week) by means of video raster stereography. Reliability was high for the sagittal plane (pelvis tilt, kyphosis and lordosis angle, and trunk inclination: ICC > 0.90), and good to high for lumbar mobility (0.86 < ICC < 0.97). Apart from sagittal plane spinal alignment, there was a lack of certainty for a high reproducibility indicated by wider ICC confidence intervals. So, reliability was fair to high for vertebral side deviation and the scoliosis angle (0.71 < ICC < 0.95), and poor to good for vertebral rotation values as well as for frontal plane upper body and pelvis position parameters (0.65 < ICC < 0.92). Coefficients for the between-day and between-week interval were a little lower than for repeated measures on the same day. Variability (SEM) was less than 1.5° or 1.5 mm, except for trunk inclination. Relative variability (CV) was greater in global trunk position and pelvis parameters (35-98%) than in scoliosis (14-20%) or sagittal sway parameters (4-8 %). Although we found a lower reproducibility for the frontal plane, raster stereography is considered to be a reliable method for the non-invasive, three-dimensional assessment of spinal alignment in normal non

  4. Washington Geothermal Play Fairway Analysis Heat, Permeability, and Fracture Model Data

    DOE Data Explorer

    Steely, Alex; Forson, Corina; Cladouhos, Trenton; Swyer, Mike; Davatzes, Nicholas; Anderson, Megan; Ritzinger, Brent; Glen, Jonathan; Peacock, Jared; Schermerhorn, William

    2017-12-07

    This submission contains raster and vector data for the entire state of Washington, with specific emphasis on the three geothermal play fairway sites: Mount St. Helens seismic zone (MSHSZ), Wind River valley (WRV), and Mount Baker (MB). Data are provided for 3 major geothermal models: heat, permeability, and fluid-filled fractures, and an additional infrastructure model. Both of the permeability and fluid-filled-fracture models are produced at 200 m and at 2 km depths; the heat model is only produced at the 200 m depth. Values are provided for both model favorability and model confidence. A combined model at 200m and 2 km depths is provided for favorability, confidence, and exploration risk. Raster data are provided in GeoTiff format and have a statewide coverage. Cell size is 104.355 ft; file type is unsigned 8-bit integer (0-255); 0 represents no favorability or confidence; 255 represents maximum favorability or confidence. The NAD83(HARN)/Washington South (ftUS) projection is used (EPSG:2927). Vector data are provided in shapefile or comma-delimited text file formats. Geographic coordinates, where provided, are in WGS84. A readme file accompanies each folder and provides an overview and description of the enclosed data. The heat model combines 5 intermediate raster layers (which are included in the download package): temperature gradient wells, young volcanic vents, hot springs, young intrusive volcanic rocks, and geothermometry. The permeability model combines 8 intermediate raster layers: density of mapped faults, 2D dilation tendency of mapped faults, 2D slip tendency of mapped faults, seismicity, 3D dilation tendency, 3D slip tendency, 3D maximum coulomb shear stress, and 3D slip gradients. The fluid-filled fracture model combines up to 4 intermediate rasters: resistivity from magneto-telluric 3D inversions, seismicity, Vp/Vs anomalies from passive seismic tomography, and Vs anomalies from ambient-noise tomography. A statewide infrastructure model is also

  5. Utah FORGE Groundwater Data

    DOE Data Explorer

    Joe Moore

    2016-07-20

    This submission includes two modelled drawdown scenarios with new supply well locations, a total dissolved solids (TDS) concentration grid (raster dataset representing the spatial distribution of TDS), and an excel spreadsheet containing well data.

  6. GPFA-AB_Phase1RiskAnalysisTask5DataUpload

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  7. Mixed raster content (MRC) model for compound image compression

    NASA Astrophysics Data System (ADS)

    de Queiroz, Ricardo L.; Buckley, Robert R.; Xu, Ming

    1998-12-01

    This paper will describe the Mixed Raster Content (MRC) method for compressing compound images, containing both binary test and continuous-tone images. A single compression algorithm that simultaneously meets the requirements for both text and image compression has been elusive. MRC takes a different approach. Rather than using a single algorithm, MRC uses a multi-layered imaging model for representing the results of multiple compression algorithms, including ones developed specifically for text and for images. As a result, MRC can combine the best of existing or new compression algorithms and offer different quality-compression ratio tradeoffs. The algorithms used by MRC set the lower bound on its compression performance. Compared to existing algorithms, MRC has some image-processing overhead to manage multiple algorithms and the imaging model. This paper will develop the rationale for the MRC approach by describing the multi-layered imaging model in light of a rate-distortion trade-off. Results will be presented comparing images compressed using MRC, JPEG and state-of-the-art wavelet algorithms such as SPIHT. MRC has been approved or proposed as an architectural model for several standards, including ITU Color Fax, IETF Internet Fax, and JPEG 2000.

  8. Raster-based outranking method: a new approach for municipal solid waste landfill (MSW) siting.

    PubMed

    Hamzeh, Mohamad; Abbaspour, Rahim Ali; Davalou, Romina

    2015-08-01

    MSW landfill siting is a complicated process because it requires integration of several factors. In this paper, geographic information system (GIS) and multiple criteria decision analysis (MCDA) were combined to handle the municipal solid waste (MSW) landfill siting. For this purpose, first, 16 input data layers were prepared in GIS environment. Then, the exclusionary lands were eliminated and potentially suitable areas for the MSW disposal were identified. These potentially suitable areas, in an innovative approach, were further examined by deploying Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE) II and analytic network process (ANP), which are two of the most recent MCDA methods, in order to determine land suitability for landfilling. PROMETHEE II was used to determine a complete ranking of the alternatives, while ANP was employed to quantify the subjective judgments of evaluators as criteria weights. The resulting land suitability was reported on a grading scale of 1-5 from 1 to 5, which is the least to the most suitable area, respectively. Finally, three optimal sites were selected by taking into consideration the local conditions of 15 sites, which were candidates for MSW landfilling. Research findings show that the raster-based method yields effective results.

  9. REAP (raster e-beam advanced process) using 50-kV raster e-beam system for sub-100-nm node mask technology

    NASA Astrophysics Data System (ADS)

    Baik, Ki-Ho; Dean, Robert L.; Mueller, Mark; Lu, Maiying; Lem, Homer Y.; Osborne, Stephen; Abboud, Frank E.

    2002-07-01

    A chemically amplified resist (CAR) process has been recognized as an approach to meet the demanding critical dimension (CD) specifications of 100nm node technology and beyond. Recently, significant effort has been devoted to optimizing CAR materials, which offer the characteristics required for next generation photomask fabrication. In this paper, a process established with a positive-tone CAR from TOK and 50kV MEBES eXara system is discussed. This resist is developed for raster scan 50 kV e-beam systems. It has high contrast, good coating characteristics, good dry etch selectivity, and high environmental stability. The coating process is conducted in an environment with amine concentration less than 2 ppb. A nitrogen environment is provided during plate transfer steps. Resolution using a 60nm writing grid is 90nm line and space patterns. CD linearity is maintained down to 240nm for isolated lines or spaces by applying embedded proximity effect correction (emPEC). Optimizations of post-apply bake (PAB) and post-expose bake (PEB) time, temperature, and uniformity are completed to improve adhesion, coating uniformity, and resolution. A puddle develop process is optimized to improve line edge roughness, edge slope, and resolution. Dry etch process is optimized on a TetraT system to transfer the resist image into the chrome layer with minimum etch bias.

  10. Customized altitude-azimuth mount for a raster-scanning Fourier transform spectrometer

    NASA Astrophysics Data System (ADS)

    Durrenberger, Jed E.; Gutman, William M.; Gammill, Troy D.; Grover, Dennis H.

    1996-10-01

    Applications of the Army Research Laboratory Mobile Atmospheric Spectrometer Remote Sensing Rover required development of a customized computer-controlled mount to satisfy a variety of requirements within a limited budget. The payload was designed to operate atop a military electronics shelter mounted on a 4-wheel drive truck to be above most atmospheric ground turbulence. Pointing orientation in altitude is limited by constraints imposed by use of a liquid nitrogen detector Dewar in the spectrometer. Stepper motor drives and control system are compatible with existing custom software used with other instrumentation for controlled incremental raster stepping. The altitude axis passes close to the center of gravity of the complete payload to minimize load eccentricity and drive torque requirements. Dovetail fixture mounting enables quick service and fine adjustment of balance to minimize stepper/gearbox drive backlash through the limited orientation range in altitude. Initial applications to characterization of remote gas plumes have been successful.

  11. Mimicking human expert interpretation of remotely sensed raster imagery by using a novel segmentation analysis within ArcGIS

    NASA Astrophysics Data System (ADS)

    Le Bas, Tim; Scarth, Anthony; Bunting, Peter

    2015-04-01

    Traditional computer based methods for the interpretation of remotely sensed imagery use each pixel individually or the average of a small window of pixels to calculate a class or thematic value, which provides an interpretation. However when a human expert interprets imagery, the human eye is excellent at finding coherent and homogenous areas and edge features. It may therefore be advantageous for computer analysis to mimic human interpretation. A new toolbox for ArcGIS 10.x will be presented that segments the data layers into a set of polygons. Each polygon is defined by a K-means clustering and region growing algorithm, thus finding areas, their edges and any lineations in the imagery. Attached to each polygon are the characteristics of the imagery such as mean and standard deviation of the pixel values, within the polygon. The segmentation of imagery into a jigsaw of polygons also has the advantage that the human interpreter does not need to spend hours digitising the boundaries. The segmentation process has been taken from the RSGIS library of analysis and classification routines (Bunting et al., 2014). These routines are freeware and have been modified to be available in the ArcToolbox under the Windows (v7) operating system. Input to the segmentation process is a multi-layered raster image, for example; a Landsat image, or a set of raster datasets made up from derivatives of topography. The size and number of polygons are set by the user and are dependent on the imagery used. Examples will be presented of data from the marine environment utilising bathymetric depth, slope, rugosity and backscatter from a multibeam system. Meaningful classification of the polygons using their numerical characteristics is the next goal. Object based image analysis (OBIA) should help this workflow. Fully calibrated imagery systems will allow numerical classification to be translated into more readily understandable terms. Peter Bunting, Daniel Clewley, Richard M. Lucas and Sam

  12. Data Representations for Geographic Information Systems.

    ERIC Educational Resources Information Center

    Shaffer, Clifford A.

    1992-01-01

    Surveys the field and literature of geographic information systems (GIS) and spatial data representation as it relates to GIS. Highlights include GIS terms, data types, and operations; vector representations and raster, or grid, representations; spatial indexing; elevation data representations; large spatial databases; and problem areas and future…

  13. Comparative study between REAP 200 and FEP171 CAR with 50-kV raster e-beam system for sub-100-nm technology

    NASA Astrophysics Data System (ADS)

    Baik, Ki-Ho; Lem, Homer Y.; Dean, Robert L.; Osborne, Stephen; Mueller, Mark; Abboud, Frank E.

    2003-08-01

    In this paper, a process established with a positive-tone chemically amplified resist (CAR) from TOK REAP200 and Fujifilm Arch FEP171 and 50kV MEBES system is discussed. This TOK resist is developed for raster scan 50 kV e-beam systems. It has high contrast, good coating characteristics, good dry etch selectivity, and high environmental stability. In the mask industries, the most popular positive tone CAR is FEP171, which is a high activation energy type CAR. REAP (Raster E-beam Advanced Process) 200 is low activation energy type and new acetal protecting polymer. In this study, we compared to these different type resists in terms of contrast, PAB and PEB latitude, resist profile, footing, T-topping, PED stability, LER, Global CDU (Critical Dimension Uniformity) and resolution. The REAP200 Resist obtained 75nm isolated lines and spaces, 90nm dense patterns with vertical profile, and a good stability of delay time.

  14. BOREAS Regional Soils Data in Raster Format and AEAC Projection

    NASA Technical Reports Server (NTRS)

    Monette, Bryan; Knapp, David; Hall, Forrest G. (Editor); Nickeson, Jaime (Editor)

    2000-01-01

    This data set was gridded by BOREAS Information System (BORIS) Staff from a vector data set received from the Canadian Soil Information System (CanSIS). The original data came in two parts that covered Saskatchewan and Manitoba. The data were gridded and merged into one data set of 84 files covering the BOREAS region. The data were gridded into the AEAC projection. Because the mapping of the two provinces was done separately in the original vector data, there may be discontinuities in some of the soil layers because of different interpretations of certain soil properties. The data are stored in binary, image format files.

  15. Washington Play Fairway Analysis Geothermal GIS Data

    DOE Data Explorer

    Corina Forson

    2015-12-15

    This file contains file geodatabases of the Mount St. Helens seismic zone (MSHSZ), Wind River valley (WRV) and Mount Baker (MB) geothermal play-fairway sites in the Washington Cascades. The geodatabases include input data (feature classes) and output rasters (generated from modeling and interpolation) from the geothermal play-fairway in Washington State, USA. These data were gathered and modeled to provide an estimate of the heat and permeability potential within the play-fairways based on: mapped volcanic vents, hot springs and fumaroles, geothermometry, intrusive rocks, temperature-gradient wells, slip tendency, dilation tendency, displacement, displacement gradient, max coulomb shear stress, sigma 3, maximum shear strain rate, and dilational strain rate at 200m and 3 km depth. In addition this file contains layer files for each of the output rasters. For details on the areas of interest please see the 'WA_State_Play_Fairway_Phase_1_Technical_Report' in the download package. This submission also includes a file with the geothermal favorability of the Washington Cascade Range based off of an earlier statewide assessment. Additionally, within this file there are the maximum shear and dilational strain rate rasters for all of Washington State.

  16. Issues of Authenticity of Spatial Data.

    ERIC Educational Resources Information Center

    McGlamery, Patrick

    This paper discusses the authenticity of digital spatial data. The first section describes three formats for digital spatial data: vector, raster, and thematic. The second section addresses the integrity of spatial data, including six possible formats for the same information: (1) aerial photographic prints, time stamped, primary, remotely sensed…

  17. Surface modification of ceramic and metallic alloy substrates by laser raster-scanning

    NASA Astrophysics Data System (ADS)

    Ramos Grez, Jorge Andres

    This work describes the feasibility of continuous wave laser-raster scan-processing under controlled atmospheric conditions as employed in three distinct surface modification processes: (a) surface roughness reduction of indirect-Selective Laser Sintered 420 martensitic stainless steel-40 wt. % bronze infiltrated surfaces; (b) Si-Cr-Hf-C coating consolidation over 3D carbon-carbon composites cylinders; (c) dendritic solidification structures of Mar-M 247 confined powder precursor grown from polycrystalline Alloy 718 substrates. A heat transfer model was developed to illustrate that the aspect ratio of the laser scanned pattern and the density of scanning lines play a significant role in determining peak surface temperature, heating and cooling rates and melt resident times. Comprehensive characterization of the surface of the processed specimens was performed using scanning electron microscopy (SEM), energy dispersive spectroscopy (EDS), optical metallography, X-ray diffraction (XRD), and, in certain cases, tactile profilometry. In Process (a), it was observed that a 24% to 37% roughness Ra reduction could be accomplished from the as-received value of 2.50+/-0.10 microns for laser energy densities ranging from 350 to 500 J/cm2. In Process (b), complete reactive wetting of carbon-carbon composite cylinders surface was achieved by laser melting a Si-Cr-Hf-C slurry. Coatings showed good thermal stability at 1000°C in argon, and, when tested in air, a percent weight reduction rate of -6.5 wt.%/hr was achieved. A soda-glass overcoat applied over the coated specimens by conventional means revealed a percent weight reduction rate between -1.4 to -2.2 wt.%/hr. Finally, in Process (c), microstructure of the Mar-M 247 single layer deposits, 1 mm in height, grown on Alloy 718 polycrystalline sheets, resulted in a sound metallurgical bond, low porosity, and uniform thickness. Polycrystalline dendrites grew preferentially along the [001] direction from the substrate up to 400

  18. Efficient sintering of nanocrystalline titanium dioxide films for dye solar cells via raster scanning laser

    NASA Astrophysics Data System (ADS)

    Mincuzzi, Girolamo; Vesce, Luigi; Reale, Andrea; Di Carlo, Aldo; Brown, Thomas M.

    2009-09-01

    By identifying the right combination of laser parameters, in particular the integrated laser fluence Φ, we fabricated dye solar cells (DSCs) with UV laser-sintered TiO2 films exhibiting a power conversion efficiency η =5.2%, the highest reported for laser-sintered devices. η is dramatically affected by Φ and a clear trend is reported. Significantly, DSCs fabricated by raster scanning the laser beam to sinter the TiO2 films are made as efficient as those with oven-sintered ones. These results, confirmed on three batches of cells, demonstrate the remarkable potential (noncontact, local, low cost, rapid, selective, and scalable) of scanning laser processing applied to DSC technology.

  19. Obtaining and processing Daymet data using Python and ArcGIS

    USGS Publications Warehouse

    Bohms, Stefanie

    2013-01-01

    This set of scripts was developed to automate the process of downloading and mosaicking daily Daymet data to a user defined extent using ArcGIS and Python programming language. The three steps are downloading the needed Daymet tiles for the study area extent, converting the netcdf file to a tif raster format, and mosaicking those rasters to one file. The set of scripts is intended for all levels of experience with Python programming language and requires no scripting by the user.

  20. Spatial Data Transfer Standard (SDTS)

    USGS Publications Warehouse

    ,

    1995-01-01

    The Spatial Data Transfer Standard (SOTS) is a mechanism for the transfer of spatial data between dissimilar computer systems. The SOTS specifies exchange constructs, addressing formats, structure, and content for spatially referenced vector and raster (including gridded) data. SOTS components are a conceptual model, specifications for a quality report, transfer module specifications, data dictionary specifications, and definitions of spatial features and attributes.

  1. Influence of Layer Thickness and Raster Angle on the Mechanical Properties of 3D-Printed PEEK and a Comparative Mechanical Study between PEEK and ABS.

    PubMed

    Wu, Wenzheng; Geng, Peng; Li, Guiwei; Zhao, Di; Zhang, Haibo; Zhao, Ji

    2015-09-01

    Fused deposition modeling (FDM) is a rapidly growing 3D printing technology. However, printing materials are restricted to acrylonitrile butadiene styrene (ABS) or poly (lactic acid) (PLA) in most Fused deposition modeling (FDM) equipment. Here, we report on a new high-performance printing material, polyether-ether-ketone (PEEK), which could surmount these shortcomings. This paper is devoted to studying the influence of layer thickness and raster angle on the mechanical properties of 3D-printed PEEK. Samples with three different layer thicknesses (200, 300 and 400 μm) and raster angles (0°, 30° and 45°) were built using a polyether-ether-ketone (PEEK) 3D printing system and their tensile, compressive and bending strengths were tested. The optimal mechanical properties of polyether-ether-ketone (PEEK) samples were found at a layer thickness of 300 μm and a raster angle of 0°. To evaluate the printing performance of polyether-ether-ketone (PEEK) samples, a comparison was made between the mechanical properties of 3D-printed polyether-ether-ketone (PEEK) and acrylonitrile butadiene styrene (ABS) parts. The results suggest that the average tensile strengths of polyether-ether-ketone (PEEK) parts were 108% higher than those for acrylonitrile butadiene styrene (ABS), and compressive strengths were 114% and bending strengths were 115%. However, the modulus of elasticity for both materials was similar. These results indicate that the mechanical properties of 3D-printed polyether-ether-ketone (PEEK) are superior to 3D-printed ABS.

  2. Maximizing Accessibility to Spatially Referenced Digital Data.

    ERIC Educational Resources Information Center

    Hunt, Li; Joselyn, Mark

    1995-01-01

    Discusses some widely available spatially referenced datasets, including raster and vector datasets. Strategies for improving accessibility include: acquisition of data in a software-dependent format; reorganization of data into logical geographic units; acquisition of intelligent retrieval software; improving computer hardware; and intelligent…

  3. Parallel Geospatial Data Management for Multi-Scale Environmental Data Analysis on GPUs

    NASA Astrophysics Data System (ADS)

    Wang, D.; Zhang, J.; Wei, Y.

    2013-12-01

    As the spatial and temporal resolutions of Earth observatory data and Earth system simulation outputs are getting higher, in-situ and/or post- processing such large amount of geospatial data increasingly becomes a bottleneck in scientific inquires of Earth systems and their human impacts. Existing geospatial techniques that are based on outdated computing models (e.g., serial algorithms and disk-resident systems), as have been implemented in many commercial and open source packages, are incapable of processing large-scale geospatial data and achieve desired level of performance. In this study, we have developed a set of parallel data structures and algorithms that are capable of utilizing massively data parallel computing power available on commodity Graphics Processing Units (GPUs) for a popular geospatial technique called Zonal Statistics. Given two input datasets with one representing measurements (e.g., temperature or precipitation) and the other one represent polygonal zones (e.g., ecological or administrative zones), Zonal Statistics computes major statistics (or complete distribution histograms) of the measurements in all regions. Our technique has four steps and each step can be mapped to GPU hardware by identifying its inherent data parallelisms. First, a raster is divided into blocks and per-block histograms are derived. Second, the Minimum Bounding Boxes (MBRs) of polygons are computed and are spatially matched with raster blocks; matched polygon-block pairs are tested and blocks that are either inside or intersect with polygons are identified. Third, per-block histograms are aggregated to polygons for blocks that are completely within polygons. Finally, for blocks that intersect with polygon boundaries, all the raster cells within the blocks are examined using point-in-polygon-test and cells that are within polygons are used to update corresponding histograms. As the task becomes I/O bound after applying spatial indexing and GPU hardware acceleration

  4. Wild boar mapping using population-density statistics: From polygons to high resolution raster maps.

    PubMed

    Pittiglio, Claudia; Khomenko, Sergei; Beltran-Alcrudo, Daniel

    2018-01-01

    The wild boar is an important crop raider as well as a reservoir and agent of spread of swine diseases. Due to increasing densities and expanding ranges worldwide, the related economic losses in livestock and agricultural sectors are significant and on the rise. Its management and control would strongly benefit from accurate and detailed spatial information on species distribution and abundance, which are often available only for small areas. Data are commonly available at aggregated administrative units with little or no information about the distribution of the species within the unit. In this paper, a four-step geostatistical downscaling approach is presented and used to disaggregate wild boar population density statistics from administrative units of different shape and size (polygons) to 5 km resolution raster maps by incorporating auxiliary fine scale environmental variables. 1) First a stratification method was used to define homogeneous bioclimatic regions for the analysis; 2) Under a geostatistical framework, the wild boar densities at administrative units, i.e. subnational areas, were decomposed into trend and residual components for each bioclimatic region. Quantitative relationships between wild boar data and environmental variables were estimated through multiple regression and used to derive trend components at 5 km spatial resolution. Next, the residual components (i.e., the differences between the trend components and the original wild boar data at administrative units) were downscaled at 5 km resolution using area-to-point kriging. The trend and residual components obtained at 5 km resolution were finally added to generate fine scale wild boar estimates for each bioclimatic region. 3) These maps were then mosaicked to produce a final output map of predicted wild boar densities across most of Eurasia. 4) Model accuracy was assessed at each different step using input as well as independent data. We discuss advantages and limits of the method and its

  5. Wild boar mapping using population-density statistics: From polygons to high resolution raster maps

    PubMed Central

    Pittiglio, Claudia; Khomenko, Sergei

    2018-01-01

    The wild boar is an important crop raider as well as a reservoir and agent of spread of swine diseases. Due to increasing densities and expanding ranges worldwide, the related economic losses in livestock and agricultural sectors are significant and on the rise. Its management and control would strongly benefit from accurate and detailed spatial information on species distribution and abundance, which are often available only for small areas. Data are commonly available at aggregated administrative units with little or no information about the distribution of the species within the unit. In this paper, a four-step geostatistical downscaling approach is presented and used to disaggregate wild boar population density statistics from administrative units of different shape and size (polygons) to 5 km resolution raster maps by incorporating auxiliary fine scale environmental variables. 1) First a stratification method was used to define homogeneous bioclimatic regions for the analysis; 2) Under a geostatistical framework, the wild boar densities at administrative units, i.e. subnational areas, were decomposed into trend and residual components for each bioclimatic region. Quantitative relationships between wild boar data and environmental variables were estimated through multiple regression and used to derive trend components at 5 km spatial resolution. Next, the residual components (i.e., the differences between the trend components and the original wild boar data at administrative units) were downscaled at 5 km resolution using area-to-point kriging. The trend and residual components obtained at 5 km resolution were finally added to generate fine scale wild boar estimates for each bioclimatic region. 3) These maps were then mosaicked to produce a final output map of predicted wild boar densities across most of Eurasia. 4) Model accuracy was assessed at each different step using input as well as independent data. We discuss advantages and limits of the method and its

  6. High-speed real-time animated displays on the ADAGE (trademark) RDS 3000 raster graphics system

    NASA Technical Reports Server (NTRS)

    Kahlbaum, William M., Jr.; Ownbey, Katrina L.

    1989-01-01

    Techniques which may be used to increase the animation update rate of real-time computer raster graphic displays are discussed. They were developed on the ADAGE RDS 3000 graphic system in support of the Advanced Concepts Simulator at the NASA Langley Research Center. These techniques involve the use of a special purpose parallel processor, for high-speed character generation. The description of the parallel processor includes the Barrel Shifter which is part of the hardware and is the key to the high-speed character rendition. The final result of this total effort was a fourfold increase in the update rate of an existing primary flight display from 4 to 16 frames per second.

  7. Influence of Layer Thickness and Raster Angle on the Mechanical Properties of 3D-Printed PEEK and a Comparative Mechanical Study between PEEK and ABS

    PubMed Central

    Wu, Wenzheng; Geng, Peng; Li, Guiwei; Zhao, Di; Zhang, Haibo; Zhao, Ji

    2015-01-01

    Fused deposition modeling (FDM) is a rapidly growing 3D printing technology. However, printing materials are restricted to acrylonitrile butadiene styrene (ABS) or poly (lactic acid) (PLA) in most Fused deposition modeling (FDM) equipment. Here, we report on a new high-performance printing material, polyether-ether-ketone (PEEK), which could surmount these shortcomings. This paper is devoted to studying the influence of layer thickness and raster angle on the mechanical properties of 3D-printed PEEK. Samples with three different layer thicknesses (200, 300 and 400 μm) and raster angles (0°, 30° and 45°) were built using a polyether-ether-ketone (PEEK) 3D printing system and their tensile, compressive and bending strengths were tested. The optimal mechanical properties of polyether-ether-ketone (PEEK) samples were found at a layer thickness of 300 μm and a raster angle of 0°. To evaluate the printing performance of polyether-ether-ketone (PEEK) samples, a comparison was made between the mechanical properties of 3D-printed polyether-ether-ketone (PEEK) and acrylonitrile butadiene styrene (ABS) parts. The results suggest that the average tensile strengths of polyether-ether-ketone (PEEK) parts were 108% higher than those for acrylonitrile butadiene styrene (ABS), and compressive strengths were 114% and bending strengths were 115%. However, the modulus of elasticity for both materials was similar. These results indicate that the mechanical properties of 3D-printed polyether-ether-ketone (PEEK) are superior to 3D-printed ABS. PMID:28793537

  8. elevatr: Access Elevation Data from Various APIs | Science ...

    EPA Pesticide Factsheets

    Several web services are available that provide access to elevation data. This package provides access to several of those services and returns elevation data either as a SpatialPointsDataFrame from point elevation services or as a raster object from raster elevation services. Currently, the package supports access to the Mapzen Elevation Service, Mapzen Terrain Service, and the USGS Elevation Point Query Service. The R language for statistical computing is increasingly used for spatial data analysis . This R package, elevatr, is in response to this and provides access to elevation data from various sources directly in R. The impact of `elevatr` is that it will 1) facilitate spatial analysis in R by providing access to foundational dataset for many types of analyses (e.g. hydrology, limnology) 2) open up a new set of users and uses for APIs widely used outside of R, and 3) provide an excellent example federal open source development as promoted by the Federal Source Code Policy (https://sourcecode.cio.gov/).

  9. Definition and design of an experiment to test raster scanning with rotating unbalanced-mass devices on gimbaled payloads

    NASA Technical Reports Server (NTRS)

    Lightsey, W. D.; Alhorn, D. C.; Polites, M. E.

    1992-01-01

    An experiment designed to test the feasibility of using rotating unbalanced-mass (RUM) devices for line and raster scanning gimbaled payloads, while expending very little power is described. The experiment is configured for ground-based testing, but the scan concept is applicable to ground-based, balloon-borne, and space-based payloads, as well as free-flying spacecraft. The servos used in scanning are defined; the electronic hardware is specified; and a computer simulation model of the system is described. Simulation results are presented that predict system performance and verify the servo designs.

  10. Spatial Data Transfer Standard (SDTS)

    USGS Publications Warehouse

    ,

    1999-01-01

    The American National Standards Institute?s (ANSI) Spatial Data Transfer Standard (SDTS) is a mechanism for archiving and transferring of spatial data (including metadata) between dissimilar computer systems. The SDTS specifies exchange constructs, such as format, structure, and content, for spatially referenced vector and raster (including gridded) data. The SDTS includes a flexible conceptual model, specifications for a quality report, transfer module specifications, data dictionary specifications, and definitions of spatial features and attributes.

  11. Analyzing rasters, vectors and time series using new Python interfaces in GRASS GIS 7

    NASA Astrophysics Data System (ADS)

    Petras, Vaclav; Petrasova, Anna; Chemin, Yann; Zambelli, Pietro; Landa, Martin; Gebbert, Sören; Neteler, Markus; Löwe, Peter

    2015-04-01

    GRASS GIS 7 is a free and open source GIS software developed and used by many scientists (Neteler et al., 2012). While some users of GRASS GIS prefer its graphical user interface, significant part of the scientific community takes advantage of various scripting and programing interfaces offered by GRASS GIS to develop new models and algorithms. Here we will present different interfaces added to GRASS GIS 7 and available in Python, a popular programming language and environment in geosciences. These Python interfaces are designed to satisfy the needs of scientists and programmers under various circumstances. PyGRASS (Zambelli et al., 2013) is a new object-oriented interface to GRASS GIS modules and libraries. The GRASS GIS libraries are implemented in C to ensure maximum performance and the PyGRASS interface provides an intuitive, pythonic access to their functionality. GRASS GIS Python scripting library is another way of accessing GRASS GIS modules. It combines the simplicity of Bash and the efficiency of the Python syntax. When full access to all low-level and advanced functions and structures from GRASS GIS library is required, Python programmers can use an interface based on the Python ctypes package. Ctypes interface provides complete, direct access to all functionality as it would be available to C programmers. GRASS GIS provides specialized Python library for managing and analyzing spatio-temporal data (Gebbert and Pebesma, 2014). The temporal library introduces space time datasets representing time series of raster, 3D raster or vector maps and allows users to combine various spatio-temporal operations including queries, aggregation, sampling or the analysis of spatio-temporal topology. We will also discuss the advantages of implementing scientific algorithm as a GRASS GIS module and we will show how to write such module in Python. To facilitate the development of the module, GRASS GIS provides a Python library for testing (Petras and Gebbert, 2014) which

  12. Non-contact optoacoustic imaging by raster scanning a piezoelectric air-coupled transducer

    NASA Astrophysics Data System (ADS)

    Deán-Ben, X. Luís.; Pang, Genny A.; Montero de Espinosa, Francisco; Razansky, Daniel

    2016-03-01

    Optoacoustic techniques rely on ultrasound transmission between optical absorbers within tissues and the measurement location. Much like in echography, commonly used piezoelectric transducers require either direct contact with the tissue or through a liquid coupling medium. The contact nature of this detection approach then represents a disadvantage of standard optoacoustic systems with respect to other imaging modalities (including optical techniques) in applications where non-contact imaging is needed, e.g. in open surgeries or when burns or other lesions are present in the skin. Herein, non-contact optoacoustic imaging using raster-scanning of a spherically-focused piezoelectric air-coupled ultrasound transducer is demonstrated. When employing laser fluence levels not exceeding the maximal permissible human exposure, it is shown possible to attain detectable signals from objects as small as 1 mm having absorption properties representative of blood at near-infrared wavelengths with a relatively low number of averages. Optoacoustic images from vessel-mimicking tubes embedded in an agar phantom are further showcased. The initial results indicate that the air-coupled ultrasound detection approach can be potentially made suitable for non-contact biomedical imaging with optoacoustics.

  13. A decision support system for map projections of small scale data

    USGS Publications Warehouse

    Finn, Michael P.; Usery, E. Lynn; Posch, Stephan T.; Seong, Jeong Chang

    2004-01-01

    The use of commercial geographic information system software to process large raster datasets of terrain elevation, population, land cover, vegetation, soils, temperature, and rainfall requires both projection from spherical coordinates to plane coordinate systems and transformation from one plane system to another. Decision support systems deliver information resulting in knowledge that assists in policies, priorities, or processes. This paper presents an approach to handling the problems of raster dataset projection and transformation through the development of a Web-enabled decision support system to aid users of transformation processes with the selection of appropriate map projections based on data type, areal extent, location, and preservation properties.

  14. USGS GeoData Digital Raster Graphics

    USGS Publications Warehouse

    ,

    2001-01-01

    Passive diffusion samplers have been tested at a number of sites where volatile organic compounds (VOC?s) are the principal contaminants in ground water. Test results generally show good agreement between concentrations of VOC?s in samples collected with diffusion samplers and concentrations in samples collected by purging the water from a well. Diffusion samplers offer several advantages over conventional and low-flow ground-water sampling procedures: ? Elimination of the need to purge a well before collecting a sample and to dispose of contaminated water. ? Elimination of cross-contamination of samples associated with sampling with non-dedicated pumps or sample delivery tubes. ? Reduction in sampling time by as much as 80 percent of that required for ?purge type? sampling methods. ? An increase in the frequency and spatial coverage of monitoring at a site because of the associated savings in time and money. The successful use of diffusion samplers depends on the following three primary factors: (1) understanding site conditions and contaminants of interest (defining sample objectives), (2) validating of results of diffusion samplers against more widely acknowledged sampling methods, and (3) applying diffusion samplers in the field.

  15. Rastering strategy for screening and centring of microcrystal samples of human membrane proteins with a sub-10 µm size X-ray synchrotron beam

    PubMed Central

    Cherezov, Vadim; Hanson, Michael A.; Griffith, Mark T.; Hilgart, Mark C.; Sanishvili, Ruslan; Nagarajan, Venugopalan; Stepanov, Sergey; Fischetti, Robert F.; Kuhn, Peter; Stevens, Raymond C.

    2009-01-01

    Crystallization of human membrane proteins in lipidic cubic phase often results in very small but highly ordered crystals. Advent of the sub-10 µm minibeam at the APS GM/CA CAT has enabled the collection of high quality diffraction data from such microcrystals. Herein we describe the challenges and solutions related to growing, manipulating and collecting data from optically invisible microcrystals embedded in an opaque frozen in meso material. Of critical importance is the use of the intense and small synchrotron beam to raster through and locate the crystal sample in an efficient and reliable manner. The resulting diffraction patterns have a significant reduction in background, with strong intensity and improvement in diffraction resolution compared with larger beam sizes. Three high-resolution structures of human G protein-coupled receptors serve as evidence of the utility of these techniques that will likely be useful for future structural determination efforts. We anticipate that further innovations of the technologies applied to microcrystallography will enable the solving of structures of ever more challenging targets. PMID:19535414

  16. Vector-Based Data Services for NASA Earth Science

    NASA Astrophysics Data System (ADS)

    Rodriguez, J.; Roberts, J. T.; Ruvane, K.; Cechini, M. F.; Thompson, C. K.; Boller, R. A.; Baynes, K.

    2016-12-01

    Vector data sources offer opportunities for mapping and visualizing science data in a way that allows for more customizable rendering and deeper data analysis than traditional raster images, and popular formats like GeoJSON and Mapbox Vector Tiles allow diverse types of geospatial data to be served in a high-performance and easily consumed-package. Vector data is especially suited to highly dynamic mapping applications and visualization of complex datasets, while growing levels of support for vector formats and features in open-source mapping clients has made utilizing them easier and more powerful than ever. NASA's Global Imagery Browse Services (GIBS) is working to make NASA data more easily and conveniently accessible than ever by serving vector datasets via GeoJSON, Mapbox Vector Tiles, and raster images. This presentation will review these output formats, the services, including WFS, WMS, and WMTS, that can be used to access the data, and some ways in which vector sources can be utilized in popular open-source mapping clients like OpenLayers. Lessons learned from GIBS' recent move towards serving vector will be discussed, as well as how to use GIBS open source software to create, configure, and serve vector data sources using Mapserver and the GIBS OnEarth Apache module.

  17. Geographic information system datasets of regolith-thickness data, regolith-thickness contours, raster-based regolith thickness, and aquifer-test and specific-capacity data for the Lost Creek Designated Ground Water Basin, Weld, Adams, and Arapahoe Counties, Colorado

    USGS Publications Warehouse

    Arnold, L. Rick

    2010-01-01

    These datasets were compiled in support of U.S. Geological Survey Scientific-Investigations Report 2010-5082-Hydrogeology and Steady-State Numerical Simulation of Groundwater Flow in the Lost Creek Designated Ground Water Basin, Weld, Adams, and Arapahoe Counties, Colorado. The datasets were developed by the U.S. Geological Survey in cooperation with the Lost Creek Ground Water Management District and the Colorado Geological Survey. The four datasets are described as follows and methods used to develop the datasets are further described in Scientific-Investigations Report 2010-5082: (1) ds507_regolith_data: This point dataset contains geologic information concerning regolith (unconsolidated sediment) thickness and top-of-bedrock altitude at selected well and test-hole locations in and near the Lost Creek Designated Ground Water Basin, Weld, Adams, and Arapahoe Counties, Colorado. Data were compiled from published reports, consultant reports, and from lithologic logs of wells and test holes on file with the U.S. Geological Survey Colorado Water Science Center and the Colorado Division of Water Resources. (2) ds507_regthick_contours: This dataset consists of contours showing generalized lines of equal regolith thickness overlying bedrock in the Lost Creek Designated Ground Water Basin, Weld, Adams, and Arapahoe Counties, Colorado. Regolith thickness was contoured manually on the basis of information provided in the dataset ds507_regolith_data. (3) ds507_regthick_grid: This dataset consists of raster-based generalized thickness of regolith overlying bedrock in the Lost Creek Designated Ground Water Basin, Weld, Adams, and Arapahoe Counties, Colorado. Regolith thickness in this dataset was derived from contours presented in the dataset ds507_regthick_contours. (4) ds507_welltest_data: This point dataset contains estimates of aquifer transmissivity and hydraulic conductivity at selected well locations in the Lost Creek Designated Ground Water Basin, Weld, Adams, and

  18. Spatial Data Transfer Standard (SDTS), part 5 : SDTS raster profile and extensions

    DOT National Transportation Integrated Search

    1999-02-01

    The Spatial Data Transfer Standard (SDTS) defines a general mechanism for the transfer of : geographically referenced spatial data and its supporting metadata, i.e., attributes, data quality reports, : coordinate reference systems, security informati...

  19. BOREAS Soils Data over the SSA in Raster Format and AEAC Projection

    NASA Technical Reports Server (NTRS)

    Knapp, David; Rostad, Harold; Hall, Forrest G. (Editor)

    2000-01-01

    This data set consists of GIS layers that describe the soils of the BOREAS SSA. The original data were submitted as vector layers that were gridded by BOREAS staff to a 30-meter pixel size in the AEAC projection. These data layers include the soil code (which relates to the soil name), modifier (which also relates to the soil name), and extent (indicating the extent that this soil exists within the polygon). There are three sets of these layers representing the primary, secondary, and tertiary soil characteristics. Thus, there is a total of nine layers in this data set along with supporting files. The data are stored in binary, image format files.

  20. Influence of Layer Thickness, Raster Angle, Deformation Temperature and Recovery Temperature on the Shape-Memory Effect of 3D-Printed Polylactic Acid Samples

    PubMed Central

    Wu, Wenzheng; Ye, Wenli; Wu, Zichao; Geng, Peng; Wang, Yulei; Zhao, Ji

    2017-01-01

    The success of the 3D-printing process depends upon the proper selection of process parameters. However, the majority of current related studies focus on the influence of process parameters on the mechanical properties of the parts. The influence of process parameters on the shape-memory effect has been little studied. This study used the orthogonal experimental design method to evaluate the influence of the layer thickness H, raster angle θ, deformation temperature Td and recovery temperature Tr on the shape-recovery ratio Rr and maximum shape-recovery rate Vm of 3D-printed polylactic acid (PLA). The order and contribution of every experimental factor on the target index were determined by range analysis and ANOVA, respectively. The experimental results indicated that the recovery temperature exerted the greatest effect with a variance ratio of 416.10, whereas the layer thickness exerted the smallest effect on the shape-recovery ratio with a variance ratio of 4.902. The recovery temperature exerted the most significant effect on the maximum shape-recovery rate with the highest variance ratio of 1049.50, whereas the raster angle exerted the minimum effect with a variance ratio of 27.163. The results showed that the shape-memory effect of 3D-printed PLA parts depended strongly on recovery temperature, and depended more weakly on the deformation temperature and 3D-printing parameters. PMID:28825617

  1. Integrating fine-scale soil data into species distribution models: preparing Soil Survey Geographic (SSURGO) data from multiple counties

    Treesearch

    Matthew P. Peters; Louis R. Iverson; Anantha M. Prasad; Steve N. Matthews

    2013-01-01

    Fine-scale soil (SSURGO) data were processed at the county level for 37 states within the eastern United States, initially for use as predictor variables in a species distribution model called DISTRIB II. Values from county polygon files converted into a continuous 30-m raster grid were aggregated to 4-km cells and integrated with other environmental and site condition...

  2. Data layer integration for the national map of the united states

    USGS Publications Warehouse

    Usery, E.L.; Finn, M.P.; Starbuck, M.

    2009-01-01

    The integration of geographic data layers in multiple raster and vector formats, from many different organizations and at a variety of resolutions and scales, is a significant problem for The National Map of the United States being developed by the U.S. Geological Survey. Our research has examined data integration from a layer-based approach for five of The National Map data layers: digital orthoimages, elevation, land cover, hydrography, and transportation. An empirical approach has included visual assessment by a set of respondents with statistical analysis to establish the meaning of various types of integration. A separate theoretical approach with established hypotheses tested against actual data sets has resulted in an automated procedure for integration of specific layers and is being tested. The empirical analysis has established resolution bounds on meanings of integration with raster datasets and distance bounds for vector data. The theoretical approach has used a combination of theories on cartographic transformation and generalization, such as T??pfer's radical law, and additional research concerning optimum viewing scales for digital images to establish a set of guiding principles for integrating data of different resolutions.

  3. The national elevation data set

    USGS Publications Warehouse

    Gesch, Dean B.; Oimoen, Michael J.; Greenlee, Susan K.; Nelson, Charles A.; Steuck, Michael J.; Tyler, Dean J.

    2002-01-01

    The NED is a seamless raster dataset from the USGS that fulfills many of the concepts of framework geospatial data as envisioned for the NSDI, allowing users to focus on analysis rather than data preparation. It is regularly maintained and updated, and it provides basic elevation data for many GIS applications. The NED is one of several seamless datasets that the USGS is making available through the Web. The techniques and approaches developed for producing, maintaining, and distributing the NED are the type that will be used for implementing the USGS National Map (http://nationalmap.usgs.gov/).

  4. Anisotropic diffusion of fluorescently labeled ATP in rat cardiomyocytes determined by raster image correlation spectroscopy

    PubMed Central

    Vendelin, Marko; Birkedal, Rikke

    2008-01-01

    A series of experimental data points to the existence of profound diffusion restrictions of ADP/ATP in rat cardiomyocytes. This assumption is required to explain the measurements of kinetics of respiration, sarcoplasmic reticulum loading with calcium, and kinetics of ATP-sensitive potassium channels. To be able to analyze and estimate the role of intracellular diffusion restrictions on bioenergetics, the intracellular diffusion coefficients of metabolites have to be determined. The aim of this work was to develop a practical method for determining diffusion coefficients in anisotropic medium and to estimate the overall diffusion coefficients of fluorescently labeled ATP in rat cardiomyocytes. For that, we have extended raster image correlation spectroscopy (RICS) protocols to be able to discriminate the anisotropy in the diffusion coefficient tensor. Using this extended protocol, we estimated diffusion coefficients of ATP labeled with the fluorescent conjugate Alexa Fluor 647 (Alexa-ATP). In the analysis, we assumed that the diffusion tensor can be described by two values: diffusion coefficient along the myofibril and that across it. The average diffusion coefficients found for Alexa-ATP were as follows: 83 ± 14 μm2/s in the longitudinal and 52 ± 16 μm2/s in the transverse directions (n = 8, mean ± SD). Those values are ∼2 (longitudinal) and ∼3.5 (transverse) times smaller than the diffusion coefficient value estimated for the surrounding solution. Such uneven reduction of average diffusion coefficient leads to anisotropic diffusion in rat cardiomyocytes. Although the source for such anisotropy is uncertain, we speculate that it may be induced by the ordered pattern of intracellular structures in rat cardiomyocytes. PMID:18815224

  5. An outlet breaching algorithm for the treatment of closed depressions in a raster DEM

    NASA Astrophysics Data System (ADS)

    Martz, Lawrence W.; Garbrecht, Jurgen

    1999-08-01

    Automated drainage analysis of raster DEMs typically begins with the simulated filling of all closed depressions and the imposition of a drainage pattern on the resulting flat areas. The elimination of closed depressions by filling implicitly assumes that all depressions are caused by elevation underestimation. This assumption is difficult to support, as depressions can be produced by overestimation as well as by underestimation of DEM values.This paper presents a new algorithm that is applied in conjunction with conventional depression filling to provide a more realistic treatment of those depressions that are likely due to overestimation errors. The algorithm lowers the elevation of selected cells on the edge of closed depressions to simulate breaching of the depression outlets. Application of this breaching algorithm prior to depression filling can substantially reduce the number and size of depressions that need to be filled, especially in low relief terrain.Removing or reducing the size of a depression by breaching implicitly assumes that the depression is due to a spurious flow blockage caused by elevation overestimation. Removing a depression by filling, on the other hand, implicitly assumes that the depression is a direct artifact of elevation underestimation. Although the breaching algorithm cannot distinguish between overestimation and underestimation errors in a DEM, a constraining parameter for breaching length can be used to restrict breaching to closed depressions caused by narrow blockages along well-defined drainage courses. These are considered the depressions most likely to have arisen from overestimation errors. Applying the constrained breaching algorithm prior to a conventional depression-filling algorithm allows both positive and negative elevation adjustments to be used to remove depressions.The breaching algorithm was incorporated into the DEM pre-processing operations of the TOPAZ software system. The effect of the algorithm is illustrated

  6. Map projections for global and continental data sets and an analysis of pixel distortion caused by reprojection

    USGS Publications Warehouse

    Steinwand, Daniel R.; Hutchinson, John A.; Snyder, J.P.

    1995-01-01

    In global change studies the effects of map projection properties on data quality are apparent, and the choice of projection is significant. To aid compilers of global and continental data sets, six equal-area projections were chosen: the interrupted Goode Homolosine, the interrupted Mollweide, the Wagner IV, and the Wagner VII for global maps; the Lambert Azimuthal Equal-Area for hemisphere maps; and the Oblated Equal-Area and the Lambert Azimuthal Equal-Area for continental maps. Distortions in small-scale maps caused by reprojection, and the additional distortions incurred when reprojecting raster images, were quantified and graphically depicted. For raster images, the errors caused by the usual resampling methods (pixel brightness level interpolation) were responsible for much of the additional error where the local resolution and scale change were the greatest.

  7. Improving data management and dissemination in web based information systems by semantic enrichment of descriptive data aspects

    NASA Astrophysics Data System (ADS)

    Gebhardt, Steffen; Wehrmann, Thilo; Klinger, Verena; Schettler, Ingo; Huth, Juliane; Künzer, Claudia; Dech, Stefan

    2010-10-01

    The German-Vietnamese water-related information system for the Mekong Delta (WISDOM) project supports business processes in Integrated Water Resources Management in Vietnam. Multiple disciplines bring together earth and ground based observation themes, such as environmental monitoring, water management, demographics, economy, information technology, and infrastructural systems. This paper introduces the components of the web-based WISDOM system including data, logic and presentation tier. It focuses on the data models upon which the database management system is built, including techniques for tagging or linking metadata with the stored information. The model also uses ordered groupings of spatial, thematic and temporal reference objects to semantically tag datasets to enable fast data retrieval, such as finding all data in a specific administrative unit belonging to a specific theme. A spatial database extension is employed by the PostgreSQL database. This object-oriented database was chosen over a relational database to tag spatial objects to tabular data, improving the retrieval of census and observational data at regional, provincial, and local areas. While the spatial database hinders processing raster data, a "work-around" was built into WISDOM to permit efficient management of both raster and vector data. The data model also incorporates styling aspects of the spatial datasets through styled layer descriptions (SLD) and web mapping service (WMS) layer specifications, allowing retrieval of rendered maps. Metadata elements of the spatial data are based on the ISO19115 standard. XML structured information of the SLD and metadata are stored in an XML database. The data models and the data management system are robust for managing the large quantity of spatial objects, sensor observations, census and document data. The operational WISDOM information system prototype contains modules for data management, automatic data integration, and web services for data

  8. Towards Direct Manipulation and Remixing of Massive Data: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2012-04-01

    Complex analytics on "big data" is one of the core challenges of current Earth science, generating strong requirements for on-demand processing and fil tering of massive data sets. Issues under discussion include flexibility, performance, scalability, and the heterogeneity of the information types invo lved. In other domains, high-level query languages (such as those offered by database systems) have proven successful in the quest for flexible, scalable data access interfaces to massive amounts of data. However, due to the lack of support for many of the Earth science data structures, database systems are only used for registries and catalogs, but not for the bulk of spatio-temporal data. One core information category in this field is given by coverage data. ISO 19123 defines coverages, simplifying, as a representation of a "space-time varying phenomenon". This model can express a large class of Earth science data structures, including rectified and non-rectified rasters, curvilinear grids, point clouds, TINs, general meshes, trajectories, surfaces, and solids. This abstract definition, which is too high-level to establish interoperability, is concretized by the OGC GML 3.2.1 Application Schema for Coverages Standard into an interoperable representation. The OGC Web Coverage Processing Service (WCPS) Standard defines a declarative query language on multi-dimensional raster-type coverages, such as 1D in-situ sensor timeseries, 2D EO imagery, 3D x/y/t image time series and x/y/z geophysical data, 4D x/y/z/t climate and ocean data. Hence, important ingredients for versatile coverage retrieval are given - however, this potential has not been fully unleashed by service architectures up to now. The EU FP7-INFRA project EarthServer, launched in September 2011, aims at enabling standards-based on-demand analytics over the Web for Earth science data based on an integration of W3C XQuery for alphanumeric data and OGC-WCPS for raster data. Ultimately, EarthServer will support

  9. Bring NASA Scientific Data into GIS

    NASA Astrophysics Data System (ADS)

    Xu, H.

    2016-12-01

    NASA's Earth Observation System (EOS) and many other missions produce data of huge volume and near real time which drives the research and understanding of climate change. Geographic Information System (GIS) is a technology used for the management, visualization and analysis of spatial data. Since it's inception in the 1960s, GIS has been applied to many fields at the city, state, national, and world scales. People continue to use it today to analyze and visualize trends, patterns, and relationships from the massive datasets of scientific data. There is great interest in both the scientific and GIS communities in improving technologies that can bring scientific data into a GIS environment, where scientific research and analysis can be shared through the GIS platform to the public. Most NASA scientific data are delivered in the Hierarchical Data Format (HDF), a format is both flexible and powerful. However, this flexibility results in challenges when trying to develop supported GIS software - data stored with HDF formats lack a unified standard and convention among these products. The presentation introduces an information model that enables ArcGIS software to ingest NASA scientific data and create a multidimensional raster - univariate and multivariate hypercubes - for scientific visualization and analysis. We will present the framework how ArcGIS leverages the open source GDAL (Geospatial Data Abstract Library) to support its raster data access, discuss how we overcame the GDAL drivers limitations in handing scientific products that are stored with HDF4 and HDF5 formats and how we improve the way in modeling the multidimensionality with GDAL. In additional, we will talk about the direction of ArcGIS handling NASA products and demonstrate how the multidimensional information model can help scientists work with various data products such as MODIS, MOPPIT, SMAP as well as many data products in a GIS environment.

  10. A physical data model for fields and agents

    NASA Astrophysics Data System (ADS)

    de Jong, Kor; de Bakker, Merijn; Karssenberg, Derek

    2016-04-01

    Two approaches exist in simulation modeling: agent-based and field-based modeling. In agent-based (or individual-based) simulation modeling, the entities representing the system's state are represented by objects, which are bounded in space and time. Individual objects, like an animal, a house, or a more abstract entity like a country's economy, have properties representing their state. In an agent-based model this state is manipulated. In field-based modeling, the entities representing the system's state are represented by fields. Fields capture the state of a continuous property within a spatial extent, examples of which are elevation, atmospheric pressure, and water flow velocity. With respect to the technology used to create these models, the domains of agent-based and field-based modeling have often been separate worlds. In environmental modeling, widely used logical data models include feature data models for point, line and polygon objects, and the raster data model for fields. Simulation models are often either agent-based or field-based, even though the modeled system might contain both entities that are better represented by individuals and entities that are better represented by fields. We think that the reason for this dichotomy in kinds of models might be that the traditional object and field data models underlying those models are relatively low level. We have developed a higher level conceptual data model for representing both non-spatial and spatial objects, and spatial fields (De Bakker et al. 2016). Based on this conceptual data model we designed a logical and physical data model for representing many kinds of data, including the kinds used in earth system modeling (e.g. hydrological and ecological models). The goal of this work is to be able to create high level code and tools for the creation of models in which entities are representable by both objects and fields. Our conceptual data model is capable of representing the traditional feature data

  11. Coherent visualization of spatial data adapted to roles, tasks, and hardware

    NASA Astrophysics Data System (ADS)

    Wagner, Boris; Peinsipp-Byma, Elisabeth

    2012-06-01

    Modern crisis management requires that users with different roles and computer environments have to deal with a high volume of various data from different sources. For this purpose, Fraunhofer IOSB has developed a geographic information system (GIS) which supports the user depending on available data and the task he has to solve. The system provides merging and visualization of spatial data from various civilian and military sources. It supports the most common spatial data standards (OGC, STANAG) as well as some proprietary interfaces, regardless if these are filebased or database-based. To set the visualization rules generic Styled Layer Descriptors (SLDs) are used, which are an Open Geospatial Consortium (OGC) standard. SLDs allow specifying which data are shown, when and how. The defined SLDs consider the users' roles and task requirements. In addition it is possible to use different displays and the visualization also adapts to the individual resolution of the display. Too high or low information density is avoided. Also, our system enables users with different roles to work together simultaneously using the same data base. Every user is provided with the appropriate and coherent spatial data depending on his current task. These so refined spatial data are served via the OGC services Web Map Service (WMS: server-side rendered raster maps), or the Web Map Tile Service - (WMTS: pre-rendered and cached raster maps).

  12. Comparison of LiDAR-derived data and high resolution true color imagery for extracting urban forest cover

    Treesearch

    Aaron E. Maxwell; Adam C. Riley; Paul Kinder

    2013-01-01

    Remote sensing has many applications in forestry. Light detection and ranging (LiDAR) and high resolution aerial photography have been investigated as means to extract forest data, such as biomass, timber volume, stand dynamics, and gap characteristics. LiDAR return intensity data are often overlooked as a source of input raster data for thematic map creation. We...

  13. NAVAIR Portable Source Initiative (NPSI) Standard for Reusable Source Dataset Metadata (RSDM) V2.4

    DTIC Science & Technology

    2012-09-26

    defining a raster file format: <RasterFileFormat> <FormatName>TIFF</FormatName> <Order>BIP</Order> < DataType >8-BIT_UNSIGNED</ DataType ...interleaved by line (BIL); Band interleaved by pixel (BIP). element RasterFileFormatType/ DataType diagram type restriction of xsd:string facets

  14. Improving the Raster Scanning Methods used with X-ray Fluorescence to See the Ancient Greek Text of Archimedes (SULI Paper)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffin, Isabella B.; /Norfolk State U. /SLAC, SSRL

    2006-01-04

    X-ray fluorescence is being used to detect the ancient Greek copy of Archimedes work. The copy of Archimedes text was erased with a weak acid and written over to make a prayer book in the Middle Ages. The ancient parchment, made of goat skin, has on it some of Archimedes most valuable writings. The ink in the text contains iron which will fluoresce under x-ray radiation. My research project deals with the scanning and imaging process. The palimpsest is put in a stage that moves in a raster format. As the beam hits the parchment, a germanium detector detects themore » iron atoms and discriminates against other elements. Since the computer scans in both forwards and backwards directions, it is imperative that each row of data lines up exactly on top of the next row. There are several parameters to consider when scanning the parchment. These parameters include: speed, count time, shutter time, x-number of points, and acceleration. Formulas were made to relate these parameters together. During the actual beam time of this project, the scanning was very slow going; it took 30 hours to scan 1/2 of a page. Using the formulas, the scientists doubled distance and speed to scan the parchment faster; however, the grey scaled data was not lined up properly causing the images to look blurred. My project was is to find out why doubling the parameters caused blurred images, and to fix the problem if it is fixable.« less

  15. Interpolating of climate data using R

    NASA Astrophysics Data System (ADS)

    Reinhardt, Katja

    2017-04-01

    Interpolation methods are used in many different geoscientific areas, such as soil physics, climatology and meteorology. Thereby, unknown values are calculated by using statistical calculation approaches applied on known values. So far, the majority of climatologists have been using computer languages, such as FORTRAN or C++, but there is also an increasing number of climate scientists using R for data processing and visualization. Most of them, however, are still working with arrays and vector based data which is often associated with complex R code structures. For the presented study, I have decided to convert the climate data into geodata and to perform the whole data processing using the raster package, gstat and similar packages, providing a much more comfortable way for data handling. A central goal of my approach is to create an easy to use, powerful and fast R script, implementing the entire geodata processing and visualization into a single and fully automated R based procedure, which allows avoiding the necessity of using other software packages, such as ArcGIS or QGIS. Thus, large amount of data with recurrent process sequences can be processed. The aim of the presented study, which is located in western Central Asia, is to interpolate wind data based on the European reanalysis data Era-Interim, which are available as raster data with a resolution of 0.75˚ x 0.75˚ , to a finer grid. Therefore, various interpolation methods are used: inverse distance weighting, the geostatistical methods ordinary kriging and regression kriging, generalized additve model and the machine learning algorithms support vector machine and neural networks. Besides the first two mentioned methods, the methods are used with influencing factors, e.g. geopotential and topography.

  16. Deserts in the Deluge: TerraPopulus and Big Human-Environment Data.

    PubMed

    Manson, S M; Kugler, T A; Haynes, D

    2016-01-01

    Terra Populus, or TerraPop, is a cyberinfrastructure project that integrates, preserves, and disseminates massive data collections describing characteristics of the human population and environment over the last six decades. TerraPop has made a number of GIScience advances in the handling of big spatial data to make information interoperable between formats and across scientific communities. In this paper, we describe challenges of these data, or 'deserts in the deluge' of data, that are common to spatial big data more broadly, and explore computational solutions specific to microdata, raster, and vector data models.

  17. A framework for evaluating forest landscape model predictions using empirical data and knowledge

    Treesearch

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson; William D. Dijak; Qia Wang

    2014-01-01

    Evaluation of forest landscape model (FLM) predictions is indispensable to establish the credibility of predictions. We present a framework that evaluates short- and long-term FLM predictions at site and landscape scales. Site-scale evaluation is conducted through comparing raster cell-level predictions with inventory plot data whereas landscape-scale evaluation is...

  18. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patchett, John M; Ahrens, James P; Lo, Li - Ta

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less

  19. mapview - Interactive viewing of spatial data in R

    NASA Astrophysics Data System (ADS)

    Appelhans, Tim; Detsch, Florian; Reudenbach, Cristoph; Woellauer, Stefan

    2016-04-01

    In this talk we would like to introduce mapview, an R package designed to aid researchers during their work-flow of spatial data analysis. The package was initially developed within the framework of the DFG funded research group "KiLi - Kilimanjaro ecosystems under global change: Linking biodiversity, biotic interactions and biogeochemical ecosystem processes" but has quickly developed into a general purpose spatial data viewer. mapview provides some powerful tools for interactive visualization of standard spatial data in R. It has support for all Spatial*(DataFrame) objects as well as all Raster* objects. It is designed so that one function call - mapview(x) - is all you need to view the data interactively. Adding layers to existing views is very easy and we have taken great care in providing suitable defaults for features such as background maps or coloring but things can be customized flexibly (and permanently) to suit different needs. Even though mapview is for most parts based on the leaflet package, it is far more than just a convenience wrapper around leaflet functionality. mapview provides additional features for handling big data sets (up to several million points) as well as some specialized functionality to view and compare rasters of any size with arbitrary coordinate reference systems. Given that mapview is merely a bridge between R and the underlying leaflet.js javascript library, mapview can be used to produce web-maps by simply providing the path to a designated folder. This talk will be a live demonstration of some of the key features of mapview.

  20. User Guide and Documentation for Five MODFLOW Ground-Water Modeling Utility Programs

    USGS Publications Warehouse

    Banta, Edward R.; Paschke, Suzanne S.; Litke, David W.

    2008-01-01

    This report documents five utility programs designed for use in conjunction with ground-water flow models developed with the U.S. Geological Survey's MODFLOW ground-water modeling program. One program extracts calculated flow values from one model for use as input to another model. The other four programs extract model input or output arrays from one model and make them available in a form that can be used to generate an ArcGIS raster data set. The resulting raster data sets may be useful for visual display of the data or for further geographic data processing. The utility program GRID2GRIDFLOW reads a MODFLOW binary output file of cell-by-cell flow terms for one (source) model grid and converts the flow values to input flow values for a different (target) model grid. The spatial and temporal discretization of the two models may differ. The four other utilities extract selected 2-dimensional data arrays in MODFLOW input and output files and write them to text files that can be imported into an ArcGIS geographic information system raster format. These four utilities require that the model cells be square and aligned with the projected coordinate system in which the model grid is defined. The four raster-conversion utilities are * CBC2RASTER, which extracts selected stress-package flow data from a MODFLOW binary output file of cell-by-cell flows; * DIS2RASTER, which extracts cell-elevation data from a MODFLOW Discretization file; * MFBIN2RASTER, which extracts array data from a MODFLOW binary output file of head or drawdown; and * MULT2RASTER, which extracts array data from a MODFLOW Multiplier file.

  1. Globes from global data: Charting international research networks with the GRASS GIS r.out.polycones add-on module.

    NASA Astrophysics Data System (ADS)

    Löwe, Peter

    2015-04-01

    Many Free and Open Source Software (FOSS) tools have been created for the various application fields within geoscience. While FOSS allows re-implementation of functionalities in new environments by access to the original codebase, the easiest approach to build new software solutions for new problems is the combination or merging of existing software tools. Such mash-ups are implemented by embedding and encapsulating FOSS tools within each another, effectively focusing the use of the embedded software to the specific role it needs to perform in the given scenario, while ignoring all its other capabilities. GRASS GIS is a powerful and established FOSS GIS for raster, vector and volume data processing while the Generic Mapping Tools (GMT) are a suite of powerful Open Source mapping tools, which exceed the mapping capabilities of GRASS GIS. This poster reports on the new GRASS GIS add-on module r.out.polycones. It enables users to utilize non-continuous projections for map production within the GRASS production environment. This is implemented on the software level by encapsulating a subset of GMT mapping capabilities into a GRASS GIS (Version 6.x) add-on module. The module was developed at the German National Library of Science and Technology (TIB) to provide custom global maps of scientific collaboration networks, such as the DataCite consortium, the registration agency for Digital Object Identifiers (DOI) for research data. The GRASS GIS add-on module can be used for global mapping of raster data into a variety of non continuous sinosoidal projections, allowing the creation of printable biangles (gores) to be used for globe making. Due to the well structured modular nature of GRASS modules, technical follow-up work will focus on API-level Python-based integration in GRASS 7 [1]. Based on this, GMT based mapping capabilities in GRASS will be extended beyond non-continuous sinosoidal maps and advanced from raster-layers to content GRASS display monitors. References

  2. Techniques and strategies for data integration in mineral resource assessment

    USGS Publications Warehouse

    Trautwein, Charles M.; Dwyer, John L.

    1991-01-01

    The Geologic and the National Mapping divisions of the U.S. Geological Survey have been involved formally in cooperative research and development of computer-based geographic information systems (GISs) applied to mineral-resource assessment objectives since 1982. Experience in the Conterminous United States Mineral Assessment Program (CUSMAP) projects including the Rolla, Missouri; Dillon, Montana; Butte, Montana; and Tonopah, Nevada 1?? ?? 2?? quadrangles, has resulted in the definition of processing requirements for geographically and mineral-resource data that are common to these studies. The diverse formats of data sets collected and compiled for regional mineral-resource assessments necessitate capabilities for digitally encoding and entering data into appropriate tabular, vector, and raster subsystems of the GIS. Although many of the required data sets are either available or can be provided in a digital format suitable for direct entry, their utility is largely dependent on the original intent and consequent preprocessing of the data. In this respect, special care must be taken to ensure the digital data type, encoding, and format will meet assessment objectives. Data processing within the GIS is directed primarily toward the development and application of models that can be used to describe spatially geological, geophysical, and geochemical environments either known or inferred to be associated with specific types of mineral deposits. Consequently, capabilities to analyze spatially, aggregate, and display relations between data sets are principal processing requirements. To facilitate the development of these models within the GIS, interfaces must be developed among vector-, raster-, and tabular-based processing subsystems to reformat resident data sets for comparative analyses and multivariate display of relations.

  3. Evaluation of LiDAR-Acquired Bathymetric and Topographic Data Accuracy in Various Hydrogeomorphic Settings in the Lower Boise River, Southwestern Idaho, 2007

    USGS Publications Warehouse

    Skinner, Kenneth D.

    2009-01-01

    Elevation data in riverine environments can be used in various applications for which different levels of accuracy are required. The Experimental Advanced Airborne Research LiDAR (Light Detection and Ranging) - or EAARL - system was used to obtain topographic and bathymetric data along the lower Boise River, southwestern Idaho, for use in hydraulic and habitat modeling. The EAARL data were post-processed into bare earth and bathymetric raster and point datasets. Concurrently with the EAARL data collection, real-time kinetic global positioning system and total station ground-survey data were collected in three areas within the lower Boise River basin to assess the accuracy of the EAARL elevation data in different hydrogeomorphic settings. The accuracies of the EAARL-derived elevation data, determined in open, flat terrain, to provide an optimal vertical comparison surface, had root mean square errors ranging from 0.082 to 0.138 m. Accuracies for bank, floodplain, and in-stream bathymetric data had root mean square errors ranging from 0.090 to 0.583 m. The greater root mean square errors for the latter data are the result of high levels of turbidity in the downstream ground-survey area, dense tree canopy, and horizontal location discrepancies between the EAARL and ground-survey data in steeply sloping areas such as riverbanks. The EAARL point to ground-survey comparisons produced results similar to those for the EAARL raster to ground-survey comparisons, indicating that the interpolation of the EAARL points to rasters did not introduce significant additional error. The mean percent error for the wetted cross-sectional areas of the two upstream ground-survey areas was 1 percent. The mean percent error increases to -18 percent if the downstream ground-survey area is included, reflecting the influence of turbidity in that area.

  4. Ground sample data for the Conterminous U.S. Land Cover Characteristics Database

    Treesearch

    Robert Burgan; Colin Hardy; Donald Ohlen; Gene Fosnight; Robert Treder

    1999-01-01

    Ground sample data were collected for a land cover database and raster map that portray 159 vegetation classes at 1 km2 resolution for the conterminous United States. Locations for 3,500 1 km2 ground sample plots were selected randomly across the United States. The number of plots representing each vegetation class was weighted by the proportionate coverage of each...

  5. Influence of Elevation Data Resolution on Spatial Prediction of Colluvial Soils in a Luvisol Region

    PubMed Central

    Penížek, Vít; Zádorová, Tereza; Kodešová, Radka; Vaněk, Aleš

    2016-01-01

    The development of a soil cover is a dynamic process. Soil cover can be altered within a few decades, which requires updating of the legacy soil maps. Soil erosion is one of the most important processes quickly altering soil cover on agriculture land. Colluvial soils develop in concave parts of the landscape as a consequence of sedimentation of eroded material. Colluvial soils are recognised as important soil units because they are a vast sink of soil organic carbon. Terrain derivatives became an important tool in digital soil mapping and are among the most popular auxiliary data used for quantitative spatial prediction. Prediction success rates are often directly dependent on raster resolution. In our study, we tested how raster resolution (1, 2, 3, 5, 10, 20 and 30 meters) influences spatial prediction of colluvial soils. Terrain derivatives (altitude, slope, plane curvature, topographic position index, LS factor and convergence index) were calculated for the given raster resolutions. Four models were applied (boosted tree, neural network, random forest and Classification/Regression Tree) to spatially predict the soil cover over a 77 ha large study plot. Models training and validation was based on 111 soil profiles surveyed on a regular sampling grid. Moreover, the predicted real extent and shape of the colluvial soil area was examined. In general, no clear trend in the accuracy prediction was found without the given raster resolution range. Higher maximum prediction accuracy for colluvial soil, compared to prediction accuracy of total soil cover of the study plot, can be explained by the choice of terrain derivatives that were best for Colluvial soils differentiation from other soil units. Regarding the character of the predicted Colluvial soils area, maps of 2 to 10 m resolution provided reasonable delineation of the colluvial soil as part of the cover over the study area. PMID:27846230

  6. Leveraging Open Standards and Technologies to Enhance Community Access to Earth Science Lidar Data

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Cowart, C.; Baru, C.; Arrowsmith, R.

    2011-12-01

    Lidar (Light Detection and Ranging) data, collected from space, airborne and terrestrial platforms, have emerged as an invaluable tool for a variety of Earth science applications ranging from ice sheet monitoring to modeling of earth surface processes. However, lidar present a unique suite of challenges from the perspective of building cyberinfrastructure systems that enable the scientific community to access these valuable research datasets. Lidar data are typically characterized by millions to billions of individual measurements of x,y,z position plus attributes; these "raw" data are also often accompanied by derived raster products and are frequently terabytes in size. As a relatively new and rapidly evolving data collection technology, relevant open data standards and software projects are immature compared to those for other remote sensing platforms. The NSF-funded OpenTopography Facility project has developed an online lidar data access and processing system that co-locates data with on-demand processing tools to enable users to access both raw point cloud data as well as custom derived products and visualizations. OpenTopography is built on a Service Oriented Architecture (SOA) in which applications and data resources are deployed as standards compliant (XML and SOAP) Web services with the open source Opal Toolkit. To develop the underlying applications for data access, filtering and conversion, and various processing tasks, OpenTopography has heavily leveraged existing open source software efforts for both lidar and raster data. Operating on the de facto LAS binary point cloud format (maintained by ASPRS), open source libLAS and LASlib libraries provide OpenTopography data ingestion, query and translation capabilities. Similarly, raster data manipulation is performed through a suite of services built on the Geospatial Data Abstraction Library (GDAL). OpenTopography has also developed our own algorithm for high-performance gridding of lidar point cloud data

  7. Geometrical correction of the e-beam proximity effect for raster scan systems

    NASA Astrophysics Data System (ADS)

    Belic, Nikola; Eisenmann, Hans; Hartmann, Hans; Waas, Thomas

    1999-06-01

    Increasing demands on pattern fidelity and CD accuracy in e- beam lithography require a correction of the e-beam proximity effect. The new needs are mainly coming from OPC at mask level and x-ray lithography. The e-beam proximity limits the achievable resolution and affects neighboring structures causing under- or over-exposion depending on the local pattern densities and process settings. Methods to compensate for this unequilibrated does distribution usually use a dose modulation or multiple passes. In general raster scan systems are not able to apply variable doses in order to compensate for the proximity effect. For system of this kind a geometrical modulation of the original pattern offers a solution for compensation of line edge deviations due to the proximity effect. In this paper a new method for the fast correction of the e-beam proximity effect via geometrical pattern optimization is described. The method consists of two steps. In a first step the pattern dependent dose distribution caused by back scattering is calculated by convolution of the pattern with the long range part of the proximity function. The restriction to the long range part result in a quadratic sped gain in computing time for the transformation. The influence of the short range part coming from forward scattering is not pattern dependent and can therefore be determined separately in a second step. The second calculation yields the dose curve at the border of a written structure. The finite gradient of this curve leads to an edge displacement depending on the amount of underground dosage at the observed position which was previously determined in the pattern dependent step. This unintended edge displacement is corrected by splitting the line into segments and shifting them by multiples of the writers address grid to the opposite direction.

  8. Cartographic potential of SPOT image data

    NASA Technical Reports Server (NTRS)

    Welch, R.

    1985-01-01

    In late 1985, the SPOT (Systeme Probatoire d'Observation de la Terre) satellite is to be launched by the Ariane rocket from French Guiana. This satellite will have two High Resolution Visible (HRV) line array sensor systems which are capable of providing monoscopic and stereoscopic coverage of the earth. Cartographic applications are related to the recording of stereo image data and the acquisition of 20-m data in a multispectral mode. One of the objectives of this study involves a comparison of the suitability of SPOT and TM image data for mapping urban land use/cover. Another objective is concerned with a preliminary assessment of the potential of SPOT image data for map revision when merged with conventional map sheets converted to raster formats.

  9. A data base approach for prediction of deforestation-induced mass wasting events

    NASA Technical Reports Server (NTRS)

    Logan, T. L.

    1981-01-01

    A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.

  10. Rapid Data Delivery System (RDDS)

    USGS Publications Warehouse

    Cress, Jill J.; Goplen, Susan E.

    2007-01-01

    Since the start of the active 2000 summer fire season, the U. S. Geological Survey (USGS) Rocky Mountain Geographic Science Center (RMGSC) has been actively engaged in providing crucial and timely support to Federal, State, and local natural hazards monitoring, analysis, response, and recovery activities. As part of this support, RMGSC has developed the Rapid Data Delivery System (RDDS) to provide emergency and incident response teams with timely access to geospatial data. The RDDS meets these needs by combining a simple web-enabled data viewer for the selection and preview of vector and raster geospatial data with an easy to use data ordering form. The RDDS viewer also incorporates geospatial locations for current natural hazard incidents, including wildfires, earthquakes, hurricanes, and volcanoes, allowing incident responders to quickly focus on their area of interest for data selection.

  11. Quantifying forest fragmentation using Geographic Information Systems and Forest Inventory and Analysis plot data

    Treesearch

    Dacia M. Meneguzzo; Mark H. Hansen

    2009-01-01

    Fragmentation metrics provide a means of quantifying and describing forest fragmentation. The most common method of calculating these metrics is through the use of Geographic Information System software to analyze raster data, such as a satellite or aerial image of the study area; however, the spatial resolution of the imagery has a significant impact on the results....

  12. Determination of Dynamics of Plant Plasma Membrane Proteins with Fluorescence Recovery and Raster Image Correlation Spectroscopy.

    PubMed

    Laňková, Martina; Humpolíčková, Jana; Vosolsobě, Stanislav; Cit, Zdeněk; Lacek, Jozef; Čovan, Martin; Čovanová, Milada; Hof, Martin; Petrášek, Jan

    2016-04-01

    A number of fluorescence microscopy techniques are described to study dynamics of fluorescently labeled proteins, lipids, nucleic acids, and whole organelles. However, for studies of plant plasma membrane (PM) proteins, the number of these techniques is still limited because of the high complexity of processes that determine the dynamics of PM proteins and the existence of cell wall. Here, we report on the usage of raster image correlation spectroscopy (RICS) for studies of integral PM proteins in suspension-cultured tobacco cells and show its potential in comparison with the more widely used fluorescence recovery after photobleaching method. For RICS, a set of microscopy images is obtained by single-photon confocal laser scanning microscopy (CLSM). Fluorescence fluctuations are subsequently correlated between individual pixels and the information on protein mobility are extracted using a model that considers processes generating the fluctuations such as diffusion and chemical binding reactions. As we show here using an example of two integral PM transporters of the plant hormone auxin, RICS uncovered their distinct short-distance lateral mobility within the PM that is dependent on cytoskeleton and sterol composition of the PM. RICS, which is routinely accessible on modern CLSM instruments, thus represents a valuable approach for studies of dynamics of PM proteins in plants.

  13. Maps, Models and Data from Southeastern Great Basin PFA, Phase II Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nash, Greg

    This submission includes composite risk segment models in raster format for permeability, heat of the earth, and MT, as well as the final PFA model of geothermal exploration risk in Southwestern Utah, USA. Additionally, this submission has data regarding hydrothermally altered areas, and opal sinter deposits in the study area. All of this information lends to the understanding and exploration for hidden geothermal systems in the area.

  14. Data, Metadata - Who Cares?

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2013-04-01

    There is a traditional saying that metadata are understandable, semantic-rich, and searchable. Data, on the other hand, are big, with no accessible semantics, and just downloadable. Not only has this led to an imbalance of search support form a user perspective, but also underneath to a deep technology divide often using relational databases for metadata and bespoke archive solutions for data. Our vision is that this barrier will be overcome, and data and metadata become searchable likewise, leveraging the potential of semantic technologies in combination with scalability technologies. Ultimately, in this vision ad-hoc processing and filtering will not distinguish any longer, forming a uniformly accessible data universe. In the European EarthServer initiative, we work towards this vision by federating database-style raster query languages with metadata search and geo broker technology. We present our approach taken, how it can leverage OGC standards, the benefits envisaged, and first results.

  15. Mapping disease at an approximated individual level using aggregate data: a case study of mapping New Hampshire birth defects.

    PubMed

    Shi, Xun; Miller, Stephanie; Mwenda, Kevin; Onda, Akikazu; Reese, Judy; Onega, Tracy; Gui, Jiang; Karagas, Margret; Demidenko, Eugene; Moeschler, John

    2013-09-06

    Limited by data availability, most disease maps in the literature are for relatively large and subjectively-defined areal units, which are subject to problems associated with polygon maps. High resolution maps based on objective spatial units are needed to more precisely detect associations between disease and environmental factors. We propose to use a Restricted and Controlled Monte Carlo (RCMC) process to disaggregate polygon-level location data to achieve mapping aggregate data at an approximated individual level. RCMC assigns a random point location to a polygon-level location, in which the randomization is restricted by the polygon and controlled by the background (e.g., population at risk). RCMC allows analytical processes designed for individual data to be applied, and generates high-resolution raster maps. We applied RCMC to the town-level birth defect data for New Hampshire and generated raster maps at the resolution of 100 m. Besides the map of significance of birth defect risk represented by p-value, the output also includes a map of spatial uncertainty and a map of hot spots. RCMC is an effective method to disaggregate aggregate data. An RCMC-based disease mapping maximizes the use of available spatial information, and explicitly estimates the spatial uncertainty resulting from aggregation.

  16. Fluorescence correlation spectroscopy, Raster image correlation spectroscopy and Number & Brightness on a commercial confocal laser scanning microscope with analog detectors (Nikon C1)

    PubMed Central

    Moens, Pierre D.J.; Gratton, Enrico; Salvemini, Iyrri L.

    2010-01-01

    Fluorescence correlation spectroscopy (FCS) was developed in 1972 by Magde, Elson and Webb (Magde et al., 1972). Photon counting detectors and avalanche photodiodes have become standards in FCS to the point that there is a widespread belief that these detectors are essential to perform FCS experiments, despite the fact that FCS was developed using analog detectors. Spatial and temporal intensity fluctuation correlations using analog detection on a commercial Olympus Fluoview 300 microscope has been reported by Brown et al. (2008). However, each analog instrument has its own idiosyncrasies that need to be understood before using the instrument for FCS. In this work we explore the capabilities of the Nikon C1, a low cost confocal microscope, to obtain single point FCS, Raster-scan Image Correlation Spectroscopy (RICS) and Number & Brightness data both in solution and incorporated into the membrane of Giant Unilamellar Vesicles (GUVs). We show that it is possible to obtain dynamic information about fluorescent molecules from single point FCS, RICS and Number & Brightness using the Nikon C1. We highlighted the fact that care should be taken in selecting the acquisition parameters in order to avoid possible artifacts due to the detector noise. However, due to relatively large errors in determining the distribution of digital levels for a given microscope setting, the system is probably only adequate for determining relative brightness within the same image. PMID:20734406

  17. Raster Images of Geologic Maps of Middle Proterozoic Belt strata in parts of Benewah, Bonner, Kootenai and Shoshone Counties, Idaho and Lincoln, Mineral and Sanders Counties, Montana

    USGS Publications Warehouse

    Boleneus, David E.; Appelgate, Larry M.; Joseph, Nancy L.; Brandt, Theodore R.

    2001-01-01

    Geologic maps of the western part of the Belt Basin of western Montana and northern Idaho were converted into digital raster (TIFF image) format to facilitate their manipulation in geographic information systems. The 85-mile x 100-mile map area mostly contains rocks belonging to the lower and middle Belt Supergroup. The area is of interest as these Middle Proterozoic strata contain vein-type lead-zinc-silver deposits in the Coeur d?Alene Mining District in the St. Regis and Revett formations and strata-bound copper-silver deposits, such as the Troy mine, within the Revett Formation. The Prichard Formation is also prospective for strata-bound lead-zinc deposits because equivalent Belt strata in southern British Columbia, Canada host the Sullivan lead-zinc deposit. Map data converted to digital images include 13 geological maps at scales ranging from 1:48,000 to 1:12,000. Geologic map images produced from these maps by color scanning were registered to grid tick coverages in a Universal Transverse Mercator (North American Datum of 1927, zone 11) projection using ArcView Image Analysis. Geo-registering errors vary from 10 ft to 114 ft.

  18. Land-Cover and Imperviousness Data for Regional Areas near Denver, Colorado; Dallas-Fort Worth, Texas; and Milwaukee-Green Bay, Wisconsin - 2001

    USGS Publications Warehouse

    Falcone, James A.; Pearson, Daniel K.

    2006-01-01

    This report describes the processing and results of land-cover and impervious surface derivation for parts of three metropolitan areas being studied as part of the U.S. Geological Survey's (USGS) National Water-Quality Assessment (NAWQA) Program Effects of Urbanization on Stream Ecosystems (EUSE). The data were derived primarily from Landsat-7 Enhanced Thematic Mapper Plus (ETM+) satellite imagery from the period 1999-2002, and are provided as 30-meter resolution raster datasets. Data were produced to a standard consistent with data being produced as part of the USGS National Land Cover Database 2001 (NLCD01) Program, and were derived in cooperation with, and assistance from, NLCD01 personnel. The data were intended as surrogates for NLCD01 data because of the EUSE Program's time-critical need for updated land-cover for parts of the United States that would not be available in time from the NLCD01 Program. Six datasets are described in this report: separate land-cover (15-class categorical data) and imperviousness (0-100 percent continuous data) raster datasets for parts of the general Denver, Colorado area (South Platte River Basin), Dallas-Fort Worth, Texas area (Trinity River Basin), and Milwaukee-Green Bay, Wisconsin area (Western Lake Michigan Drainages).

  19. Satellite Data Processing System (SDPS) users manual V1.0

    NASA Technical Reports Server (NTRS)

    Caruso, Michael; Dunn, Chris

    1989-01-01

    SDPS is a menu driven interactive program designed to facilitate the display and output of image and line-based data sets common to telemetry, modeling and remote sensing. This program can be used to display up to four separate raster images and overlay line-based data such as coastlines, ship tracks and velocity vectors. The program uses multiple windows to communicate information with the user. At any given time, the program may have up to four image display windows as well as auxiliary windows containing information about each image displayed. SDPS is not a commercial program. It does not contain complete type checking or error diagnostics which may allow the program to crash. Known anomalies will be mentioned in the appropriate section as notes or cautions. SDPS was designed to be used on Sun Microsystems Workstations running SunView1 (Sun Visual/Integrated Environment for Workstations). It was primarily designed to be used on workstations equipped with color monitors, but most of the line-based functions and several of the raster-based functions can be used with monochrome monitors. The program currently runs on Sun 3 series workstations running Sun OS 4.0 and should port easily to Sun 4 and Sun 386 series workstations with SunView1. Users should also be familiar with UNIX, Sun workstations and the SunView window system.

  20. A geostatistical approach to the change-of-support problem and variable-support data fusion in spatial analysis

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Wang, Yang; Zeng, Hui

    2016-01-01

    A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.

  1. Making data matter: Voxel printing for the digital fabrication of data across scales and domains.

    PubMed

    Bader, Christoph; Kolb, Dominik; Weaver, James C; Sharma, Sunanda; Hosny, Ahmed; Costa, João; Oxman, Neri

    2018-05-01

    We present a multimaterial voxel-printing method that enables the physical visualization of data sets commonly associated with scientific imaging. Leveraging voxel-based control of multimaterial three-dimensional (3D) printing, our method enables additive manufacturing of discontinuous data types such as point cloud data, curve and graph data, image-based data, and volumetric data. By converting data sets into dithered material deposition descriptions, through modifications to rasterization processes, we demonstrate that data sets frequently visualized on screen can be converted into physical, materially heterogeneous objects. Our approach alleviates the need to postprocess data sets to boundary representations, preventing alteration of data and loss of information in the produced physicalizations. Therefore, it bridges the gap between digital information representation and physical material composition. We evaluate the visual characteristics and features of our method, assess its relevance and applicability in the production of physical visualizations, and detail the conversion of data sets for multimaterial 3D printing. We conclude with exemplary 3D-printed data sets produced by our method pointing toward potential applications across scales, disciplines, and problem domains.

  2. Depth-to-basement, sediment-thickness, and bathymetry data for the deep-sea basins offshore of Washington, Oregon, and California

    USGS Publications Warehouse

    Wong, Florence L.; Grim, Muriel S.

    2015-01-01

    Contours and derivative raster files of depth-to-basement, sediment-thickness, and bathymetry data for the area offshore of Washington, Oregon, and California are provided here as GIS-ready shapefiles and GeoTIFF files. The data were used to generate paper maps in 1992 and 1993 from 1984 surveys of the U.S. Exclusive Economic Zone by the U.S. Geological Survey for depth to basement and sediment thickness, and from older data for the bathymetry.

  3. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  4. Speeding up the Raster Scanning Methods used in theX-Ray Fluorescence Imaging of the Ancient Greek Text of Archimedes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, Manisha; /Norfolk State U.

    2006-08-24

    Progress has been made at the Stanford Linear Accelerator Center (SLAC) toward deciphering the remaining 10-20% of ancient Greek text contained in the Archimedes palimpsest. The text is known to contain valuable works by the mathematician, including the ''Method of Mechanical Theorems, the Equilibrium of Planes, On Floating Bodies'', and several diagrams as well. The only surviving copy of the text was recycled into a prayer book in the Middle Ages. The ink used to write on the goat skin parchment is partly composed of iron, which is visible by x-ray radiation. To image the palimpsest pages, the parchment ismore » framed and placed in a stage that moves according to the raster method. When an x-ray beam strikes the parchment, the iron in the ink is detected by a germanium detector. The resulting signal is converted to a gray-scale image on the imaging program, Rasplot. It is extremely important that each line of data is perfectly aligned with the line that came before it because the image is scanned in two directions. The objectives of this experiment were to determine the best parameters for producing well-aligned images and to reduce the scanning time. Imaging half a page of parchment during previous beam time for this project was achieved in thirty hours. Equations were produced to evaluate count time, shutter time, and the number of pixels in this experiment. On Beamline 6-2 at the Stanford Synchrotron Radiation Laboratory (SSRL), actual scanning time was reduced by one fourth. The remaining pages were successfully imaged and sent to ancient Greek experts for translation.« less

  5. A case study of a precision fertilizer application task generation for wheat based on classified hyperspectral data from UAV combined with farm history data

    NASA Astrophysics Data System (ADS)

    Kaivosoja, Jere; Pesonen, Liisa; Kleemola, Jouko; Pölönen, Ilkka; Salo, Heikki; Honkavaara, Eija; Saari, Heikki; Mäkynen, Jussi; Rajala, Ari

    2013-10-01

    Different remote sensing methods for detecting variations in agricultural fields have been studied in last two decades. There are already existing systems for planning and applying e.g. nitrogen fertilizers to the cereal crop fields. However, there are disadvantages such as high costs, adaptability, reliability, resolution aspects and final products dissemination. With an unmanned aerial vehicle (UAV) based airborne methods, data collection can be performed cost-efficiently with desired spatial and temporal resolutions, below clouds and under diverse weather conditions. A new Fabry-Perot interferometer based hyperspectral imaging technology implemented in an UAV has been introduced. In this research, we studied the possibilities of exploiting classified raster maps from hyperspectral data to produce a work task for a precision fertilizer application. The UAV flight campaign was performed in a wheat test field in Finland in the summer of 2012. Based on the campaign, we have classified raster maps estimating the biomass and nitrogen contents at approximately stage 34 in the Zadoks scale. We combined the classified maps with farm history data such as previous yield maps. Then we generalized the combined results and transformed it to a vectorized zonal task map suitable for farm machinery. We present the selected weights for each dataset in the processing chain and the resultant variable rate application (VRA) task. The additional fertilization according to the generated task was shown to be beneficial for the amount of yield. However, our study is indicating that there are still many uncertainties within the process chain.

  6. A Color Raster Scanning System for Digitizing Cartographic Data.

    DTIC Science & Technology

    1979-11-01

    NUMBER~ Hami4Tlond Stnrd l vsino Uie Tecnooge A RE R & P~~.MERSJ"EE WidrTCP Locks CT 06096e 0 1 7 C"OTROLN OFFICE NAEADADRSC2 ORAT ORAT TNMBR. R~ome lAir ...measurements caused by nonuniformities in the chart. In the case of screen recognition, the color identification concept employed is the same as that

  7. Detecting and connecting agricultural ditches using LiDAR data

    NASA Astrophysics Data System (ADS)

    Roelens, Jennifer; Dondeyne, Stefaan; Van Orshoven, Jos; Diels, Jan

    2017-04-01

    High-resolution hydrological data are essential for spatially-targeted water resource management decisions and future modelling efforts. For Flanders, small water courses like agricultural ditches and their connection to the river network are incomplete in the official digital atlas. High-resolution LiDAR data offer the prospect for automated detection of ditches, but there is no established method or software to do so nor to predict how these are connected to each other and the wider hydrographic network. An aerial LiDAR database encompassing at least 16 points per square meter linked with simultaneously collected digital RGB aerial images, is available for Flanders. The potential of detecting agricultural ditches and their connectivity based on point LiDAR data was investigated in a 1.9 km2 study area located in the alluvial valley of the river Demer. The area consists of agricultural parcels and woodland with a ditch network of approximately 17 km. The entire network of open ditches, and the location of culverts were mapped during a field survey to test the effectiveness of the proposed method. In the first step of the proposed method, the LiDAR point data were transformed into a raster DEM with a 1-m resolution to reduce the amount of data to be analyzed. This was done by interpolating the bare earth points using the nearest neighborhood method. In a next step, a morphological approach was used for detecting a preliminary network as traditional flow algorithms are not suitable for detecting small water courses in low-lying areas. This resulted in a preliminary classified raster image with ditch and non-ditch cells. After eliminating small details that are the result of background noise, the resulting classified raster image was vectorized to match the format of the digital watercourse network. As the vectorisation does not always adequately represent the shape of linear features, the results did not meet the high-quality cartographic needs. The spatial accuracy

  8. A building extraction approach for Airborne Laser Scanner data utilizing the Object Based Image Analysis paradigm

    NASA Astrophysics Data System (ADS)

    Tomljenovic, Ivan; Tiede, Dirk; Blaschke, Thomas

    2016-10-01

    In the past two decades Object-Based Image Analysis (OBIA) established itself as an efficient approach for the classification and extraction of information from remote sensing imagery and, increasingly, from non-image based sources such as Airborne Laser Scanner (ALS) point clouds. ALS data is represented in the form of a point cloud with recorded multiple returns and intensities. In our work, we combined OBIA with ALS point cloud data in order to identify and extract buildings as 2D polygons representing roof outlines in a top down mapping approach. We performed rasterization of the ALS data into a height raster for the purpose of the generation of a Digital Surface Model (DSM) and a derived Digital Elevation Model (DEM). Further objects were generated in conjunction with point statistics from the linked point cloud. With the use of class modelling methods, we generated the final target class of objects representing buildings. The approach was developed for a test area in Biberach an der Riß (Germany). In order to point out the possibilities of the adaptation-free transferability to another data set, the algorithm has been applied ;as is; to the ISPRS Benchmarking data set of Toronto (Canada). The obtained results show high accuracies for the initial study area (thematic accuracies of around 98%, geometric accuracy of above 80%). The very high performance within the ISPRS Benchmark without any modification of the algorithm and without any adaptation of parameters is particularly noteworthy.

  9. Detailed Hydrographic Feature Extraction from High-Resolution LiDAR Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danny L. Anderson

    Detailed hydrographic feature extraction from high-resolution light detection and ranging (LiDAR) data is investigated. Methods for quantitatively evaluating and comparing such extractions are presented, including the use of sinuosity and longitudinal root-mean-square-error (LRMSE). These metrics are then used to quantitatively compare stream networks in two studies. The first study examines the effect of raster cell size on watershed boundaries and stream networks delineated from LiDAR-derived digital elevation models (DEMs). The study confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes generally yielded better stream network delineations, based on sinuosity and LRMSE. The second study demonstrates amore » new method of delineating a stream directly from LiDAR point clouds, without the intermediate step of deriving a DEM. Direct use of LiDAR point clouds could improve efficiency and accuracy of hydrographic feature extractions. The direct delineation method developed herein and termed “mDn”, is an extension of the D8 method that has been used for several decades with gridded raster data. The method divides the region around a starting point into sectors, using the LiDAR data points within each sector to determine an average slope, and selecting the sector with the greatest downward slope to determine the direction of flow. An mDn delineation was compared with a traditional grid-based delineation, using TauDEM, and other readily available, common stream data sets. Although, the TauDEM delineation yielded a sinuosity that more closely matches the reference, the mDn delineation yielded a sinuosity that was higher than either the TauDEM method or the existing published stream delineations. Furthermore, stream delineation using the mDn method yielded the smallest LRMSE.« less

  10. Automatic Near-Real-Time Image Processing Chain for Very High Resolution Optical Satellite Data

    NASA Astrophysics Data System (ADS)

    Ostir, K.; Cotar, K.; Marsetic, A.; Pehani, P.; Perse, M.; Zaksek, K.; Zaletelj, J.; Rodic, T.

    2015-04-01

    In response to the increasing need for automatic and fast satellite image processing SPACE-SI has developed and implemented a fully automatic image processing chain STORM that performs all processing steps from sensor-corrected optical images (level 1) to web-delivered map-ready images and products without operator's intervention. Initial development was tailored to high resolution RapidEye images, and all crucial and most challenging parts of the planned full processing chain were developed: module for automatic image orthorectification based on a physical sensor model and supported by the algorithm for automatic detection of ground control points (GCPs); atmospheric correction module, topographic corrections module that combines physical approach with Minnaert method and utilizing anisotropic illumination model; and modules for high level products generation. Various parts of the chain were implemented also for WorldView-2, THEOS, Pleiades, SPOT 6, Landsat 5-8, and PROBA-V. Support of full-frame sensor currently in development by SPACE-SI is in plan. The proposed paper focuses on the adaptation of the STORM processing chain to very high resolution multispectral images. The development concentrated on the sub-module for automatic detection of GCPs. The initially implemented two-step algorithm that worked only with rasterized vector roads and delivered GCPs with sub-pixel accuracy for the RapidEye images, was improved with the introduction of a third step: super-fine positioning of each GCP based on a reference raster chip. The added step exploits the high spatial resolution of the reference raster to improve the final matching results and to achieve pixel accuracy also on very high resolution optical satellite data.

  11. Data resources for the Wyoming Landscape Conservation Initiative (WLCI) Integrated Assessment (IA)

    USGS Publications Warehouse

    Assal, Timothy J.; Garman, Steven L.; Bowen, Zachary H.; Anderson, Patrick J.; Manier, Daniel J.; McDougal, Robert R.

    2012-01-01

    The data contained in this report were compiled, modified, and analyzed for the Wyoming Landscape Conservation Initiative (WLCI) Integrated Assessment (IA). The WLCI is a long-term science based effort to assess and enhance aquatic and terrestrial habitats at a landscape scale in southwest Wyoming while facilitating responsible energy development through local collaboration and partnerships. The IA is an integrated synthesis and analysis of WLCI resource values based on best available data and information collected from multiple agencies and organizations. It is a support tool for landscape-scale conservation planning and evaluation, and a data and analysis resource that can be used for addressing specific management questions. The IA analysis was conducted using a Geographic Information System in a raster (that is, a grid) environment using a cell size of 30 meters. To facilitate the interpretation of the data in a regional context, mean values were summarized and displayed at the subwatershed unit (WLCI subwatersheds were subset from the National Hydrography Dataset, Hydrologic Unit Code 12/Level 6). A dynamic mapping platform, accessed via the WLCI webpage at http://www.wlci.gov is used to display the mapped information, and to access underlying resource values that were combined to produce the final mapped results. The raster data used in the IA are provided here for use by interested parties to conduct additional analyses and can be accessed via the WLCI webpage. This series contains 74 spatial data sets: WLCI subwatersheds (vector) and 73 geotiffs (raster) that are segregated into the major categories of Multicriteria Index (including Resource Index and Condition), Change Agents, and Future Change. The Total Multicriteria Index is composed of the Aquatic Multicriteria Index and the Terrestrial Multicriteria Index. The Aquatic Multicriteria Index is composed of the Aquatic Resource Index and the Aquatic Condition. The Aquatic Resource Index is composed of the

  12. Automatic extraction of road features in urban environments using dense ALS data

    NASA Astrophysics Data System (ADS)

    Soilán, Mario; Truong-Hong, Linh; Riveiro, Belén; Laefer, Debra

    2018-02-01

    This paper describes a methodology that automatically extracts semantic information from urban ALS data for urban parameterization and road network definition. First, building façades are segmented from the ground surface by combining knowledge-based information with both voxel and raster data. Next, heuristic rules and unsupervised learning are applied to the ground surface data to distinguish sidewalk and pavement points as a means for curb detection. Then radiometric information was employed for road marking extraction. Using high-density ALS data from Dublin, Ireland, this fully automatic workflow was able to generate a F-score close to 95% for pavement and sidewalk identification with a resolution of 20 cm and better than 80% for road marking detection.

  13. Environmental Data Store: A Web-Based System Providing Management and Exploitation for Multi-Data-Type Environmental Data

    NASA Astrophysics Data System (ADS)

    Ji, P.; Piasecki, M.

    2012-12-01

    With the rapid growth in data volumes, data diversity and data demands from multi-disciplinary research effort, data management and exploitation are increasingly facing significant challenges for environmental scientific community. We describe Environmental data store (EDS), a system we are developing that is a web-based system following an open source implementation to manage and exploit multi-data-type environmental data. EDS provides repository services for the six fundamental data types, which meet the demands of multi-disciplinary environmental research. These data types are: a) Time Series Data, b) GeoSpatial data, c) Digital Data, d) Ex-Situ Sampling data, e) Modeling Data, f) Raster Data. Through data portal, EDS allows for efficient consuming these six types of data placed in data pool, which is made up of different data nodes corresponding to different data types, including iRODS, ODM, THREADS, ESSDB, GeoServer, etc.. EDS data portal offers unified submission interface for the above different data types; provides fully integrated, scalable search across content from the above different data systems; also features mapping, analysis, exporting and visualization, through integration with other software. EDS uses a number of developed systems, follows widely used data standards, and highlights the thematic, semantic, and syntactic support on the submission and search, in order to advance multi-disciplinary environmental research. This system will be installed and develop at the CrossRoads initiative at the City College of New York.

  14. Evaluation of LiDAR-acquired bathymetric and topographic data accuracy in various hydrogeomorphic settings in the Deadwood and South Fork Boise Rivers, West-Central Idaho, 2007

    USGS Publications Warehouse

    Skinner, Kenneth D.

    2011-01-01

    High-quality elevation data in riverine environments are important for fisheries management applications and the accuracy of such data needs to be determined for its proper application. The Experimental Advanced Airborne Research LiDAR (Light Detection and Ranging)-or EAARL-system was used to obtain topographic and bathymetric data along the Deadwood and South Fork Boise Rivers in west-central Idaho. The EAARL data were post-processed into bare earth and bathymetric raster and point datasets. Concurrently with the EAARL surveys, real-time kinematic global positioning system surveys were made in three areas along each of the rivers to assess the accuracy of the EAARL elevation data in different hydrogeomorphic settings. The accuracies of the EAARL-derived raster elevation values, determined in open, flat terrain, to provide an optimal vertical comparison surface, had root mean square errors ranging from 0.134 to 0.347 m. Accuracies in the elevation values for the stream hydrogeomorphic settings had root mean square errors ranging from 0.251 to 0.782 m. The greater root mean square errors for the latter data are the result of complex hydrogeomorphic environments within the streams, such as submerged aquatic macrophytes and air bubble entrainment; and those along the banks, such as boulders, woody debris, and steep slopes. These complex environments reduce the accuracy of EAARL bathymetric and topographic measurements. Steep banks emphasize the horizontal location discrepancies between the EAARL and ground-survey data and may not be good representations of vertical accuracy. The EAARL point to ground-survey comparisons produced results with slightly higher but similar root mean square errors than those for the EAARL raster to ground-survey comparisons, emphasizing the minimized horizontal offset by using interpolated values from the raster dataset at the exact location of the ground-survey point as opposed to an actual EAARL point within a 1-meter distance. The

  15. A Coastal Hazards Data Base for the U.S. Gulf Coast (1993) (NDP-04bB)

    DOE Data Explorer

    Gornitz, Vivien M. [National Aeronautics and Space Administration, Goddard Institute for Space Studies, New York, NY (USA); White, Tammy W. [CDIAC, Oak Ridge National Laboratory, Oak Ridge, TN (USA)

    2008-01-01

    This document describes the contents of a digital data base that may be used to identify coastlines along the U.S. Gulf Coast at risk to sea-level rise. The data base integrates point, line, and polygon data for the U.S. Gulf Coast into 0.25° latitude by 0.25° longitude grid cells and into 1:2,000,000 digitized line segments that can be used by raster or vector geographic information systems (GIS) as well as by non-GIS data base systems. Each coastal grid cell and line segment contains data on elevations, geology, geomorphology, sea-level trends, shoreline displacement (erosion/accretion), tidal ranges, and wave heights.

  16. Application of the 1:2,000,000-scale data base: A National Atlas sectional prototype

    USGS Publications Warehouse

    Dixon, Donna M.

    1985-01-01

    A study of the potential to produce a National Atlas sectional prototype from the 1:2,000,000-scale data base was concluded recently by the National Mapping Division, U. S. Geological Survey. This paper discusses the specific digital cartographic production procedures involved in the preparation of the prototype map, as well as the theoretical and practical cartographic framework for the study. Such items as data organization, data classification, digital techniques, data conversions, and modification of traditional design specifications for an automated environment are discussed. The bulk of the cartographic work for the production of the prototype was carried out in raster format on the Scitex Response-250 mapping system.

  17. Moderate-resolution sea surface temperature data and seasonal pattern analysis for the Arctic Ocean ecoregions

    USGS Publications Warehouse

    Payne, Meredith C.; Reusser, Deborah A.; Lee, Henry

    2012-01-01

    Sea surface temperature (SST) is an important environmental characteristic in determining the suitability and sustainability of habitats for marine organisms. In particular, the fate of the Arctic Ocean, which provides critical habitat to commercially important fish, is in question. This poses an intriguing problem for future research of Arctic environments - one that will require examination of long-term SST records. This publication describes and provides access to an easy-to-use Arctic SST dataset for ecologists, biogeographers, oceanographers, and other scientists conducting research on habitats and/or processes in the Arctic Ocean. The data cover the Arctic ecoregions as defined by the "Marine Ecoregions of the World" (MEOW) biogeographic schema developed by The Nature Conservancy as well as the region to the north from approximately 46°N to about 88°N (constrained by the season and data coverage). The data span a 29-year period from September 1981 to December 2009. These SST data were derived from Advanced Very High Resolution Radiometer (AVHRR) instrument measurements that had been compiled into monthly means at 4-kilometer grid cell spatial resolution. The processed data files are available in ArcGIS geospatial datasets (raster and point shapefiles) and also are provided in text (.csv) format. All data except the raster files include attributes identifying latitude/longitude coordinates, and realm, province, and ecoregion as defined by the MEOW classification schema. A seasonal analysis of these Arctic ecoregions reveals a wide range of SSTs experienced throughout the Arctic, both over the course of an annual cycle and within each month of that cycle. Sea ice distribution plays a major role in SST regulation in all Arctic ecoregions.

  18. Spatial data software integration - Merging CAD/CAM/mapping with GIS and image processing

    NASA Technical Reports Server (NTRS)

    Logan, Thomas L.; Bryant, Nevin A.

    1987-01-01

    The integration of CAD/CAM/mapping with image processing using geographic information systems (GISs) as the interface is examined. Particular emphasis is given to the development of software interfaces between JPL's Video Image Communication and Retrieval (VICAR)/Imaged Based Information System (IBIS) raster-based GIS and the CAD/CAM/mapping system. The design and functions of the VICAR and IBIS are described. Vector data capture and editing are studied. Various software programs for interfacing between the VICAR/IBIS and CAD/CAM/mapping are presented and analyzed.

  19. Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Viswambharan, V.; Doshi, A.

    2017-12-01

    Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics

  20. [Study of influence of endoopalescence on the solid tissue by means of raster microscope].

    PubMed

    Kobakhidze, G D; Vadachkoriia, N R

    2006-05-01

    During the process of endowhitening as a result of penetration of peroxide into dentine tubules the adhesion of tooth tissues sharply decreases, it requires the delay of restoration by filling for several days. Surely it is not comfortable for the patient. The best way out is the application of antioxidants after whitening. Under their influence the sedimentary layer on the hard tissues of the teeth neutralizes much quicker. The urgency of this issue is preconditioned by the fact that under the influence of antioxidant the restored adhesiveness enables the immediate restoration of tooth. Our previous experiment is the good proof of that. In the experiment we studied the level of micro leakage and origination of microfissures by effecting with antioxidant, precisely with 10% sodium ascorbate on the hard tissues of tooth after using the whitening agent. As a result of this experiment we have not obtained any microfissure in the teeth covered with antioxidant unlike those teeth where we had not used the antioxidant. According to the reference data it is known that after acide angraving there occurs the removing of adhesive layer from the enamel and dentin of tooth. As a result of this the prisms of enamel and the dentine tubules are widened and this creates condition for the further penetration of the primer of the adhesive system. This process is followed by the origination of transitional i.e. hybrid layer. The latter one is the best link for the adhesive tar and tooth tissues. Modern investigations in the esthetic stomatology prove that the whitening agents produce the peroxide molecules during the process of whitening. These molecules cause the widening of the tooth enamel prisms. We also studied the results of post endo-whitening influence of peroxidation processes on the enamel and dentine of tooth by means of the raster microscope. Studies by electron microscope showed that the antioxidant - 10% sodium ascorbate was characterized by high penetration and was

  1. Creating a water depth map from Earth Observation-derived flood extent and topography data

    NASA Astrophysics Data System (ADS)

    Matgen, Patrick; Giustarini, Laura; Chini, Marco; Hostache, Renaud; Pelich, Ramona; Schlaffer, Stefan

    2017-04-01

    Enhanced methods for monitoring temporal and spatial variations of water depth in rivers and floodplains are very important in operational water management. Currently, variations of water elevation can be estimated indirectly at the land-water interface using sequences of satellite EO imagery in combination with topographic data. In recent years high-resolution digital elevation models (DEM) and satellite EO data have become more readily available at global scale. This study introduces an approach for efficiently converting remote sensing-derived flood extent maps into water depth maps using a floodplain's topography information. For this we make the assumption of uniform flow, that is the depth of flow with respect to the drainage network is considered to be the same at every section of the floodplain. In other words, the depth of water above the nearest drainage is expected to be constant for a given river reach. To determine this value we first need the Height Above Nearest Drainage (HAND) raster obtained by using the area of interest's DEM as source topography and a shapefile of the river network. The HAND model normalizes the topography with respect to the drainage network. Next, the HAND raster is thresholded in order to generate a binary mask that optimally fits, over the entire region of study, the flood extent map obtained from SAR or any other remote sensing product, including aerial photographs. The optimal threshold value corresponds to the height of the water line above the nearest drainage, termed HANDWATER, and is considered constant for a given subreach. Once the HANDWATER has been optimized, a water depth map can be generated by subtracting the value of the HAND raster at the each location from this parameter value. These developments enable large scale and near real-time applications and only require readily available EO data, a DEM and the river network as input data. The approach is based on a hierarchical split-based approach that subdivides a

  2. Identification of the condition of crops based on geospatial data embedded in graph databases

    NASA Astrophysics Data System (ADS)

    Idziaszek, P.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Koszela, K.; Fojud, A.

    2017-07-01

    The Web application presented here supports plant production and works with the graph database Neo4j shell to support the assessment of the condition of crops on the basis of geospatial data, including raster and vector data. The adoption of a graph database as a tool to store and manage the data, including geospatial data, is completely justified in the case of those agricultural holdings that have a wide range of types and sizes of crops. In addition, the authors tested the option of using the technology of Microsoft Cognitive Services at the level of produced application that enables an image analysis using the services provided. The presented application was designed using ASP.NET MVC technology and a wide range of leading IT tools.

  3. Optimizing placements of ground-based snow sensors for areal snow cover estimation using a machine-learning algorithm and melt-season snow-LiDAR data

    NASA Astrophysics Data System (ADS)

    Oroza, C.; Zheng, Z.; Glaser, S. D.; Bales, R. C.; Conklin, M. H.

    2016-12-01

    We present a structured, analytical approach to optimize ground-sensor placements based on time-series remotely sensed (LiDAR) data and machine-learning algorithms. We focused on catchments within the Merced and Tuolumne river basins, covered by the JPL Airborne Snow Observatory LiDAR program. First, we used a Gaussian mixture model to identify representative sensor locations in the space of independent variables for each catchment. Multiple independent variables that govern the distribution of snow depth were used, including elevation, slope, and aspect. Second, we used a Gaussian process to estimate the areal distribution of snow depth from the initial set of measurements. This is a covariance-based model that also estimates the areal distribution of model uncertainty based on the independent variable weights and autocorrelation. The uncertainty raster was used to strategically add sensors to minimize model uncertainty. We assessed the temporal accuracy of the method using LiDAR-derived snow-depth rasters collected in water-year 2014. In each area, optimal sensor placements were determined using the first available snow raster for the year. The accuracy in the remaining LiDAR surveys was compared to 100 configurations of sensors selected at random. We found the accuracy of the model from the proposed placements to be higher and more consistent in each remaining survey than the average random configuration. We found that a relatively small number of sensors can be used to accurately reproduce the spatial patterns of snow depth across the basins, when placed using spatial snow data. Our approach also simplifies sensor placement. At present, field surveys are required to identify representative locations for such networks, a process that is labor intensive and provides limited guarantees on the networks' representation of catchment independent variables.

  4. Towards Big Earth Data Analytics: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2013-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data whereby the term "coverage", according to ISO and OGC, is defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timeseries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The EarthServer initiative, funded by EU FP7 eInfrastructures, unites 11 partners from computer and earth sciences to establish Big Earth Data Analytics. One key ingredient is flexibility for users to ask what they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level query languages; these have proven tremendously successful on tabular and XML data, and we extend them with a central geo data structure, multi-dimensional arrays. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing code has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, an Array DBMS enabling efficient storage and retrieval of any-size, any-type multi-dimensional raster data. In the project, rasdaman is being extended with several functionality and scalability features, including: support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data

  5. Poly-Pattern Compressive Segmentation of ASTER Data for GIS

    NASA Technical Reports Server (NTRS)

    Myers, Wayne; Warner, Eric; Tutwiler, Richard

    2007-01-01

    Pattern-based segmentation of multi-band image data, such as ASTER, produces one-byte and two-byte approximate compressions. This is a dual segmentation consisting of nested coarser and finer level pattern mappings called poly-patterns. The coarser A-level version is structured for direct incorporation into geographic information systems in the manner of a raster map. GIs renderings of this A-level approximation are called pattern pictures which have the appearance of color enhanced images. The two-byte version consisting of thousands of B-level segments provides a capability for approximate restoration of the multi-band data in selected areas or entire scenes. Poly-patterns are especially useful for purposes of change detection and landscape analysis at multiple scales. The primary author has implemented the segmentation methodology in a public domain software suite.

  6. The use of computer-generated color graphic images for transient thermal analysis. [for hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Edwards, C. L. W.; Meissner, F. T.; Hall, J. B.

    1979-01-01

    Color computer graphics techniques were investigated as a means of rapidly scanning and interpreting large sets of transient heating data. The data presented were generated to support the conceptual design of a heat-sink thermal protection system (TPS) for a hypersonic research airplane. Color-coded vector and raster displays of the numerical geometry used in the heating calculations were employed to analyze skin thicknesses and surface temperatures of the heat-sink TPS under a variety of trajectory flight profiles. Both vector and raster displays proved to be effective means for rapidly identifying heat-sink mass concentrations, regions of high heating, and potentially adverse thermal gradients. The color-coded (raster) surface displays are a very efficient means for displaying surface-temperature and heating histories, and thereby the more stringent design requirements can quickly be identified. The related hardware and software developments required to implement both the vector and the raster displays for this application are also discussed.

  7. Design and development of linked data from the National Map

    USGS Publications Warehouse

    Usery, E. Lynn; Varanka, Dalia E.

    2012-01-01

    The development of linked data on the World-Wide Web provides the opportunity for the U.S. Geological Survey (USGS) to supply its extensive volumes of geospatial data, information, and knowledge in a machine interpretable form and reach users and applications that heretofore have been unavailable. To pilot a process to take advantage of this opportunity, the USGS is developing an ontology for The National Map and converting selected data from nine research test areas to a Semantic Web format to support machine processing and linked data access. In a case study, the USGS has developed initial methods for legacy vector and raster formatted geometry, attributes, and spatial relationships to be accessed in a linked data environment maintaining the capability to generate graphic or image output from semantic queries. The description of an initial USGS approach to developing ontology, linked data, and initial query capability from The National Map databases is presented.

  8. Environmental Data Store (EDS): A multi-node Data Storage Facility for diverse sets of Geoscience Data

    NASA Astrophysics Data System (ADS)

    Piasecki, M.; Ji, P.

    2014-12-01

    Geoscience data comes in many flavors that are determined by type of data such as continous on a grid or mesh or discrete colelcted at point either as one time samples or a stream of data coming of sensors, but coudl also encompass digital files of any time type such text files, WORD or EXCEL documents, or audio and video files. We present a storage facility that is comprsed of 6 nodes each of speciaized to host a certain data type: grid based data (netCDF on a THREDDS server), GIS data (shapefiles using GeoServer), point time series data (CUAHSI ODM), sample data (EDBS), and any digital data (RAMADAA) plus a server fro Remote sensing data and its products. While there is overlap in data type storage capabilities (rasters can go into several of these nodes) we prefer to use dedicated storage facilities that are a) freeware, and b) have a good degree of maturity, and c) have shown their utility for stroing a cetain type. In addition it allows to place these commonly used software stacks and storage solutiosn side-by-side to develop interoprability strategies. We have used a DRUPAL based system to handle user regoistration and authentication, and also use the system for data submission and data search. In support for tis system we developed an extensive controlled vocabulary system that is an amalgamation of various CVs used in the geosciecne community in order to achieve as high a degree of recognition, such the CF conventions, CUAHSI Cvs, , NASA (GCMD), EPA and USGS taxonomies, GEMET, in addition to ontological representations such as SWEET.

  9. Neighborhood size of training data influences soil map disaggregation

    USDA-ARS?s Scientific Manuscript database

    Soil class mapping relies on the ability of sample locations to represent portions of the landscape with similar soil types; however, most digital soil mapping (DSM) approaches intersect sample locations with one raster pixel per covariate layer regardless of pixel size. This approach does not take ...

  10. NoRMCorre: An online algorithm for piecewise rigid motion correction of calcium imaging data.

    PubMed

    Pnevmatikakis, Eftychios A; Giovannucci, Andrea

    2017-11-01

    Motion correction is a challenging pre-processing problem that arises early in the analysis pipeline of calcium imaging data sequences. The motion artifacts in two-photon microscopy recordings can be non-rigid, arising from the finite time of raster scanning and non-uniform deformations of the brain medium. We introduce an algorithm for fast Non-Rigid Motion Correction (NoRMCorre) based on template matching. NoRMCorre operates by splitting the field of view (FOV) into overlapping spatial patches along all directions. The patches are registered at a sub-pixel resolution for rigid translation against a regularly updated template. The estimated alignments are subsequently up-sampled to create a smooth motion field for each frame that can efficiently approximate non-rigid artifacts in a piecewise-rigid manner. Existing approaches either do not scale well in terms of computational performance or are targeted to non-rigid artifacts arising just from the finite speed of raster scanning, and thus cannot correct for non-rigid motion observable in datasets from a large FOV. NoRMCorre can be run in an online mode resulting in comparable to or even faster than real time motion registration of streaming data. We evaluate its performance with simple yet intuitive metrics and compare against other non-rigid registration methods on simulated data and in vivo two-photon calcium imaging datasets. Open source Matlab and Python code is also made available. The proposed method and accompanying code can be useful for solving large scale image registration problems in calcium imaging, especially in the presence of non-rigid deformations. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  11. GRASS GIS: The first Open Source Temporal GIS

    NASA Astrophysics Data System (ADS)

    Gebbert, Sören; Leppelt, Thomas

    2015-04-01

    GRASS GIS is a full featured, general purpose Open Source geographic information system (GIS) with raster, 3D raster and vector processing support[1]. Recently, time was introduced as a new dimension that transformed GRASS GIS into the first Open Source temporal GIS with comprehensive spatio-temporal analysis, processing and visualization capabilities[2]. New spatio-temporal data types were introduced in GRASS GIS version 7, to manage raster, 3D raster and vector time series. These new data types are called space time datasets. They are designed to efficiently handle hundreds of thousands of time stamped raster, 3D raster and vector map layers of any size. Time stamps can be defined as time intervals or time instances in Gregorian calendar time or relative time. Space time datasets are simplifying the processing and analysis of large time series in GRASS GIS, since these new data types are used as input and output parameter in temporal modules. The handling of space time datasets is therefore equal to the handling of raster, 3D raster and vector map layers in GRASS GIS. A new dedicated Python library, the GRASS GIS Temporal Framework, was designed to implement the spatio-temporal data types and their management. The framework provides the functionality to efficiently handle hundreds of thousands of time stamped map layers and their spatio-temporal topological relations. The framework supports reasoning based on the temporal granularity of space time datasets as well as their temporal topology. It was designed in conjunction with the PyGRASS [3] library to support parallel processing of large datasets, that has a long tradition in GRASS GIS [4,5]. We will present a subset of more than 40 temporal modules that were implemented based on the GRASS GIS Temporal Framework, PyGRASS and the GRASS GIS Python scripting library. These modules provide a comprehensive temporal GIS tool set. The functionality range from space time dataset and time stamped map layer management

  12. An introduction to real-time graphical techniques for analyzing multivariate data

    NASA Astrophysics Data System (ADS)

    Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner

    1987-08-01

    Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".

  13. Cartographic Production for the FLaSH Map Study: Generation of Rugosity Grids, 2008

    USGS Publications Warehouse

    Robbins, Lisa L.; Knorr, Paul O.; Hansen, Mark

    2010-01-01

    Project Summary This series of raster data is a U.S. Geological Survey (USGS) Data Series release from the Florida Shelf Habitat Project (FLaSH). This disc contains two raster images in Environmental Systems Research Institute, Inc. (ESRI) raster grid format, jpeg image format, and Geo-referenced Tagged Image File Format (GeoTIFF). Data is also provided in non-image ASCII format. Rugosity grids at two resolutions (250 m and 1000 m) were generated for West Florida shelf waters to 250 m using a custom algorithm that follows the methods of Valentine and others (2004). The Methods portion of this document describes the specific steps used to generate the raster images. Rugosity, also referred to as roughness, ruggedness, or the surface-area ratio (Riley and others, 1999; Wilson and others, 2007), is a visual and quantitative measurement of terrain complexity, a common variable in ecological habitat studies. The rugosity of an area can affect biota by influencing habitat, providing shelter from elements, determining the quantity and type of living space, influencing the type and quantity of flora, affecting predator-prey relationships by providing cover and concealment, and, as an expression of vertical relief, can influence local environmental conditions such as temperature and moisture. In the marine environment rugosity can furthermore influence current flow rate and direction, increase the residence time of water in an area through eddying and current deflection, influence local water conditions such as chemistry, turbidity, and temperature, and influence the rate and nature of sedimentary deposition. State-of-the-art computer-mapping techniques and data-processing tools were used to develop shelf-wide raster and vector data layers. Florida Shelf Habitat (FLaSH) Mapping Project (http://coastal.er.usgs.gov/flash) endeavors to locate available data, identify data gaps, synthesize existing information, and expand our understanding of geologic processes in our dynamic

  14. In-situ microscale through-silicon via strain measurements by synchrotron x-ray microdiffraction exploring the physics behind data interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Xi; School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332; Thadesar, Paragkumar A.

    2014-09-15

    In-situ microscale thermomechanical strain measurements have been performed in combination with synchrotron x-ray microdiffraction to understand the fundamental cause of failures in microelectronics devices with through-silicon vias. The physics behind the raster scan and data analysis of the measured strain distribution maps is explored utilizing the energies of indexed reflections from the measured data and applying them for beam intensity analysis and effective penetration depth determination. Moreover, a statistical analysis is performed for the beam intensity and strain distributions along the beam penetration path to account for the factors affecting peak search and strain refinement procedure.

  15. Symposium on Machine Processing of Remotely Sensed Data, Purdue University, West Lafayette, Ind., June 29-July 1, 1976, Proceedings

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Papers are presented on the applicability of Landsat data to water management and control needs, IBIS, a geographic information system based on digital image processing and image raster datatype, and the Image Data Access Method (IDAM) for the Earth Resources Interactive Processing System. Attention is also given to the Prototype Classification and Mensuration System (PROCAMS) applied to agricultural data, the use of Landsat for water quality monitoring in North Carolina, and the analysis of geophysical remote sensing data using multivariate pattern recognition. The Illinois crop-acreage estimation experiment, the Pacific Northwest Resources Inventory Demonstration, and the effects of spatial misregistration on multispectral recognition are also considered. Individual items are announced in this issue.

  16. Improving the Accessibility and Use of NASA Earth Science Data

    NASA Technical Reports Server (NTRS)

    Tisdale, Matthew; Tisdale, Brian

    2015-01-01

    Many of the NASA Langley Atmospheric Science Data Center (ASDC) Distributed Active Archive Center (DAAC) multidimensional tropospheric and atmospheric chemistry data products are stored in HDF4, HDF5 or NetCDF format, which traditionally have been difficult to analyze and visualize with geospatial tools. With the rising demand from the diverse end-user communities for geospatial tools to handle multidimensional products, several applications, such as ArcGIS, have refined their software. Many geospatial applications now have new functionalities that enable the end user to: Store, serve, and perform analysis on each individual variable, its time dimension, and vertical dimension. Use NetCDF, GRIB, and HDF raster data formats across applications directly. Publish output within REST image services or WMS for time and space enabled web application development. During this webinar, participants will learn how to leverage geospatial applications such as ArcGIS, OPeNDAP and ncWMS in the production of Earth science information, and in increasing data accessibility and usability.

  17. Profile development for the spatial data transfer standard

    USGS Publications Warehouse

    Szemraj, John A.; Fegeas, Robin G.; Tolar, Billy R.

    1994-01-01

    The Spatial Data Transfer Standard (SDTS), or Federal Information Processing Standard (FIPS) 173, is designed to support all types of spatial data. Implementing all of the standard's options at one time is impractical. Therefore, implementation of the SDTS is being accomplished through the use of profiles. Profiles are clearly defined, limited subsets of the SDTS created for use with a specific type or model of data and designed with as few options as possible. When a profile is proposed, specific choices are made for encoding possibilities that were not addressed, left optional, or left with numerous choices within the SDTS. Profile development is coordinated by the U.S. Geological Survey's SDTS Task Force. When completed, profiles are submitted to the National Institute of Standards and Technology (NIST) for approval as official amendments to the SDTS. The first profile, the Topological Vector Profile (TVP), has been completed. A Raster Profile has been tested and is being finalized for submission to the NIST. Other vector profiles, such as those for network and nontopological data, are also being considered as future implementation options for the SDTS.

  18. A Big Spatial Data Processing Framework Applying to National Geographic Conditions Monitoring

    NASA Astrophysics Data System (ADS)

    Xiao, F.

    2018-04-01

    In this paper, a novel framework for spatial data processing is proposed, which apply to National Geographic Conditions Monitoring project of China. It includes 4 layers: spatial data storage, spatial RDDs, spatial operations, and spatial query language. The spatial data storage layer uses HDFS to store large size of spatial vector/raster data in the distributed cluster. The spatial RDDs are the abstract logical dataset of spatial data types, and can be transferred to the spark cluster to conduct spark transformations and actions. The spatial operations layer is a series of processing on spatial RDDs, such as range query, k nearest neighbor and spatial join. The spatial query language is a user-friendly interface which provide people not familiar with Spark with a comfortable way to operation the spatial operation. Compared with other spatial frameworks, it is highlighted that comprehensive technologies are referred for big spatial data processing. Extensive experiments on real datasets show that the framework achieves better performance than traditional process methods.

  19. EarthServer: Use of Rasdaman as a data store for use in visualisation of complex EO data

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter; Grant, Mike

    2013-04-01

    The European Commission FP7 project EarthServer is establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending cutting-edge Array Database technology. EarthServer is built around the Rasdaman Raster Data Manager which extends standard relational database systems with the ability to store and retrieve multi-dimensional raster data of unlimited size through an SQL style query language. Rasdaman facilitates visualisation of data by providing several Open Geospatial Consortium (OGC) standard interfaces through its web services wrapper, Petascope. These include the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. Here we will show that the use of EarthServer technologies and infrastructure allows access and visualisation of massive scale data through a web client with only marginal bandwidth use as opposed to the current mechanism of copying huge amounts of data to create visualisations locally. For example if a user wanted to generate a plot of global average chlorophyll for a complete decade time series they would only have to download the result instead of Terabytes of data. Firstly we will present a brief overview of the capabilities of Rasdaman and the WCPS query language to introduce the ways in which it is used in a visualisation tool chain. We will show that there are several ways in which WCPS can be utilised to create both standard and novel web based visualisations. An example of a standard visualisation is the production of traditional 2d plots, allowing users the ability to plot data products easily. However, the query language allows the creation of novel/custom products, which can then immediately be

  20. Method and apparatus for measuring areas of photoelectric cells and photoelectric cell performance parameters

    DOEpatents

    Osterwald, C.R.; Emery, K.A.

    1984-05-29

    A laser scanning system for scanning the surface of photovoltaic cell in a precise, stepped raster pattern includes electric current detecting and measuring equipment for sensing the current response of the scanned cell to the laser beam at each stepped irradiated spot or pixel on the cell surface. A computer is used to control and monitor the raster position of the laser scan as well as monitoring the corresponding current responses, storing this data, operating on it, and for feeding the data to a graphical plotter for producing a visual, color-coded image of the current response of the cell to the laser scan. A translation platform driven by stepper motors in precise X and Y distances holds and rasters the cell being scanned under a stationary spot-focused laser beam.

  1. Method and apparatus for measuring areas of photoelectric cells and photoelectric cell performance parameters

    DOEpatents

    Osterwald, Carl R.; Emery, Keith A.

    1987-01-01

    A laser scanning system for scanning the surface of a photovoltaic cell in a precise, stepped raster pattern includes electric current detecting and measuring equipment for sensing the current response of the scanned cell to the laser beam at each stepped irradiated spot or pixel on the cell surface. A computer is used to control and monitor the raster position of the laser scan as well as monitoring the corresponding current responses, storing this data, operating on it, and for feeding the data to a graphic plotter for producing a visual, color-coded image of the current response of the cell to the laser scan. A translation platform driven by stepper motors in precise X and Y distances holds and rasters the cell being scanned under a stationary spot-focused laser beam.

  2. Development of large scale riverine terrain-bathymetry dataset by integrating NHDPlus HR with NED,CoNED and HAND data

    NASA Astrophysics Data System (ADS)

    Li, Z.; Clark, E. P.

    2017-12-01

    Large scale and fine resolution riverine bathymetry data is critical for flood inundation modelingbut not available over the continental United States (CONUS). Previously we implementedbankfull hydraulic geometry based approaches to simulate bathymetry for individual riversusing NHDPlus v2.1 data and 10 m National Elevation Dataset (NED). USGS has recentlydeveloped High Resolution NHD data (NHDPlus HR Beta) (USGS, 2017), and thisenhanced dataset has a significant improvement on its spatial correspondence with 10 m DEM.In this study, we used this high resolution data, specifically NHDFlowline and NHDArea,to create bathymetry/terrain for CONUS river channels and floodplains. A software packageNHDPlus Inundation Modeler v5.0 Beta was developed for this project as an Esri ArcGIShydrological analysis extension. With the updated tools, raw 10 m DEM was first hydrologicallytreated to remove artificial blockages (e.g., overpasses, bridges and eve roadways, etc.) usinglow pass moving window filters. Cross sections were then automatically constructed along eachflowline to extract elevation from the hydrologically treated DEM. In this study, river channelshapes were approximated using quadratic curves to reduce uncertainties from commonly usedtrapezoids. We calculated underneath water channel elevation at each cross section samplingpoint using bankfull channel dimensions that were estimated from physiographicprovince/division based regression equations (Bieger et al. 2015). These elevation points werethen interpolated to generate bathymetry raster. The simulated bathymetry raster wasintegrated with USGS NED and Coastal National Elevation Database (CoNED) (whereveravailable) to make seamless terrain-bathymetry dataset. Channel bathymetry was alsointegrated to the HAND (Height above Nearest Drainage) dataset to improve large scaleinundation modeling. The generated terrain-bathymetry was processed at WatershedBoundary Dataset Hydrologic Unit 4 (WBDHU4) level.

  3. Assessment of Early Toxicity and Response in Patients Treated With Proton and Carbon Ion Therapy at the Heidelberg Ion Therapy Center Using the Raster Scanning Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rieken, Stefan; Habermehl, Daniel; Nikoghosyan, Anna

    2011-12-01

    Puropose: To asses early toxicity and response in 118 patients treated with scanned ion beams to validate the safety of intensity-controlled raster scanning at the Heidelberg Ion Therapy Center. Patients and Methods: Between November 2009 and June 2010, we treated 118 patients with proton and carbon ion radiotherapy (RT) using active beam delivery. The main indications included skull base chordomas and chondrosarcomas, salivary gland tumors, and gliomas. We evaluated early toxicity within 6 weeks after RT and the initial clinical and radiologic response for quality assurance in our new facility. Results: In all 118 patients, few side effects were observed,more » in particular, no high numbers of severe acute toxicity were found. In general, the patients treated with particle therapy alone showed only a few single side effects, mainly Radiation Therapy Oncology Group/Common Terminology Criteria grade 1. The most frequent side effects and cumulative incidence of single side effects were observed in the head-and-neck patients treated with particle therapy as a boost and photon intensity-modulated RT. The toxicities included common radiation-attributed reactions known from photon RT, including mucositis, dysphagia, and skin erythema. The most predominant imaging responses were observed in patients with high-grade gliomas and those with salivary gland tumors. For skull base tumors, imaging showed a stable tumor outline in most patients. Thirteen patients showed improvement of pre-existing clinical symptoms. Conclusions: Side effects related to particle treatment were rare, and the overall tolerability of the treatment was shown. The initial response was promising. The data have confirmed the safe delivery of carbon ions and protons at the newly opened Heidelberg facility.« less

  4. Assessment of early toxicity and response in patients treated with proton and carbon ion therapy at the Heidelberg ion therapy center using the raster scanning technique.

    PubMed

    Rieken, Stefan; Habermehl, Daniel; Nikoghosyan, Anna; Jensen, Alexandra; Haberer, Thomas; Jäkel, Oliver; Münter, Marc W; Welzel, Thomas; Debus, Jürgen; Combs, Stephanie E

    2011-12-01

    PUROPOSE: To asses early toxicity and response in 118 patients treated with scanned ion beams to validate the safety of intensity-controlled raster scanning at the Heidelberg Ion Therapy Center. Between November 2009 and June 2010, we treated 118 patients with proton and carbon ion radiotherapy (RT) using active beam delivery. The main indications included skull base chordomas and chondrosarcomas, salivary gland tumors, and gliomas. We evaluated early toxicity within 6 weeks after RT and the initial clinical and radiologic response for quality assurance in our new facility. In all 118 patients, few side effects were observed, in particular, no high numbers of severe acute toxicity were found. In general, the patients treated with particle therapy alone showed only a few single side effects, mainly Radiation Therapy Oncology Group/Common Terminology Criteria grade 1. The most frequent side effects and cumulative incidence of single side effects were observed in the head-and-neck patients treated with particle therapy as a boost and photon intensity-modulated RT. The toxicities included common radiation-attributed reactions known from photon RT, including mucositis, dysphagia, and skin erythema. The most predominant imaging responses were observed in patients with high-grade gliomas and those with salivary gland tumors. For skull base tumors, imaging showed a stable tumor outline in most patients. Thirteen patients showed improvement of pre-existing clinical symptoms. Side effects related to particle treatment were rare, and the overall tolerability of the treatment was shown. The initial response was promising. The data have confirmed the safe delivery of carbon ions and protons at the newly opened Heidelberg facility. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Qualitative analysis of precipiation distribution in Poland with use of different data sources

    NASA Astrophysics Data System (ADS)

    Walawender, J.; Dyras, I.; Łapeta, B.; Serafin-Rek, D.; Twardowski, A.

    2008-04-01

    Geographical Information Systems (GIS) can be used to integrate data from different sources and in different formats to perform innovative spatial and temporal analysis. GIS can be also applied for climatic research to manage, investigate and display all kinds of weather data. The main objective of this study is to demonstrate that GIS is a useful tool to examine and visualise precipitation distribution obtained from different data sources: ground measurements, satellite and radar data. Three selected days (30 cases) with convective rainfall situations were analysed. Firstly, scalable GRID-based approach was applied to store data from three different sources in comparable layout. Then, geoprocessing algorithm was created within ArcGIS 9.2 environment. The algorithm included: GRID definition, reclassification and raster algebra. All of the calculations and procedures were performed automatically. Finally, contingency tables and pie charts were created to show relationship between ground measurements and both satellite and radar derived data. The results were visualised on maps.

  6. Data documenting the potential distribution of Aedes aegypti in the center of Veracruz, Mexico.

    PubMed

    Estrada-Contreras, Israel; Sandoval-Ruiz, César A; Mendoza-Palmero, Fredy S; Ibáñez-Bernal, Sergio; Equihua, Miguel; Benítez, Griselda

    2017-02-01

    The data presented in this article are related to the research article entitled "Establishment of Aedes aegypti (L.) in mountainous regions in Mexico: Increasing number of population at risk of mosquito-borne disease and future climate conditions" (M. Equihua, S. Ibáñez-Bernal, G. Benítez, I. Estrada-Contreras, C.A. Sandoval-Ruiz, F.S. Mendoza-Palmero, 2016) [1]. This article provides presence records in shapefile format used to generate maps of potential distribution of Aedes aegypti with different climate change scenarios as well as each of the maps obtained in raster format. In addition, tables with values of potential distribution of the vector as well as the average values of probability of presence including data of the mosquito incidence along the altitudinal range.

  7. A coastal hazards data base for the U.S. West Coast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gornitz, V.M.; Beaty, T.W.; Daniels, R.C.

    1997-12-01

    This document describes the contents of a digital data base that may be used to identify coastlines along the US West Coast that are at risk to sea-level rise. This data base integrates point, line, and polygon data for the US West Coast into 0.25{degree} latitude by 0.25{degree} longitude grid cells and into 1:2,000,000 digitized line segments that can be used by raster or vector geographic information systems (GIS) as well as by non-GIS data bases. Each coastal grid cell and line segment contains data variables from the following seven data sets: elevation, geology, geomorphology, sea-level trends, shoreline displacement (erosion/accretion),more » tidal ranges, and wave heights. One variable from each data set was classified according to its susceptibility to sea-level rise and/or erosion to form 7 relative risk variables. These risk variables range in value from 1 to 5 and may be used to calculate a Coastal Vulnerability Index (CVI). Algorithms used to calculate several CVIs are listed within this text.« less

  8. A Coastal Hazards Data Base for the U.S. West Coast (1997) (NDP-043C)

    DOE Data Explorer

    Gomitz, Vivien M. [Columbia Univ., New York, NY (United States); Beaty, Tammy W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daniels, Richard C. [The University of Tennessee, Knoville, TN (United States)

    1997-01-01

    This data base integrates point, line, and polygon data for the U.S. West Coast into 0.25 degree latitude by 0.25 degree longitude grid cells and into 1:2,000,000 digitized line segments that can be used by raster or vector geographic information systems (GIS) as well as by non-GIS data bases. Each coastal grid cell and line segment contains data variables from the following seven data sets: elevation, geology, geomorphology, sea-level trends, shoreline displacement (erosion/accretion), tidal ranges, and wave heights. One variable from each data set was classified according to its susceptibility to sea-level rise and/or erosion to form 7 relative risk variables. These risk variables range in value from 1 to 5 and may be used to calculate a Coastal Vulnerability Index (CVI). Algorithms used to calculate several CVIs are listed within this text.

  9. Dynamic, physical-based landslide susceptibility modelling based on real-time weather data

    NASA Astrophysics Data System (ADS)

    Canli, Ekrem; Glade, Thomas

    2016-04-01

    By now there seem to be a broad consensus that due to human-induced global change the frequency and magnitude of precipitation intensities within extensive rainstorm events is expected to increase in certain parts of the world. Given the fact, that rainfall serves as one of the most common triggers for landslide initiation, also an increased landside activity might be expected. Landslide occurrence is a globally spread phenomenon that clearly needs to be handled by a variety of concepts, methods, and models. However, most of the research done with respect to landslides deals with retrospect cases, thus classical back-analysis approaches do not incorporate real-time data. This is remarkable, as most destructive landslides are related to immediate events due to external triggering factors. Only few works so far addressed real-time dynamic components for spatial landslide susceptibility and hazard assessment. Here we present an approach for integrating real-time web-based rainfall data from different sources into an automated workflow. Rain gauge measurements are interpolated into a continuous raster which in return is directly utilized in a dynamic, physical-based model. We use the Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability Analysis (TRIGRS) model that was modified in a way that it is automatically updated with the most recent rainfall raster for producing hourly landslide susceptibility maps on a regional scale. To account for the uncertainties involved in spatial modelling, the model was further adjusted by not only applying single values for given geotechnical parameters, but ranges instead. The values are determined randomly between user-defined thresholds defining the parameter ranges. Consequently, a slope failure probability from a larger number of model runs is computed rather than just the distributed factor of safety. This will ultimately allow a near-real time spatial landslide alert for a given region.

  10. Deciphering groundwater potential zones in hard rock terrain using geospatial technology.

    PubMed

    Dar, Imran A; Sankar, K; Dar, Mithas A

    2011-02-01

    Remote sensing and geographical information system (GIS) has become one of the leading tools in the field of groundwater research, which helps in assessing, monitoring, and conserving groundwater resources. This paper mainly deals with the integrated approach of remote sensing and GIS to delineate groundwater potential zones in hard rock terrain. Digitized vector maps pertaining to chosen parameters, viz. geomorphology, geology, land use/land cover, lineament, relief, and drainage, were converted to raster data using 23 m×23 m grid cell size. Moreover, curvature of the study area was also considered while manipulating the spatial data. The raster maps of these parameters were assigned to their respective theme weight and class weights. The individual theme weight was multiplied by its respective class weight and then all the raster thematic layers were aggregated in a linear combination equation in Arc Map GIS Raster Calculator module. Moreover, the weighted layers were statistically modeled to get the areal extent of groundwater prospects with respect to each thematic layer. The final result depicts the favorable prospective zones in the study area and can be helpful in better planning and management of groundwater resources especially in hard rock terrains.

  11. Prioritizing landscapes for longleaf pine conservation

    USGS Publications Warehouse

    Grand, James B.; Kleiner, Kevin J.

    2016-01-01

    We developed a spatially explicit model and map, as a decision support tool (DST), to aid conservation agencies creating or maintaining open pine ecosystems. The tool identified areas that are likely to provide the greatest benefit to focal bird populations based on a comprehensive landscape analysis. We used NLCD 2011, SSURGO, and SEGAP data to map the density of desired resources for open pine ecosystems and six focal species of birds and 2 reptiles within the historic range of longleaf pine east of the Mississippi River. Binary rasters were created of sites with desired characteristics such as land form, hydrology, land use and land cover, soils, potential habitat for focal species, and putative source populations of focal species. Each raster was smoothed using a kernel density estimator. Rasters were combined and scaled to map priority locations for the management of each focal species. Species’ rasters were combined and scaled to provide maps of overall priority for birds and for birds and reptiles. The spatial data can be used to identify high priority areas for conservation or to compare areas under consideration for maintenance or creation of open pine ecosystems.

  12. A system for verifying models and classification maps by extraction of information from a variety of data sources

    NASA Technical Reports Server (NTRS)

    Norikane, L.; Freeman, A.; Way, J.; Okonek, S.; Casey, R.

    1992-01-01

    Recent updates to a geographical information system (GIS) called VICAR (Video Image Communication and Retrieval)/IBIS are described. The system is designed to handle data from many different formats (vector, raster, tabular) and many different sources (models, radar images, ground truth surveys, optical images). All the data are referenced to a single georeference plane, and average or typical values for parameters defined within a polygonal region are stored in a tabular file, called an info file. The info file format allows tracking of data in time, maintenance of links between component data sets and the georeference image, conversion of pixel values to `actual' values (e.g., radar cross-section, luminance, temperature), graph plotting, data manipulation, generation of training vectors for classification algorithms, and comparison between actual measurements and model predictions (with ground truth data as input).

  13. Development and implementation of a low cost micro computer system for LANDSAT analysis and geographic data base applications

    NASA Technical Reports Server (NTRS)

    Faust, N.; Jordon, L.

    1981-01-01

    Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.

  14. High resolution global gridded data for use in population studies

    NASA Astrophysics Data System (ADS)

    Lloyd, Christopher T.; Sorichetta, Alessandro; Tatem, Andrew J.

    2017-01-01

    Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website.

  15. High resolution global gridded data for use in population studies.

    PubMed

    Lloyd, Christopher T; Sorichetta, Alessandro; Tatem, Andrew J

    2017-01-31

    Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website.

  16. High resolution global gridded data for use in population studies

    PubMed Central

    Lloyd, Christopher T.; Sorichetta, Alessandro; Tatem, Andrew J.

    2017-01-01

    Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website. PMID:28140386

  17. Object oriented classification of high resolution data for inventory of horticultural crops

    NASA Astrophysics Data System (ADS)

    Hebbar, R.; Ravishankar, H. M.; Trivedi, S.; Subramoniam, S. R.; Uday, R.; Dadhwal, V. K.

    2014-11-01

    High resolution satellite images are associated with large variance and thus, per pixel classifiers often result in poor accuracy especially in delineation of horticultural crops. In this context, object oriented techniques are powerful and promising methods for classification. In the present study, a semi-automatic object oriented feature extraction model has been used for delineation of horticultural fruit and plantation crops using Erdas Objective Imagine. Multi-resolution data from Resourcesat LISS-IV and Cartosat-1 have been used as source data in the feature extraction model. Spectral and textural information along with NDVI were used as inputs for generation of Spectral Feature Probability (SFP) layers using sample training pixels. The SFP layers were then converted into raster objects using threshold and clump function resulting in pixel probability layer. A set of raster and vector operators was employed in the subsequent steps for generating thematic layer in the vector format. This semi-automatic feature extraction model was employed for classification of major fruit and plantations crops viz., mango, banana, citrus, coffee and coconut grown under different agro-climatic conditions. In general, the classification accuracy of about 75-80 per cent was achieved for these crops using object based classification alone and the same was further improved using minimal visual editing of misclassified areas. A comparison of on-screen visual interpretation with object oriented approach showed good agreement. It was observed that old and mature plantations were classified more accurately while young and recently planted ones (3 years or less) showed poor classification accuracy due to mixed spectral signature, wider spacing and poor stands of plantations. The results indicated the potential use of object oriented approach for classification of high resolution data for delineation of horticultural fruit and plantation crops. The present methodology is applicable at

  18. Remodeling census population with spatial information from Landsat TM imagery

    USGS Publications Warehouse

    Yuan, Y.; Smith, R.M.; Limp, W.F.

    1997-01-01

    In geographic information systems (GIS) studies there has been some difficulty integrating socioeconomic and physiogeographic data. One important type of socioeconomic data, census data, offers a wide range of socioeconomic information, but is aggregated within arbitrary enumeration districts (EDs). Values reflect either raw counts or, when standardized, the mean densities in the EDs. On the other hand, remote sensing imagery, an important type of physiogeographic data, provides large quantities of information with more spatial details than census data. Based on the dasymetric mapping principle, this study applies multivariable regression to examine the correlation between population counts from census and land cover types. The land cover map is classified from LandSat TM imagery. The correlation is high. Census population counts are remodeled to a GIS raster layer based on the discovered correlations coupled with scaling techniques, which offset influences from other than land cover types. The GIS raster layer depicts the population distribution with much more spatial detail than census data offer. The resulting GIS raster layer is ready to be analyzed or integrated with other GIS data. ?? 1998 Elsevier Science Ltd. All rights reserved.

  19. User's Guide for the MapImage Reprojection Software Package, Version 1.01

    USGS Publications Warehouse

    Finn, Michael P.; Trent, Jason R.

    2004-01-01

    Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets (such as 30-m data) for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Recently, Usery and others (2003a) expanded on the previously limited empirical work with real geographic data by compiling and tabulating the accuracy of categorical areas in projected raster datasets of global extent. Geographers and applications programmers at the U.S. Geological Survey's (USGS) Mid-Continent Mapping Center (MCMC) undertook an effort to expand and evolve an internal USGS software package, MapImage, or mapimg, for raster map projection transformation (Usery and others, 2003a). Daniel R. Steinwand of Science Applications International Corporation, Earth Resources Observation Systems Data Center in Sioux Falls, S. Dak., originally developed mapimg for the USGS, basing it on the USGS's General Cartographic Transformation Package (GCTP). It operated as a command line program on the Unix operating system. Through efforts at MCMC, and in coordination with Mr. Steinwand, this program has been transformed from an application based on a command line into a software package based on a graphic user interface for Windows, Linux, and Unix machines. Usery and others (2003b) pointed out that many commercial software packages do not use exact projection equations and that even when exact projection equations are used, the software often results in error and sometimes does not complete the transformation for specific projections, at specific resampling resolutions, and for specific singularities. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in these software packages, but implementation with data other than points requires specific adaptation of the equations or prior preparation of the data to allow the transformation to succeed. Additional

  20. Risk Factor Analysis in Low-Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB)

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  1. BOREAS Regional DEM in Raster Format and AEAC Projection

    NASA Technical Reports Server (NTRS)

    Knapp, David; Verdin, Kristine; Hall, Forrest G. (Editor)

    2000-01-01

    This data set is based on the GTOPO30 Digital Elevation Model (DEM) produced by the United States Geological Survey EROS Data Center (USGS EDC). The BOReal Ecosystem-Atmosphere Study (BOREAS) region (1,000 km x 1000 km) was extracted from the GTOPO30 data and reprojected by BOREAS staff into the Albers Equal-Area Conic (AEAC) projection. The pixel size of these data is 1 km. The data are stored in binary, image format files.

  2. Cross-Service Investigation of Geographical Information Systems

    DTIC Science & Technology

    2004-03-01

    Figure 8 illustrates the combined layers. Information for the layers is stored in a database format. The two types of storage are vector and...raster models. In a vector model, the image and information are stored as geometric objects such as points, lines, or polygons. In a raster model...DNCs are a vector -based digital database with selected maritime significant physical features from hydrographic charts. Layers within the DNC are data

  3. A data model for environmental scientists

    NASA Astrophysics Data System (ADS)

    Kapeljushnik, O.; Beran, B.; Valentine, D.; van Ingen, C.; Zaslavsky, I.; Whitenack, T.

    2008-12-01

    Environmental science encompasses a wide range of disciplines from water chemistry to microbiology, ecology and atmospheric sciences. Studies often require working across disciplines which differ in their ways of describing and storing data such that it is not possible to devise a monolithic one-size-fits-all data solution. Based on our experiences with Consortium of the Universities for the Advancement of Hydrologic Science Inc. (CUAHSI) Observations Data Model, Berkeley Water Center FLUXNET carbon-climate work and by examining standards like EPA's Water Quality Exchange (WQX), we have developed a flexible data model that allows extensions without need to altering the schema such that scientists can define custom metadata elements to describe their data including observations, analysis methods as well as sensors and geographical features. The data model supports various types of observations including fixed point and moving sensors, bottled samples, rasters from remote sensors and models, and categorical descriptions (e.g. taxonomy) by employing user-defined-types when necessary. It leverages ADO .NET Entity Framework to provide the semantic data models for differing disciplines, while maintaining a common schema below the entity layer. This abstraction layer simplifies data retrieval and manipulation by hiding the logic and complexity of the relational schema from users thus allows programmers and scientists to deal directly with objects such as observations, sensors, watersheds, river reaches, channel cross-sections, laboratory analysis methods and samples as opposed to table joins, columns and rows.

  4. Geostatistical Investigations of Displacements on the Basis of Data from the Geodetic Monitoring of a Hydrotechnical Object

    NASA Astrophysics Data System (ADS)

    Namysłowska-Wilczyńska, Barbara; Wynalek, Janusz

    2017-12-01

    Geostatistical methods make the analysis of measurement data possible. This article presents the problems directed towards the use of geostatistics in spatial analysis of displacements based on geodetic monitoring. Using methods of applied (spatial) statistics, the research deals with interesting and current issues connected to space-time analysis, modeling displacements and deformations, as applied to any large-area objects on which geodetic monitoring is conducted (e.g., water dams, urban areas in the vicinity of deep excavations, areas at a macro-regional scale subject to anthropogenic influences caused by mining, etc.). These problems are very crucial, especially for safety assessment of important hydrotechnical constructions, as well as for modeling and estimating mining damage. Based on the geodetic monitoring data, a substantial basic empirical material was created, comprising many years of research results concerning displacements of controlled points situated on the crown and foreland of an exemplary earth dam, and used to assess the behaviour and safety of the object during its whole operating period. A research method at a macro-regional scale was applied to investigate some phenomena connected with the operation of the analysed big hydrotechnical construction. Applying a semivariogram function enabled the spatial variability analysis of displacements. Isotropic empirical semivariograms were calculated and then, theoretical parameters of analytical functions were determined, which approximated the courses of the mentioned empirical variability measure. Using ordinary (block) kriging at the grid nodes of an elementary spatial grid covering the analysed object, the values of the Z* estimated means of displacements were calculated together with the accompanying assessment of uncertainty estimation - a standard deviation of estimation σk. Raster maps of the distribution of estimated averages Z* and raster maps of deviations of estimation σk (in perspective

  5. Solid object visualization of 3D ultrasound data

    NASA Astrophysics Data System (ADS)

    Nelson, Thomas R.; Bailey, Michael J.

    2000-04-01

    Visualization of volumetric medical data is challenging. Rapid-prototyping (RP) equipment producing solid object prototype models of computer generated structures is directly applicable to visualization of medical anatomic data. The purpose of this study was to develop methods for transferring 3D Ultrasound (3DUS) data to RP equipment for visualization of patient anatomy. 3DUS data were acquired using research and clinical scanning systems. Scaling information was preserved and the data were segmented using threshold and local operators to extract features of interest, converted from voxel raster coordinate format to a set of polygons representing an iso-surface and transferred to the RP machine to create a solid 3D object. Fabrication required 30 to 60 minutes depending on object size and complexity. After creation the model could be touched and viewed. A '3D visualization hardcopy device' has advantages for conveying spatial relations compared to visualization using computer display systems. The hardcopy model may be used for teaching or therapy planning. Objects may be produced at the exact dimension of the original object or scaled up (or down) to facilitate matching the viewers reference frame more optimally. RP models represent a useful means of communicating important information in a tangible fashion to patients and physicians.

  6. An Automated Road Roughness Detection from Mobile Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Kumar, P.; Angelats, E.

    2017-05-01

    Rough roads influence the safety of the road users as accident rate increases with increasing unevenness of the road surface. Road roughness regions are required to be efficiently detected and located in order to ensure their maintenance. Mobile Laser Scanning (MLS) systems provide a rapid and cost-effective alternative by providing accurate and dense point cloud data along route corridor. In this paper, an automated algorithm is presented for detecting road roughness from MLS data. The presented algorithm is based on interpolating smooth intensity raster surface from LiDAR point cloud data using point thinning process. The interpolated surface is further processed using morphological and multi-level Otsu thresholding operations to identify candidate road roughness regions. The candidate regions are finally filtered based on spatial density and standard deviation of elevation criteria to detect the roughness along the road surface. The test results of road roughness detection algorithm on two road sections are presented. The developed approach can be used to provide comprehensive information to road authorities in order to schedule maintenance and ensure maximum safety conditions for road users.

  7. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  8. Implementing a Data Quality Strategy to Simplify Access to Data

    NASA Astrophysics Data System (ADS)

    Druken, K. A.; Trenham, C. E.; Evans, B. J. K.; Richards, C. J.; Wang, J.; Wyborn, L. A.

    2016-12-01

    To ensure seamless programmatic access for data analysis (including machine learning), standardization of both data and services is vital. At the Australian National Computational Infrastructure (NCI) we have developed a Data Quality Strategy (DQS) that currently provides processes for: (1) the consistency of data structures in the underlying High Performance Data (HPD) platform; (2) quality control through compliance with recognized community standards; and (3) data quality assurance through demonstrated functionality across common platforms, tools and services. NCI hosts one of Australia's largest repositories (10+ PBytes) of research data collections spanning datasets from climate, coasts, oceans and geophysics through to astronomy, bioinformatics and the social sciences. A key challenge is the application of community-agreed data standards to the broad set of Earth systems and environmental data that are being used. Within these disciplines, data span a wide range of gridded, ungridded (i.e., line surveys, point clouds), and raster image types, as well as diverse coordinate reference projections and resolutions. By implementing our DQS we have seen progressive improvement in the quality of the datasets across the different subject domains, and through this, the ease by which the users can programmatically access the data, either in situ or via web services. As part of its quality control procedures, NCI has developed a compliance checker based upon existing domain standards. The DQS also includes extensive Functionality Testing which include readability by commonly used libraries (e.g., netCDF, HDF, GDAL, etc.); accessibility by data servers (e.g., THREDDS, Hyrax, GeoServer), validation against scientific analysis and programming platforms (e.g., Python, Matlab, QGIS); and visualization tools (e.g., ParaView, NASA Web World Wind). These tests ensure smooth interoperability between products and services as well as exposing unforeseen requirements and

  9. Mapping the montane cloud forest of Taiwan using 12 year MODIS-derived ground fog frequency data.

    PubMed

    Schulz, Hans Martin; Li, Ching-Feng; Thies, Boris; Chang, Shih-Chieh; Bendix, Jörg

    2017-01-01

    Up until now montane cloud forest (MCF) in Taiwan has only been mapped for selected areas of vegetation plots. This paper presents the first comprehensive map of MCF distribution for the entire island. For its creation, a Random Forest model was trained with vegetation plots from the National Vegetation Database of Taiwan that were classified as "MCF" or "non-MCF". This model predicted the distribution of MCF from a raster data set of parameters derived from a digital elevation model (DEM), Landsat channels and texture measures derived from them as well as ground fog frequency data derived from the Moderate Resolution Imaging Spectroradiometer. While the DEM parameters and Landsat data predicted much of the cloud forest's location, local deviations in the altitudinal distribution of MCF linked to the monsoonal influence as well as the Massenerhebung effect (causing MCF in atypically low altitudes) were only captured once fog frequency data was included. Therefore, our study suggests that ground fog data are most useful for accurately mapping MCF.

  10. Galileo PPR at Io: High Resolution Scans Taken in Conjunction with SSA and NIMS Data

    NASA Technical Reports Server (NTRS)

    Rathbun, J. A.; Spencer, J. R.; Tamppari, L. K.; Martin, T. Z.; Barnard, L.; Travis, L. D.

    2003-01-01

    The Galileo Photopolarimeter-Radiometer (PPR), when used in the radiometry mode which is most often used at Io, is a long-wavelength infrared single-aperture photometer. It is sensitive to temperatures from about 60 to several hundred K, and is thus useful for studying the volcanoes and background temperatures on Io. PPR can take raster scan images when it is the primary instrument being used (these data were discussed last year, see Rathbun et al., 2002). It can also take data in ride-along mode in conjunction with another remote sensing instrument (either SSI or NIMS) producing one-dimensional temperature scans. The best data of this type were taken during the close approach flybys during orbits I24, I25, I27, I31, I32, and I33 and include measurements of the volcanoes Pele, Prometheus, Pillan, Zamama, Tvashtar, Daedalus, Amarani, Gish Bar, Isum, Emakong, Tupan, and Tohil.

  11. A Global Geographic Information System Data Base of Storm Occurrences and Other Climatic Phenomena Affecting Coastal Zones (1991) (NDP-035)

    DOE Data Explorer

    Birdwell, Kevub R. [Murray State University, Kentucky; Daniels, Richard C.

    2012-01-01

    This NDP is unique in that it represents CDIAC's first offering of ARC/INFOTM export data files and equivalent flat ASCII data files that may be used by raster or vector geographic information systems (GISs). The data set contains 61 variables, including information on tropical storms, hurricanes, super typhoons, extratropical cyclogeneses, polar lows, cyclonicity, influence of winds in monsoon regions, and sea-ice concentrations. Increased availability of source data has made it possible to extend the area of these data variables to regional or global coverages. All data variables except five are referenced to 1° × 1° or 5° × 5° grid cells of latitude and longitude. These data help meet the demand for new and improved climatologies of storm events and may be used in climate research studies, including the verification of general circulation models and the calculation of storm-recurrence intervals.

  12. Distribution of Potential Hydrothermally Altered Rocks in Central Colorado Derived From Landsat Thematic Mapper Data: A Geographic Information System Data Set

    USGS Publications Warehouse

    Knepper, Daniel H.

    2010-01-01

    As part of the Central Colorado Mineral Resource Assessment Project, the digital image data for four Landsat Thematic Mapper scenes covering central Colorado between Wyoming and New Mexico were acquired and band ratios were calculated after masking pixels dominated by vegetation, snow, and terrain shadows. Ratio values were visually enhanced by contrast stretching, revealing only those areas with strong responses (high ratio values). A color-ratio composite mosaic was prepared for the four scenes so that the distribution of potentially hydrothermally altered rocks could be visually evaluated. To provide a more useful input to a Geographic Information System-based mineral resource assessment, the information contained in the color-ratio composite raster image mosaic was converted to vector-based polygons after thresholding to isolate the strongest ratio responses and spatial filtering to reduce vector complexity and isolate the largest occurrences of potentially hydrothermally altered rocks.

  13. User's Guide for MapIMG 2: Map Image Re-projection Software Package

    USGS Publications Warehouse

    Finn, Michael P.; Trent, Jason R.; Buehler, Robert A.

    2006-01-01

    BACKGROUND Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in commercial software packages, but implementation with data other than points requires specific adaptation of the transformation equations or prior preparation of the data to allow the transformation to succeed. It seems that some of these packages use the U.S. Geological Survey's (USGS) General Cartographic Transformation Package (GCTP) or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003a). Usery and others (2003b) compiled and tabulated the accuracy of categorical areas in projected raster datasets of global extent. Based on the shortcomings identified in these studies, geographers and applications programmers at the USGS expanded and evolved a USGS software package, MapIMG, for raster map projection transformation (Finn and Trent, 2004). Daniel R. Steinwand of Science Applications International Corporation, National Center for Earth Resources Observation and Science, originally developed MapIMG for the USGS, basing it on GCTP. Through previous and continuing efforts at the USGS' National Geospatial Technical Operations Center, this program has been transformed from an application based on command line input into a software package based on a graphical user interface for Windows, Linux, and other UNIX machines.

  14. Earthscape, a Multi-Purpose Interactive 3d Globe Viewer for Hybrid Data Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Sarthou, A.; Mas, S.; Jacquin, M.; Moreno, N.; Salamon, A.

    2015-08-01

    The hybrid visualization and interaction tool EarthScape is presented here. The software is able to display simultaneously LiDAR point clouds, draped videos with moving footprint, volume scientific data (using volume rendering, isosurface and slice plane), raster data such as still satellite images, vector data and 3D models such as buildings or vehicles. The application runs on touch screen devices such as tablets. The software is based on open source libraries, such as OpenSceneGraph, osgEarth and OpenCV, and shader programming is used to implement volume rendering of scientific data. The next goal of EarthScape is to perform data analysis using ENVI Services Engine, a cloud data analysis solution. EarthScape is also designed to be a client of Jagwire which provides multisource geo-referenced video fluxes. When all these components will be included, EarthScape will be a multi-purpose platform that will provide at the same time data analysis, hybrid visualization and complex interactions. The software is available on demand for free at france@exelisvis.com.

  15. Integrated Multibeam and LIDAR Bathymetry Data Offshore of New London and Niantic, Connecticut

    USGS Publications Warehouse

    Poppe, L.J.; Danforth, W.W.; McMullen, K.Y.; Parker, Castle E.; Lewit, P.G.; Doran, E.F.

    2010-01-01

    Nearshore areas within Long Island Sound are of great interest to the Connecticut and New York research and resource management communities because of their ecological, recreational, and commercial importance. Although advances in multibeam echosounder technology permit the construction of high-resolution representations of sea-floor topography in deeper waters, limitations inherent in collecting fixed-angle multibeam data make using this technology in shallower waters (less than 10 meters deep) difficult and expensive. These limitations have often resulted in data gaps between areas for which multibeam bathymetric datasets are available and the adjacent shoreline. To address this problem, the geospatial data sets released in this report seamlessly integrate complete-coverage multibeam bathymetric data acquired off New London and Niantic Bay, Connecticut, with hydrographic Light Detection and Ranging (LIDAR) data acquired along the nearshore. The result is a more continuous sea floor representation and a much smaller gap between the digital bathymetric data and the shoreline than previously available. These data sets are provided online and on CD-ROM in Environmental Systems Research Institute (ESRI) raster-grid and GeoTIFF formats in order to facilitate access, compatibility, and utility.

  16. Vector Data Model: A New Model of HDF-EOS to Support GIS Applications in EOS

    NASA Astrophysics Data System (ADS)

    Chi, E.; Edmonds, R d

    2001-05-01

    NASA's Earth Science Data Information System (ESDIS) project has an active program of research and development of systems for the storage and management of Earth science data for Earth Observation System (EOS) mission, a key program of NASA Earth Science Enterprise. EOS has adopted an extension of the Hierarchical Data Format (HDF) as the format of choice for standard product distribution. Three new EOS specific datatypes - point, swath and grid - have been defined within the HDF framework. The enhanced data format is named HDF-EOS. Geographic Information Systems (GIS) are used by Earth scientists in EOS data product generation, visualization, and analysis. There are two major data types in GIS applications, raster and vector. The current HDF-EOS handles only raster type in the swath data model. The vector data model is identified and developed as a new HDFEOS format to meet the requirements of scientists working with EOS data products in vector format. The vector model is designed using a topological data structure, which defines the spatial relationships among points, lines, and polygons. The three major topological concepts that the vector model adopts are: a) lines connect to each other at nodes (connectivity), b) lines that connect to surround an area define a polygon (area definition), and c) lines have direction and left and right sides (contiguity). The vector model is implemented in HDF by mapping the conceptual model to HDF internal data models and structures, viz. Vdata, Vgroup, and their associated attribute structures. The point, line, and polygon geometry and attribute data are stored in similar tables. Further, the vector model utilizes the structure and product metadata, which characterize the HDF-EOS. Both types of metadata are stored as attributes in HDF-EOS files, and are encoded in text format by using Object Description Language (ODL) and stored as global attributes in HDF-EOS files. EOS has developed a series of routines for storing

  17. The Challenge of Handling Big Data Sets in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Autermann, Christian; Stasch, Christoph; Jirka, Simon

    2016-04-01

    More and more Sensor Web components are deployed in different domains such as hydrology, oceanography or air quality in order to make observation data accessible via the Web. However, besides variability of data formats and protocols in environmental applications, the fast growing volume of data with high temporal and spatial resolution is imposing new challenges for Sensor Web technologies when sharing observation data and metadata about sensors. Variability, volume and velocity are the core issues that are addressed by Big Data concepts and technologies. Most solutions in the geospatial sector focus on remote sensing and raster data, whereas big in-situ observation data sets relying on vector features require novel approaches. Hence, in order to deal with big data sets in infrastructures for observational data, the following questions need to be answered: 1. How can big heterogeneous spatio-temporal datasets be organized, managed, and provided to Sensor Web applications? 2. How can views on big data sets and derived information products be made accessible in the Sensor Web? 3. How can big observation data sets be processed efficiently? We illustrate these challenges with examples from the marine domain and outline how we address these challenges. We therefore show how big data approaches from mainstream IT can be re-used and applied to Sensor Web application scenarios.

  18. Discovering new methods of data fusion, visualization, and analysis in 3D immersive environments for hyperspectral and laser altimetry data

    NASA Astrophysics Data System (ADS)

    Moore, C. A.; Gertman, V.; Olsoy, P.; Mitchell, J.; Glenn, N. F.; Joshi, A.; Norpchen, D.; Shrestha, R.; Pernice, M.; Spaete, L.; Grover, S.; Whiting, E.; Lee, R.

    2011-12-01

    Immersive virtual reality environments such as the IQ-Station or CAVE° (Cave Automated Virtual Environment) offer new and exciting ways to visualize and explore scientific data and are powerful research and educational tools. Combining remote sensing data from a range of sensor platforms in immersive 3D environments can enhance the spectral, textural, spatial, and temporal attributes of the data, which enables scientists to interact and analyze the data in ways never before possible. Visualization and analysis of large remote sensing datasets in immersive environments requires software customization for integrating LiDAR point cloud data with hyperspectral raster imagery, the generation of quantitative tools for multidimensional analysis, and the development of methods to capture 3D visualizations for stereographic playback. This study uses hyperspectral and LiDAR data acquired over the China Hat geologic study area near Soda Springs, Idaho, USA. The data are fused into a 3D image cube for interactive data exploration and several methods of recording and playback are investigated that include: 1) creating and implementing a Virtual Reality User Interface (VRUI) patch configuration file to enable recording and playback of VRUI interactive sessions within the CAVE and 2) using the LiDAR and hyperspectral remote sensing data and GIS data to create an ArcScene 3D animated flyover, where left- and right-eye visuals are captured from two independent monitors for playback in a stereoscopic player. These visualizations can be used as outreach tools to demonstrate how integrated data and geotechnology techniques can help scientists see, explore, and more adequately comprehend scientific phenomena, both real and abstract.

  19. Algorithm for Compressing Time-Series Data

    NASA Technical Reports Server (NTRS)

    Hawkins, S. Edward, III; Darlington, Edward Hugo

    2012-01-01

    An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").

  20. GenomeDiagram: a python package for the visualization of large-scale genomic data.

    PubMed

    Pritchard, Leighton; White, Jennifer A; Birch, Paul R J; Toth, Ian K

    2006-03-01

    We present GenomeDiagram, a flexible, open-source Python module for the visualization of large-scale genomic, comparative genomic and other data with reference to a single chromosome or other biological sequence. GenomeDiagram may be used to generate publication-quality vector graphics, rastered images and in-line streamed graphics for webpages. The package integrates with datatypes from the BioPython project, and is available for Windows, Linux and Mac OS X systems. GenomeDiagram is freely available as source code (under GNU Public License) at http://bioinf.scri.ac.uk/lp/programs.html, and requires Python 2.3 or higher, and recent versions of the ReportLab and BioPython packages. A user manual, example code and images are available at http://bioinf.scri.ac.uk/lp/programs.html.

  1. Physics Mining of Multi-Source Data Sets

    NASA Technical Reports Server (NTRS)

    Helly, John; Karimabadi, Homa; Sipes, Tamara

    2012-01-01

    Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it.

  2. Joint denoising and distortion correction of atomic scale scanning transmission electron microscopy images

    NASA Astrophysics Data System (ADS)

    Berkels, Benjamin; Wirth, Benedikt

    2017-09-01

    Nowadays, modern electron microscopes deliver images at atomic scale. The precise atomic structure encodes information about material properties. Thus, an important ingredient in the image analysis is to locate the centers of the atoms shown in micrographs as precisely as possible. Here, we consider scanning transmission electron microscopy (STEM), which acquires data in a rastering pattern, pixel by pixel. Due to this rastering combined with the magnification to atomic scale, movements of the specimen even at the nanometer scale lead to random image distortions that make precise atom localization difficult. Given a series of STEM images, we derive a Bayesian method that jointly estimates the distortion in each image and reconstructs the underlying atomic grid of the material by fitting the atom bumps with suitable bump functions. The resulting highly non-convex minimization problems are solved numerically with a trust region approach. Existence of minimizers and the model behavior for faster and faster rastering are investigated using variational techniques. The performance of the method is finally evaluated on both synthetic and real experimental data.

  3. Urban area thermal monitoring: Liepaja case study using satellite and aerial thermal data

    NASA Astrophysics Data System (ADS)

    Gulbe, Linda; Caune, Vairis; Korats, Gundars

    2017-12-01

    The aim of this study is to explore large (60 m/pixel) and small scale (individual building level) temperature distribution patterns from thermal remote sensing data and to conclude what kind of information could be extracted from thermal remote sensing on regular basis. Landsat program provides frequent large scale thermal images useful for analysis of city temperature patterns. During the study correlation between temperature patterns and vegetation content based on NDVI and building coverage based on OpenStreetMap data was studied. Landsat based temperature patterns were independent from the season, negatively correlated with vegetation content and positively correlated with building coverage. Small scale analysis included spatial and raster descriptor analysis for polygons corresponding to roofs of individual buildings for evaluating insulation of roofs. Remote sensing and spatial descriptors are poorly related to heat consumption data, however, thermal aerial data median and entropy can help to identify poorly insulated roofs. Automated quantitative roof analysis has high potential for acquiring city wide information about roof insulation, but quality is limited by reference data quality and information on building types, and roof materials would be crucial for further studies.

  4. Mapping the montane cloud forest of Taiwan using 12 year MODIS-derived ground fog frequency data

    PubMed Central

    Li, Ching-Feng; Thies, Boris; Chang, Shih-Chieh; Bendix, Jörg

    2017-01-01

    Up until now montane cloud forest (MCF) in Taiwan has only been mapped for selected areas of vegetation plots. This paper presents the first comprehensive map of MCF distribution for the entire island. For its creation, a Random Forest model was trained with vegetation plots from the National Vegetation Database of Taiwan that were classified as “MCF” or “non-MCF”. This model predicted the distribution of MCF from a raster data set of parameters derived from a digital elevation model (DEM), Landsat channels and texture measures derived from them as well as ground fog frequency data derived from the Moderate Resolution Imaging Spectroradiometer. While the DEM parameters and Landsat data predicted much of the cloud forest’s location, local deviations in the altitudinal distribution of MCF linked to the monsoonal influence as well as the Massenerhebung effect (causing MCF in atypically low altitudes) were only captured once fog frequency data was included. Therefore, our study suggests that ground fog data are most useful for accurately mapping MCF. PMID:28245279

  5. Surface water data and geographic relation to Tertiary age intrusions and hydrothermal alteration in the Grand Mesa, Uncompahgre, and Gunnison National Forests (GMUG) and intervening Bureau of Land Management (BLM) lands

    USGS Publications Warehouse

    Bove, Dana J.; Knepper, Daniel H.

    2000-01-01

    This data set covering the western part of Colorado includes water quality data from eight different sources (points), nine U.S. Geological Survey Digital Raster Graph (DRG) files for topographic bases, a compilation of Tertiary age intrusions (polygons and lines), and two geotiff files showing areas of hydrothermally altered rock. These data were compiled for use with an ongoing mineral resource assessment of theGrand Mesa, Uncompahgre, and Gunnison National Forests (GMUG) and intervening Bureau of Land Management(BLM) lands. This compilation was assembled to give federal land managers a preliminary view of water within sub-basinal areas, and to show possible relationships to Tertiary age intrusion and areas of hydrothermal alteration.

  6. Defense Mapping Agency (DMA) Raster-to-Vector Analysis

    DTIC Science & Technology

    1984-11-30

    model) to pinpoint critical deficiencies and understand trade-offs between alternative solutions. This may be exemplified by the allocation of human ...process, prone to errors (i.e., human operator eye/motor control limitations), and its time consuming nature (as a function of data density). It should...achieved through the facilities of coinputer interactive graphics. Each error or anomaly is individually identified by a human operator and corrected

  7. The Relative Biological Effectiveness for Carbon and Oxygen Ion Beams Using the Raster-Scanning Technique in Hepatocellular Carcinoma Cell Lines

    PubMed Central

    Habermehl, Daniel; Ilicic, Katarina; Dehne, Sarah; Rieken, Stefan; Orschiedt, Lena; Brons, Stephan; Haberer, Thomas; Weber, Klaus-Josef; Debus, Jürgen; Combs, Stephanie E.

    2014-01-01

    Background Aim of this study was to evaluate the relative biological effectiveness (RBE) of carbon (12C) and oxygen ion (16O)-irradiation applied in the raster-scanning technique at the Heidelberg Ion beam Therapy center (HIT) based on clonogenic survival in hepatocellular carcinoma cell lines compared to photon irradiation. Methods Four human HCC lines Hep3B, PLC, HepG2 and HUH7 were irradiated with photons, 12C and 16O using a customized experimental setting at HIT for in-vitro trials. Cells were irradiated with increasing physical photon single doses of 0, 2, 4 and 6 Gy and heavy ionsingle doses of 0, 0.125, 0.5, 1, 2, 3 Gy (12C and 16O). SOBP-penetration depth and extension was 35 mm +/−4 mm and 36 mm +/−5 mm for carbon ions and oxygen ions respectively. Mean energy level and mean linear energy transfer (LET) were 130 MeV/u and 112 keV/um for 12C, and 154 MeV/u and 146 keV/um for 16O. Clonogenic survival was computated and realtive biological effectiveness (RBE) values were defined. Results For all cell lines and both particle modalities α- and β-values were determined. As expected, α-values were significantly higher for 12C and 16O than for photons, reflecting a steeper decline of the initial slope of the survival curves for high-LET beams. RBE-values were in the range of 2.1–3.3 and 1.9–3.1 for 12C and 16O, respectively. Conclusion Both irradiation with 12C and 16O using the rasterscanning technique leads to an enhanced RBE in HCC cell lines. No relevant differences between achieved RBE-values for 12C and 16O were found. Results of this work will further influence biological-adapted treatment planning for HCC patients that will undergo particle therapy with 12C or 16O. PMID:25460352

  8. Land use change detection based on multi-date imagery from different satellite sensor systems

    NASA Technical Reports Server (NTRS)

    Stow, Douglas A.; Collins, Doretta; Mckinsey, David

    1990-01-01

    An empirical study is conducted to assess the accuracy of land use change detection using satellite image data acquired ten years apart by sensors with differing spatial resolutions. The primary goals of the investigation were to (1) compare standard change detection methods applied to image data of varying spatial resolution, (2) assess whether to transform the raster grid of the higher resolution image data to that of the lower resolution raster grid or vice versa in the registration process, (3) determine if Landsat/Thermatic Mapper or SPOT/High Resolution Visible multispectral data provide more accurate detection of land use changes when registered to historical Landsat/MSS data. It is concluded that image ratioing of multisensor, multidate satellite data produced higher change detection accuracies than did principal components analysis, and that it is useful as a land use change enhancement method.

  9. A VLSI-Based High-Performance Raster Image System.

    DTIC Science & Technology

    1986-05-08

    and data in broadcast form to the array of memory -hips in the frame buffer, shown in the bottom block. This is simply a physical structure to hold up...Principal Investigator: John Poulton Collaboration on algorithm development: Prof. Jack Goldfeather (Dept. of Mathematics, Carleton Collge ...1983) Cheng-Hong Hsieh (MS, Computer Science, May, 1985) Jeff P. Hultquist Susan Spach Undergraduate ResearLh Assistant: Sonya Holder (BS, Physics , May

  10. Sharing Planetary-Scale Data in the Cloud

    NASA Astrophysics Data System (ADS)

    Sundwall, J.; Flasher, J.

    2016-12-01

    On 19 March 2015, Amazon Web Services (AWS) announced Landsat on AWS, an initiative to make data from the U.S. Geological Survey's Landsat satellite program freely available in the cloud. Because of Landsat's global coverage and long history, it has become a reference point for all Earth observation work and is considered the gold standard of natural resource satellite imagery. Within the first year of Landsat on AWS, the service served over a billion requests for Landsat imagery and metadata, globally. Availability of the data in the cloud has led to new product development by companies and startups including Mapbox, Esri, CartoDB, MathWorks, Development Seed, Trimble, Astro Digital, Blue Raster and Timbr.io. The model of staging data for analysis in the cloud established by Landsat on AWS has since been applied to high resolution radar data, European Space Agency satellite imagery, global elevation data and EPA air quality models. This session will provide an overview of lessons learned throughout these projects. It will demonstrate how cloud-based object storage is democratizing access to massive publicly-funded data sets that have previously only been available to people with access to large amounts of storage, bandwidth, and computing power. Technical discussion points will include: The differences between staging data for analysis using object storage versus file storage Using object stores to design simple RESTful APIs through thoughtful file naming conventions, header fields, and HTTP Range Requests Managing costs through data architecture and Amazon S3's "requester pays" feature Building tools that allow users to take their algorithm to the data in the cloud Using serverless technologies to display dynamic frontends for massive data sets

  11. An evaluation of automated GIS tools for delineating karst sinkholes and closed depressions from 1-meter LIDAR-derived digital elevation data

    USGS Publications Warehouse

    Doctor, Daniel H.; Young, John A.

    2013-01-01

    LiDAR (Light Detection and Ranging) surveys of karst terrains provide high-resolution digital elevation models (DEMs) that are particularly useful for mapping sinkholes. In this study, we used automated processing tools within ArcGIS (v. 10.0) operating on a 1.0 m resolution LiDAR DEM in order to delineate sinkholes and closed depressions in the Boyce 7.5 minute quadrangle located in the northern Shenandoah Valley of Virginia. The results derived from the use of the automated tools were then compared with depressions manually delineated by a geologist. Manual delineation of closed depressions was conducted using a combination of 1.0 m DEM hillshade, slopeshade, aerial imagery, and Topographic Position Index (TPI) rasters. The most effective means of visualizing depressions in the GIS was using an overlay of the partially transparent TPI raster atop the slopeshade raster at 1.0 m resolution. Manually identified depressions were subsequently checked using aerial imagery to screen for false positives, and targeted ground-truthing was undertaken in the field. The automated tools that were utilized include the routines in ArcHydro Tools (v. 2.0) for prescreening, evaluating, and selecting sinks and depressions as well as thresholding, grouping, and assessing depressions from the TPI raster. Results showed that the automated delineation of sinks and depressions within the ArcHydro tools was highly dependent upon pre-conditioning of the DEM to produce "hydrologically correct" surface flow routes. Using stream vectors obtained from the National Hydrologic Dataset alone to condition the flow routing was not sufficient to produce a suitable drainage network, and numerous artificial depressions were generated where roads, railways, or other manmade structures acted as flow barriers in the elevation model. Additional conditioning of the DEM with drainage paths across these barriers was required prior to automated 2delineation of sinks and depressions. In regions where the DEM

  12. Using R for analysing spatio-temporal datasets: a satellite-based precipitation case study

    NASA Astrophysics Data System (ADS)

    Zambrano-Bigiarini, Mauricio

    2017-04-01

    Increasing computer power and the availability of remote-sensing data measuring different environmental variables has led to unprecedented opportunities for Earth sciences in recent decades. However, dealing with hundred or thousands of files, usually in different vectorial and raster formats and measured with different temporal frequencies, impose high computation challenges to take full advantage of all the available data. R is a language and environment for statistical computing and graphics which includes several functions for data manipulation, calculation and graphical display, which are particularly well suited for Earth sciences. In this work I describe how R was used to exhaustively evaluate seven state-of-the-art satellite-based rainfall estimates (SRE) products (TMPA 3B42v7, CHIRPSv2, CMORPH, PERSIANN-CDR, PERSIAN-CCS-adj, MSWEPv1.1 and PGFv3) over the complex topography and diverse climatic gradients of Chile. First, built-in functions were used to automatically download the satellite-images in different raster formats and spatial resolutions and to clip them into the Chilean spatial extent if necessary. Second, the raster package was used to read, plot, and conduct an exploratory data analysis in selected files of each SRE product, in order to detect unexpected problems (rotated spatial domains, order or variables in NetCDF files, etc). Third, raster was used along with the hydroTSM package to aggregate SRE files into different temporal scales (daily, monthly, seasonal, annual). Finally, the hydroTSM and hydroGOF packages were used to carry out a point-to-pixel comparison between precipitation time series measured at 366 stations and the corresponding grid cell of each SRE. The modified Kling-Gupta index of model performance was used to identify possible sources of systematic errors in each SRE, while five categorical indices (PC, POD, FAR, ETS, fBIAS) were used to assess the ability of each SRE to correctly identify different precipitation intensities

  13. Publishing Platform for Aerial Orthophoto Maps, the Complete Stack

    NASA Astrophysics Data System (ADS)

    Čepický, J.; Čapek, L.

    2016-06-01

    When creating set of orthophoto maps from mosaic compositions, using airborne systems, such as popular drones, we need to publish results of the work to users. Several steps need to be performed in order get large scale raster data published. As first step, data have to be shared as service (OGC WMS as view service, OGC WCS as download service). But for some applications, OGC WMTS is handy as well, for faster view of the data. Finally the data have to become a part of web mapping application, so that they can be used and evaluated by non-technical users. In this talk, we would like to present automated line of those steps, where user puts in orthophoto image and as a result, OGC Open Web Services are published as well as web mapping application with the data. The web mapping application can be used as standard presentation platform for such type of big raster data to generic user. The publishing platform - Geosense online map information system - can be also used for combination of data from various resources and for creating of unique map compositions and as input for better interpretations of photographed phenomenons. The whole process is successfully tested with eBee drone with raster data resolution 1.5-4 cm/px on many areas and result is also used for creation of derived datasets, usually suited for property management - the records of roads, pavements, traffic signs, public lighting, sewage system, grave locations, and others.

  14. Central Satellite Data Repository Supporting Research and Development

    NASA Astrophysics Data System (ADS)

    Han, W.; Brust, J.

    2015-12-01

    Near real-time satellite data is critical to many research and development activities of atmosphere, land, and ocean processes. Acquiring and managing huge volumes of satellite data without (or with less) latency in an organization is always a challenge in the big data age. An organization level data repository is a practical solution to meeting this challenge. The STAR (Center for Satellite Applications and Research of NOAA) Central Data Repository (SCDR) is a scalable, stable, and reliable repository to acquire, manipulate, and disseminate various types of satellite data in an effective and efficient manner. SCDR collects more than 200 data products, which are commonly used by multiple groups in STAR, from NOAA, GOES, Metop, Suomi NPP, Sentinel, Himawari, and other satellites. The processes of acquisition, recording, retrieval, organization, and dissemination are performed in parallel. Multiple data access interfaces, like FTP, FTPS, HTTP, HTTPS, and RESTful, are supported in the SCDR to obtain satellite data from their providers through high speed internet. The original satellite data in various raster formats can be parsed in the respective adapter to retrieve data information. The data information is ingested to the corresponding partitioned tables in the central database. All files are distributed equally on the Network File System (NFS) disks to balance the disk load. SCDR provides consistent interfaces (including Perl utility, portal, and RESTful Web service) to locate files of interest easily and quickly and access them directly by over 200 compute servers via NFS. SCDR greatly improves collection and integration of near real-time satellite data, addresses satellite data requirements of scientists and researchers, and facilitates their primary research and development activities.

  15. Cloud Computing and Its Applications in GIS

    NASA Astrophysics Data System (ADS)

    Kang, Cao

    2011-12-01

    Cloud computing is a novel computing paradigm that offers highly scalable and highly available distributed computing services. The objectives of this research are to: 1. analyze and understand cloud computing and its potential for GIS; 2. discover the feasibilities of migrating truly spatial GIS algorithms to distributed computing infrastructures; 3. explore a solution to host and serve large volumes of raster GIS data efficiently and speedily. These objectives thus form the basis for three professional articles. The first article is entitled "Cloud Computing and Its Applications in GIS". This paper introduces the concept, structure, and features of cloud computing. Features of cloud computing such as scalability, parallelization, and high availability make it a very capable computing paradigm. Unlike High Performance Computing (HPC), cloud computing uses inexpensive commodity computers. The uniform administration systems in cloud computing make it easier to use than GRID computing. Potential advantages of cloud-based GIS systems such as lower barrier to entry are consequently presented. Three cloud-based GIS system architectures are proposed: public cloud- based GIS systems, private cloud-based GIS systems and hybrid cloud-based GIS systems. Public cloud-based GIS systems provide the lowest entry barriers for users among these three architectures, but their advantages are offset by data security and privacy related issues. Private cloud-based GIS systems provide the best data protection, though they have the highest entry barriers. Hybrid cloud-based GIS systems provide a compromise between these extremes. The second article is entitled "A cloud computing algorithm for the calculation of Euclidian distance for raster GIS". Euclidean distance is a truly spatial GIS algorithm. Classical algorithms such as the pushbroom and growth ring techniques require computational propagation through the entire raster image, which makes it incompatible with the distributed nature

  16. Method and system for progressive mesh storage and reconstruction using wavelet-encoded height fields

    NASA Technical Reports Server (NTRS)

    Baxes, Gregory A. (Inventor); Linger, Timothy C. (Inventor)

    2011-01-01

    Systems and methods are provided for progressive mesh storage and reconstruction using wavelet-encoded height fields. A method for progressive mesh storage includes reading raster height field data, and processing the raster height field data with a discrete wavelet transform to generate wavelet-encoded height fields. In another embodiment, a method for progressive mesh storage includes reading texture map data, and processing the texture map data with a discrete wavelet transform to generate wavelet-encoded texture map fields. A method for reconstructing a progressive mesh from wavelet-encoded height field data includes determining terrain blocks, and a level of detail required for each terrain block, based upon a viewpoint. Triangle strip constructs are generated from vertices of the terrain blocks, and an image is rendered utilizing the triangle strip constructs. Software products that implement these methods are provided.

  17. Method and system for progressive mesh storage and reconstruction using wavelet-encoded height fields

    NASA Technical Reports Server (NTRS)

    Baxes, Gregory A. (Inventor)

    2010-01-01

    Systems and methods are provided for progressive mesh storage and reconstruction using wavelet-encoded height fields. A method for progressive mesh storage includes reading raster height field data, and processing the raster height field data with a discrete wavelet transform to generate wavelet-encoded height fields. In another embodiment, a method for progressive mesh storage includes reading texture map data, and processing the texture map data with a discrete wavelet transform to generate wavelet-encoded texture map fields. A method for reconstructing a progressive mesh from wavelet-encoded height field data includes determining terrain blocks, and a level of detail required for each terrain block, based upon a viewpoint. Triangle strip constructs are generated from vertices of the terrain blocks, and an image is rendered utilizing the triangle strip constructs. Software products that implement these methods are provided.

  18. Developing an Application to Increase the Accessibility of Planetary Geologic Maps

    NASA Astrophysics Data System (ADS)

    Jacobsen, R. E.; Fay, C.

    2018-06-01

    USGS planetary geologic maps are widely used digital products with text, raster, vector, and temporal data, within a highly standardized design. This tool will augment the user experience by improving accessibility among the various forms of data.

  19. Integrating Geo-Spatial Data for Regional Landslide Susceptibility Modeling in Consideration of Run-Out Signature

    NASA Astrophysics Data System (ADS)

    Lai, J.-S.; Tsai, F.; Chiang, S.-H.

    2016-06-01

    This study implements a data mining-based algorithm, the random forests classifier, with geo-spatial data to construct a regional and rainfall-induced landslide susceptibility model. The developed model also takes account of landslide regions (source, non-occurrence and run-out signatures) from the original landslide inventory in order to increase the reliability of the susceptibility modelling. A total of ten causative factors were collected and used in this study, including aspect, curvature, elevation, slope, faults, geology, NDVI (Normalized Difference Vegetation Index), rivers, roads and soil data. Consequently, this study transforms the landslide inventory and vector-based causative factors into the pixel-based format in order to overlay with other raster data for constructing the random forests based model. This study also uses original and edited topographic data in the analysis to understand their impacts to the susceptibility modeling. Experimental results demonstrate that after identifying the run-out signatures, the overall accuracy and Kappa coefficient have been reached to be become more than 85 % and 0.8, respectively. In addition, correcting unreasonable topographic feature of the digital terrain model also produces more reliable modelling results.

  20. Experimental Advanced Airborne Research Lidar (EAARL) Data Processing Manual

    USGS Publications Warehouse

    Bonisteel, Jamie M.; Nayegandhi, Amar; Wright, C. Wayne; Brock, John C.; Nagle, David

    2009-01-01

    The Experimental Advanced Airborne Research Lidar (EAARL) is an example of a Light Detection and Ranging (Lidar) system that utilizes a blue-green wavelength (532 nanometers) to determine the distance to an object. The distance is determined by recording the travel time of a transmitted pulse at the speed of light (fig. 1). This system uses raster laser scanning with full-waveform (multi-peak) resolving capabilities to measure submerged topography and adjacent coastal land elevations simultaneously (Nayegandhi and others, 2009). This document reviews procedures for the post-processing of EAARL data using the custom-built Airborne Lidar Processing System (ALPS). ALPS software was developed in an open-source programming environment operated on a Linux platform. It has the ability to combine the laser return backscatter digitized at 1-nanosecond intervals with aircraft positioning information. This solution enables the exploration and processing of the EAARL data in an interactive or batch mode. ALPS also includes modules for the creation of bare earth, canopy-top, and submerged topography Digital Elevation Models (DEMs). The EAARL system uses an Earth-centered coordinate and reference system that removes the necessity to reference submerged topography data relative to water level or tide gages (Nayegandhi and others, 2006). The EAARL system can be mounted in an array of small twin-engine aircraft that operate at 300 meters above ground level (AGL) at a speed of 60 meters per second (117 knots). While other systems strive to maximize operational depth limits, EAARL has a narrow transmit beam and receiver field of view (1.5 to 2 milliradians), which improves the depth-measurement accuracy in shallow, clear water but limits the maximum depth to about 1.5 Secchi disk depth (~20 meters) in clear water. The laser transmitter [Continuum EPO-5000 yttrium aluminum garnet (YAG)] produces up to 5,000 short-duration (1.2 nanosecond), low-power (70 microjoules) pulses each second

  1. A framework for standardized calculation of weather indices in Germany

    NASA Astrophysics Data System (ADS)

    Möller, Markus; Doms, Juliane; Gerstmann, Henning; Feike, Til

    2018-05-01

    Climate change has been recognized as a main driver in the increasing occurrence of extreme weather. Weather indices (WIs) are used to assess extreme weather conditions regarding its impact on crop yields. Designing WIs is challenging, since complex and dynamic crop-climate relationships have to be considered. As a consequence, geodata for WI calculations have to represent both the spatio-temporal dynamic of crop development and corresponding weather conditions. In this study, we introduce a WI design framework for Germany, which is based on public and open raster data of long-term spatio-temporal availability. The operational process chain enables the dynamic and automatic definition of relevant phenological phases for the main cultivated crops in Germany. Within the temporal bounds, WIs can be calculated for any year and test site in Germany in a reproducible and transparent manner. The workflow is demonstrated on the example of a simple cumulative rainfall index for the phenological phase shooting of winter wheat using 16 test sites and the period between 1994 and 2014. Compared to station-based approaches, the major advantage of our approach is the possibility to design spatial WIs based on raster data characterized by accuracy metrics. Raster data and WIs, which fulfill data quality standards, can contribute to an increased acceptance and farmers' trust in WI products for crop yield modeling or weather index-based insurances (WIIs).

  2. Utilizing soil polypedons to improve model performance for digital soil mapping

    USDA-ARS?s Scientific Manuscript database

    Most digital soil mapping approaches that use point data to develop relationships with covariate data intersect sample locations with one raster pixel regardless of pixel size. Resulting models are subject to spurious values in covariate data which may limit model performance. An alternative approac...

  3. VizieR Online Data Catalog: NGC 1893 optical and NIR photometry (Prisinzano+, 2011)

    NASA Astrophysics Data System (ADS)

    Prisinzano, L.; Sanz-Forcada, J.; Micela, G.; Caramazza, M.; Guarcello, M. G.; Sciortino, S.; Testi, L.

    2010-10-01

    We present new optical and NIR photometric data in the VRIJHK and H-α bands for the cluster NGC 1893. The optical photometry was obtained by using images acquired in service mode using two different telescopes: the Device Optimized for the LOw RESolution (DOLORES) mounted on the Telescopio Nazionale Galileo (TNG), used in service mode during three nights in 2007, and the Calar Alto Faint Object Spectrograph (CAFOS), mounted on the 2.2m telescope in Calar Alto German-Spanish Observatory (Spain), during three nights in 2007 and 2008. NIR observations were acquired in service mode at the TNG, using the large field Near Infrared Camera Spectrometer (NICS) with the Js(1.25um), H(1.63um) and K'(2.12um) filters during eight nights in 2007 and 2008. We observed a field around NGC 1893 with a raster of 4x4 pointings, at each pointing we obtained a series of NINT dithered exposures. Each exposure is a repetition of a DIT (Detector Integration Time) times NDIT (number of DIT), to avoid saturation of the background. (4 data files).

  4. Comparison of MSS and TM Data for Landcover Classification in the Chesapeake Bay Area: a Preliminary Report. [Taylor's Island, Maryland

    NASA Technical Reports Server (NTRS)

    Mulligan, P. J.; Gervin, J. C.; Lu, Y. C.

    1985-01-01

    An area bordering the Eastern Shore of the Chesapeake Bay was selected for study and classified using unsupervised techniques applied to LANDSAT-2 MSS data and several band combinations of LANDSAT-4 TM data. The accuracies of these Level I land cover classifications were verified using the Taylor's Island USGS 7.5 minute topographic map which was photointerpreted, digitized and rasterized. The the Taylor's Island map, comparing the MSS and TM three band (2 3 4) classifications, the increased resolution of TM produced a small improvement in overall accuracy of 1% correct due primarily to a small improvement, and 1% and 3%, in areas such as water and woodland. This was expected as the MSS data typically produce high accuracies for categories which cover large contiguous areas. However, in the categories covering smaller areas within the map there was generally an improvement of at least 10%. Classification of the important residential category improved 12%, and wetlands were mapped with 11% greater accuracy.

  5. Energize New Mexico - Integration of Diverse Energy-Related Research Data into an Interoperable Geospatial Infrastructure and National Data Repositories

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Barrett, H.; Diller, S.; Valentin, G.

    2016-12-01

    Energize is New Mexico's Experimental Program to Stimulate Competitive Research (NM EPSCoR), funded by the NSF with a focus on building capacity to conduct scientific research. Energize New Mexico leverages the work of faculty and students from NM universities and colleges to provide the tools necessary to a quantitative, science-driven discussion of the state's water policy options and to realize New Mexico's potential for sustainable energy development. This presentation discusses the architectural details of NM EPSCoR's collaborative data management system, GSToRE, and how New Mexico researchers use it to share and analyze diverse research data, with the goal of attaining sustainable energy development in the state.The Earth Data Analysis Center (EDAC) at The University of New Mexico leads the development of computational interoperability capacity that allows the wide use and sharing of energy-related data among NM EPSCoR researchers. Data from a variety of research disciplines is stored and maintained in EDAC's Geographic Storage, Transformation and Retrieval Engine (GSToRE), a distributed platform for large-scale vector and raster data discovery, subsetting, and delivery via Web services that are based on Open Geospatial Consortium (OGC) and REST Web-service standards. Researchers upload and register scientific datasets using a front-end client that collects the critical metadata. In addition, researchers have the option to register their datasets with DataONE, a national, community-driven project that provides access to data across multiple member repositories. The GSToRE platform maintains a searchable, core collection of metadata elements that can be used to deliver metadata in multiple formats, including ISO 19115-2/19139 and FGDC CSDGM. Stored metadata elements also permit the platform to automate the registration of Energize datasets into DataONE, once the datasets are approved for release to the public.

  6. D Building FAÇADE Reconstruction Using Handheld Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Sadeghi, F.; Arefi, H.; Fallah, A.; Hahn, M.

    2015-12-01

    3D The three dimensional building modelling has been an interesting topic of research for decades and it seems that photogrammetry methods provide the only economic means to acquire truly 3D city data. According to the enormous developments of 3D building reconstruction with several applications such as navigation system, location based services and urban planning, the need to consider the semantic features (such as windows and doors) becomes more essential than ever, and therefore, a 3D model of buildings as block is not any more sufficient. To reconstruct the façade elements completely, we employed the high density point cloud data that obtained from the handheld laser scanner. The advantage of the handheld laser scanner with capability of direct acquisition of very dense 3D point clouds is that there is no need to derive three dimensional data from multi images using structure from motion techniques. This paper presents a grammar-based algorithm for façade reconstruction using handheld laser scanner data. The proposed method is a combination of bottom-up (data driven) and top-down (model driven) methods in which, at first the façade basic elements are extracted in a bottom-up way and then they are served as pre-knowledge for further processing to complete models especially in occluded and incomplete areas. The first step of data driven modelling is using the conditional RANSAC (RANdom SAmple Consensus) algorithm to detect façade plane in point cloud data and remove noisy objects like trees, pedestrians, traffic signs and poles. Then, the façade planes are divided into three depth layers to detect protrusion, indentation and wall points using density histogram. Due to an inappropriate reflection of laser beams from glasses, the windows appear like holes in point cloud data and therefore, can be distinguished and extracted easily from point cloud comparing to the other façade elements. Next step, is rasterizing the indentation layer that holds the windows

  7. A comparison of the accuracy of pixel based and object based classifications of integrated optical and LiDAR data

    NASA Astrophysics Data System (ADS)

    Gajda, Agnieszka; Wójtowicz-Nowakowska, Anna

    2013-04-01

    A comparison of the accuracy of pixel based and object based classifications of integrated optical and LiDAR data Land cover maps are generally produced on the basis of high resolution imagery. Recently, LiDAR (Light Detection and Ranging) data have been brought into use in diverse applications including land cover mapping. In this study we attempted to assess the accuracy of land cover classification using both high resolution aerial imagery and LiDAR data (airborne laser scanning, ALS), testing two classification approaches: a pixel-based classification and object-oriented image analysis (OBIA). The study was conducted on three test areas (3 km2 each) in the administrative area of Kraków, Poland, along the course of the Vistula River. They represent three different dominating land cover types of the Vistula River valley. Test site 1 had a semi-natural vegetation, with riparian forests and shrubs, test site 2 represented a densely built-up area, and test site 3 was an industrial site. Point clouds from ALS and ortophotomaps were both captured in November 2007. Point cloud density was on average 16 pt/m2 and it contained additional information about intensity and encoded RGB values. Ortophotomaps had a spatial resolution of 10 cm. From point clouds two raster maps were generated: intensity (1) and (2) normalised Digital Surface Model (nDSM), both with the spatial resolution of 50 cm. To classify the aerial data, a supervised classification approach was selected. Pixel based classification was carried out in ERDAS Imagine software. Ortophotomaps and intensity and nDSM rasters were used in classification. 15 homogenous training areas representing each cover class were chosen. Classified pixels were clumped to avoid salt and pepper effect. Object oriented image object classification was carried out in eCognition software, which implements both the optical and ALS data. Elevation layers (intensity, firs/last reflection, etc.) were used at segmentation stage due to

  8. IDATEN and G-SITENNO: GUI-assisted software for coherent X-ray diffraction imaging experiments and data analyses at SACLA.

    PubMed

    Sekiguchi, Yuki; Yamamoto, Masaki; Oroguchi, Tomotaka; Takayama, Yuki; Suzuki, Shigeyuki; Nakasako, Masayoshi

    2014-11-01

    Using our custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors, cryogenic coherent X-ray diffraction imaging experiments have been undertaken at the SPring-8 Angstrom Compact free electron LAser (SACLA) facility. To efficiently perform experiments and data processing, two software suites with user-friendly graphical user interfaces have been developed. The first is a program suite named IDATEN, which was developed to easily conduct four procedures during experiments: aligning KOTOBUKI-1, loading a flash-cooled sample into the cryogenic goniometer stage inside the vacuum chamber of KOTOBUKI-1, adjusting the sample position with respect to the X-ray beam using a pair of telescopes, and collecting diffraction data by raster scanning the sample with X-ray pulses. Named G-SITENNO, the other suite is an automated version of the original SITENNO suite, which was designed for processing diffraction data. These user-friendly software suites are now indispensable for collecting a large number of diffraction patterns and for processing the diffraction patterns immediately after collecting data within a limited beam time.

  9. Fractal Characterization of Multitemporal Scaled Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Lam, Nina Siu-Ngan; Qiu, Hong-lie

    1998-01-01

    Scale is an "innate" concept in geographic information systems. It is recognized as something that is intrinsic to the ingestion, storage, manipulation, analysis, modeling, and output of space and time data within a GIS purview, yet the relative meaning and ramifications of scaling spatial and temporal data from this perspective remain enigmatic. As GISs become more sophisticated as a product of more robust software and more powerful computer systems, there is an urgent need to examine the issue of scale, and its relationship to the whole body of spatiotemporal data, as imparted in GISS. Scale is fundamental to the characterization of geo-spatial data as represented in GISS, but we have relatively little insight on the effects of, or how to measure the effects of, scale in representing multiscaled data; i.e., data that are acquired in different formats (e.g., map, digital) and exist in varying spatial, temporal, and in the case of remote sensing data, radiometric, configurations. This is particularly true in the emerging era of Integrated GISs (IGIS), wherein spatial data in a variety of formats (e.g., raster, vector) are combined with multiscaled remote sensing data, capable of performing highly sophisticated space-time data analyses and modeling. Moreover, the complexities associated with the integration of multiscaled data sets in a multitude of formats are exacerbated by the confusion of what the term "scale" is from a multidisciplinary perspective; i.e., "scale" takes on significantly different meanings depending upon one's disciplinary background and spatial perspective which can lead to substantive confusion in the input, manipulation, analyses, and output of IGISs (Quattrochi, 1993). Hence, we must begin to look at the universality of scale and begin to develop the theory, methods, and techniques necessary to advance knowledge on the "Science of Scale" across a wide number of spatial disciplines that use GISs.

  10. Google Earth elevation data extraction and accuracy assessment for transportation applications.

    PubMed

    Wang, Yinsong; Zou, Yajie; Henrickson, Kristian; Wang, Yinhai; Tang, Jinjun; Park, Byung-Jung

    2017-01-01

    Roadway elevation data is critical for a variety of transportation analyses. However, it has been challenging to obtain such data and most roadway GIS databases do not have them. This paper intends to address this need by proposing a method to extract roadway elevation data from Google Earth (GE) for transportation applications. A comprehensive accuracy assessment of the GE-extracted elevation data is conducted for the area of conterminous USA. The GE elevation data was compared with the ground truth data from nationwide GPS benchmarks and roadway monuments from six states in the conterminous USA. This study also compares the GE elevation data with the elevation raster data from the U.S. Geological Survey National Elevation Dataset (USGS NED), which is a widely used data source for extracting roadway elevation. Mean absolute error (MAE) and root mean squared error (RMSE) are used to assess the accuracy and the test results show MAE, RMSE and standard deviation of GE roadway elevation error are 1.32 meters, 2.27 meters and 2.27 meters, respectively. Finally, the proposed extraction method was implemented and validated for the following three scenarios: (1) extracting roadway elevation differentiating by directions, (2) multi-layered roadway recognition in freeway segment and (3) slope segmentation and grade calculation in freeway segment. The methodology validation results indicate that the proposed extraction method can locate the extracting route accurately, recognize multi-layered roadway section, and segment the extracted route by grade automatically. Overall, it is found that the high accuracy elevation data available from GE provide a reliable data source for various transportation applications.

  11. A Coastal Hazards Data Base for the U.S. East Coast (1992) (NDP-043a)

    DOE Data Explorer

    Gornitz, Vivien M. [NASA Goddard Inst. for Space Studies (GISS), New York, NY (United States); White, Tammy W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    1992-01-01

    This NDP presents data on coastal geology, geomorphology, elevation, erosion, wave heights, tide ranges, and sea levels for the U.S. east coast. These data may be used either by nongeographic database management systems or by raster or vector geographic information systems (GISs). The database integrates several data sets (originally obtained as point, line, and polygon data) for the east coast into 0.25°-latitude by 0.25°-longitude grid cells. Each coastal grid cell contains 28 data variables. This NDP may be used to predict the response of coastal zones on the U.S. east coast to changes in local or global sea levels. Information on the geologic, geomorphic, and erosional states of the coast provides the basic data needed to predict the behavior of the coastal zone into the far future. Thus, these data may be seen as providing a baseline for the calculation of the relative vulnerability of the east coast to projected sea-level rises. This data will also be useful to research, educational, governmental, and private organizations interested in the present and future vulnerability of coastal areas to erosion and inundation. The data are in 13 files, the largest of which is 1.42 MB; the entire data base takes up 3.29 MB, excluding the ARC/INFOTM files.

  12. Lossless compression techniques for maskless lithography data

    NASA Astrophysics Data System (ADS)

    Dai, Vito; Zakhor, Avideh

    2002-07-01

    Future lithography systems must produce more dense chips with smaller feature sizes, while maintaining the throughput of one wafer per sixty seconds per layer achieved by today's optical lithography systems. To achieve this throughput with a direct-write maskless lithography system, using 25 nm pixels for 50 nm feature sizes, requires data rates of about 10 Tb/s. In a previous paper, we presented an architecture which achieves this data rate contingent on consistent 25 to 1 compression of lithography data, and on implementation of a decoder-writer chip with a real-time decompressor fabricated on the same chip as the massively parallel array of lithography writers. In this paper, we examine the compression efficiency of a spectrum of techniques suitable for lithography data, including two industry standards JBIG and JPEG-LS, a wavelet based technique SPIHT, general file compression techniques ZIP and BZIP2, our own 2D-LZ technique, and a simple list-of-rectangles representation RECT. Layouts rasterized both to black-and-white pixels, and to 32 level gray pixels are considered. Based on compression efficiency, JBIG, ZIP, 2D-LZ, and BZIP2 are found to be strong candidates for application to maskless lithography data, in many cases far exceeding the required compression ratio of 25. To demonstrate the feasibility of implementing the decoder-writer chip, we consider the design of a hardware decoder based on ZIP, the simplest of the four candidate techniques. The basic algorithm behind ZIP compression is Lempel-Ziv 1977 (LZ77), and the design parameters of LZ77 decompression are optimized to minimize circuit usage while maintaining compression efficiency.

  13. Moderate-resolution sea surface temperature data for the nearshore North Pacific

    USGS Publications Warehouse

    Payne, Meredith C.; Reusser, Deborah A.; Lee, Henry; Brown, Cheryl A.

    2011-01-01

    Coastal sea surface temperature (SST) is an important environmental characteristic in determining the suitability of habitat for nearshore marine and estuarine organisms. This publication describes and provides access to an easy-to-use coastal SST dataset for ecologists, biogeographers, oceanographers, and other scientists conducting research on nearshore marine habitats or processes. The data cover the Temperate Northern Pacific Ocean as defined by the 'Marine Ecosystems of the World' (MEOW) biogeographic schema developed by The Nature Conservancy. The spatial resolution of the SST data is 4-km grid cells within 20 km of the shore. The data span a 29-year period - from September 1981 to December 2009. These SST data were derived from Advanced Very High Resolution Radiometer (AVHRR) instrument measurements compiled into monthly means as part of the Pathfinder versions 5.0 and 5.1 (PFSST V50 and V51) Project. The processing methods used to transform the data from their native Hierarchical Data Format Scientific Data Set (HDF SDS) to georeferenced, spatial datasets capable of being read into geographic information systems (GIS) software are explained. In addition, links are provided to examples of scripts involved in the data processing steps. The scripts were written in the Python programming language, which is supported by ESRI's ArcGIS version 9 or later. The processed data files are also provided in text (.csv) and Access 2003 Database (.mdb) formats. All data except the raster files include attributes identifying realm, province, and ecoregion as defined by the MEOW classification schema.

  14. Communicating data quality through Web Map Services

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Roberts, Charles; Griffiths, Guy; Lewis, Jane; Yang, Kevin

    2013-04-01

    The sharing and visualization of environmental data through spatial data infrastructures is becoming increasingly common. However, information about the quality of data is frequently unavailable or presented in an inconsistent fashion. ("Data quality" is a phrase with many possible meanings but here we define it as "fitness for purpose" - therefore different users have different notions of what constitutes a "high quality" dataset.) The GeoViQua project (www.geoviqua.org) is developing means for eliciting, formatting, discovering and visualizing quality information using ISO and Open Geospatial Consortium (OGC) standards. Here we describe one aspect of the innovations of the GeoViQua project. In this presentation, we shall demonstrate new developments in using Web Map Services to communicate data quality at the level of datasets, variables and individual samples. We shall outline a new draft set of conventions (known as "WMS-Q"), which describe a set of rules for using WMS to convey quality information (OGC draft Engineering Report 12-160). We shall demonstrate these conventions through new prototype software, based upon the widely-used ncWMS software, that applies these rules to enable the visualization of uncertainties in raster data such as satellite products and the results of numerical simulations. Many conceptual and practical issues have arisen from these experiments. How can source data be formatted so that a WMS implementation can detect the semantic links between variables (e.g. the links between a mean field and its variance)? The visualization of uncertainty can be a complex task - how can we provide users with the power and flexibility to choose an optimal strategy? How can we maintain compatibility (as far as possible) with existing WMS clients? We explore these questions with reference to existing standards and approaches, including UncertML, NetCDF-U and Styled Layer Descriptors.

  15. A new map of global ecological land units—An ecophysiographic stratification approach

    USGS Publications Warehouse

    Sayre, Roger; Dangermond, Jack; Frye, Charlie; Vaughan, Randy; Aniello, Peter; Breyer, Sean P.; Cribbs, Douglas; Hopkins, Dabney; Nauman, Richard; Derrenbacher, William; Wright, Dawn J.; Brown, Clint; Convis, Charles; Smith, Jonathan H.; Benson, Laurence; Van Sistine, Darren; Warner, Harumi; Cress, Jill Janene; Danielson, Jeffrey J.; Hamann, Sharon L.; Cecere, Thomas; Reddy, Ashwan D.; Burton, Devon; Grosse, Andrea; True, Diane; Metzger, Marc; Hartmann, Jens; Moosdorf, Nils; Durr, Hans; Paganini, Marc; Defourny, Pierre; Arino, Olivier; Maynard, Simone; Anderson, Mark; Comer, Patrick

    2014-01-01

    In response to the need and an intergovernmental commission for a high resolution and data-derived global ecosystem map, land surface elements of global ecological pattern were characterized in an ecophysiographic stratification of the planet. The stratification produced 3,923 terrestrial ecological land units (ELUs) at a base resolution of 250 meters. The ELUs were derived from data on land surface features in a three step approach. The first step involved acquiring or developing four global raster datalayers representing the primary components of ecosystem structure: bioclimate, landform, lithology, and land cover. These datasets generally represent the most accurate, current, globally comprehensive, and finest spatial and thematic resolution data available for each of the four inputs. The second step involved a spatial combination of the four inputs into a single, new integrated raster dataset where every cell represents a combination of values from the bioclimate, landforms, lithology, and land cover datalayers. This foundational global raster datalayer, called ecological facets (EFs), contains 47,650 unique combinations of the four inputs. The third step involved an aggregation of the EFs into the 3,923 ELUs. This subdivision of the Earth’s surface into relatively fine, ecological land areas is designed to be useful for various types of ecosystem research and management applications, including assessments of climate change impacts to ecosystems, economic and non-economic valuation of ecosystem services, and conservation planning.

  16. DigiFract: A software and data model implementation for flexible acquisition and processing of fracture data from outcrops

    NASA Astrophysics Data System (ADS)

    Hardebol, N. J.; Bertotti, G.

    2013-04-01

    This paper presents the development and use of our new DigiFract software designed for acquiring fracture data from outcrops more efficiently and more completely than done with other methods. Fracture surveys often aim at measuring spatial information (such as spacing) directly in the field. Instead, DigiFract focuses on collecting geometries and attributes and derives spatial information through subsequent analyses. Our primary development goal was to support field acquisition in a systematic digital format and optimized for a varied range of (spatial) analyses. DigiFract is developed using the programming interface of the Quantum Geographic Information System (GIS) with versatile functionality for spatial raster and vector data handling. Among other features, this includes spatial referencing of outcrop photos, and tools for digitizing geometries and assigning attribute information through a graphical user interface. While a GIS typically operates in map-view, DigiFract collects features on a surface of arbitrary orientation in 3D space. This surface is overlain with an outcrop photo and serves as reference frame for digitizing geologic features. Data is managed through a data model and stored in shapefiles or in a spatial database system. Fracture attributes, such as spacing or length, is intrinsic information of the digitized geometry and becomes explicit through follow-up data processing. Orientation statistics, scan-line or scan-window analyses can be performed from the graphical user interface or can be obtained through flexible Python scripts that directly access the fractdatamodel and analysisLib core modules of DigiFract. This workflow has been applied in various studies and enabled a faster collection of larger and more accurate fracture datasets. The studies delivered a better characterization of fractured reservoirs analogues in terms of fracture orientation and intensity distributions. Furthermore, the data organisation and analyses provided more

  17. Communicating and visualizing data quality through Web Map Services

    NASA Astrophysics Data System (ADS)

    Roberts, Charles; Blower, Jon; Maso, Joan; Diaz, Daniel; Griffiths, Guy; Lewis, Jane

    2014-05-01

    The sharing and visualization of environmental data through OGC Web Map Services is becoming increasingly common. However, information about the quality of data is rarely presented. (In this presentation we consider mostly data uncertainty as a measure of quality, although we acknowledge that many other quality measures are relevant to the geoscience community.) In the context of the GeoViQua project (http://www.geoviqua.org) we have developed conventions and tools for using WMS to deliver data quality information. The "WMS-Q" convention describes how the WMS specification can be used to publish quality information at the level of datasets, variables and individual pixels (samples). WMS-Q requires no extensions to the WMS 1.3.0 specification, being entirely backward-compatible. (An earlier version of WMS-Q was published as OGC Engineering Report 12-160.) To complement the WMS-Q convention, we have also developed extensions to the OGC Symbology Encoding (SE) specification, enabling uncertain geoscience data to be portrayed using a variety of visualization techniques. These include contours, stippling, blackening, whitening, opacity, bivariate colour maps, confidence interval triangles and glyphs. There may also be more extensive applications of these methods beyond the visual representation of uncertainty. In this presentation we will briefly describe the scope of the WMS-Q and "extended SE" specifications and then demonstrate the innovations using open-source software based upon ncWMS (http://ncwms.sf.net). We apply the tools to a variety of datasets including Earth Observation data from the European Space Agency's Climate Change Initiative. The software allows uncertain raster data to be shared through Web Map Services, giving the user fine control over data visualization.

  18. Architecture of a spatial data service system for statistical analysis and visualization of regional climate changes

    NASA Astrophysics Data System (ADS)

    Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.

  19. The StreamCat Dataset: Accumulated Attributes for NHDPlusV2 Catchments (Version 2.1) for the Conterminous United States: State Soil Geographic Database (STATSGO)

    EPA Pesticide Factsheets

    This dataset (STATSGO_Set1 and STATSGO_Set2) represents the soil characteristics within individual, local NHDPlusV2 catchments and upstream, contributing watersheds based on the STATSGO dataset (see Data Sources for links to NHDPlusV2 data and STATSGO data). Attributes were calculated for every local NHDPlusV2 catchment and accumulated to provide watershed-level metrics. This data set is derived from the STATSGO landscape rasters for the conterminous USA. Individual rasters (Landscape Layers) of organic material (om), permeability (perm), water table depth (wtdep), depth to bedrock (rckdep), percent clay (clay), and percent sand (sand) were used to calculate soil characteristics for each NHDPlusV2 catchment. The soil characteristics were summarized to produce local catchment-level and watershed-level metrics as a continuous data type (see Data Structure and Attribute Information for a description). The STATSGO data are distributed in two sets, STATSGO_Set1 and STATSGO_Set2, based on common NoData locations in each set of soil GIS layers (see ***link to ReadMe html with NoData map here***).

  20. Laser scanning endoscope for diagnostic medicine

    NASA Astrophysics Data System (ADS)

    Ouimette, Donald R.; Nudelman, Sol; Spackman, Thomas; Zaccheo, Scott

    1990-07-01

    A new type of endoscope is being developed which utilizes an optical raster scanning system for imaging through an endoscope. The optical raster scanner utilizes a high speed, multifaceted, rotating polygon mirror system for horizontal deflection, and a slower speed galvanometer driven mirror as the vertical deflection system. When used in combination, the optical raster scanner traces out a raster similar to an electron beam raster used in television systems. This flying spot of light can then be detected by various types of photosensitive detectors to generate a video image of the surface or scene being illuminated by the scanning beam. The optical raster scanner has been coupled to an endoscope. The raster is projected down the endoscope, thereby illuminating the object to be imaged at the distal end of the endoscope. Elemental photodetectors are placed at the distal or proximal end of the endoscope to detect the reflected illumination from the flying spot of light. This time sequenced signal is captured by an image processor for display and processing. This technique offers the possibility for very small diameter endoscopes since illumination channel requirements are eliminated. Using various lasers, very specific spectral selectivity can be achieved to optimum contrast of specific lesions of interest. Using several laser lines, or a white light source, with detectors of specific spectral response, multiple spectrally selected images can be acquired simultaneously. The potential for co-linear therapy delivery while imaging is also possible.

  1. Psyplot: Visualizing rectangular and triangular Climate Model Data with Python

    NASA Astrophysics Data System (ADS)

    Sommer, Philipp

    2016-04-01

    The development and use of climate models often requires the visualization of geo-referenced data. Creating visualizations should be fast, attractive, flexible, easily applicable and easily reproducible. There is a wide range of software tools available for visualizing raster data, but they often are inaccessible to many users (e.g. because they are difficult to use in a script or have low flexibility). In order to facilitate easy visualization of geo-referenced data, we developed a new framework called "psyplot," which can aid earth system scientists with their daily work. It is purely written in the programming language Python and primarily built upon the python packages matplotlib, cartopy and xray. The package can visualize data stored on the hard disk (e.g. NetCDF, GeoTIFF, any other file format supported by the xray package), or directly from the memory or Climate Data Operators (CDOs). Furthermore, data can be visualized on a rectangular grid (following or not following the CF Conventions) and on a triangular grid (following the CF or UGRID Conventions). Psyplot visualizes 2D scalar and vector fields, enabling the user to easily manage and format multiple plots at the same time, and to export the plots into all common picture formats and movies covered by the matplotlib package. The package can currently be used in an interactive python session or in python scripts, and will soon be developed for use with a graphical user interface (GUI). Finally, the psyplot framework enables flexible configuration, allows easy integration into other scripts that uses matplotlib, and provides a flexible foundation for further development.

  2. Google Earth elevation data extraction and accuracy assessment for transportation applications

    PubMed Central

    Wang, Yinsong; Zou, Yajie; Henrickson, Kristian; Wang, Yinhai; Tang, Jinjun; Park, Byung-Jung

    2017-01-01

    Roadway elevation data is critical for a variety of transportation analyses. However, it has been challenging to obtain such data and most roadway GIS databases do not have them. This paper intends to address this need by proposing a method to extract roadway elevation data from Google Earth (GE) for transportation applications. A comprehensive accuracy assessment of the GE-extracted elevation data is conducted for the area of conterminous USA. The GE elevation data was compared with the ground truth data from nationwide GPS benchmarks and roadway monuments from six states in the conterminous USA. This study also compares the GE elevation data with the elevation raster data from the U.S. Geological Survey National Elevation Dataset (USGS NED), which is a widely used data source for extracting roadway elevation. Mean absolute error (MAE) and root mean squared error (RMSE) are used to assess the accuracy and the test results show MAE, RMSE and standard deviation of GE roadway elevation error are 1.32 meters, 2.27 meters and 2.27 meters, respectively. Finally, the proposed extraction method was implemented and validated for the following three scenarios: (1) extracting roadway elevation differentiating by directions, (2) multi-layered roadway recognition in freeway segment and (3) slope segmentation and grade calculation in freeway segment. The methodology validation results indicate that the proposed extraction method can locate the extracting route accurately, recognize multi-layered roadway section, and segment the extracted route by grade automatically. Overall, it is found that the high accuracy elevation data available from GE provide a reliable data source for various transportation applications. PMID:28445480

  3. A spatial analysis of wilderness campsites in Lyell Canyon, Yosemite National Park

    Treesearch

    Steven R. Lawson; Peter Newman

    2001-01-01

    During the summer of 1999, Yosemite National Park staff collected GPS data to inventory the number and distribution of wilderness campsites in Lyell Canyon, Yosemite National Park. The data were collected after one month of campsite restoration work had been conducted by a student work crew. This study integrated the GPS data with digital raster graphics data for Lyell...

  4. Investigating Effects of Fused-Deposition Modeling (FDM) Processing Parameters on Flexural Properties of ULTEM 9085 using Designed Experiment.

    PubMed

    Gebisa, Aboma Wagari; Lemu, Hirpa G

    2018-03-27

    Fused-deposition modeling (FDM), one of the additive manufacturing (AM) technologies, is an advanced digital manufacturing technique that produces parts by heating, extruding and depositing filaments of thermoplastic polymers. The properties of FDM-produced parts apparently depend on the processing parameters. These processing parameters have conflicting advantages that need to be investigated. This article focuses on an investigation into the effect of these parameters on the flexural properties of FDM-produced parts. The investigation is carried out on high-performance ULTEM 9085 material, as this material is relatively new and has potential application in the aerospace, military and automotive industries. Five parameters: air gap, raster width, raster angle, contour number, and contour width, with a full factorial design of the experiment, are considered for the investigation. From the investigation, it is revealed that raster angle and raster width have the greatest effect on the flexural properties of the material. The optimal levels of the process parameters achieved are: air gap of 0.000 mm, raster width of 0.7814 mm, raster angle of 0°, contour number of 5, and contour width of 0.7814 mm, leading to a flexural strength of 127 MPa, a flexural modulus of 2400 MPa, and 0.081 flexural strain.

  5. Investigating Effects of Fused-Deposition Modeling (FDM) Processing Parameters on Flexural Properties of ULTEM 9085 using Designed Experiment

    PubMed Central

    Gebisa, Aboma Wagari

    2018-01-01

    Fused-deposition modeling (FDM), one of the additive manufacturing (AM) technologies, is an advanced digital manufacturing technique that produces parts by heating, extruding and depositing filaments of thermoplastic polymers. The properties of FDM-produced parts apparently depend on the processing parameters. These processing parameters have conflicting advantages that need to be investigated. This article focuses on an investigation into the effect of these parameters on the flexural properties of FDM-produced parts. The investigation is carried out on high-performance ULTEM 9085 material, as this material is relatively new and has potential application in the aerospace, military and automotive industries. Five parameters: air gap, raster width, raster angle, contour number, and contour width, with a full factorial design of the experiment, are considered for the investigation. From the investigation, it is revealed that raster angle and raster width have the greatest effect on the flexural properties of the material. The optimal levels of the process parameters achieved are: air gap of 0.000 mm, raster width of 0.7814 mm, raster angle of 0°, contour number of 5, and contour width of 0.7814 mm, leading to a flexural strength of 127 MPa, a flexural modulus of 2400 MPa, and 0.081 flexural strain. PMID:29584674

  6. A comparison of interpolation methods on the basis of data obtained from a bathymetric survey of Lake Vrana, Croatia

    NASA Astrophysics Data System (ADS)

    Šiljeg, A.; Lozić, S.; Šiljeg, S.

    2014-12-01

    The bathymetric survey of Lake Vrana included a wide range of activities that were performed in several different stages, in accordance with the standards set by the International Hydrographic Organization. The survey was conducted using an integrated measuring system which consisted of three main parts: a single-beam sonar Hydrostar 4300, GPS devices Ashtech Promark 500 - base, and a Thales Z-Max - rover. A total of 12 851 points were gathered. In order to find continuous surfaces necessary for analysing the morphology of the bed of Lake Vrana, it was necessary to approximate values in certain areas that were not directly measured, by using an appropriate interpolation method. The main aims of this research were as follows: to compare the efficiency of 16 different interpolation methods, to discover the most appropriate interpolators for the development of a raster model, to calculate the surface area and volume of Lake Vrana, and to compare the differences in calculations between separate raster models. The best deterministic method of interpolation was ROF multi-quadratic, and the best geostatistical, ordinary cokriging. The mean quadratic error in both methods measured less than 0.3 m. The quality of the interpolation methods was analysed in 2 phases. The first phase used only points gathered by bathymetric measurement, while the second phase also included points gathered by photogrammetric restitution. The first bathymetric map of Lake Vrana in Croatia was produced, as well as scenarios of minimum and maximum water levels. The calculation also included the percentage of flooded areas and cadastre plots in the case of a 2 m increase in the water level. The research presented new scientific and methodological data related to the bathymetric features, surface area and volume of Lake Vrana.

  7. A comparison of interpolation methods on the basis of data obtained from a bathymetric survey of Lake Vrana, Croatia

    NASA Astrophysics Data System (ADS)

    Šiljeg, A.; Lozić, S.; Šiljeg, S.

    2015-08-01

    The bathymetric survey of Lake Vrana included a wide range of activities that were performed in several different stages, in accordance with the standards set by the International Hydrographic Organization. The survey was conducted using an integrated measuring system which consisted of three main parts: a single-beam sonar HydroStar 4300 and GPS devices; a Ashtech ProMark 500 base, and a Thales Z-Max® rover. A total of 12 851 points were gathered. In order to find continuous surfaces necessary for analysing the morphology of the bed of Lake Vrana, it was necessary to approximate values in certain areas that were not directly measured, by using an appropriate interpolation method. The main aims of this research were as follows: (a) to compare the efficiency of 14 different interpolation methods and discover the most appropriate interpolators for the development of a raster model; (b) to calculate the surface area and volume of Lake Vrana, and (c) to compare the differences in calculations between separate raster models. The best deterministic method of interpolation was multiquadric RBF (radio basis function), and the best geostatistical method was ordinary cokriging. The root mean square error in both methods measured less than 0.3 m. The quality of the interpolation methods was analysed in two phases. The first phase used only points gathered by bathymetric measurement, while the second phase also included points gathered by photogrammetric restitution. The first bathymetric map of Lake Vrana in Croatia was produced, as well as scenarios of minimum and maximum water levels. The calculation also included the percentage of flooded areas and cadastre plots in the case of a 2 m increase in the water level. The research presented new scientific and methodological data related to the bathymetric features, surface area and volume of Lake Vrana.

  8. Parallel Vision Algorithm Design and Implementation 1988 End of Year Report

    DTIC Science & Technology

    1989-08-01

    as a local operation, the provided C code used raster order processing to speed up execution time. This made it impossible to implement the code using...Apply, which does not allow the programmer to take advantage of raster order processing . Therefore, the 5x5 median filter algorithm was a straight...possible to exploit raster- order processing in W2, giving greater efficiency. The first advantage is the reason that connected components and the Hough

  9. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  10. Tiled vector data model for the geographical features of symbolized maps.

    PubMed

    Li, Lin; Hu, Wei; Zhu, Haihong; Li, You; Zhang, Hang

    2017-01-01

    Electronic maps (E-maps) provide people with convenience in real-world space. Although web map services can display maps on screens, a more important function is their ability to access geographical features. An E-map that is based on raster tiles is inferior to vector tiles in terms of interactive ability because vector maps provide a convenient and effective method to access and manipulate web map features. However, the critical issue regarding rendering tiled vector maps is that geographical features that are rendered in the form of map symbols via vector tiles may cause visual discontinuities, such as graphic conflicts and losses of data around the borders of tiles, which likely represent the main obstacles to exploring vector map tiles on the web. This paper proposes a tiled vector data model for geographical features in symbolized maps that considers the relationships among geographical features, symbol representations and map renderings. This model presents a method to tailor geographical features in terms of map symbols and 'addition' (join) operations on the following two levels: geographical features and map features. Thus, these maps can resolve the visual discontinuity problem based on the proposed model without weakening the interactivity of vector maps. The proposed model is validated by two map data sets, and the results demonstrate that the rendered (symbolized) web maps present smooth visual continuity.

  11. Quantifying Uncertainties from Presence Data Sampling Methods for Species Distribution Modeling: Focused on Vegetation.

    NASA Astrophysics Data System (ADS)

    Sung, S.; Kim, H. G.; Lee, D. K.; Park, J. H.; Mo, Y.; Kil, S.; Park, C.

    2016-12-01

    The impact of climate change has been observed throughout the globe. The ecosystem experiences rapid changes such as vegetation shift, species extinction. In these context, Species Distribution Model (SDM) is one of the popular method to project impact of climate change on the ecosystem. SDM basically based on the niche of certain species with means to run SDM present point data is essential to find biological niche of species. To run SDM for plants, there are certain considerations on the characteristics of vegetation. Normally, to make vegetation data in large area, remote sensing techniques are used. In other words, the exact point of presence data has high uncertainties as we select presence data set from polygons and raster dataset. Thus, sampling methods for modeling vegetation presence data should be carefully selected. In this study, we used three different sampling methods for selection of presence data of vegetation: Random sampling, Stratified sampling and Site index based sampling. We used one of the R package BIOMOD2 to access uncertainty from modeling. At the same time, we included BioCLIM variables and other environmental variables as input data. As a result of this study, despite of differences among the 10 SDMs, the sampling methods showed differences in ROC values, random sampling methods showed the lowest ROC value while site index based sampling methods showed the highest ROC value. As a result of this study the uncertainties from presence data sampling methods and SDM can be quantified.

  12. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Treesearch

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  13. OpenClimateGIS - A Web Service Providing Climate Model Data in Commonly Used Geospatial Formats

    NASA Astrophysics Data System (ADS)

    Erickson, T. A.; Koziol, B. W.; Rood, R. B.

    2011-12-01

    The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.

  14. Landscape level reforestation priorities for forest breeding landbirds in the Mississippi Alluvial Valley

    USGS Publications Warehouse

    Twedt, D.J.; Uihlein, W.B.; Fredrickson, L.H.; King, S.L.; Kaminski, R.M.

    2005-01-01

    Thousands of ha of cleared wetlands are being reforested annually in the Mississippi Alluvial Valley (MAV). Despite the expansive and long-term impacts of reforestation on the biological communities of the MAV, there is generally a lack of landscape level planning in its implementation. To address this deficiency we used raster-based digital data to assess the value of forest restoration to migratory landbirds for each ha within the MAV. Raster themes were developed that reflected distance from 3 existing forest cover parameters: (1) extant forest, (2) contiguous forest patches between 1,012 and 40,000 ha, and (3) forest cores with contiguous area 1 km from an agricultural, urban, or pastoral edge. Two additional raster themes were developed that combined information on the proportion of forest cover and average size of forest patches, respectively, within landscapes of 50,000, 100,000, 150,000, and 200,000 ha. Data from these 5 themes were amalgamated into a single raster using a weighting system that gave increased emphasis to existing forest cores, larger forest patches, and moderately forested landscapes while deemphasizing reforestation near small or isolated forest fragments and within largely agricultural landscapes. This amalgamated raster was then modified by the geographic location of historical forest cover and the current extent of public land ownership to assign a reforestation priority score to each ha in the MAV. However, because reforestation is not required on areas with extant forest cover and because restoration is unlikely on areas of open water and urban communities, these lands were not assigned a reforestation priority score. These spatially explicit reforestation priority scores were used to simulate reforestation of 368,000 ha (5%) of the highest priority lands in the MAV. Targeting restoration to these high priority areas resulted in a 54% increase in forest core - an area of forest core that exceeded the area of simulated reforestation

  15. State of the Oceans: A Satellite Data Processing System for Visualizing Near Real-Time Imagery on Google Earth

    NASA Astrophysics Data System (ADS)

    Thompson, C. K.; Bingham, A. W.; Hall, J. R.; Alarcon, C.; Plesea, L.; Henderson, M. L.; Levoe, S.

    2011-12-01

    The State of the Oceans (SOTO) web tool was developed at NASA's Physical Oceanography Distributed Active Archive Center (PO.DAAC) at the Jet Propulsion Laboratory (JPL) as an interactive means for users to visually explore and assess ocean-based geophysical parameters extracted from the latest archived data products. The SOTO system consists of four extensible modules, a data polling tool, a preparation and imaging package, image server software, and the graphical user interface. Together, these components support multi-resolution visualization of swath (Level 2) and gridded Level 3/4) data products as either raster- or vector- based KML layers on Google Earth. These layers are automatically updated periodically throughout the day. Current parameters available include sea surface temperature, chlorophyll concentration, ocean winds, sea surface height anomaly, and sea surface temperature anomaly. SOTO also supports mash-ups, allowing KML feeds from other sources to be overlaid directly onto Google Earth such as hurricane tracks and buoy data. A version of the SOTO software has also been installed at Goddard Space Flight Center (GSFC) to support the Land Atmosphere Near real-time Capability for EOS (LANCE). The State of the Earth (SOTE) has similar functionality to SOTO but supports different data sets, among them the MODIS 250m data product.

  16. Anisoft - Advanced Treatment of Magnetic Anisotropy Data

    NASA Astrophysics Data System (ADS)

    Chadima, M.

    2017-12-01

    Since its first release, Anisoft (Anisotropy Data Browser) has gained a wide popularity in magnetic fabric community mainly due to its simple and user-friendly interface enabling very fast visualization of magnetic anisotropy tensors. Here, a major Anisoft update is presented transforming a rather simple data viewer into a platform offering an advanced treatment of magnetic anisotropy data. The updated software introduces new enlarged binary data format which stores both in-phase and out-of-phase (if measured) susceptibility tensors (AMS) or tensors of anisotropy of magnetic remanence (AMR) together with their respective confidence ellipses and values of F-tests for anisotropy. In addition to the tensor data, a whole array of specimen orientation angles, orientation of mesoscopic foliation(s) and lineation(s) is stored for each record enabling later editing or corrections. The input data may be directly acquired by AGICO Kappabridges (AMS) or Spinner Magnetometers (AMR); imported from various data formats, including the long-time standard binary ran-format; or manually created. Multiple anisotropy files can be combined together or split into several files by manual data selection or data filtering according to their values. Anisotropy tensors are conventionally visualized as principal directions (eigenvectors) in equal-area projection (stereoplot) together with a wide array of quantitative anisotropy parameters presented in histograms or in color-coded scatter plots showing mutual relationship of up to three quantitative parameters. When dealing with AMS in variable low fields, field-independent and field-dependent components of anisotropy can be determined (Hrouda 2009). For a group of specimens, individual principal directions can be contoured, or a mean tensor and respective confidence ellipses of its principal directions can be calculated using either the Hext-Jelinek (Jelinek 1978) statistics or the Bootstrap method (Constable & Tauxe 1990). Each graphical

  17. Video Information Communication and Retrieval/Image Based Information System (VICAR/IBIS)

    NASA Technical Reports Server (NTRS)

    Wherry, D. B.

    1981-01-01

    The acquisition, operation, and planning stages of installing a VICAR/IBIS system are described. The system operates in an IBM mainframe environment, and provides image processing of raster data. System support problems with software and documentation are discussed.

  18. TEODOOR, a blueprint for distributed terrestrial observation data infrastructures

    NASA Astrophysics Data System (ADS)

    Kunkel, Ralf; Sorg, Jürgen; Abbrent, Martin; Borg, Erik; Gasche, Rainer; Kolditz, Olaf; Neidl, Frank; Priesack, Eckart; Stender, Vivien

    2017-04-01

    TERENO (TERrestrial ENvironmental Observatories) is an initiative funded by the large research infrastructure program of the Helmholtz Association of Germany. Four observation platforms to facilitate the investigation of consequences of global change for terrestrial ecosys-tems and the socioeconomic implications of these have been implemented and equipped from 2007 until 2013. Data collection, however, is planned to be performed for at least 30 years. TERENO provides series of system variables (e.g. precipitation, runoff, groundwater level, soil moisture, water vapor and trace gases fluxes) for the analysis and prognosis of global change consequences using integrated model systems, which will be used to derive efficient prevention, mitigation and adaptation strategies. Each platform is operated by a different Helmholtz-Institution, which maintains its local data infrastructure. Within the individual observatories, areas with intensive measurement programs have been implemented. Different sensors provide information on various physical parameters like soil moisture, temperatures, ground water levels or gas fluxes. Sensor data from more than 900 stations are collected automatically with a frequency of 20 s-1 up to 2 h-1, summing up to about 2,500,000 data values per day. In addition, three weather radar devices create raster data with a frequency of 12 to 60 h-1. The data are automatically imported into local relational database systems using a common data quality assessment framework, used to handle processing and assessment of heterogeneous environmental observation data. Starting with the way data are imported into the data infrastructure, custom workflows are developed. Data levels implying the underlying data processing, stages of quality assessment and data ac-cessibility are defined. In order to facilitate the acquisition, provision, integration, management and exchange of heterogeneous geospatial resources within a scientific and non-scientific environment

  19. Real time imaging of infrared scene data generated by the Naval Postgraduate School Infrared Search and Target Designation (NPS-IRSTD) system

    NASA Astrophysics Data System (ADS)

    Baca, Michael J.

    1990-09-01

    A system to display images generated by the Naval Postgraduate School Infrared Search and Target Designation (a modified AN/SAR-8 Advanced Development Model) in near real time was developed using a 33 MHz NIC computer as the central controller. This computer was enhanced with a Data Translation DT2861 Frame Grabber for image processing and an interface board designed and constructed at NPS to provide synchronization between the IRSTD and Frame Grabber. Images are displayed in false color in a video raster format on a 512 by 480 pixel resolution monitor. Using FORTRAN, programs have been written to acquire, unscramble, expand and display a 3 deg sector of data. The time line for acquisition, processing and display has been analyzed and repetition periods of less than four seconds for successive screen displays have been achieved. This represents a marked improvement over previous methods necessitating slower Direct Memory Access transfers of data into the Frame Grabber. Recommendations are made for further improvements to enhance the speed and utility of images produced.

  20. Video-to-film color-image recorder.

    NASA Technical Reports Server (NTRS)

    Montuori, J. S.; Carnes, W. R.; Shim, I. H.

    1973-01-01

    A precision video-to-film recorder for use in image data processing systems, being developed for NASA, will convert three video input signals (red, blue, green) into a single full-color light beam for image recording on color film. Argon ion and krypton lasers are used to produce three spectral lines which are independently modulated by the appropriate video signals, combined into a single full-color light beam, and swept over the recording film in a raster format for image recording. A rotating multi-faceted spinner mounted on a translating carriage generates the raster, and an annotation head is used to record up to 512 alphanumeric characters in a designated area outside the image area.

  1. Automated microdensitometer for digitizing astronomical plates

    NASA Technical Reports Server (NTRS)

    Angilello, J.; Chiang, W. H.; Elmegreen, D. M.; Segmueller, A.

    1984-01-01

    A precision microdensitometer was built under control of an IBM S/1 time-sharing computer system. The instrument's spatial resolution is better than 20 microns. A raster scan of an area of 10x10 sq mm (500x500 raster points) takes 255 minutes. The reproducibility is excellent and the stability is good over a period of 30 hours, which is significantly longer than the time required for most scans. The intrinsic accuracy of the instrument was tested using Kodak standard filters, and it was found to be better than 3%. A comparative accuracy was tested measuring astronomical plates of galaxies for which absolute photoelectric photometry data were available. The results showed an accuracy excellent for astronomical applications.

  2. The potential impacts of development on wildlands in El Dorado County, California

    Treesearch

    Shawn C. Saving; Gregory B. Greenwood

    2002-01-01

    We modeled future development in rapidly urbanizing El Dorado County, California, to assess ecological impacts of expanding urbanization and effectiveness of standard policy mitigation efforts. Using raster land cover data and county parcel data, we constructed a footprint of current development and simulated future development using a modified stochastic flood-fill...

  3. Monitoring and mapping leaf area index of rubber and oil palm in small watershed area

    NASA Astrophysics Data System (ADS)

    Rusli, N.; Majid, M. R.

    2014-02-01

    Existing conventional methods to determine LAI are tedious and time consuming for implementation in small or large areas. Thus, raster LAI data which are available free were downloaded for 4697.60 km2 of Sungai Muar watershed area in Johor. The aim of this study is to monitor and map LAI changes of rubber and oil palm throughout the years from 2002 to 2008. Raster datasets of LAI value were obtained from the National Aeronautics and Space Administration (NASA) website of available years from 2002 to year 2008. These data, were mosaicked and subset utilizing ERDAS Imagine 9.2. Next, the LAI raster dataset was multiplied by a scale factor of 0.1 to derive the final LAI value. Afterwards, to determine LAI values of rubber and oil palms, the boundaries of each crop from land cover data of the years 2002, 2006 and 2008 were exploited to overlay with LAI raster dataset. A total of 5000 sample points were generated utilizing the Hawths Tool (extension in ARcGIS 9.2) within these boundaries area and utilized for extracting LAI value of oil palm and rubber. In integration, a wide range of literature review was conducted as a guideline to derive LAI value of oil palm and rubber which range from 0 to 6. The results show, an overall mean LAI value from year 2002 to 2008 as decremented from 4.12 to 2.5 due to land cover transition within these years. In 2002, the mean LAI value of rubber and oil palm is 2.65 and 2.53 respectively. Meanwhile in 2006, the mean LAI value for rubber and oil palm is 2.54 and 2.82 respectively. In 2008, the mean LAI value for both crops is 0.85 for rubber and 1.04 for oil palm. In conclusion, apart from the original function of LAI which is related to the growth and metabolism of vegetation, the changes of LAI values from year 2002 to 2008 also capable to explain the process of land cover changes in a watershed area.

  4. gulf_of_mexico_90mwindspeed_off

    Science.gov Websites

    using their MesoMap system and historical weather data. This shapefile was generated from raster "). The user is granted the right, without any fee or cost, to use, copy, modify, alter, enhance copies of the data. Further, the user of this data agrees to credit NREL in any publications or software

  5. Sensitivity of landscape metrics to pixel size

    Treesearch

    J. D. Wickham; K. H. Riitters

    1995-01-01

    Analysis of diversity and evenness metrics using land cover data are becoming formalized in landscape ecology. Diversity and evenness metrics are dependent on the pixel size (scale) over which the data are collected. Aerial photography was interpreted for land cover and converted into four raster data sets with 4, 12, 28, and 80 m pixel sizes, representing pixel sizes...

  6. Method and apparatus for differential spectroscopic atomic-imaging using scanning tunneling microscopy

    DOEpatents

    Kazmerski, Lawrence L.

    1990-01-01

    A Method and apparatus for differential spectroscopic atomic-imaging is disclosed for spatial resolution and imaging for display not only individual atoms on a sample surface, but also bonding and the specific atomic species in such bond. The apparatus includes a scanning tunneling microscope (STM) that is modified to include photon biasing, preferably a tuneable laser, modulating electronic surface biasing for the sample, and temperature biasing, preferably a vibration-free refrigerated sample mounting stage. Computer control and data processing and visual display components are also included. The method includes modulating the electronic bias voltage with and without selected photon wavelengths and frequency biasing under a stabilizing (usually cold) bias temperature to detect bonding and specific atomic species in the bonds as the STM rasters the sample. This data is processed along with atomic spatial topography data obtained from the STM raster scan to create a real-time visual image of the atoms on the sample surface.

  7. Enhancements to TauDEM to support Rapid Watershed Delineation Services

    NASA Astrophysics Data System (ADS)

    Sazib, N. S.; Tarboton, D. G.

    2015-12-01

    Watersheds are widely recognized as the basic functional unit for water resources management studies and are important for a variety of problems in hydrology, ecology, and geomorphology. Nevertheless, delineating a watershed spread across a large region is still cumbersome due to the processing burden of working with large Digital Elevation Model. Terrain Analysis Using Digital Elevation Models (TauDEM) software supports the delineation of watersheds and stream networks from within desktop Geographic Information Systems. A rich set of watershed and stream network attributes are computed. However limitations of the TauDEM desktop tools are (1) it supports only one type of raster (tiff format) data (2) requires installation of software for parallel processing, and (3) data have to be in projected coordinate system. This paper presents enhancements to TauDEM that have been developed to extend its generality and support web based watershed delineation services. The enhancements of TauDEM include (1) reading and writing raster data with the open-source geospatial data abstraction library (GDAL) not limited to the tiff data format and (2) support for both geographic and projected coordinates. To support web services for rapid watershed delineation a procedure has been developed for sub setting the domain based on sub-catchments, with preprocessed data prepared for each catchment stored. This allows the watershed delineation to function locally, while extending to the full extent of watersheds using preprocessed information. Additional capabilities of this program includes computation of average watershed properties and geomorphic and channel network variables such as drainage density, shape factor, relief ratio and stream ordering. The updated version of TauDEM increases the practical applicability of it in terms of raster data type, size and coordinate system. The watershed delineation web service functionality is useful for web based software as service deployments

  8. Ecohydrologic coevolution in drylands: relative roles of vegetation, soil depth and runoff connectivity on ecosystem shifts.

    NASA Astrophysics Data System (ADS)

    Saco, P. M.; Moreno de las Heras, M.; Willgoose, G. R.

    2014-12-01

    Watersheds are widely recognized as the basic functional unit for water resources management studies and are important for a variety of problems in hydrology, ecology, and geomorphology. Nevertheless, delineating a watershed spread across a large region is still cumbersome due to the processing burden of working with large Digital Elevation Model. Terrain Analysis Using Digital Elevation Models (TauDEM) software supports the delineation of watersheds and stream networks from within desktop Geographic Information Systems. A rich set of watershed and stream network attributes are computed. However limitations of the TauDEM desktop tools are (1) it supports only one type of raster (tiff format) data (2) requires installation of software for parallel processing, and (3) data have to be in projected coordinate system. This paper presents enhancements to TauDEM that have been developed to extend its generality and support web based watershed delineation services. The enhancements of TauDEM include (1) reading and writing raster data with the open-source geospatial data abstraction library (GDAL) not limited to the tiff data format and (2) support for both geographic and projected coordinates. To support web services for rapid watershed delineation a procedure has been developed for sub setting the domain based on sub-catchments, with preprocessed data prepared for each catchment stored. This allows the watershed delineation to function locally, while extending to the full extent of watersheds using preprocessed information. Additional capabilities of this program includes computation of average watershed properties and geomorphic and channel network variables such as drainage density, shape factor, relief ratio and stream ordering. The updated version of TauDEM increases the practical applicability of it in terms of raster data type, size and coordinate system. The watershed delineation web service functionality is useful for web based software as service deployments

  9. The Voronoi spatio-temporal data structure

    NASA Astrophysics Data System (ADS)

    Mioc, Darka

    2002-04-01

    Current GIS models cannot integrate the temporal dimension of spatial data easily. Indeed, current GISs do not support incremental (local) addition and deletion of spatial objects, and they can not support the temporal evolution of spatial data. Spatio-temporal facilities would be very useful in many GIS applications: harvesting and forest planning, cadastre, urban and regional planning, and emergency planning. The spatio-temporal model that can overcome these problems is based on a topological model---the Voronoi data structure. Voronoi diagrams are irregular tessellations of space, that adapt to spatial objects and therefore they are a synthesis of raster and vector spatial data models. The main advantage of the Voronoi data structure is its local and sequential map updates, which allows us to automatically record each event and performed map updates within the system. These map updates are executed through map construction commands that are composed of atomic actions (geometric algorithms for addition, deletion, and motion of spatial objects) on the dynamic Voronoi data structure. The formalization of map commands led to the development of a spatial language comprising a set of atomic operations or constructs on spatial primitives (points and lines), powerful enough to define the complex operations. This resulted in a new formal model for spatio-temporal change representation, where each update is uniquely characterized by the numbers of newly created and inactivated Voronoi regions. This is used for the extension of the model towards the hierarchical Voronoi data structure. In this model, spatio-temporal changes induced by map updates are preserved in a hierarchical data structure that combines events and corresponding changes in topology. This hierarchical Voronoi data structure has an implicit time ordering of events visible through changes in topology, and it is equivalent to an event structure that can support temporal data without precise temporal

  10. High-density grids for efficient data collection from multiple crystals

    PubMed Central

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto; Barnes, Christopher O.; Bonagura, Christopher A.; Brehmer, Winnie; Brunger, Axel T.; Calero, Guillermo; Caradoc-Davies, Tom T.; Chatterjee, Ruchira; Degrado, William F.; Fraser, James S.; Ibrahim, Mohamed; Kern, Jan; Kobilka, Brian K.; Kruse, Andrew C.; Larsson, Karl M.; Lemke, Heinrik T.; Lyubimov, Artem Y.; Manglik, Aashish; McPhillips, Scott E.; Norgren, Erik; Pang, Siew S.; Soltis, S. M.; Song, Jinhu; Thomaston, Jessica; Tsai, Yingssu; Weis, William I.; Woldeyes, Rahel A.; Yachandra, Vittal; Yano, Junko; Zouni, Athina; Cohen, Aina E.

    2016-01-01

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassette or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into the Blu-Ice/DCSS experimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. Crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures. PMID:26894529

  11. Study of Raster Metafile Formats.

    DTIC Science & Technology

    1985-01-01

    0, 04 0 41 a4.4 a 00 ".0 06me 0 a +1 4 a a aU 0.4 060. 41 6 41 0 6 66 4 6 £4 6 N 4041 0 ;,4 *,10*64 NO4 066 a 4140 00 .414103w 4-4 ~’ a41 06 ft406...0a.44* 5 444. 44, .4 A "a 44014 0,14 4164 00 2 44 aV Loc 4 14 4 1 54 . A s o 6 . 4a UN a 4114 A 0, 4140 50 .. a4 Ne e-C 42 0 N4 M 0.44 4 3 a 0 000...use of 8-bit characters and the SI, SO and ES. control codes within the text string, in accordance with AISI X3.41 and ISO 2022. The ALTERNATE

  12. Spatial interpolation techniques using R

    EPA Science Inventory

    Interpolation techniques are used to predict the cell values of a raster based on sample data points. For example, interpolation can be used to predict the distribution of sediment particle size throughout an estuary based on discrete sediment samples. We demonstrate some inter...

  13. A hierarchical network-based algorithm for multi-scale watershed delineation

    NASA Astrophysics Data System (ADS)

    Castronova, Anthony M.; Goodall, Jonathan L.

    2014-11-01

    Watershed delineation is a process for defining a land area that contributes surface water flow to a single outlet point. It is a commonly used in water resources analysis to define the domain in which hydrologic process calculations are applied. There has been a growing effort over the past decade to improve surface elevation measurements in the U.S., which has had a significant impact on the accuracy of hydrologic calculations. Traditional watershed processing on these elevation rasters, however, becomes more burdensome as data resolution increases. As a result, processing of these datasets can be troublesome on standard desktop computers. This challenge has resulted in numerous works that aim to provide high performance computing solutions to large data, high resolution data, or both. This work proposes an efficient watershed delineation algorithm for use in desktop computing environments that leverages existing data, U.S. Geological Survey (USGS) National Hydrography Dataset Plus (NHD+), and open source software tools to construct watershed boundaries. This approach makes use of U.S. national-level hydrography data that has been precomputed using raster processing algorithms coupled with quality control routines. Our approach uses carefully arranged data and mathematical graph theory to traverse river networks and identify catchment boundaries. We demonstrate this new watershed delineation technique, compare its accuracy with traditional algorithms that derive watershed solely from digital elevation models, and then extend our approach to address subwatershed delineation. Our findings suggest that the open-source hierarchical network-based delineation procedure presented in the work is a promising approach to watershed delineation that can be used summarize publicly available datasets for hydrologic model input pre-processing. Through our analysis, we explore the benefits of reusing the NHD+ datasets for watershed delineation, and find that the our technique

  14. Processing techniques for global land 1-km AVHRR data

    USGS Publications Warehouse

    Eidenshink, Jeffery C.; Steinwand, Daniel R.; Wivell, Charles E.; Hollaren, Douglas M.; Meyer, David

    1993-01-01

    The U.S. Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center (EDC) in cooperation with several international science organizations has developed techniques for processing daily Advanced Very High Resolution Radiometer (AVHRR) 1-km data of the entire global land surface. These techniques include orbital stitching, geometric rectification, radiometric calibration, and atmospheric correction. An orbital stitching algorithm was developed to combine consecutive observations acquired along an orbit by ground receiving stations into contiguous half-orbital segments. The geometric rectification process uses an AVHRR satellite model that contains modules for forward mapping, forward terrain correction, and inverse mapping with terrain correction. The correction is accomplished by using the hydrologic features coastlines and lakes from the Digital Chart of the World. These features are rasterized into the satellite projection and are matched to the AVHRR imagery using binary edge correlation techniques. The resulting coefficients are related to six attitude correction parameters: roll, roll rate, pitch, pitch rate, yaw, and altitude. The image can then be precision corrected to a variety of map projections and user-selected image frames. Because the AVHRR lacks onboard calibration for the optical wavelengths, a series of time-variant calibration coefficients derived from vicarious calibration methods and are used to model the degradation profile of the instruments. Reducing atmospheric effects on AVHRR data is important. A method has been develop that will remove the effects of molecular scattering and absorption from clear sky observations, using climatological measurements of ozone. Other methods to remove the effects of water vapor and aerosols are being investigated.

  15. great_lakes_90mwindspeed_off

    Science.gov Websites

    MesoMap system and historical weather data. This shapefile was generated from raster datasets with a 200 m , LLC for the U.S. Department of Energy ("DOE"). The user is granted the right, without any , provided that this entire notice appears in all copies of the data. Further, the user of this data agrees

  16. National Elevation Dataset

    USGS Publications Warehouse

    ,

    1999-01-01

    The National Elevation Dataset (NED) is a new raster product assembled by the U.S. Geological Survey (USGS). The NED is designed to provide national elevation data in a seamless form with a consistent datum, elevation unit, and projection. Data corrections were made in the NED assembly process to minimize artifacts, permit edge matching, and fill sliver areas of missing data.

  17. A Bayesian Analysis of Scale-Invariant Processes

    DTIC Science & Technology

    2012-01-01

    Earth Grid (EASE- Grid). The NED raster elevation data of one arc-second resolution (30 m) over the continental US are derived from multiple satellites ...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...empirical and ME distributions, yet ensuring computational efficiency. Instead of com- puting empirical histograms from large amount of data , only some

  18. OLYMPUS DISS - A Readily Implemented Geographic Data and Information Sharing System

    NASA Astrophysics Data System (ADS)

    Necsoiu, D. M.; Winfrey, B.; Murphy, K.; McKague, H. L.

    2002-12-01

    Electronic information technology has become a crucial component of business, government, and scientific organizations. In this technology era, many enterprises are moving away from the perception that information repositories are only a tool for decision-making. Instead, many organizations are learning that information systems, which are capable of organizing and following the interrelations between information and both the short-term and strategic organizational goals, are assets themselves, with inherent value. Olympus Data and Information Sharing System (DISS) is a system developed at the Center for Nuclear Waste Regulatory Analyses (CNWRA) to solve several difficult tasks associated with the management of geographical, geological and geophysical data. Three of the tasks were to (1) gather the large amount of heterogeneous information that has accumulated over the operational lifespan of CNWRA, (2) store the data in a central, knowledge-based, searchable database and (3) create quick, easy, convenient, and reliable access to that information. Faced with these difficult tasks CNWRA identified the requirements for designing such a system. Key design criteria were: (a) ability to ingest different data formats (i.e., raster, vector, and tabular data); (b) minimal expense using open-source and commercial off-the-shelf software; (c) seamless management of geospatial data, freeing up time for researchers to focus on analyses or algorithm development, rather than on time consuming format conversions; (d) controlled access; and (e) scalable architecture to meet new and continuing demands. Olympus DISS is a solution that can be easily adapted to small and mid-size enterprises dealing with heterogeneous geographic data. It uses established data standards, provides a flexible mechanism to build applications upon and output geographic data in multiple and clear ways. This abstract is an independent product of the CNWRA and does not necessarily reflect the views or

  19. Overlapping MALDI-Mass Spectrometry Imaging for In-Parallel MS and MS/MS Data Acquisition without Sacrificing Spatial Resolution

    NASA Astrophysics Data System (ADS)

    Hansen, Rebecca L.; Lee, Young Jin

    2017-09-01

    Metabolomics experiments require chemical identifications, often through MS/MS analysis. In mass spectrometry imaging (MSI), this necessitates running several serial tissue sections or using a multiplex data acquisition method. We have previously developed a multiplex MSI method to obtain MS and MS/MS data in a single experiment to acquire more chemical information in less data acquisition time. In this method, each raster step is composed of several spiral steps and each spiral step is used for a separate scan event (e.g., MS or MS/MS). One main limitation of this method is the loss of spatial resolution as the number of spiral steps increases, limiting its applicability for high-spatial resolution MSI. In this work, we demonstrate multiplex MS imaging is possible without sacrificing spatial resolution by the use of overlapping spiral steps, instead of spatially separated spiral steps as used in the previous work. Significant amounts of matrix and analytes are still left after multiple spectral acquisitions, especially with nanoparticle matrices, so that high quality MS and MS/MS data can be obtained on virtually the same tissue spot. This method was then applied to visualize metabolites and acquire their MS/MS spectra in maize leaf cross-sections at 10 μm spatial resolution. [Figure not available: see fulltext.

  20. High-efficient Extraction of Drainage Networks from Digital Elevation Model Data Constrained by Enhanced Flow Enforcement from Known River Map

    NASA Astrophysics Data System (ADS)

    Wu, T.; Li, T.; Li, J.; Wang, G.

    2017-12-01

    Improved drainage network extraction can be achieved by flow enforcement whereby information of known river maps is imposed to the flow-path modeling process. However, the common elevation-based stream burning method can sometimes cause unintended topological errors and misinterpret the overall drainage pattern. We presented an enhanced flow enforcement method to facilitate accurate and efficient process of drainage network extraction. Both the topology of the mapped hydrography and the initial landscape of the DEM are well preserved and fully utilized in the proposed method. An improved stream rasterization is achieved here, yielding continuous, unambiguous and stream-collision-free raster equivalent of stream vectors for flow enforcement. By imposing priority-based enforcement with a complementary flow direction enhancement procedure, the drainage patterns of the mapped hydrography are fully represented in the derived results. The proposed method was tested over the Rogue River Basin, using DEMs with various resolutions. As indicated by the visual and statistical analyses, the proposed method has three major advantages: (1) it significantly reduces the occurrences of topological errors, yielding very accurate watershed partition and channel delineation, (2) it ensures scale-consistent performance at DEMs of various resolutions, and (3) the entire extraction process is well-designed to achieve great computational efficiency.

  1. Facilitating hydrological data analysis workflows in R: the RHydro package

    NASA Astrophysics Data System (ADS)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges

  2. --No Title--

    Science.gov Websites

    simple math (addition, subtraction, etc.) on the sand thickness and formation depth rasters. Use "). The user is granted the right, without any fee or cost, to use, copy, modify, alter, enhance that incorporate or use the data. Access to and use of the GIS data shall further impose the following

  3. hi_90mwindspeed_off

    Science.gov Websites

    WGS 84. The shapefile was generated from the raster dataset and then projected to Geographic Decimal "). The user is granted the right, without any fee or cost, to use, copy, modify, alter, enhance copies of the data. Further, the user of this data agrees to credit NREL in any publications or software

  4. PR_VI_50mwind

    Science.gov Websites

    -siting potential development projects. This shapefile was generated from a raster dataset with a 200 m of Energy ("DOE"). The user is granted the right, without any fee or cost, to use, copy notice appears in all copies of the data. Further, the user of this data agrees to credit NREL in any

  5. Digital data base application to porphyry copper mineralization in Alaska; case study summary

    USGS Publications Warehouse

    Trautwein, Charles M.; Greenlee, David D.; Orr, Donald G.

    1982-01-01

    The purpose of this report is to summarize the progress in use of digital image analysis techniques in developing a conceptual model for assessing porphyry copper mineral potential. The study area consists of approximately the southern one-half of the 1? by 3? Nabesna quadrangle in east-central Alaska. The digital geologic data base consists of data compiled under the Alaskan Mineral Resource Assessment Program (AMRAP) as well as digital elevation data and Landsat spectral reflectance data from the Multispectral Scanner System. The digital data base used to develop and implement a conceptual model for porphyry-type copper mineralization consisted of 16 original data types and 18 derived data sets formatted in a grid-cell (raster) structure and registered to a map base in the Universal Transverse Mercator (UTM) projection. Minimum curvature and inverse distance squared interpolation techniques were used to generate continuous surfaces from sets of irregularly spaced data points. Processing requirements included: (1) merging or overlaying of data sets, (2) display and color coding of maps and images, (3) univariate and multivariate statistical analyses, and (4) compound overlaying operations. Data sets were merged and processed to create stereoscopic displays of continuous surfaces. The ratio of several data sets were calculated to evaluate relative variations and to enhance the display of surface alteration (gossans). Factor analysis and principal components analysis techniques were used to determine complex relationships and correlations between data sets. The resultant model consists of 10 parameters that identify three areas most likely to contain porphyry copper mineralization; two of these areas are known occurrences of mineralization and the third is not well known. Field studies confirmed that the three areas identified by the model have significant copper potential.

  6. Digital photogrammetry at the U.S. Geological Survey

    USGS Publications Warehouse

    Greve, Clifford W.

    1995-01-01

    The U.S. Geological Survey is converting its primary map production and revision operations to use digital photogrammetric techniques. The primary source of data for these operations is the digital orthophoto quadrangle derived from National Aerial Photography Program images. These digital orthophotos are used on workstations that permit comparison of existing vector and raster data with the orthophoto and interactive collection and revision of the vector data.

  7. Computing Risk to West Coast Intertidal Rocky Habitat due to ...

    EPA Pesticide Factsheets

    Compared to marshes, little information is available on the potential for rocky intertidal habitats to migrate upward in response to sea level rise (SLR). To address this gap, we utilized topobathy LiDAR digital elevation models (DEMs) downloaded from NOAA’s Digital Coast GIS data repository to estimate percent change in the area of rocky intertidal habitat in 10 cm increments with eustatic sea level rise. The analysis was conducted at the scale of the four Marine Ecoregions of the World (MEOW) ecoregions located along the continental west coast of the United States (CONUS). Environmental Sensitivity Index (ESI) map data were used to identify rocky shoreline. Such stretches of shoreline were extracted for each of the four ecoregions and buffered by 100 m to include the intertidal and evaluate the potential area for upland habitat migration. All available LiDAR topobathy DEMs from Digital Coast were extracted using the resulting polygons and two rasters were synthesized from the results, a 10 cm increment zone raster and a non-planimetric surface area raster for zonal summation. Current rocky intertidal non-planimetric surface areas for each ecoregion were computed between Mean Higher High Water (MHHW) and Mean Lower Low Water (MLLW) levels established from published datum sheets for tidal stations central to each MEOW ecoregion. Percent change in non-planimetric surface area for the same relative ranges were calculated in 10 cm incremental steps of eustatic S

  8. CD-ROM publication of the Mars digital cartographic data base

    NASA Technical Reports Server (NTRS)

    Batson, R. M.; Eliason, E. M.; Soderblom, L. A.; Edwards, Kathleen; Wu, Sherman S. C.

    1991-01-01

    The recently completed Mars mosaicked digital image model (MDIM) and the soon-to-be-completed Mars digital terrain model (DTM) are being transcribed to optical disks to simplify distribution to planetary investigators. These models, completed in FY 1991, provide a cartographic base to which all existing Mars data can be registered. The digital image map of Mars is a cartographic extension of a set of compact disk read-only memory (CD-ROM) volumes containing individual Viking Orbiter images now being released. The data in these volumes are pristine in the sense that they were processed only to the extent required to view them as images. They contain the artifacts and the radiometric, geometric, and photometric characteristics of the raw data transmitted by the spacecraft. This new set of volumes, on the other hand, contains cartographic compilations made by processing the raw images to reduce radiometric and geometric distortions and to form geodetically controlled MDIM's. It also contains digitized versions of an airbrushed map of Mars as well as a listing of all feature names approved by the International Astronomical Union. In addition, special geodetic and photogrammetric processing has been performed to derive rasters of topographic data, or DTM's. The latter have a format similar to that of MDIM, except that elevation values are used in the array instead of image brightness values. The set consists of seven volumes: (1) Vastitas Borealis Region of Mars; (2) Xanthe Terra of Mars; (3) Amazonis Planitia Region of Mars; (4) Elysium Planitia Region of Mars; (5) Arabia Terra of Mars; (6) Planum Australe Region of Mars; and (7) a digital topographic map of Mars.

  9. High-density grids for efficient data collection from multiple crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassettemore » or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into theBlu-Ice/DCSSexperimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. As a result, crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures.« less

  10. High-density grids for efficient data collection from multiple crystals

    DOE PAGES

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto; ...

    2015-11-03

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassettemore » or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into theBlu-Ice/DCSSexperimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. As a result, crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures.« less

  11. Rapid, semi-automatic fracture and contact mapping for point clouds, images and geophysical data

    NASA Astrophysics Data System (ADS)

    Thiele, Samuel T.; Grose, Lachlan; Samsu, Anindita; Micklethwaite, Steven; Vollgger, Stefan A.; Cruden, Alexander R.

    2017-12-01

    The advent of large digital datasets from unmanned aerial vehicle (UAV) and satellite platforms now challenges our ability to extract information across multiple scales in a timely manner, often meaning that the full value of the data is not realised. Here we adapt a least-cost-path solver and specially tailored cost functions to rapidly interpolate structural features between manually defined control points in point cloud and raster datasets. We implement the method in the geographic information system QGIS and the point cloud and mesh processing software CloudCompare. Using these implementations, the method can be applied to a variety of three-dimensional (3-D) and two-dimensional (2-D) datasets, including high-resolution aerial imagery, digital outcrop models, digital elevation models (DEMs) and geophysical grids. We demonstrate the algorithm with four diverse applications in which we extract (1) joint and contact patterns in high-resolution orthophotographs, (2) fracture patterns in a dense 3-D point cloud, (3) earthquake surface ruptures of the Greendale Fault associated with the Mw7.1 Darfield earthquake (New Zealand) from high-resolution light detection and ranging (lidar) data, and (4) oceanic fracture zones from bathymetric data of the North Atlantic. The approach improves the consistency of the interpretation process while retaining expert guidance and achieves significant improvements (35-65 %) in digitisation time compared to traditional methods. Furthermore, it opens up new possibilities for data synthesis and can quantify the agreement between datasets and an interpretation.

  12. Application and enhancements of MOVIE.BYU

    NASA Technical Reports Server (NTRS)

    Gates, R. L.; Vonofenheim, W. H.

    1984-01-01

    MOVIE.BYU (MOVIE.BRIGHAM YOUNG UNIVERSITY) is a system of programs for the display and manipulation of data representing mathematical, architectural, and topological models in which the geometry may be described in terms of panel (n-sided polygons) and solid elements or contour lines. The MOVIE.BYU system has been used in a series of applications of LaRC. One application has been the display, creation, and manipulation of finite element models in aeronautic/aerospace research. These models have been displayed on both vector and color raster devices, and the user has the option to modify color and shading parameters on these color raster devices. Another application involves the display of scalar functions (temperature, pressure, etc.) over the surface of a given model. This capability gives the researcher added flexibility in the analysis of the model and its accompanying data. Limited animation (frame-by-frame creation) has been another application of MOVIE.BYU in the modeling of kinematic processes in antenna structures.

  13. Beam position reconstruction for the g2p experiment in Hall A at Jefferson lab

    NASA Astrophysics Data System (ADS)

    Zhu, Pengjia; Allada, Kalyan; Allison, Trent; Badman, Toby; Camsonne, Alexandre; Chen, Jian-ping; Cummings, Melissa; Gu, Chao; Huang, Min; Liu, Jie; Musson, John; Slifer, Karl; Sulkosky, Vincent; Ye, Yunxiu; Zhang, Jixie; Zielinski, Ryan

    2016-02-01

    Beam-line equipment was upgraded for experiment E08-027 (g2p) in Hall A at Jefferson Lab. Two beam position monitors (BPMs) were necessary to measure the beam position and angle at the target. A new BPM receiver was designed and built to handle the low beam currents (50-100 nA) used for this experiment. Two new super-harps were installed for calibrating the BPMs. In addition to the existing fast raster system, a slow raster system was installed. Before and during the experiment, these new devices were tested and debugged, and their performance was also evaluated. In order to achieve the required accuracy (1-2 mm in position and 1-2 mrad in angle at the target location), the data of the BPMs and harps were carefully analyzed, as well as reconstructing the beam position and angle event by event at the target location. The calculated beam position will be used in the data analysis to accurately determine the kinematics for each event.

  14. Cartographic services contract...for everything geographic

    USGS Publications Warehouse

    ,

    2003-01-01

    The U.S. Geological Survey's (USGS) Cartographic Services Contract (CSC) is used to award work for photogrammetric and mapping services under the umbrella of Architect-Engineer (A&E) contracting. The A&E contract is broad in scope and can accommodate any activity related to standard, nonstandard, graphic, and digital cartographic products. Services provided may include, but are not limited to, photogrammetric mapping and aerotriangulation; orthophotography; thematic mapping (for example, land characterization); analog and digital imagery applications; geographic information systems development; surveying and control acquisition, including ground-based and airborne Global Positioning System; analog and digital image manipulation, analysis, and interpretation; raster and vector map digitizing; data manipulations (for example, transformations, conversions, generalization, integration, and conflation); primary and ancillary data acquisition (for example, aerial photography, satellite imagery, multispectral, multitemporal, and hyperspectral data); image scanning and processing; metadata production, revision, and creation; and production or revision of standard USGS products defined by formal and informal specification and standards, such as those for digital line graphs, digital elevation models, digital orthophoto quadrangles, and digital raster graphics.

  15. neweng_wpc50_poly

    Science.gov Websites

    development projects. This shapefile was generated from a raster dataset with a 200 m resolution, in a UTM Energy ("DOE"). The user is granted the right, without any fee or cost, to use, copy, modify appears in all copies of the data. Further, the user of this data agrees to credit NREL in any

  16. CT_50m_Wind

    Science.gov Websites

    development projects. This shapefile was generated from a raster dataset with a 200 m resolution, in a UTM Energy ("DOE"). The user is granted the right, without any fee or cost, to use, copy, modify appears in all copies of the data. Further, the user of this data agrees to credit NREL in any

  17. RI_50m_Wind

    Science.gov Websites

    development projects. This shapefile was generated from a raster dataset with a 200 m resolution, in a UTM Energy ("DOE"). The user is granted the right, without any fee or cost, to use, copy, modify appears in all copies of the data. Further, the user of this data agrees to credit NREL in any

  18. pacific_coast_90mwindspeed_off

    Science.gov Websites

    UTM zone 11, datum WGS 84. The shapefile was generated from these raster datasets and then projected of Energy ("DOE"). The user is granted the right, without any fee or cost, to use, copy notice appears in all copies of the data. Further, the user of this data agrees to credit NREL in any

  19. cenam_50mwind

    Science.gov Websites

    . This shapefile was generated from a raster dataset with a 200 m resolution, in a UTM zone 12, datum WGS , LLC for the U.S. Department of Energy ("DOE"). The user is granted the right, without any , provided that this entire notice appears in all copies of the data. Further, the user of this data agrees

  20. NH_50m_Wind

    Science.gov Websites

    development projects. This shapefile was generated from a raster dataset with a 200 m resolution, in a UTM Energy ("DOE"). The user is granted the right, without any fee or cost, to use, copy, modify appears in all copies of the data. Further, the user of this data agrees to credit NREL in any

  1. ga_50m_wind

    Science.gov Websites

    development projects. This shapefile was generated from a raster dataset with a 200 m resolution, in a UTM Energy ("DOE"). The user is granted the right, without any fee or cost, to use, copy, modify appears in all copies of the data. Further, the user of this data agrees to credit NREL in any

  2. MA_50m_Wind

    Science.gov Websites

    development projects. This shapefile was generated from a raster dataset with a 200 m resolution, in a UTM Energy ("DOE"). The user is granted the right, without any fee or cost, to use, copy, modify appears in all copies of the data. Further, the user of this data agrees to credit NREL in any

  3. IA_50m_Wind

    Science.gov Websites

    development projects. This shapefile was generated from a raster dataset with a 200 m resolution, in a UTM Energy ("DOE"). The user is granted the right, without any fee or cost, to use, copy, modify appears in all copies of the data. Further, the user of this data agrees to credit NREL in any

  4. Clearing your Desk! Software and Data Services for Collaborative Web Based GIS Analysis

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Gichamo, T.; Yildirim, A. A.; Liu, Y.

    2015-12-01

    Can your desktop computer crunch the large GIS datasets that are becoming increasingly common across the geosciences? Do you have access to or the know-how to take advantage of advanced high performance computing (HPC) capability? Web based cyberinfrastructure takes work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This talk will describe the HydroShare collaborative environment and web based services being developed to support the sharing and processing of hydrologic data and models. HydroShare supports the upload, storage, and sharing of a broad class of hydrologic data including time series, geographic features and raster datasets, multidimensional space-time data, and other structured collections of data. Web service tools and a Python client library provide researchers with access to HPC resources without requiring them to become HPC experts. This reduces the time and effort spent in finding and organizing the data required to prepare the inputs for hydrologic models and facilitates the management of online data and execution of models on HPC systems. This presentation will illustrate the use of web based data and computation services from both the browser and desktop client software. These web-based services implement the Terrain Analysis Using Digital Elevation Model (TauDEM) tools for watershed delineation, generation of hydrology-based terrain information, and preparation of hydrologic model inputs. They allow users to develop scripts on their desktop computer that call analytical functions that are executed completely in the cloud, on HPC resources using input datasets stored in the cloud, without installing specialized software, learning how to use HPC, or transferring large datasets back to the user's desktop. These cases serve as examples for how this approach can be extended to other models to enhance the use of web and data services in the geosciences.

  5. Geolokit: An interactive tool for visualising and exploring geoscientific data in Google Earth

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Antoine; Watlet, Arnaud; Bastin, Christophe

    2017-10-01

    Virtual globes have been developed to showcase different types of data combining a digital elevation model and basemaps of high resolution satellite imagery. Hence, they became a standard to share spatial data and information, although they suffer from a lack of toolboxes dedicated to the formatting of large geoscientific dataset. From this perspective, we developed Geolokit: a free and lightweight software that allows geoscientists - and every scientist working with spatial data - to import their data (e.g., sample collections, structural geology, cross-sections, field pictures, georeferenced maps), to handle and to transcribe them to Keyhole Markup Language (KML) files. KML files are then automatically opened in the Google Earth virtual globe and the spatial data accessed and shared. Geolokit comes with a large number of dedicated tools that can process and display: (i) multi-points data, (ii) scattered data interpolations, (iii) structural geology features in 2D and 3D, (iv) rose diagrams, stereonets and dip-plunge polar histograms, (v) cross-sections and oriented rasters, (vi) georeferenced field pictures, (vii) georeferenced maps and projected gridding. Therefore, together with Geolokit, Google Earth becomes not only a powerful georeferenced data viewer but also a stand-alone work platform. The toolbox (available online at http://www.geolokit.org) is written in Python, a high-level, cross-platform programming language and is accessible through a graphical user interface, designed to run in parallel with Google Earth, through a workflow that requires no additional third party software. Geolokit features are demonstrated in this paper using typical datasets gathered from two case studies illustrating its applicability at multiple scales of investigation: a petro-structural investigation of the Ile d'Yeu orthogneissic unit (Western France) and data collection of the Mariana oceanic subduction zone (Western Pacific).

  6. Imaging MS Methodology for More Chemical Information in Less Data Acquisition Time Utilizing a Hybrid Linear Ion Trap-Orbitrap Mass Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perdian, D. C.; Lee, Young Jin

    2010-11-15

    A novel mass spectrometric imaging method is developed to reduce the data acquisition time and provide rich chemical information using a hybrid linear ion trap-orbitrap mass spectrometer. In this method, the linear ion trap and orbitrap are used in tandem to reduce the acquisition time by incorporating multiple linear ion trap scans during an orbitrap scan utilizing a spiral raster step plate movement. The data acquisition time was decreased by 43-49% in the current experiment compared to that of orbitrap-only scans; however, 75% or more time could be saved for higher mass resolution and with a higher repetition rate laser.more » Using this approach, a high spatial resolution of 10 {micro}m was maintained at ion trap imaging, while orbitrap spectra were acquired at a lower spatial resolution, 20-40 {micro}m, all with far less data acquisition time. Furthermore, various MS imaging methods were developed by interspersing MS/MS and MSn ion trap scans during orbitrap scans to provide more analytical information on the sample. This method was applied to differentiate and localize structural isomers of several flavonol glycosides from an Arabidopsis flower petal in which MS/MS, MSn, ion trap, and orbitrap images were all acquired in a single data acquisition.« less

  7. Automatic detection of zebra crossings from mobile LiDAR data

    NASA Astrophysics Data System (ADS)

    Riveiro, B.; González-Jorge, H.; Martínez-Sánchez, J.; Díaz-Vilariño, L.; Arias, P.

    2015-07-01

    An algorithm for the automatic detection of zebra crossings from mobile LiDAR data is developed and tested to be applied for road management purposes. The algorithm consists of several subsequent processes starting with road segmentation by performing a curvature analysis for each laser cycle. Then, intensity images are created from the point cloud using rasterization techniques, in order to detect zebra crossing using the Standard Hough Transform and logical constrains. To optimize the results, image processing algorithms are applied to the intensity images from the point cloud. These algorithms include binarization to separate the painting area from the rest of the pavement, median filtering to avoid noisy points, and mathematical morphology to fill the gaps between the pixels in the border of white marks. Once the road marking is detected, its position is calculated. This information is valuable for inventorying purposes of road managers that use Geographic Information Systems. The performance of the algorithm has been evaluated over several mobile LiDAR strips accounting for a total of 30 zebra crossings. That test showed a completeness of 83%. Non-detected marks mainly come from painting deterioration of the zebra crossing or by occlusions in the point cloud produced by other vehicles on the road.

  8. Mapping Diffusion in a Living Cell via the Phasor Approach

    PubMed Central

    Ranjit, Suman; Lanzano, Luca; Gratton, Enrico

    2014-01-01

    Diffusion of a fluorescent protein within a cell has been measured using either fluctuation-based techniques (fluorescence correlation spectroscopy (FCS) or raster-scan image correlation spectroscopy) or particle tracking. However, none of these methods enables us to measure the diffusion of the fluorescent particle at each pixel of the image. Measurement using conventional single-point FCS at every individual pixel results in continuous long exposure of the cell to the laser and eventual bleaching of the sample. To overcome this limitation, we have developed what we believe to be a new method of scanning with simultaneous construction of a fluorescent image of the cell. In this believed new method of modified raster scanning, as it acquires the image, the laser scans each individual line multiple times before moving to the next line. This continues until the entire area is scanned. This is different from the original raster-scan image correlation spectroscopy approach, where data are acquired by scanning each frame once and then scanning the image multiple times. The total time of data acquisition needed for this method is much shorter than the time required for traditional FCS analysis at each pixel. However, at a single pixel, the acquired intensity time sequence is short; requiring nonconventional analysis of the correlation function to extract information about the diffusion. These correlation data have been analyzed using the phasor approach, a fit-free method that was originally developed for analysis of FLIM images. Analysis using this method results in an estimation of the average diffusion coefficient of the fluorescent species at each pixel of an image, and thus, a detailed diffusion map of the cell can be created. PMID:25517145

  9. Fast Risk Assessment Software For Natural Hazard Phenomena Using Georeference Population And Infrastructure Data Bases

    NASA Astrophysics Data System (ADS)

    Marrero, J. M.; Pastor Paz, J. E.; Erazo, C.; Marrero, M.; Aguilar, J.; Yepes, H. A.; Estrella, C. M.; Mothes, P. A.

    2015-12-01

    Disaster Risk Reduction (DRR) requires an integrated multi-hazard assessment approach towards natural hazard mitigation. In the case of volcanic risk, long term hazard maps are generally developed on a basis of the most probable scenarios (likelihood of occurrence) or worst cases. However, in the short-term, expected scenarios may vary substantially depending on the monitoring data or new knowledge. In this context, the time required to obtain and process data is critical for optimum decision making. Availability of up-to-date volcanic scenarios is as crucial as it is to have this data accompanied by efficient estimations of their impact among populations and infrastructure. To address this impact estimation during volcanic crises, or other natural hazards, a web interface has been developed to execute an ANSI C application. This application allows one to compute - in a matter of seconds - the demographic and infrastructure impact that any natural hazard may cause employing an overlay-layer approach. The web interface is tailored to users involved in the volcanic crises management of Cotopaxi volcano (Ecuador). The population data base and the cartographic basis used are of public domain, published by the National Office of Statistics of Ecuador (INEC, by its Spanish acronym). To run the application and obtain results the user is expected to upload a raster file containing information related to the volcanic hazard or any other natural hazard, and determine categories to group population or infrastructure potentially affected. The results are displayed in a user-friendly report.

  10. System Integration Issues in Digital Photogrammetric Mapping

    DTIC Science & Technology

    1992-01-01

    elevation models, and/or rectified imagery/ orthophotos . Imagery exported from the DSPW can be either in a tiled image format or standard raster format...data. In the near future, correlation using "window shaping" operations along with an iterative orthophoto refinements methodology (Norvelle, 1992) is...components of TIES. The IDS passes tiled image data and ASCII header data to the DSPW. The tiled image file contains only image data. The ASCII header

  11. NASA's Global Imagery Browse Services - Technologies for Visualizing Earth Science Data

    NASA Astrophysics Data System (ADS)

    Cechini, M. F.; Boller, R. A.; Baynes, K.; Schmaltz, J. E.; Thompson, C. K.; Roberts, J. T.; Rodriguez, J.; Wong, M. M.; King, B. A.; King, J.; De Luca, A. P.; Pressley, N. N.

    2017-12-01

    For more than 20 years, the NASA Earth Observing System (EOS) has collected earth science data for thousands of scientific parameters now totaling nearly 15 Petabytes of data. In 2013, NASA's Global Imagery Browse Services (GIBS) formed its vision to "transform how end users interact and discover [EOS] data through visualizations." This vision included leveraging scientific and community best practices and standards to provide a scalable, compliant, and authoritative source for EOS earth science data visualizations. Since that time, GIBS has grown quickly and now services millions of daily requests for over 500 imagery layers representing hundreds of earth science parameters to a broad community of users. For many of these parameters, visualizations are available within hours of acquisition from the satellite. For others, visualizations are available for the entire mission of the satellite. The GIBS system is built upon the OnEarth and MRF open source software projects, which are provided by the GIBS team. This software facilitates standards-based access for compliance with existing GIS tools. The GIBS imagery layers are predominantly rasterized images represented in two-dimensional coordinate systems, though multiple projections are supported. The OnEarth software also supports the GIBS ingest pipeline to facilitate low latency updates to new or updated visualizations. This presentation will focus on the following topics: Overview of GIBS visualizations and user community Current benefits and limitations of the OnEarth and MRF software projects and related standards GIBS access methods and their in/compatibilities with existing GIS libraries and applications Considerations for visualization accuracy and understandability Future plans for more advanced visualization concepts including Vertical Profiles and Vector-Based Representations Future plans for Amazon Web Service support and deployments

  12. NeuroMatic: An Integrated Open-Source Software Toolkit for Acquisition, Analysis and Simulation of Electrophysiological Data

    PubMed Central

    Rothman, Jason S.; Silver, R. Angus

    2018-01-01

    Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic. PMID:29670519

  13. Correction of Line Interleaving Displacement in Frame Captured Aerial Video Imagery

    Treesearch

    B. Cooke; A. Saucier

    1995-01-01

    Scientists with the USDA Forest Service are currently assessing the usefulness of aerial video imagery for various purposes including midcycle inventory updates. The potential of video image data for these purposes may be compromised by scan line interleaving displacement problems. Interleaving displacement problems cause features in video raster datasets to have...

  14. Multiphoton minimal inertia scanning for fast acquisition of neural activity signals

    NASA Astrophysics Data System (ADS)

    Schuck, Renaud; Go, Mary Ann; Garasto, Stefania; Reynolds, Stephanie; Dragotti, Pier Luigi; Schultz, Simon R.

    2018-04-01

    Objective. Multi-photon laser scanning microscopy provides a powerful tool for monitoring the spatiotemporal dynamics of neural circuit activity. It is, however, intrinsically a point scanning technique. Standard raster scanning enables imaging at subcellular resolution; however, acquisition rates are limited by the size of the field of view to be scanned. Recently developed scanning strategies such as travelling salesman scanning (TSS) have been developed to maximize cellular sampling rate by scanning only select regions in the field of view corresponding to locations of interest such as somata. However, such strategies are not optimized for the mechanical properties of galvanometric scanners. We thus aimed to develop a new scanning algorithm which produces minimal inertia trajectories, and compare its performance with existing scanning algorithms. Approach. We describe here the adaptive spiral scanning (SSA) algorithm, which fits a set of near-circular trajectories to the cellular distribution to avoid inertial drifts of galvanometer position. We compare its performance to raster scanning and TSS in terms of cellular sampling frequency and signal-to-noise ratio (SNR). Main Results. Using surrogate neuron spatial position data, we show that SSA acquisition rates are an order of magnitude higher than those for raster scanning and generally exceed those achieved by TSS for neural densities comparable with those found in the cortex. We show that this result also holds true for in vitro hippocampal mouse brain slices bath loaded with the synthetic calcium dye Cal-520 AM. The ability of TSS to ‘park’ the laser on each neuron along the scanning trajectory, however, enables higher SNR than SSA when all targets are precisely scanned. Raster scanning has the highest SNR but at a substantial cost in number of cells scanned. To understand the impact of sampling rate and SNR on functional calcium imaging, we used the Cramér-Rao Bound on evoked calcium traces recorded

  15. Compressing interpreted satellite imagery for geographic information systems applications over extensive regions

    USGS Publications Warehouse

    Miller, Stephan W.

    1981-01-01

    A second set of related problems deals with how this format and other representations of spatial entities, such as vector formats for point and line features, can be interrelated for manipulation, retrieval, and analysis by a spatial database management subsystem. Methods have been developed for interrelating areal data sets in the raster format with point and line data in a vector format and these are described.

  16. Combined use of SAR and optical data for environmental assessments around refugee camps in semiarid landscapes

    NASA Astrophysics Data System (ADS)

    Braun, A.; Hochschild, V.

    2015-04-01

    Over 15 million people were officially considered as refugees in the year 2012 and another 28 million as internally displaced people (IDPs). Natural disasters, climatic and environmental changes, violent regional conflicts and population growth force people to migrate in all parts of this world. This trend is likely to continue in the near future, as political instabilities increase and land degradation progresses. EO4HumEn aims at developing operational services to support humanitarian operations during crisis situations by means of dedicated geo-spatial information products derived from Earth observation and GIS data. The goal is to develop robust, automated methods of image analysis routines for population estimation, identification of potential groundwater extraction sites and monitoring the environmental impact of refugee/IDP camps. This study investigates the combination of satellite SAR data with optical sensors and elevation information for the assessment of the environmental conditions around refugee camps. In order to estimate their impact on land degradation, land cover classifications are required which target dynamic landscapes. We performed a land use / land cover classification based on a random forest algorithm and 39 input prediction rasters based on Landsat 8 data and additional layers generated from radar texture and elevation information. The overall accuracy was 92.9 %, while optical data had the highest impact on the final classification. By analysing all combinations of the three input datasets we additionally estimated their impact on single classification outcomes and land cover classes.

  17. Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes

    NASA Astrophysics Data System (ADS)

    Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.

    2012-07-01

    Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.

  18. Recent Structural Change in Remote Sensing Data Time Series Linked to Farm Management in Horn of Africa (1999-2009)

    NASA Astrophysics Data System (ADS)

    Crisci, A.; Vignaroli, P.; Genesio, L.; Grasso, V.; Bacci, M.; Tarchiani, V.; Capecchi, V.

    2011-01-01

    Food security in East Africa region essentially depends on the stability of rain-fed crops farming, which renders its society vulnerable to climatic fluctuations. These ones in Africa are most widely and directly related to rainfall. In this study, the relation between recent spatial rainfall variability and vegetation dynamics has been investigated for East Africa territories. Satellite raster products SPOT-4 Vegetation 1 km resolution (Saint, 1995) and RFE (rainfall estimates) from Famine Early Warning Systems Network (FEWS NET) are used. The survey is carried out at administrative level scale using 10-day summaries extracted from raster data for each spatial area unit thanks to specific polygonal layers. Time series covers two different periods: 1996-2009 for rainfall estimates and 1999-2009 for NDVI. The first step of the analysis has been to build for each administrative unit a coherent set of data, along the time series, suitable to be processed with state-of-art statistical tools. The analysis is based on the assumption that every structural break in vegetation dynamics could be caused by two alternative/complementary causes, namely: (i) modifications in crop farming systems (adaptation strategy) related to eventual break-shift in rainfall regime and/or (ii) other socio-economic factors. BFAST (Verbesselt et al, 2010) R package are employed to lead a comprehensive breakpoint analysis on 10-day RFE (spatial mean and standard deviation) and 10-day NDVI ones (spatial mean, mode and standard deviation). The cross-viewing of the years where significant breaks have occurred, throughout opportune GIS layering, provides an explorative interpretation of spatial climate/vegetation dynamics in the whole area. Moreover, the spatial and temporal pattern of ecosystem dynamics in response to climatic variability has been investigated using wavelet coherency by SOWAS R package (Maraun, 2007). The wavelet coherency (WCOH) is a normalized time and scale resolved measure for

  19. Improved analyses using function datasets and statistical modeling

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2014-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...

  20. Data-Driven Synthesis for Investigating Food Systems Resilience to Climate Change

    NASA Astrophysics Data System (ADS)

    Magliocca, N. R.; Hart, D.; Hondula, K. L.; Munoz, I.; Shelley, M.; Smorul, M.

    2014-12-01

    The production, supply, and distribution of our food involves a complex set of interactions between farmers, rural communities, governments, and global commodity markets that link important issues such as environmental quality, agricultural science and technology, health and nutrition, rural livelihoods, and social institutions and equality - all of which will be affected by climate change. The production of actionable science is thus urgently needed to inform and prepare the public for the consequences of climate change for local and global food systems. Access to data that spans multiple sectors/domains and spatial and temporal scales is key to beginning to tackle such complex issues. As part of the White House's Climate Data Initiative, the USDA and the National Socio-Environmental Synthesis Center (SESYNC) are launching a new collaboration to catalyze data-driven research to enhance food systems resilience to climate change. To support this collaboration, SESYNC is developing a new "Data to Motivate Synthesis" program designed to engage early career scholars in a highly interactive and dynamic process of real-time data discovery, analysis, and visualization to catalyze new research questions and analyses that would not have otherwise been possible and/or apparent. This program will be supported by an integrated, spatially-enabled cyberinfrastructure that enables the management, intersection, and analysis of large heterogeneous datasets relevant to food systems resilience to climate change. Our approach is to create a series of geospatial abstraction data structures and visualization services that can be used to accelerate analysis and visualization across various socio-economic and environmental datasets (e.g., reconcile census data with remote sensing raster datasets). We describe the application of this approach with a pilot workshop of socio-environmental scholars that will lay the groundwork for the larger SESYNC-USDA collaboration. We discuss the

  1. Near-census Delineation of Laterally Organized Geomorphic Zones and Associated Sub-width Fluvial Landforms

    NASA Astrophysics Data System (ADS)

    Pasternack, G. B.; Hopkins, C.

    2017-12-01

    A river channel and its associated riparian corridor exhibit a pattern of nested, geomorphically imprinted, lateral inundation zones (IZs). Each zone plays a key role in fluvial geomorphic processes and ecological functions. Within each zone, distinct landforms (aka geomorphic or morphological units, MUs) reside at the 0.1-10 channel width scale. These features are basic units linking river corridor morphology with local ecosystem services. Objective, automated delineation of nested inundation zones and morphological units remains a significant scientific challenge. This study describes and demonstrates new, objective methods for solving this problem, using the 35-km alluvial lower Yuba River as a testbed. A detrended, high-resolution digital elevation model constructed from near-census topographic and bathymetric data was produced and used in a hypsograph analysis, a commonly used method in oceanographic studies capable of identifying slope breaks at IZ transitions. Geomorphic interpretation mindful of the river's setting was required to properly describe each IZ identified by the hypsograph analysis. Then, a 2D hydrodynamic model was used to determine what flow yields the wetted area that most closely matches each IZ domain. The model also provided meter-scale rasters of depth and velocity useful for MU mapping. Even though MUs are discharge-independent landforms, they can be revealed by analyzing their overlying hydraulics at low flows. Baseflow depth and velocity rasters are used along with a hydraulic landform classification system to quantitatively delineate in-channel bed MU types. In-channel bar and off-channel flood and valley MUs are delineated using a combination of hydraulic and geomorphic indicators, such as depth and velocity rasters for different discharges, topographic contours, NAIP imagery, and a raster of vegetation. The ability to objectively delineate inundation zones and morphological units in tandem allows for better informed river management

  2. Interactive Digital Image Manipulation System (IDIMS)

    NASA Technical Reports Server (NTRS)

    Fleming, M. D.

    1981-01-01

    The implementation of an interactive digital image manipulation system (IDIMS) is described. The system is run on an HP-3000 Series 3 minicomputer. The IDIMS system provides a complete image geoprocessing capability for raster formatted data in a self-contained system. It is easily installed, documentation is provided, and vendor support is available.

  3. Using the global positioning system to map disturbance patterns of forest harvesting machinery

    Treesearch

    T.P. McDonald; E.A. Carter; S.E. Taylor

    2002-01-01

    Abstract: A method was presented to transform sampled machine positional data obtained from a global positioning system (GPS) receiver into a two-dimensional raster map of number of passes as a function of location. The effect of three sources of error in the transformation process were investigated: path sampling rate (receiver sampling frequency);...

  4. Raster graphic helmet-mounted display study

    NASA Technical Reports Server (NTRS)

    Beamon, William S.; Moran, Susanna I.

    1990-01-01

    A design of a helmet mounted display system is presented, including a design specification and development plan for the selected design approach. The requirements for the helmet mounted display system and a survey of applicable technologies are presented. Three helmet display concepts are then described which utilize lasers, liquid crystal display's (LCD's), and subminiature cathode ray tubes (CRT's), respectively. The laser approach is further developed in a design specification and a development plan.

  5. "Science SQL" as a Building Block for Flexible, Standards-based Data Infrastructures

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2016-04-01

    We have learnt to live with the pain of separating data and metadata into non-interoperable silos. For metadata, we enjoy the flexibility of databases, be they relational, graph, or some other NoSQL. Contrasting this, users still "drown in files" as an unstructured, low-level archiving paradigm. It is time to bridge this chasm which once was technologically induced, but today can be overcome. One building block towards a common re-integrated information space is to support massive multi-dimensional spatio-temporal arrays. These "datacubes" appear as sensor, image, simulation, and statistics data in all science and engineering domains, and beyond. For example, 2-D satellilte imagery, 2-D x/y/t image timeseries and x/y/z geophysical voxel data, and 4-D x/y/z/t climate data contribute to today's data deluge in the Earth sciences. Virtual observatories in the Space sciences routinely generate Petabytes of such data. Life sciences deal with microarray data, confocal microscopy, human brain data, which all fall into the same category. The ISO SQL/MDA (Multi-Dimensional Arrays) candidate standard is extending SQL with modelling and query support for n-D arrays ("datacubes") in a flexible, domain-neutral way. This heralds a new generation of services with new quality parameters, such as flexibility, ease of access, embedding into well-known user tools, and scalability mechanisms that remain completely transparent to users. Technology like the EU rasdaman ("raster data manager") Array Database system can support all of the above examples simultaneously, with one technology. This is practically proven: As of today, rasdaman is in operational use on hundreds of Terabytes of satellite image timeseries datacubes, with transparent query distribution across more than 1,000 nodes. Therefore, Array Databases offering SQL/MDA constitute a natural common building block for next-generation data infrastructures. Being initiator and editor of the standard we present principles

  6. Search for evidence of low energy protons in solar flares

    NASA Technical Reports Server (NTRS)

    Metcalf, Thomas R.; Wuelser, Jean-Pierre; Canfield, Richard C.; Hudson, Hugh S.

    1992-01-01

    We searched for linear polarization in the H alpha line using the Stokes Polarimeter at Mees Solar Observatory and present observations of a flare from NOAA active region 6659 which began at 01:30 UT on 14 Jun. 1991. Our dataset also includes H alpha spectra from the Mees charge coupled device (MCCD) imaging spectrograph as well as hard x ray observations from the Burst and Transient Source Experiment (BATSE) instrument on board the Gamma Ray Observatory (GRO). The polarimeter scanned a 40 x 40 inch field of view using 16 raster points in a 4 x 4 grid. Each scan took about 30 seconds with 2 seconds at each raster point. The polarimeter stopped 8.5 inches between raster points and each point covered a 6 inch region. This sparse sampling increased the total field of view without reducing the temporal cadence. At each raster point, an H alpha spectrum with 20 mA spectral sampling is obtained covering 2.6 A centered on H alpha line center. The preliminary conclusions from the research are presented.

  7. VizieR Online Data Catalog: SOFI and ISOCAM observations of Cha II (Persi+, 2003)

    NASA Astrophysics Data System (ADS)

    Persi, P.; Marenzi, A. R.; Gomez, M.; Olofsson, G.

    2003-01-01

    A region of approximately 28'x26' of Cha II, centered at RA = 13h 00min 47s, DE = -77° 06' 09" (2000), was surveyed with ISOCAM in raster mode at LW2(5-8.5μm) (TDT N.11500619) and LW3(12-18μm)(TDT N.11500620). All the frames were observed with a pixel field of view (PFOV) of 6", intrinsic integration time Tint=2.1s and ~15s integration time per sky position. The total integration time was of 4472 s and 4474 s for LW2 and LW3, respectively. We obtained J, H, and Ks images of the central part of Cha II covering an area of4.9'x4.9' with the SOFI near-IR camera at the ESO 3.58m New Technology Telescope (NTT) on the night of April 28, 2000 under very good seeing conditions (~0.3") SOFI uses a 1024x1024 pixel HgCdTe array and provides a field of view of 299"x299" with a scale of 0.292"/pix. (2 data files).

  8. Object-based habitat mapping using very high spatial resolution multispectral and hyperspectral imagery with LiDAR data

    NASA Astrophysics Data System (ADS)

    Onojeghuo, Alex Okiemute; Onojeghuo, Ajoke Ruth

    2017-07-01

    This study investigated the combined use of multispectral/hyperspectral imagery and LiDAR data for habitat mapping across parts of south Cumbria, North West England. The methodology adopted in this study integrated spectral information contained in pansharp QuickBird multispectral/AISA Eagle hyperspectral imagery and LiDAR-derived measures with object-based machine learning classifiers and ensemble analysis techniques. Using the LiDAR point cloud data, elevation models (such as the Digital Surface Model and Digital Terrain Model raster) and intensity features were extracted directly. The LiDAR-derived measures exploited in this study included Canopy Height Model, intensity and topographic information (i.e. mean, maximum and standard deviation). These three LiDAR measures were combined with spectral information contained in the pansharp QuickBird and Eagle MNF transformed imagery for image classification experiments. A fusion of pansharp QuickBird multispectral and Eagle MNF hyperspectral imagery with all LiDAR-derived measures generated the best classification accuracies, 89.8 and 92.6% respectively. These results were generated with the Support Vector Machine and Random Forest machine learning algorithms respectively. The ensemble analysis of all three learning machine classifiers for the pansharp QuickBird and Eagle MNF fused data outputs did not significantly increase the overall classification accuracy. Results of the study demonstrate the potential of combining either very high spatial resolution multispectral or hyperspectral imagery with LiDAR data for habitat mapping.

  9. A Mobile App for Geochemical Field Data Acquisition

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Reid, N.; Ballsun-Stanton, B.; White, A.; Sobotkova, A.

    2015-12-01

    We have developed a geochemical sampling application for use on Android tablets. This app was developed together with the Federated Archaeological Information Management Systems (FAIMS) at Macquarie University and is based on the open source FAIMS mobile platform, which was originally designed for archaeological field data collection. The FAIMS mobile platform has proved valuable for hydrogeochemical, biogeochemical, soil and rock sample collection due to the ability to customise data collection methodologies for any field research. The module we commissioned allows for using inbuilt or external GPS to locate sample points, it incorporates standard and incremental sampling names which can be easily fed into the International Geo-Sample Number (IGSN). Sampling can be documented not only in metadata, but also accompanied by photographic documentation and sketches. The module is augmented by dropdown menus for fields specific for each sample type and user defined tags. The module also provides users with an overview of all records from a field campaign in a records viewer. We also use basic mapping functionality, showing the current location, sampled points overlaid on preloaded rasters, and allows for drawing of points and simple polygons to be later exported as shape files. A particular challenge is the remoteness of the sampling locations, hundreds of kilometres away from network access. The first trial raised the issue of backup without access to the internet, so in collaboration with the FAIMS team and Solutions First, we commissioned a vehicle mounted portable server. This server box is constantly syncing with the tablets in the field via Wi-Fi, it has an uninterruptible power supply that can run for up to 45 minutes when the vehicle is turned off, and a 1TB hard drive for storage of all data and photographs. The server can be logged into via any of the field tablets or laptop to download all the data collected to date or to just view it on the server.

  10. mn_50mwind

    Science.gov Websites

    for micro-siting potential development projects. This shapefile was generated from a raster dataset for Sustainable Energy, LLC for the U.S. Department of Energy ("DOE"). The user is granted , the user of this data agrees to credit NREL in any publications or software that incorporate or use

  11. bt_50mwind

    Science.gov Websites

    for micro-siting potential development projects. This shapefile was generated from a raster dataset Sustainable Energy, LLC for the U.S. Department of Energy ("DOE"). The user is granted the right whatsoever, provided that this entire notice appears in all copies of the data. Further, the user of this

  12. am_50mwind

    Science.gov Websites

    for micro-siting potential development projects. This shapefile was generated from a raster dataset for Sustainable Energy, LLC for the U.S. Department of Energy ("DOE"). The user is granted , the user of this data agrees to credit NREL in any publications or software that incorporate or use

  13. WC WAVE - Integrating Diverse Hydrological-Modeling Data and Services Into an Interoperable Geospatial Infrastructure

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Baros, S.; Barrett, H.; Savickas, J.; Erickson, J.

    2015-12-01

    WC WAVE (Western Consortium for Watershed Analysis, Visualization and Exploration) is a collaborative research project between the states of Idaho, Nevada, and New Mexico that is funded under the National Science Foundation's Experimental Program to Stimulate Competitive Research (EPSCoR). The goal of the project is to understand and document the effects of climate change on interactions between precipitation, vegetation growth, soil moisture and other landscape properties. These interactions are modeled within a framework we refer to as a virtual watershed (VW), a computer infrastructure that simulates watershed dynamics by linking scientific modeling, visualization, and data management components into a coherent whole. Developed and hosted at the Earth Data Analysis Center, University of New Mexico, the virtual watershed has a number of core functions which include: a) streamlined access to data required for model initialization and boundary conditions; b) the development of analytic scenarios through interactive visualization of available data and the storage of model configuration options; c) coupling of hydrological models through the rapid assimilation of model outputs into the data management system for access and use by sequent models. The WC-WAVE virtual watershed accomplishes these functions by provision of large-scale vector and raster data discovery, subsetting, and delivery via Open Geospatial Consortium (OGC) and REST web service standards. Central to the virtual watershed is the design and use of an innovative array of metadata elements that permits the stepwise coupling of diverse hydrological models (e.g. ISNOBAL, PRMS, CASiMiR) and input data to rapidly assess variation in outcomes under different climatic conditions. We present details on the architecture and functionality of the virtual watershed, results from three western U.S. watersheds, and discuss the realized benefits to watershed science of employing this integrated solution.

  14. The GIS Weasel: An interface for the development of geographic information used in environmental simulation modeling

    USGS Publications Warehouse

    Viger, R.J.

    2008-01-01

    The GIS Weasel is a freely available, open-source software package built on top of ArcInfo Workstation?? [ESRI, Inc., 2001, ArcInfo Workstation (8.1 ed.), Redlands, CA] for creating maps and parameters of geographic features used in environmental simulation models. The software has been designed to minimize the need for GIS expertise and automate the preparation of the geographic information as much as possible. Although many kinds of data can be exploited with the GIS Weasel, the only information required is a raster dataset of elevation for the user's area of interest (AOI). The user-defined AOI serves as a starting point from which to create maps of many different types of geographic features, including sub-watersheds, streams, elevation bands, land cover patches, land parcels, or anything else that can be discerned from the available data. The GIS Weasel has a library of over 200 routines that can be applied to any raster map of geographic features to generate information about shape, area, or topological association with other features of the same or different maps. In addition, a wide variety of parameters can be derived using ancillary data layers such as soil and vegetation maps.

  15. Open Source GIS Connectors to the NASA GES DISC Satellite Data

    NASA Astrophysics Data System (ADS)

    Pham, L.; Kempler, S. J.; Yang, W.

    2014-12-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) houses a suite of satellite-derived GIS data including high spatiotemporal resolution precipitation, air quality, and modeled land surface parameter data. The data are extremely useful to various GIS research and applications at regional, continental, and global scales, as evidenced by the growing GIS user requests to the data. On the other hand, we also found that some GIS users, especially those from the ArcGIS community, having difficulties in obtaining, importing, and using our data, primarily due to the unfamiliarity of the users with our products and GIS software's lack of capabilities in dealing with the predominately raster form data in various sometimes very complicated formats. In this presentation, we introduce a set of open source ArcGIS data connectors that significantly simplify the access and use of our data in ArcGIS. With the connectors, users do not need to know the data access URLs, the access protocols or syntaxes, and data formats. Nor do they need to browse through a long list of variables that are often embedded into one single science data file and whose names may sometimes be confusing to those not familiar with the file (such as variable CH4_VMR_D for "CH4 Volume mixing ratio from the descending orbit" and variable EVPsfc for "Total Evapotranspiration"). The connectors will expose most GIS-related variables to the users with easy to understand names. User can simply define the spatiotemporal range of their study, select interested parameter(s), and have the needed data be downloaded, imported, and displayed in ArcGIS. The connectors are python text files and there is no installation process. They can be placed at any user directory and be started by simply clicking on it. In the presentation, we'll also demonstrate how to use the tools to load GES DISC time series air quality data with a few clicks and how such data depict the spatial and temporal patterns of

  16. Beam position reconstruction for the g2p experiment in Hall A at Jefferson Lab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Pengjia; Allada, Kalyan; Allison, Trent

    2015-11-03

    Beam-line equipment was upgraded for experiment E08-027 (g2p) in Hall A at Jefferson Lab. Two beam position monitors (BPMs) were necessary to measure the beam position and angle at the target. A new BPM receiver was designed and built to handle the low beam currents (50-100 nA) used for this experiment. Two new super-harps were installed for calibrating the BPMs. In addition to the existing fast raster system, a slow raster system was installed. We found that before and during the experiment, these new devices were tested and debugged, and their performance was also evaluated. In order to achieve themore » required accuracy (1-2 mm in position and 1-2 mrad in angle at the target location), the data of the BPMs and harps were carefully analyzed, as well as reconstructing the beam position and angle event by event at the target location. Finally, the calculated beam position will be used in the data analysis to accurately determine the kinematics for each event.« less

  17. Flood inundation extent mapping based on block compressed tracing

    NASA Astrophysics Data System (ADS)

    Shen, Dingtao; Rui, Yikang; Wang, Jiechen; Zhang, Yu; Cheng, Liang

    2015-07-01

    Flood inundation extent, depth, and duration are important factors affecting flood hazard evaluation. At present, flood inundation analysis is based mainly on a seeded region-growing algorithm, which is an inefficient process because it requires excessive recursive computations and it is incapable of processing massive datasets. To address this problem, we propose a block compressed tracing algorithm for mapping the flood inundation extent, which reads the DEM data in blocks before transferring them to raster compression storage. This allows a smaller computer memory to process a larger amount of data, which solves the problem of the regular seeded region-growing algorithm. In addition, the use of a raster boundary tracing technique allows the algorithm to avoid the time-consuming computations required by the seeded region-growing. Finally, we conduct a comparative evaluation in the Chin-sha River basin, results show that the proposed method solves the problem of flood inundation extent mapping based on massive DEM datasets with higher computational efficiency than the original method, which makes it suitable for practical applications.

  18. Mapping soil texture classes and optimization of the result by accuracy assessment

    NASA Astrophysics Data System (ADS)

    Laborczi, Annamária; Takács, Katalin; Bakacsi, Zsófia; Szabó, József; Pásztor, László

    2014-05-01

    There are increasing demands nowadays on spatial soil information in order to support environmental related and land use management decisions. The GlobalSoilMap.net (GSM) project aims to make a new digital soil map of the world using state-of-the-art and emerging technologies for soil mapping and predicting soil properties at fine resolution. Sand, silt and clay are among the mandatory GSM soil properties. Furthermore, soil texture class information is input data of significant agro-meteorological and hydrological models. Our present work aims to compare and evaluate different digital soil mapping methods and variables for producing the most accurate spatial prediction of texture classes in Hungary. In addition to the Hungarian Soil Information and Monitoring System as our basic data, digital elevation model and its derived components, geological database, and physical property maps of the Digital Kreybig Soil Information System have been applied as auxiliary elements. Two approaches have been applied for the mapping process. At first the sand, silt and clay rasters have been computed independently using regression kriging (RK). From these rasters, according to the USDA categories, we have compiled the texture class map. Different combinations of reference and training soil data and auxiliary covariables have resulted several different maps. However, these results consequentially include the uncertainty factor of the three kriged rasters. Therefore we have suited data mining methods as the other approach of digital soil mapping. By working out of classification trees and random forests we have got directly the texture class maps. In this way the various results can be compared to the RK maps. The performance of the different methods and data has been examined by testing the accuracy of the geostatistically computed and the directly classified results. We have used the GSM methodology to assess the most predictive and accurate way for getting the best among the

  19. Evaluation of Potential JHSV Port and Alternative Offload Sites in Coastal North Carolina

    DTIC Science & Technology

    2006-08-01

    the underlying data for use in his own Geographic Information System (GIS) application. The quality of this data is variable. This author found... Information Systems (GIS). Unlike the raster charts previously described, these ENC files are vector elements, meaning they can be individually selected in...Single Mobility System . “The Single Mobility System (SMS) embodies the Mobility Access Portal concept, a Web- based interface or “doorway” to other

  20. Efficient analysis of complex natural materials using LA-ICP-MS

    NASA Astrophysics Data System (ADS)

    Kent, A. J.; Loewen, M. W.; Koleszar, A. M.; Miller, J.; Ungerer, C. "

    2011-12-01

    Many natural materials exhibit complex variations in chemical or isotopic composition over relatively short length scales, and these compositional variations often record important information about the environment or nature of the processes that lead to formation. Examples include complexly zoned crystals within volcanic rocks that record magmatic and volcanic signals, otoliths and other biominerals that record life history and environmental information, and speleothems that record climatic variables. Laser ablation ICP-MS analyses offer several advantages for quantifying compositional in chemically complex natural materials. These include the speed of analysis, the ability to sample at atmospheric pressures, the wide diversity of possible analytes, and the ability to make measurements in both spot and raster modes. The latter in particular offers advantages for analyses that require efficient acquisition of information over significant length scales, as in raster mode compositional data can be rapidly obtained by translating the laser laterally over a compositional variable material during a single analysis. In this fashion elemental or isotopic composition at a given analysis time corresponds to the lateral spatial dimension. This contrasts with a record obtained by a row of individual spots, which require a large number of discrete analyses, and requires significantly more analysis time. However there are also disadvantages to this style of analysis. Translation of the circular spots typically used for analysis results in significant signal attenuation and production of artifacts that may mirror natural diffusion profiles or other gradual changes. The ability to ablate using non-circular spots significantly reduces this effect, although the degree of attenuation is also increased by slower ablation cell response times. For single volume cells this may result in 50-100% additional attenuation than that produced by the translation of the spot alone, although two

  1. cu_50mwind

    Science.gov Websites

    -siting potential development projects. This shapefile was generated from a raster dataset with a 200 m Sustainable Energy, LLC for the U.S. Department of Energy ("DOE"). The user is granted the right whatsoever, provided that this entire notice appears in all copies of the data. Further, the user of this

  2. af_pk_50mwind

    Science.gov Websites

    development projects. This shapefile was generated from a raster dataset with a 200 m resolution, in a UTM Sustainable Energy, LLC for the U.S. Department of Energy ("DOE"). The user is granted the right whatsoever, provided that this entire notice appears in all copies of the data. Further, the user of this

  3. gh_50mwind

    Science.gov Websites

    -siting potential development projects. This shapefile was generated from a raster dataset with a 200 m Sustainable Energy, LLC for the U.S. Department of Energy ("DOE"). The user is granted the right whatsoever, provided that this entire notice appears in all copies of the data. Further, the user of this

  4. National Scale Marine Geophysical Data Portal for the Israel EEZ with Public Access Web-GIS Platform

    NASA Astrophysics Data System (ADS)

    Ketter, T.; Kanari, M.; Tibor, G.

    2017-12-01

    Recent offshore discoveries and regulation in the Israel Exclusive Economic Zone (EEZ) are the driving forces behind increasing marine research and development initiatives such as infrastructure development, environmental protection and decision making among many others. All marine operations rely on existing seabed information, while some also generate new data. We aim to create a single platform knowledge-base to enable access to existing information, in a comprehensive, publicly accessible web-based interface. The Israel EEZ covers approx. 26,000 sqkm and has been surveyed continuously with various geophysical instruments over the past decades, including 10,000 km of multibeam survey lines, 8,000 km of sub-bottom seismic lines, and hundreds of sediment sampling stations. Our database consists of vector and raster datasets from multiple sources compiled into a repository of geophysical data and metadata, acquired nation-wide by several research institutes and universities. The repository will enable public access via a web portal based on a GIS platform, including datasets from multibeam, sub-bottom profiling, single- and multi-channel seismic surveys and sediment sampling analysis. Respective data products will also be available e.g. bathymetry, substrate type, granulometry, geological structure etc. Operating a web-GIS based repository allows retrieval of pre-existing data for potential users to facilitate planning of future activities e.g. conducting marine surveys, construction of marine infrastructure and other private or public projects. User interface is based on map oriented spatial selection, which will reveal any relevant data for designated areas of interest. Querying the database will allow the user to obtain information about the data owner and to address them for data retrieval as required. Wide and free public access to existing data and metadata can save time and funds for academia, government and commercial sectors, while aiding in cooperation

  5. Optical scanning tests of complex CMOS microcircuits

    NASA Technical Reports Server (NTRS)

    Levy, M. E.; Erickson, J. J.

    1977-01-01

    The new test method was based on the use of a raster-scanned optical stimulus in combination with special electrical test procedures. The raster-scanned optical stimulus was provided by an optical spot scanner, an instrument that combines a scanning optical microscope with electronic instrumentation to process and display the electric photoresponse signal induced in a device that is being tested.

  6. Field Emission Auger Electron Spectroscopy with Scanning Auger Microscopy |

    Science.gov Websites

    0.5 at.% for elements from lithium to uranium. Depth Profiling Removes successive layers by using size (> ~25 nm). Imaging Obtains SEM micrographs with up to 20,000x magnification by using raster scanning with a highly focused electron beam ≥25 nm in diameter. Using the same raster scan, SAM can

  7. SENTINEL-2 Services Library - efficient way for exploration and exploitation of EO data

    NASA Astrophysics Data System (ADS)

    Milcinski, Grega; Batic, Matej; Kadunc, Miha; Kolaric, Primoz; Mocnik, Rok; Repse, marko

    2017-04-01

    With more than 1.5 million scenes available covering over 11 billion sq. kilometers of area and containing half a quadrillion of pixels, Sentinel-2 is becoming one of the most important MSI datasets in the world. However, the vast amount of data makes it difficult to work with. This is certainly an important reason, why the number of Sentinel based applications is not as high as it could be at this point. We will present a Copernicus Award [1] winning service for archiving, processing and distribution of Sentinel data, Sentinel Hub [2]. It makes it easy for anyone to tap into global Sentinel archive and exploit its rich multi-sensor data to observe changes in the land. We will demonstrate, how one is able not just to observe imagery all over the world but also to create its own statistical analysis in a matter of seconds, performing comparison of different sensors through various time segments. The result can be immediately observed in any GIS tool or exported as a raster file for post-processing. All of these actions can be performed on a full, worldwide, S-2 archive (multi-temporal and multi-spectral). To demonstrate the technology, we created a publicly accessible web application, called "Sentinel Playground" [3], which makes it possible to query Sentinel-2 data anywhere in the world, and experts-oriented tool "EO Browser" [4], where it is also possible to observe land changes through longer period by using historical Landsat data as well. [1] http://www.copernicus-masters.com/index.php?anzeige=press-2016-03.html [2] http://www.sentinel-hub.com [3] http://apps.sentinel-hub.com/sentinel-playground/ [4] http://apps.eocloud.sentinel-hub.com/eo-browser/

  8. Automated oil spill detection with multispectral imagery

    NASA Astrophysics Data System (ADS)

    Bradford, Brian N.; Sanchez-Reyes, Pedro J.

    2011-06-01

    In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.

  9. Validation of a spatial model used to locate fish spawning reef construction sites in the St. Clair–Detroit River system

    USGS Publications Warehouse

    Fischer, Jason L.; Bennion, David; Roseman, Edward F.; Manny, Bruce A.

    2015-01-01

    Lake sturgeon (Acipenser fulvescens) populations have suffered precipitous declines in the St. Clair–Detroit River system, following the removal of gravel spawning substrates and overfishing in the late 1800s to mid-1900s. To assist the remediation of lake sturgeon spawning habitat, three hydrodynamic models were integrated into a spatial model to identify areas in two large rivers, where water velocities were appropriate for the restoration of lake sturgeon spawning habitat. Here we use water velocity data collected with an acoustic Doppler current profiler (ADCP) to assess the ability of the spatial model and its sub-models to correctly identify areas where water velocities were deemed suitable for restoration of fish spawning habitat. ArcMap 10.1 was used to create raster grids of water velocity data from model estimates and ADCP measurements which were compared to determine the percentage of cells similarly classified as unsuitable, suitable, or ideal for fish spawning habitat remediation. The spatial model categorized 65% of the raster cells the same as depth-averaged water velocity measurements from the ADCP and 72% of the raster cells the same as surface water velocity measurements from the ADCP. Sub-models focused on depth-averaged velocities categorized the greatest percentage of cells similar to ADCP measurements where 74% and 76% of cells were the same as depth-averaged water velocity measurements. Our results indicate that integrating depth-averaged and surface water velocity hydrodynamic models may have biased the spatial model and overestimated suitable spawning habitat. A model solely integrating depth-averaged velocity models could improve identification of areas suitable for restoration of fish spawning habitat.

  10. A scalable and multi-purpose point cloud server (PCS) for easier and faster point cloud data management and processing

    NASA Astrophysics Data System (ADS)

    Cura, Rémi; Perret, Julien; Paparoditis, Nicolas

    2017-05-01

    In addition to more traditional geographical data such as images (rasters) and vectors, point cloud data are becoming increasingly available. Such data are appreciated for their precision and true three-Dimensional (3D) nature. However, managing point clouds can be difficult due to scaling problems and specificities of this data type. Several methods exist but are usually fairly specialised and solve only one aspect of the management problem. In this work, we propose a comprehensive and efficient point cloud management system based on a database server that works on groups of points (patches) rather than individual points. This system is specifically designed to cover the basic needs of point cloud users: fast loading, compressed storage, powerful patch and point filtering, easy data access and exporting, and integrated processing. Moreover, the proposed system fully integrates metadata (like sensor position) and can conjointly use point clouds with other geospatial data, such as images, vectors, topology and other point clouds. Point cloud (parallel) processing can be done in-base with fast prototyping capabilities. Lastly, the system is built on open source technologies; therefore it can be easily extended and customised. We test the proposed system with several billion points obtained from Lidar (aerial and terrestrial) and stereo-vision. We demonstrate loading speeds in the ˜50 million pts/h per process range, transparent-for-user and greater than 2 to 4:1 compression ratio, patch filtering in the 0.1 to 1 s range, and output in the 0.1 million pts/s per process range, along with classical processing methods, such as object detection.

  11. Web-Based Tools for Data Visualization and Decision Support for South Asia

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Pulla, S. T.; Ames, D. P.; Souffront, M.; David, C. H.; Zaitchik, B. F.; Gatlin, P. N.; Matin, M. A.

    2017-12-01

    The objective of the NASA SERVIR project is to assist developing countries in using information provided by Earth observing satellites to assess and manage climate risks, land use, and water resources. We present a collection of web apps that integrate earth observations and in situ data to facilitate deployment of data and water resources models as decision-making tools in support of this effort. The interactive nature of web apps makes this an excellent medium for creating decision support tools that harness cutting edge modeling techniques. Thin client apps hosted in a cloud portal eliminates the need for the decision makers to procure and maintain the high performance hardware required by the models, deal with issues related to software installation and platform incompatibilities, or monitor and install software updates, a problem that is exacerbated for many of the regional SERVIR hubs where both financial and technical capacity may be limited. All that is needed to use the system is an Internet connection and a web browser. We take advantage of these technologies to develop tools which can be centrally maintained but openly accessible. Advanced mapping and visualization make results intuitive and information derived actionable. We also take advantage of the emerging standards for sharing water information across the web using the OGC and WMO approved WaterML standards. This makes our tools interoperable and extensible via application programming interfaces (APIs) so that tools and data from other projects can both consume and share the tools developed in our project. Our approach enables the integration of multiple types of data and models, thus facilitating collaboration between science teams in SERVIR. The apps developed thus far by our team process time-varying netCDF files from Earth observations and large-scale computer simulations and allow visualization and exploration via raster animation and extraction of time series at selected points and/or regions.

  12. Ontology for cell-based geographic information

    NASA Astrophysics Data System (ADS)

    Zheng, Bin; Huang, Lina; Lu, Xinhai

    2009-10-01

    Inter-operability is a key notion in geographic information science (GIS) for the sharing of geographic information (GI). That requires a seamless translation among different information sources. Ontology is enrolled in GI discovery to settle the semantic conflicts for its natural language appearance and logical hierarchy structure, which are considered to be able to provide better context for both human understanding and machine cognition in describing the location and relationships in the geographic world. However, for the current, most studies on field ontology are deduced from philosophical theme and not applicable for the raster expression in GIS-which is a kind of field-like phenomenon but does not physically coincide to the general concept of philosophical field (mostly comes from the physics concepts). That's why we specifically discuss the cell-based GI ontology in this paper. The discussion starts at the investigation of the physical characteristics of cell-based raster GI. Then, a unified cell-based GI ontology framework for the recognition of the raster objects is introduced, from which a conceptual interface for the connection of the human epistemology and the computer world so called "endurant-occurrant window" is developed for the better raster GI discovery and sharing.

  13. Geospatial data infrastructure: The development of metadata for geo-information in China

    NASA Astrophysics Data System (ADS)

    Xu, Baiquan; Yan, Shiqiang; Wang, Qianju; Lian, Jian; Wu, Xiaoping; Ding, Keyong

    2014-03-01

    Stores of geoscience records are in constant flux. These stores are continually added to by new information, ideas and data, which are frequently revised. The geoscience record is in restrained by human thought and technology for handling information. Conventional methods strive, with limited success, to maintain geoscience records which are readily susceptible and renewable. The information system must adapt to the diversity of ideas and data in geoscience and their changes through time. In China, more than 400,000 types of important geological data are collected and produced in geological work during the last two decades, including oil, natural gas and marine data, mine exploration, geophysical, geochemical, remote sensing and important local geological survey and research reports. Numerous geospatial databases are formed and stored in National Geological Archives (NGA) with available formats of MapGIS, ArcGIS, ArcINFO, Metalfile, Raster, SQL Server, Access and JPEG. But there is no effective way to warrant that the quality of information is adequate in theory and practice for decision making. The need for fast, reliable, accurate and up-to-date information by providing the Geographic Information System (GIS) communities are becoming insistent for all geoinformation producers and users in China. Since 2010, a series of geoinformation projects have been carried out under the leadership of the Ministry of Land and Resources (MLR), including (1) Integration, update and maintenance of geoinformation databases; (2) Standards research on clusterization and industrialization of information services; (3) Platform construction of geological data sharing; (4) Construction of key borehole databases; (5) Product development of information services. "Nine-System" of the basic framework has been proposed for the development and improvement of the geospatial data infrastructure, which are focused on the construction of the cluster organization, cluster service, convergence

  14. The Geoinformatica free and open source software stack

    NASA Astrophysics Data System (ADS)

    Jolma, A.

    2012-04-01

    The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object

  15. Merging a Terrain-Based Parameter and Snow Particle Counter Data for the Assessment of Snow Redistribution in the Col du Lac Blanc Area

    NASA Astrophysics Data System (ADS)

    Schön, Peter; Prokop, Alexander; Naaim-Bouvet, Florence; Vionnet, Vincent; Guyomarc'h, Gilbert; Heiser, Micha; Nishimura, Kouichi

    2015-04-01

    of the snow surface in dependency of Sx, SPC flux and time, we apply a simple cellular automata system. The system consists of raster cells that develop through discrete time steps according to a set of rules. The rules are based on the states of neighboring cells. Our model assumes snow transport in dependency of Sx gradients between neighboring cells. The cells evolve based on difference quotients between neighbouring cells. Our analyses and results are steps towards using the terrain-based parameter Sx, coupled with SPC data, to quantitatively estimate changes in snow depths, using high raster resolutions of 1 m.

  16. EAARL coastal topography--Alligator Point, Louisiana, 2010

    USGS Publications Warehouse

    Nayegandhi, Amar; Bonisteel-Cormier, J.M.; Wright, C.W.; Brock, J.C.; Nagle, D.B.; Vivekanandan, Saisudha; Fredericks, Xan; Barras, J.A.

    2012-01-01

    This project provides highly detailed and accurate datasets of a portion of Alligator Point, Louisiana, acquired on March 5 and 6, 2010. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative airborne lidar instrument originally developed at the National Aeronautics and Space Administration (NASA) Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multispectral color-infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for sub-meter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine aircraft, but the instrument was deployed on a Pilatus PC-6. A single pilot, a lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have

  17. U.S. Geological Survey spatial data access

    USGS Publications Warehouse

    Faundeen, John L.; Kanengieter, Ronald L.; Buswell, Michael D.

    2002-01-01

    The U.S. Geological Survey (USGS) has done a progress review on improving access to its spatial data holdings over the Web. The USGS EROS Data Center has created three major Web-based interfaces to deliver spatial data to the general public; they are Earth Explorer, the Seamless Data Distribution System (SDDS), and the USGS Web Mapping Portal. Lessons were learned in developing these systems, and various resources were needed for their implementation. The USGS serves as a fact-finding agency in the U.S. Government that collects, monitors, analyzes, and provides scientific information about natural resource conditions and issues. To carry out its mission, the USGS has created and managed spatial data since its inception. Originally relying on paper maps, the USGS now uses advanced technology to produce digital representations of the Earth’s features. The spatial products of the USGS include both source and derivative data. Derivative datasets include Digital Orthophoto Quadrangles (DOQ), Digital Elevation Models, Digital Line Graphs, land-cover Digital Raster Graphics, and the seamless National Elevation Dataset. These products, created with automated processes, use aerial photographs, satellite images, or other cartographic information such as scanned paper maps as source data. With Earth Explorer, users can search multiple inventories through metadata queries and can browse satellite and DOQ imagery. They can place orders and make payment through secure credit card transactions. Some USGS spatial data can be accessed with SDDS. The SDDS uses an ArcIMS map service interface to identify the user’s areas of interest and determine the output format; it allows the user to either download the actual spatial data directly for small areas or place orders for larger areas to be delivered on media. The USGS Web Mapping Portal provides views of national and international datasets through an ArcIMS map service interface. In addition, the map portal posts news about new

  18. Production data from a Leica ZBA31H+ shaped e-beam mask writer located at the Photronics facility, Manchester, England

    NASA Astrophysics Data System (ADS)

    Johnson, Stephen; Loughran, Dominic; Osborne, Peter; Sixt, Pierre; Doering, Hans-Joachim

    1999-06-01

    The ZBA31H+) is a variable shaped spot, vector scan e- beam lithography system operating at 20 keV. The specified performance is designed to produce reticles to 250 nanometer design rules, and beyond. In November 98 the acceptance results of a newly installed Leica ZBA31H+), at Photonic Manchester, were presented in a paper at the VDE/VDI 15th European Conference on Mask Technology. This paper is a continuation of that work and presents data from a capability study carried out, on 4000 angstrom EBR9 HS31 resist. Analysis of: mean to target, uniformity, X/Y bias, isolated vs. dense linewidths, linearity, and registration performance of the tool is presented, and the effects of re- iterative develop on process capability compared. Theoretically, a shaped beam system has advantages over raster scan in terms of write time and edge definition capabilities. In this paper, comparative write times against an Etec Mebes 4500 system are included. The ZBA31H+) has to write very small polygons in order to image non-axial or non-45 degree features. The resulting effect on image quality and write time is investigated. In order to improve the fidelity of small OPC structures, Leica have investigated alternative writing strategies, and their results to data are presented here.

  19. Enabling Web-Based Analysis of CUAHSI HIS Hydrologic Data Using R and Web Processing Services

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Kadlec, J.; Bayles, M.; Seul, M.; Hooper, R. P.; Cummings, B.

    2015-12-01

    The CUAHSI Hydrologic Information System (CUAHSI HIS) provides open access to a large number of hydrological time series observation and modeled data from many parts of the world. Several software tools have been designed to simplify searching and access to the CUAHSI HIS datasets. These software tools include: Desktop client software (HydroDesktop, HydroExcel), developer libraries (WaterML R Package, OWSLib, ulmo), and the new interactive search website, http://data.cuahsi.org. An issue with using the time series data from CUAHSI HIS for further analysis by hydrologists (for example for verification of hydrological and snowpack models) is the large heterogeneity of the time series data. The time series may be regular or irregular, contain missing data, have different time support, and be recorded in different units. R is a widely used computational environment for statistical analysis of time series and spatio-temporal data that can be used to assess fitness and perform scientific analyses on observation data. R includes the ability to record a data analysis in the form of a reusable script. The R script together with the input time series dataset can be shared with other users, making the analysis more reproducible. The major goal of this study is to examine the use of R as a Web Processing Service for transforming time series data from the CUAHSI HIS and sharing the results on the Internet within HydroShare. HydroShare is an online data repository and social network for sharing large hydrological data sets such as time series, raster datasets, and multi-dimensional data. It can be used as a permanent cloud storage space for saving the time series analysis results. We examine the issues associated with running R scripts online: including code validation, saving of outputs, reporting progress, and provenance management. An explicit goal is that the script which is run locally should produce exactly the same results as the script run on the Internet. Our design can

  20. Hydrologic Derivatives for Modeling and Analysis—A new global high-resolution database

    USGS Publications Warehouse

    Verdin, Kristine L.

    2017-07-17

    The U.S. Geological Survey has developed a new global high-resolution hydrologic derivative database. Loosely modeled on the HYDRO1k database, this new database, entitled Hydrologic Derivatives for Modeling and Analysis, provides comprehensive and consistent global coverage of topographically derived raster layers (digital elevation model data, flow direction, flow accumulation, slope, and compound topographic index) and vector layers (streams and catchment boundaries). The coverage of the data is global, and the underlying digital elevation model is a hybrid of three datasets: HydroSHEDS (Hydrological data and maps based on SHuttle Elevation Derivatives at multiple Scales), GMTED2010 (Global Multi-resolution Terrain Elevation Data 2010), and the SRTM (Shuttle Radar Topography Mission). For most of the globe south of 60°N., the raster resolution of the data is 3 arc-seconds, corresponding to the resolution of the SRTM. For the areas north of 60°N., the resolution is 7.5 arc-seconds (the highest resolution of the GMTED2010 dataset) except for Greenland, where the resolution is 30 arc-seconds. The streams and catchments are attributed with Pfafstetter codes, based on a hierarchical numbering system, that carry important topological information. This database is appropriate for use in continental-scale modeling efforts. The work described in this report was conducted by the U.S. Geological Survey in cooperation with the National Aeronautics and Space Administration Goddard Space Flight Center.

  1. id_wtimor_50mwind

    Science.gov Websites

    . This shapefile was generated from a raster dataset with a 200 m resolution, in a UTM zone 12, datum WGS Sustainable Energy, LLC for the U.S. Department of Energy ("DOE"). The user is granted the right whatsoever, provided that this entire notice appears in all copies of the data. Further, the user of this

  2. Data Provenance as a Tool for Debugging Hydrological Models based on Python

    NASA Astrophysics Data System (ADS)

    Wombacher, A.; Huq, M.; Wada, Y.; Van Beek, R.

    2012-12-01

    There is an increase in data volume used in hydrological modeling. The increasing data volume requires additional efforts in debugging models since a single output value is influenced by a multitude of input values. Thus, it is difficult to keep an overview among the data dependencies. Further, knowing these dependencies, it is a tedious job to infer all the relevant data values. The aforementioned data dependencies are also known as data provenance, i.e. the determination of how a particular value has been created and processed. The proposed tool infers the data provenance automatically from a python script and visualizes the dependencies as a graph without executing the script. To debug the model the user specifies the value of interest in space and time. The tool infers all related data values and displays them in the graph. The tool has been evaluated by hydrologists developing a model for estimating the global water demand [1]. The model uses multiple different data sources. The script we analysed has 120 lines of codes and used more than 3000 individual files, each of them representing a raster map of 360*720 cells. After importing the data of the files into a SQLite database, the data consumes around 40 GB of memory. Using the proposed tool a modeler is able to select individual values and infer which values have been used to calculate the value. Especially in cases of outliers or missing values it is a beneficial tool to provide the modeler with efficient information to investigate the unexpected behavior of the model. The proposed tool can be applied to many python scripts and has been tested with other scripts in different contexts. In case a python code contains an unknown function or class the tool requests additional information about the used function or class to enable the inference. This information has to be entered only once and can be shared with colleagues or in the community. Reference [1] Y. Wada, L. P. H. van Beek, D. Viviroli, H. H. Drr, R

  3. An Automated Algorithm for Producing Land Cover Information from Landsat Surface Reflectance Data Acquired Between 1984 and Present

    NASA Astrophysics Data System (ADS)

    Rover, J.; Goldhaber, M. B.; Holen, C.; Dittmeier, R.; Wika, S.; Steinwand, D.; Dahal, D.; Tolk, B.; Quenzer, R.; Nelson, K.; Wylie, B. K.; Coan, M.

    2015-12-01

    Multi-year land cover mapping from remotely sensed data poses challenges. Producing land cover products at spatial and temporal scales required for assessing longer-term trends in land cover change are typically a resource-limited process. A recently developed approach utilizes open source software libraries to automatically generate datasets, decision tree classifications, and data products while requiring minimal user interaction. Users are only required to supply coordinates for an area of interest, land cover from an existing source such as National Land Cover Database and percent slope from a digital terrain model for the same area of interest, two target acquisition year-day windows, and the years of interest between 1984 and present. The algorithm queries the Landsat archive for Landsat data intersecting the area and dates of interest. Cloud-free pixels meeting the user's criteria are mosaicked to create composite images for training the classifiers and applying the classifiers. Stratification of training data is determined by the user and redefined during an iterative process of reviewing classifiers and resulting predictions. The algorithm outputs include yearly land cover raster format data, graphics, and supporting databases for further analysis. Additional analytical tools are also incorporated into the automated land cover system and enable statistical analysis after data are generated. Applications tested include the impact of land cover change and water permanence. For example, land cover conversions in areas where shrubland and grassland were replaced by shale oil pads during hydrofracking of the Bakken Formation were quantified. Analytical analysis of spatial and temporal changes in surface water included identifying wetlands in the Prairie Pothole Region of North Dakota with potential connectivity to ground water, indicating subsurface permeability and geochemistry.

  4. Swept Line Electron Beam Annealing of Ion Implanted Semiconductors.

    DTIC Science & Technology

    1982-07-01

    of my research to the mainstream of technology. The techniques used for beam processing are distinguished by their * ~.* beam source and method by...raster scanned CW lasers (CWL), pulsed ion beams (PI), area pulsed electron beams (PEE), raster scanned (RSEB) or multi - scanned electron beams (MSEB...where high quality or tailored profiles are required. Continuous wave lasers and multi -scanned or swept-line electron beams are the most likely candidates

  5. Precipitation collector bias and its effects on temporal trends and spatial variability in National Atmospheric Deposition Program/National Trends Network data

    USGS Publications Warehouse

    Wetherbee, Gregory A.

    2017-01-01

    Precipitation samples have been collected by the National Atmospheric Deposition Program's (NADP) National Trends Network (NTN) using the Aerochem Metrics Model 301 (ACM) collector since 1978. Approximately one-third of the NTN ACM collectors have been replaced with N-CON Systems, Inc. Model ADS 00-120 (NCON) collectors. Concurrent data were collected over 6 years at 12 NTN sites using colocated ACM and NCON collectors in various precipitation regimes. Linear regression models of the colocated data were used to adjust for relative bias between the collectors. Replacement of ACM collectors with NCON collectors resulted in shifts in 10-year seasonal precipitation-weighted mean concentration (PWMC) trend slopes for: cations (−0.001 to −0.007 mgL−1yr−1), anions (−0.009 to −0.028 mgL−1yr−1), and hydrogen ion (+0.689 meqL-1yr−1). Larger shifts in NO3− and SO4−2 seasonal PWMC trend slopes were observed in the Midwest and Northeast US, where concentrations are generally higher than in other regions. Geospatial analysis of interpolated concentration rasters indicated regions of accentuated variability introduced by incorporation of NCON collectors into the NTN.

  6. Digital surfaces and hydrogeologic data for the Mesozoic through early Tertiary rocks in the Southeastern Coastal Plain in parts of Mississippi, Alabama, Georgia, South Carolina, and Florida

    USGS Publications Warehouse

    Cannon, Debra M.; Bellino, Jason C.; Williams, Lester J.

    2012-01-01

    A digital dataset of hydrogeologic data for Mesozoic through early Tertiary rocks in the Southeastern Coastal Plain was developed using data from five U.S. Geological Survey (USGS) reports published between 1951 and 1996. These reports contain maps and data depicting the extent and elevation of the Southeast Coastal Plain stratigraphic and hydrogeologic units in Florida and parts of Mississippi, Alabama, Georgia, and South Carolina. The reports are: Professional Paper 1410-B (Renken, 1996), Professional Paper 1088 (Brown and others, 1979), Professional Paper 524-G (Applin and Applin, 1967), Professional Paper 447 (Applin and Applin, 1965), and Circular 91 (Applin, 1951). The digital dataset provides hydrogeologic data for the USGS Energy Resources Program assessment of potential reservoirs for carbon sequestration and for the USGS Groundwater Resource Program assessment of saline aquifers in the southeastern United States. A Geographic Information System (ArcGIS 9.3.1) was used to construct 33 digital (raster) surfaces representing the top or base of key stratigraphic and hydrogeologic units. In addition, the Geographic Information System was used to generate 102 geo-referenced scanned maps from the five reports and a geo-database containing structural and thickness contours, faults, extent polygons, and common features. The dataset also includes point data of well construction and stratigraphic elevations and scanned images of two geologic cross sections and a nomenclature chart.

  7. Practical applications of remote sensing technology

    NASA Technical Reports Server (NTRS)

    Whitmore, Roy A., Jr.

    1990-01-01

    Land managers increasingly are becoming dependent upon remote sensing and automated analysis techniques for information gathering and synthesis. Remote sensing and geographic information system (GIS) techniques provide quick and economical information gathering for large areas. The outputs of remote sensing classification and analysis are most effective when combined with a total natural resources data base within the capabilities of a computerized GIS. Some examples are presented of the successes, as well as the problems, in integrating remote sensing and geographic information systems. The need to exploit remotely sensed data and the potential that geographic information systems offer for managing and analyzing such data continues to grow. New microcomputers with vastly enlarged memory, multi-fold increases in operating speed and storage capacity that was previously available only on mainframe computers are a reality. Improved raster GIS software systems have been developed for these high performance microcomputers. Vector GIS systems previously reserved for mini and mainframe systems are available to operate on these enhanced microcomputers. One of the more exciting areas that is beginning to emerge is the integration of both raster and vector formats on a single computer screen. This technology will allow satellite imagery or digital aerial photography to be presented as a background to a vector display.

  8. Time-series animation techniques for visualizing urban growth

    USGS Publications Warehouse

    Acevedo, W.; Masuoka, P.

    1997-01-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.

  9. Digital map databases in support of avionic display systems

    NASA Astrophysics Data System (ADS)

    Trenchard, Michael E.; Lohrenz, Maura C.; Rosche, Henry, III; Wischow, Perry B.

    1991-08-01

    The emergence of computerized mission planning systems (MPS) and airborne digital moving map systems (DMS) has necessitated the development of a global database of raster aeronautical chart data specifically designed for input to these systems. The Naval Oceanographic and Atmospheric Research Laboratory''s (NOARL) Map Data Formatting Facility (MDFF) is presently dedicated to supporting these avionic display systems with the development of the Compressed Aeronautical Chart (CAC) database on Compact Disk Read Only Memory (CDROM) optical discs. The MDFF is also developing a series of aircraft-specific Write-Once Read Many (WORM) optical discs. NOARL has initiated a comprehensive research program aimed at improving the pilots'' moving map displays current research efforts include the development of an alternate image compression technique and generation of a standard set of color palettes. The CAC database will provide digital aeronautical chart data in six different scales. CAC is derived from the Defense Mapping Agency''s (DMA) Equal Arc-second (ARC) Digitized Raster Graphics (ADRG) a series of scanned aeronautical charts. NOARL processes ADRG to tailor the chart image resolution to that of the DMS display while reducing storage requirements through image compression techniques. CAC is being distributed by DMA as a library of CDROMs.

  10. Spatial analysis of agro-ecological data: Detection of spatial patterns combining three different methodical approaches

    NASA Astrophysics Data System (ADS)

    Heuer, A.; Casper, M. C.; Vohland, M.

    2009-04-01

    Processes in natural systems and the resulting patterns occur in ecological space and time. To study natural structures and to understand the functional processes it is necessary to identify the relevant spatial and temporal space at which these all occur; or with other words to isolate spatial and temporal patterns. In this contribution we will concentrate on the spatial aspects of agro-ecological data analysis. Data were derived from two agricultural plots, each of about 5 hectares, in the area of Newel, located in Western Palatinate, Germany. The plots had been conventionally cultivated with a crop rotation of winter rape, winter wheat and spring barley. Data about physical and chemical soil properties, vegetation and topography were i) collected by measurements in the field during three vegetation periods (2005-2008) and/or ii) derived from hyperspectral image data, acquired by a HyMap airborne imaging sensor (2005). To detect spatial variability within the plots, we applied three different approaches that examine and describe relationships among data. First, we used variography to get an overview of the data. A comparison of the experimental variograms facilitated to distinguish variables, which seemed to occur in related or dissimilar spatial space. Second, based on data available in raster-format basic cell statistics were conducted, using a geographic information system. Here we could make advantage of the powerful classification and visualization tool, which supported the spatial distribution of patterns. Third, we used an approach that is being used for visualization of complex highly dimensional environmental data, the Kohonen self-organizing map. The self-organizing map (SOM) uses multidimensional data that gets further reduced in dimensionality (2-D) to detect similarities in data sets and correlation between single variables. One of SOM's advantages is its powerful visualization capability. The combination of the three approaches leads to

  11. Earth Explorer

    USGS Publications Warehouse

    ,

    2000-01-01

    The U.S. Geological Survey's (USGS) Earth Explorer Web site provides access to millions of land-related products, including the following: Satellite images from Landsat, advanced very high resolution radiometer (AVHRR), and Corona data sets. Aerial photographs from the National Aerial Photography Program, NASA, and USGS data sets.  Digital cartographic data from digital elevation models, digital line graphs, digital raster graphics, and digital orthophoto quadrangles. USGS paper maps Digital, film, and paper products are available, and many products can be previewed before ordering.

  12. Comparative mineral mapping in the Colorado Mineral Belt using AVIRIS and ASTER remote sensing data

    USGS Publications Warehouse

    Rockwell, Barnaby W.

    2013-01-01

    This report presents results of interpretation of spectral remote sensing data covering the eastern Colorado Mineral Belt in central Colorado, USA, acquired by the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) sensors. This study was part of a multidisciplinary mapping and data integration project at the U.S. Geological Survey that focused on long-term resource planning by land-managing entities in Colorado. The map products were designed primarily for the regional mapping and characterization of exposed surface mineralogy, including that related to hydrothermal alteration and supergene weathering of pyritic rocks. Alteration type was modeled from identified minerals based on standard definitions of alteration mineral assemblages. Vegetation was identified using the ASTER data and subdivided based on per-pixel chlorophyll content (depth of 0.68 micrometer absorption band) and dryness (fit and depth of leaf biochemical absorptions in the shortwave infrared spectral region). The vegetation results can be used to estimate the abundance of fire fuels at the time of data acquisition (2002 and 2003). The AVIRIS- and ASTER-derived mineral mapping results can be readily compared using the toggleable layers in the GeoPDF file, and by using the provided GIS-ready raster datasets. The results relating to mineral occurrence and distribution were an important source of data for studies documenting the effects of mining and un-mined, altered rocks on aquatic ecosystems at the watershed level. These studies demonstrated a high correlation between metal concentrations in streams and the presence of hydrothermal alteration and (or) pyritic mine waste as determined by analysis of the map products presented herein. The mineral mapping results were also used to delineate permissive areas for various mineral deposit types.

  13. Mineral and Vegetation Maps of the Bodie Hills, Sweetwater Mountains, and Wassuk Range, California/Nevada, Generated from ASTER Satellite Data

    USGS Publications Warehouse

    Rockwell, Barnaby W.

    2010-01-01

    Multispectral remote sensing data acquired by the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) were analyzed to identify and map minerals, vegetation groups, and volatiles (water and snow) in support of geologic studies of the Bodie Hills, Sweetwater Mountains, and Wassuk Range, California/Nevada. Digital mineral and vegetation mapping results are presented in both portable document format (PDF) and ERDAS Imagine format (.img). The ERDAS-format files are suitable for integration with other geospatial data in Geographic Information Systems (GIS) such as ArcGIS. The ERDAS files showing occurrence of 1) iron-bearing minerals, vegetation, and water, and 2) clay, sulfate, mica, carbonate, Mg-OH, and hydrous quartz minerals have been attributed according to identified material, so that the material detected in a pixel can be queried with the interactive attribute identification tools of GIS and image processing software packages (for example, the Identify Tool of ArcMap and the Inquire Cursor Tool of ERDAS Imagine). All raster data have been orthorectified to the Universal Transverse Mercator (UTM) projection using a projective transform with ground-control points selected from orthorectified Landsat Thematic Mapper data and a digital elevation model from the U.S. Geological Survey (USGS) National Elevation Dataset (1/3 arc second, 10 m resolution). Metadata compliant with Federal Geographic Data Committee (FGDC) standards for all ERDAS-format files have been included, and contain important information regarding geographic coordinate systems, attributes, and cross-references. Documentation regarding spectral analysis methodologies employed to make the maps is included in these cross-references.

  14. Soil erosion assessment and its correlation with landslide events using remote sensing data and GIS: a case study at Penang Island, Malaysia.

    PubMed

    Pradhan, Biswajeet; Chaudhari, Amruta; Adinarayana, J; Buchroithner, Manfred F

    2012-01-01

    In this paper, an attempt has been made to assess, prognosis and observe dynamism of soil erosion by universal soil loss equation (USLE) method at Penang Island, Malaysia. Multi-source (map-, space- and ground-based) datasets were used to obtain both static and dynamic factors of USLE, and an integrated analysis was carried out in raster format of GIS. A landslide location map was generated on the basis of image elements interpretation from aerial photos, satellite data and field observations and was used to validate soil erosion intensity in the study area. Further, a statistical-based frequency ratio analysis was carried out in the study area for correlation purposes. The results of the statistical correlation showed a satisfactory agreement between the prepared USLE-based soil erosion map and landslide events/locations, and are directly proportional to each other. Prognosis analysis on soil erosion helps the user agencies/decision makers to design proper conservation planning program to reduce soil erosion. Temporal statistics on soil erosion in these dynamic and rapid developments in Penang Island indicate the co-existence and balance of ecosystem.

  15. Video flowmeter

    DOEpatents

    Lord, D.E.; Carter, G.W.; Petrini, R.R.

    1983-08-02

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid. 4 figs.

  16. Flat panel ferroelectric electron emission display system

    DOEpatents

    Sampayan, Stephen E.; Orvis, William J.; Caporaso, George J.; Wieskamp, Ted F.

    1996-01-01

    A device which can produce a bright, raster scanned or non-raster scanned image from a flat panel. Unlike many flat panel technologies, this device does not require ambient light or auxiliary illumination for viewing the image. Rather, this device relies on electrons emitted from a ferroelectric emitter impinging on a phosphor. This device takes advantage of a new electron emitter technology which emits electrons with significant kinetic energy and beam current density.

  17. Electron beam throughput from raster to imaging

    NASA Astrophysics Data System (ADS)

    Zywno, Marek

    2016-12-01

    Two architectures of electron beam tools are presented: single beam MEBES Exara designed and built by Etec Systems for mask writing, and the Reflected E-Beam Lithography tool (REBL), designed and built by KLA-Tencor under a DARPA Agreement No. HR0011-07-9-0007. Both tools have implemented technologies not used before to achieve their goals. The MEBES X, renamed Exara for marketing purposes, used an air bearing stage running in vacuum to achieve smooth continuous scanning. The REBL used 2 dimensional imaging to distribute charge to a 4k pixel swath to achieve writing times on the order of 1 wafer per hour, scalable to throughput approaching optical projection tools. Three stage architectures were designed for continuous scanning of wafers: linear maglev, rotary maglev, and dual linear maglev.

  18. Raster-Based Approach to Solar Pressure Modeling

    NASA Technical Reports Server (NTRS)

    Wright, Theodore W. II

    2013-01-01

    An algorithm has been developed to take advantage of the graphics processing hardware in modern computers to efficiently compute high-fidelity solar pressure forces and torques on spacecraft, taking into account the possibility of self-shading due to the articulation of spacecraft components such as solar arrays. The process is easily extended to compute other results that depend on three-dimensional attitude analysis, such as solar array power generation or free molecular flow drag. The impact of photons upon a spacecraft introduces small forces and moments. The magnitude and direction of the forces depend on the material properties of the spacecraft components being illuminated. The parts of the components being lit depends on the orientation of the craft with respect to the Sun, as well as the gimbal angles for any significant moving external parts (solar arrays, typically). Some components may shield others from the Sun. The purpose of this innovation is to enable high-fidelity computation of solar pressure and power generation effects of illuminated portions of spacecraft, taking self-shading from spacecraft attitude and movable components into account. The key idea in this innovation is to compute results dependent upon complicated geometry by using an image to break the problem into thousands or millions of sub-problems with simple geometry, and then the results from the simpler problems are combined to give high-fidelity results for the full geometry. This process is performed by constructing a 3D model of a spacecraft using an appropriate computer language (OpenGL), and running that model on a modern computer's 3D accelerated video processor. This quickly and accurately generates a view of the model (as shown on a computer screen) that takes rotation and articulation of spacecraft components into account. When this view is interpreted as the spacecraft as seen by the Sun, then only the portions of the craft visible in the view are illuminated. The view as shown on the computer screen is composed of up to millions of pixels. Each of those pixels is associated with a small illuminated area of the spacecraft. For each pixel, it is possible to compute its position, angle (surface normal) from the view direction, and the spacecraft material (and therefore, optical coefficients) associated with that area. With this information, the area associated with each pixel can be modeled as a simple flat plate for calculating solar pressure. The vector sum of these individual flat plate models is a high-fidelity approximation of the solar pressure forces and torques on the whole vehicle. In addition to using optical coefficients associated with each spacecraft material to calculate solar pressure, a power generation coefficient is added for computing solar array power generation from the sum of the illuminated areas. Similarly, other area-based calculations, such as free molecular flow drag, are also enabled. Because the model rendering is separated from other calculations, it is relatively easy to add a new model to explore a new vehicle or mission configuration. Adding a new model is performed by adding OpenGL code, but a future version might read a mesh file exported from a computer-aided design (CAD) system to enable very rapid turnaround for new designs

  19. Creating Digital Environments for Multi-Agent Simulation

    DTIC Science & Technology

    2003-12-01

    foliage on a polygon to represent a tree). Tile A spatial partition of a coverage that shares the same set of feature classes with the same... orthophoto datasets can be made from rectified grayscale aerial images. These datasets can support various weapon systems, Command, Control...Raster Product Format (RPF) Standard. This data consists of unclassified seamless orthophotos , made from rectified grayscale aerial images. DOI 10

  20. Scanning instrumentation for measuring magnetic field trapping in high Tc superconductors

    NASA Technical Reports Server (NTRS)

    Sisk, R. C.; Helton, A. J.

    1993-01-01

    Computerized scanning instrumentation measures and displays trapped magnetic fields across the surface of high Tc superconductors at 77 K. Data are acquired in the form of a raster scan image utilizing stepping motor stages for positioning and a cryogenic Hall probe for magnetic field readout. Flat areas up to 45 mm in diameter are scanned with 0.5-mm resolution and displayed as false color images.

  1. Area- and depth- weighted averages of selected SSURGO variables for the conterminous United States and District of Columbia

    USGS Publications Warehouse

    Wieczorek, Michael

    2014-01-01

    This digital data release consists of seven data files of soil attributes for the United States and the District of Columbia. The files are derived from National Resources Conservations Service’s (NRCS) Soil Survey Geographic database (SSURGO). The data files can be linked to the raster datasets of soil mapping unit identifiers (MUKEY) available through the NRCS’s Gridded Soil Survey Geographic (gSSURGO) database (http://www.nrcs.usda.gov/wps/portal/nrcs/detail/soils/survey/geo/?cid=nrcs142p2_053628). The associated files, named DRAINAGECLASS, HYDRATING, HYDGRP, HYDRICCONDITION, LAYER, TEXT, and WTDEP are area- and depth-weighted average values for selected soil characteristics from the SSURGO database for the conterminous United States and the District of Columbia. The SSURGO tables were acquired from the NRCS on March 5, 2014. The soil characteristics in the DRAINAGE table are drainage class (DRNCLASS), which identifies the natural drainage conditions of the soil and refers to the frequency and duration of wet periods. The soil characteristics in the HYDRATING table are hydric rating (HYDRATE), a yes/no field that indicates whether or not a map unit component is classified as a "hydric soil". The soil characteristics in the HYDGRP table are the percentages for each hydrologic group per MUKEY. The soil characteristics in the HYDRICCONDITION table are hydric condition (HYDCON), which describes the natural condition of the soil component. The soil characteristics in the LAYER table are available water capacity (AVG_AWC), bulk density (AVG_BD), saturated hydraulic conductivity (AVG_KSAT), vertical saturated hydraulic conductivity (AVG_KV), soil erodibility factor (AVG_KFACT), porosity (AVG_POR), field capacity (AVG_FC), the soil fraction passing a number 4 sieve (AVG_NO4), the soil fraction passing a number 10 sieve (AVG_NO10), the soil fraction passing a number 200 sieve (AVG_NO200), and organic matter (AVG_OM). The soil characteristics in the TEXT table are

  2. An alternative methodology for the analysis of electrical resistivity data from a soil gas study

    NASA Astrophysics Data System (ADS)

    Johansson, Sara; Rosqvist, Hâkan; Svensson, Mats; Dahlin, Torleif; Leroux, Virginie

    2011-08-01

    The aim of this paper is to present an alternative method for the analysis of resistivity data. The methodology was developed during a study to evaluate if electrical resistivity can be used as a tool for analysing subsurface gas dynamics and gas emissions from landfills. The main assumption of this study was that variations in time of resistivity data correspond to variations in the relative amount of gas and water in the soil pores. Field measurements of electrical resistivity, static chamber gas flux and weather data were collected at a landfill in Helsingborg, Sweden. The resistivity survey arrangement consisted of nine lines each with 21 electrodes in an investigation area of 16 ×20 m. The ABEM Lund Imaging System provided vertical and horizontal resistivity profiles every second hour. The data were inverted in Res3Dinv using L1-norm-based optimization method with a standard least-squares formulation. Each horizontal soil layer was then represented as a linear interpolated raster model. Different areas underneath the gas flux measurement points were defined in the resistivity model of the uppermost soil layer, and the vertical extension of the zones could be followed at greater depths in deeper layer models. The average resistivity values of the defined areas were calculated and plotted on a time axis, to provide graphs of the variation in resistivity with time in a specific section of the ground. Residual variation of resistivity was calculated by subtracting the resistivity variations caused by the diurnal temperature variations from the measured resistivity data. The resulting residual resistivity graphs were compared with field data of soil moisture, precipitation, soil temperature and methane flux. The results of the study were qualitative, but promising indications of relationships between electrical resistivity and variations in the relative amount of gas and water in the soil pores were found. Even though more research and better data quality is

  3. GNU Data Language (GDL) - a free and open-source implementation of IDL

    NASA Astrophysics Data System (ADS)

    Arabas, Sylwester; Schellens, Marc; Coulais, Alain; Gales, Joel; Messmer, Peter

    2010-05-01

    GNU Data Language (GDL) is developed with the aim of providing an open-source drop-in replacement for the ITTVIS's Interactive Data Language (IDL). It is free software developed by an international team of volunteers led by Marc Schellens - the project's founder (a list of contributors is available on the project's website). The development is hosted on SourceForge where GDL continuously ranks in the 99th percentile of most active projects. GDL with its library routines is designed as a tool for numerical data analysis and visualisation. As its proprietary counterparts (IDL and PV-WAVE), GDL is used particularly in geosciences and astronomy. GDL is dynamically-typed, vectorized and has object-oriented programming capabilities. The library routines handle numerical calculations, data visualisation, signal/image processing, interaction with host OS and data input/output. GDL supports several data formats such as netCDF, HDF4, HDF5, GRIB, PNG, TIFF, DICOM, etc. Graphical output is handled by X11, PostScript, SVG or z-buffer terminals, the last one allowing output to be saved in a variety of raster graphics formats. GDL is an incremental compiler with integrated debugging facilities. It is written in C++ using the ANTLR language-recognition framework. Most of the library routines are implemented as interfaces to open-source packages such as GNU Scientific Library, PLPlot, FFTW, ImageMagick, and others. GDL features a Python bridge (Python code can be called from GDL; GDL can be compiled as a Python module). Extensions to GDL can be written in C++, GDL, and Python. A number of open software libraries written in IDL, such as the NASA Astronomy Library, MPFIT, CMSVLIB and TeXtoIDL are fully or partially functional under GDL. Packaged versions of GDL are available for several Linux distributions and Mac OS X. The source code compiles on some other UNIX systems, including BSD and OpenSolaris. The presentation will cover the current status of the project, the key

  4. WebEQ: a web-GIS System to collect, display and query data for the management of the earthquake emergency in Central Italy

    NASA Astrophysics Data System (ADS)

    Carbone, Gianluca; Cosentino, Giuseppe; Pennica, Francesco; Moscatelli, Massimiliano; Stigliano, Francesco

    2017-04-01

    After the strong earthquakes that hit central Italy in recent months, the Center for Seismic Microzonation and its applications (CentroMS) was commissioned by the Italian Department of Civil Protection to conduct the study of seismic microzonation of the territories affected by the earthquake of August 24, 2016. As part of the activities of microzonation, IGAG CNR has created WebEQ, a management tool of the data that have been acquired by all participants (i.e., more than twenty research institutes and university departments). The data collection was organized and divided into sub-areas, assigned to working groups with multidisciplinary expertise in geology, geophysics and engineering. WebEQ is a web-GIS System that helps all the subjects involved in the data collection activities, through tools aimed at data uploading and validation, and with a simple GIS interface to display, query and download geographic data. WebEQ is contributing to the creation of a large database containing geographical data, both vector and raster, from various sources and types: - Regional Technical Map em Geological and geomorphological maps em Data location maps em Maps of microzones homogeneous in seismic perspective and seismic microzonation maps em National strong motion network location. Data loading is done through simple input masks that ensure consistency with the database structure, avoiding possible errors and helping users to interact with the map through user-friendly tools. All the data are thematized through standardized symbologies and colors (Gruppo di lavoro MS 2008), in order to allow the easy interpretation by all users. The data download tools allow data exchange between working groups and the scientific community to benefit from the activities. The seismic microzonation activities are still ongoing. WebEQ is enabling easy management of large amounts of data and will form a basis for the development of tools for the management of the upcoming seismic emergencies.

  5. Pictorial Formats. Volume 1. Format Development

    DTIC Science & Technology

    1982-02-01

    inside a threat envelope when the map scale prevents showing the normal cues. 3.1.4 Special Topographic Formats The primary tactical interest in...coverage is in white to prevent confuzing it with the threat’s envelopes. The border between, PMAXI and RMAX2 missile ranges is lined with yellow and... prevent confusion with red-coded emergency action items. 4.3 STORES DISPLAYS: COLOR RASTER Figures 55, 56, 57 and 58 illustrate the color raster

  6. Research and Simulation in Support of Near Real Time/Real Time Reconnaissance RPV Systems

    DTIC Science & Technology

    1977-06-01

    Image 4,5.2 Raster Lines Across Image 4.5.3 Angle Projected by Displayed Image 4.6 Optical Defocusing SIMULATION CONSIDERATIONS PAGE 162 162 162...television and infrared, there are a finite number of resolution elements across the format. As a consequence, selection of a shorter optical focal...light that is scanned across and down the CRT to form a raster similar to that seen in a standard television tube. The light is optically projected

  7. Flat panel ferroelectric electron emission display system

    DOEpatents

    Sampayan, S.E.; Orvis, W.J.; Caporaso, G.J.; Wieskamp, T.F.

    1996-04-16

    A device is disclosed which can produce a bright, raster scanned or non-raster scanned image from a flat panel. Unlike many flat panel technologies, this device does not require ambient light or auxiliary illumination for viewing the image. Rather, this device relies on electrons emitted from a ferroelectric emitter impinging on a phosphor. This device takes advantage of a new electron emitter technology which emits electrons with significant kinetic energy and beam current density. 6 figs.

  8. A Study on the Influence of Process Parameters on the Viscoelastic Properties of ABS Components Manufactured by FDM Process

    NASA Astrophysics Data System (ADS)

    Dakshinamurthy, Devika; Gupta, Srinivasa

    2018-04-01

    Fused Deposition Modelling (FDM) is a fast growing Rapid Prototyping (RP) technology due to its ability to build parts having complex geometrical shape in reasonable time period. The quality of built parts depends on many process variables. In this study, the influence of three FDM process parameters namely, slice height, raster angle and raster width on viscoelastic properties of Acrylonitrile Butadiene Styrene (ABS) RP-specimen is studied. Statistically designed experiments have been conducted for finding the optimum process parameter setting for enhancing the storage modulus. Dynamic Mechanical Analysis has been used to understand the viscoelastic properties at various parameter settings. At the optimal parameter setting the storage modulus and loss modulus of the ABS-RP specimen was 1008 and 259.9 MPa respectively. The relative percentage contribution of slice height and raster width on the viscoelastic properties of the FDM-RP components was found to be 55 and 31 % respectively.

  9. Video flowmeter

    DOEpatents

    Lord, David E.; Carter, Gary W.; Petrini, Richard R.

    1983-01-01

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid (10) containing entrained particles (12) is formed and positioned by a rod optic lens assembly (31) on the raster area of a low-light level television camera (20). The particles (12) are illuminated by light transmitted through a bundle of glass fibers (32) surrounding the rod optic lens assembly (31). Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen (40). The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid (10).

  10. Realising the Benefits of Adopting and Adapting Existing CF Metadata Conventions to a Broader Range of Geoscience Data

    NASA Astrophysics Data System (ADS)

    Druken, K. A.; Trenham, C. E.; Wang, J.; Bastrakova, I.; Evans, B. J. K.; Wyborn, L. A.; Ip, A. I.; Poudjom Djomani, Y.

    2016-12-01

    The National Computational Infrastructure (NCI) hosts one of Australia's largest repositories (10+ PBytes) of research data, colocated with a petascale High Performance Computer and a highly integrated research cloud. Key to maximizing benefit of NCI's collections and computational capabilities is ensuring seamless interoperable access to these datasets. This presents considerable data management challenges across the diverse range of geoscience data; spanning disciplines where netCDF-CF is commonly utilized (e.g., climate, weather, remote-sensing), through to the geophysics and seismology fields that employ more traditional domain- and study-specific data formats. These data are stored in a variety of gridded, irregularly spaced (i.e., trajectories, point clouds, profiles), and raster image structures. They often have diverse coordinate projections and resolutions, thus complicating the task of comparison and inter-discipline analysis. Nevertheless, much can be learned from the netCDF-CF model that has long served the climate community, providing a common data structure for the atmospheric, ocean and cryospheric sciences. We are extending the application of the existing Climate and Forecast (CF) metadata conventions to NCI's broader geoscience data collections. We present simple implementations that can significantly improve interoperability of the research collections, particularly in the case of line survey data. NCI has developed a compliance checker to assist with the data quality across all hosted netCDF-CF collections. The tool is an extension to one of the main existing CF Convention checkers, that we have modified to incorporate the Attribute Convention for Data Discovery (ACDD) and ISO19115 standards, and to perform parallelised checks over collections of files, ensuring compliance and consistency across the NCI data collections as a whole. It is complemented by a checker that also verifies functionality against a range of scientific analysis, programming

  11. Method of composing two-dimensional scanned spectra observed by the New Vacuum Solar Telescope

    NASA Astrophysics Data System (ADS)

    Cai, Yun-Fang; Xu, Zhi; Chen, Yu-Chao; Xu, Jun; Li, Zheng-Gang; Fu, Yu; Ji, Kai-Fan

    2018-04-01

    In this paper we illustrate the technique used by the New Vacuum Solar Telescope (NVST) to increase the spatial resolution of two-dimensional (2D) solar spectroscopy observations involving two dimensions of space and one of wavelength. Without an image stabilizer at the NVST, large scale wobble motion is present during the spatial scanning, whose instantaneous amplitude can reach 1.3″ due to the Earth’s atmosphere and the precision of the telescope guiding system, and seriously decreases the spatial resolution of 2D spatial maps composed with scanned spectra. We make the following effort to resolve this problem: the imaging system (e.g., the TiO-band) is used to record and detect the displacement vectors of solar image motion during the raster scan, in both the slit and scanning directions. The spectral data (e.g., the Hα line) which are originally obtained in time sequence are corrected and re-arranged in space according to those displacement vectors. Raster scans are carried out in several active regions with different seeing conditions (two rasters are illustrated in this paper). Given a certain spatial sampling and temporal resolution, the spatial resolution of the composed 2D map could be close to that of the slit-jaw image. The resulting quality after correction is quantitatively evaluated with two methods. A physical quantity, such as the line-of-sight velocities in multiple layers of the solar atmosphere, is also inferred from the re-arranged spectrum, demonstrating the advantage of this technique.

  12. The Need of Nested Grids for Aerial and Satellite Images and Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Villa, G.; Mas, S.; Fernández-Villarino, X.; Martínez-Luceño, J.; Ojeda, J. C.; Pérez-Martín, B.; Tejeiro, J. A.; García-González, C.; López-Romero, E.; Soteres, C.

    2016-06-01

    Usual workflows for production, archiving, dissemination and use of Earth observation images (both aerial and from remote sensing satellites) pose big interoperability problems, as for example: non-alignment of pixels at the different levels of the pyramids that makes it impossible to overlay, compare and mosaic different orthoimages, without resampling them and the need to apply multiple resamplings and compression-decompression cycles. These problems cause great inefficiencies in production, dissemination through web services and processing in "Big Data" environments. Most of them can be avoided, or at least greatly reduced, with the use of a common "nested grid" for mutiresolution production, archiving, dissemination and exploitation of orthoimagery, digital elevation models and other raster data. "Nested grids" are space allocation schemas that organize image footprints, pixel sizes and pixel positions at all pyramid levels, in order to achieve coherent and consistent multiresolution coverage of a whole working area. A "nested grid" must be complemented by an appropriate "tiling schema", ideally based on the "quad-tree" concept. In the last years a "de facto standard" grid and Tiling Schema has emerged and has been adopted by virtually all major geospatial data providers. It has also been adopted by OGC in its "WMTS Simple Profile" standard. In this paper we explain how the adequate use of this tiling schema as common nested grid for orthoimagery, DEMs and other types of raster data constitutes the most practical solution to most of the interoperability problems of these types of data.

  13. Analyzing the Implications of Climate Data on Plant Hardiness Zones for Green Infrastructure Planning: Case Study of Knoxville, Tennessee and Surrounding Region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sylvester, Linda M.; Omitaomu, Olufemi A.; Parish, Esther S.

    Downscaled climate data for Knoxville, Tennessee and the surrounding region were used to investigate future changing Plant Hardiness Zones due to climate change. The methodology used is the same as the US Department of Agriculture (USDA), well-known for their creation of the standard Plant Hardiness Zone map used by gardeners and planners. USDA data were calculated from observed daily data for 1976–2005. The modeled climate data for the past is daily data from 1980-2005 and the future data is projected for 2025–2050. The average of all the modeled annual extreme minimums for each time period of interest was calculated. Eachmore » 1 km raster cell was placed into zone categories based on temperature, using the same criteria and categories of the USDA. The individual models vary between suggesting little change to the Plant Hardiness Zones to suggesting Knoxville moves into the next two Hardiness Zones. But overall, the models suggest moving into the next warmer Zone. USDA currently has the Knoxville area categorized as Zone 7a. None of the Zones calculated from the climate data models placed Knoxville in Zone 7a for the similar time period. The models placed Knoxville in a cooler Hardiness Zone and projected the area to increase to Zone 7. The modeled temperature data appears to be slightly cooler than the actual temperature data and this may explain the zone discrepancy. However, overall Knoxville is projected to increase to the next warmer Zone. As the modeled data has Knoxville, overall, moving from Zone 6 to Zone 7, it can be inferred that Knoxville, Tennessee may increase from their current Zone 7 to Zone 8.« less

  14. Improving global data infrastructures for more effective and scalable analysis of Earth and environmental data: the Australian NCI NERDIP Approach

    NASA Astrophysics Data System (ADS)

    Evans, Ben; Wyborn, Lesley; Druken, Kelsey; Richards, Clare; Trenham, Claire; Wang, Jingbo; Rozas Larraondo, Pablo; Steer, Adam; Smillie, Jon

    2017-04-01

    The National Computational Infrastructure (NCI) facility hosts one of Australia's largest repositories (10+ PBytes) of research data collections spanning datasets from climate, coasts, oceans, and geophysics through to astronomy, bioinformatics, and the social sciences domains. The data are obtained from national and international sources, spanning a wide range of gridded and ungridded (i.e., line surveys, point clouds) data, and raster imagery, as well as diverse coordinate reference projections and resolutions. Rather than managing these data assets as a digital library, whereby users can discover and download files to personal servers (similar to borrowing 'books' from a 'library'), NCI has built an extensive and well-integrated research data platform, the National Environmental Research Data Interoperability Platform (NERDIP, http://nci.org.au/data-collections/nerdip/). The NERDIP architecture enables programmatic access to data via standards-compliant services for high performance data analysis, and provides a flexible cloud-based environment to facilitate the next generation of transdisciplinary scientific research across all data domains. To improve use of modern scalable data infrastructures that are focused on efficient data analysis, the data organisation needs to be carefully managed including performance evaluations of projections and coordinate systems, data encoding standards and formats. A complication is that we have often found multiple domain vocabularies and ontologies are associated with equivalent datasets. It is not practical for individual dataset managers to determine which standards are best to apply to their dataset as this could impact accessibility and interoperability. Instead, they need to work with data custodians across interrelated communities and, in partnership with the data repository, the international scientific community to determine the most useful approach. For the data repository, this approach is essential to enable

  15. Facilitating Scientific Collaboration and Education with Easy Access Web Maps Using the AGAP Antarctic Geophysical Data

    NASA Astrophysics Data System (ADS)

    Abdi, A.

    2012-12-01

    Science and science education benefit from easy access to data yet often geophysical data sets are large, complex and difficult to share. The difficulty in sharing data and imagery easily inhibits both collaboration and the use of real data in educational applications. The dissemination of data products through web maps serves a very efficient and user-friendly method for students, the public and the science community to gain insights and understanding from data. Few research groups provide direct access to their data, let alone map-based visualizations. By building upon current GIS infrastructure with web mapping technologies, like ArcGIS Server, scientific groups, institutions and agencies can enhance the value of their GIS investments. The advantages of web maps to serve data products are many; existing web-mapping technology allows complex GIS analysis to be shared across the Internet, and can be easily scaled from a few users to millions. This poster highlights the features of an interactive web map developed at the Polar Geophysics Group at the Lamont-Doherty Earth Observatory of Columbia University that provides a visual representation of, and access to, data products that resulted from the group's recently concluded AGAP project (http://pgg.ldeo.columbia.edu). The AGAP project collected more than 120,000 line km of new aerogeophysical data using two Twin Otter aircrafts. Data included ice penetrating radar, magnetometer, gravimeter and laser altimeter measurements. The web map is based upon ArcGIS Viewer for Flex, which is a configurable client application built on the ArcGIS API for Flex that works seamlessly with ArcGIS Server 10. The application can serve a variety of raster and vector file formats through the Data Interoperability for Server, which eliminates data sharing barriers across numerous file formats. The ability of the application to serve large datasets is only hindered by the availability of appropriate hardware. ArcGIS is a proprietary

  16. Global multi-resolution terrain elevation data 2010 (GMTED2010)

    USGS Publications Warehouse

    Danielson, Jeffrey J.; Gesch, Dean B.

    2011-01-01

    In 1996, the U.S. Geological Survey (USGS) developed a global topographic elevation model designated as GTOPO30 at a horizontal resolution of 30 arc-seconds for the entire Earth. Because no single source of topographic information covered the entire land surface, GTOPO30 was derived from eight raster and vector sources that included a substantial amount of U.S. Defense Mapping Agency data. The quality of the elevation data in GTOPO30 varies widely; there are no spatially-referenced metadata, and the major topographic features such as ridgelines and valleys are not well represented. Despite its coarse resolution and limited attributes, GTOPO30 has been widely used for a variety of hydrological, climatological, and geomorphological applications as well as military applications, where a regional, continental, or global scale topographic model is required. These applications have ranged from delineating drainage networks and watersheds to using digital elevation data for the extraction of topographic structure and three-dimensional (3D) visualization exercises (Jenson and Domingue, 1988; Verdin and Greenlee, 1996; Lehner and others, 2008). Many of the fundamental geophysical processes active at the Earth's surface are controlled or strongly influenced by topography, thus the critical need for high-quality terrain data (Gesch, 1994). U.S. Department of Defense requirements for mission planning, geographic registration of remotely sensed imagery, terrain visualization, and map production are similarly dependent on global topographic data. Since the time GTOPO30 was completed, the availability of higher-quality elevation data over large geographic areas has improved markedly. New data sources include global Digital Terrain Elevation Data (DTEDRegistered) from the Shuttle Radar Topography Mission (SRTM), Canadian elevation data, and data from the Ice, Cloud, and land Elevation Satellite (ICESat). Given the widespread use of GTOPO30 and the equivalent 30-arc

  17. Vector assembly of colloids on monolayer substrates

    NASA Astrophysics Data System (ADS)

    Jiang, Lingxiang; Yang, Shenyu; Tsang, Boyce; Tu, Mei; Granick, Steve

    2017-06-01

    The key to spontaneous and directed assembly is to encode the desired assembly information to building blocks in a programmable and efficient way. In computer graphics, raster graphics encodes images on a single-pixel level, conferring fine details at the expense of large file sizes, whereas vector graphics encrypts shape information into vectors that allow small file sizes and operational transformations. Here, we adapt this raster/vector concept to a 2D colloidal system and realize `vector assembly' by manipulating particles on a colloidal monolayer substrate with optical tweezers. In contrast to raster assembly that assigns optical tweezers to each particle, vector assembly requires a minimal number of optical tweezers that allow operations like chain elongation and shortening. This vector approach enables simple uniform particles to form a vast collection of colloidal arenes and colloidenes, the spontaneous dissociation of which is achieved with precision and stage-by-stage complexity by simply removing the optical tweezers.

  18. A multiresolution hierarchical classification algorithm for filtering airborne LiDAR data

    NASA Astrophysics Data System (ADS)

    Chen, Chuanfa; Li, Yanyan; Li, Wei; Dai, Honglei

    2013-08-01

    We presented a multiresolution hierarchical classification (MHC) algorithm for differentiating ground from non-ground LiDAR point cloud based on point residuals from the interpolated raster surface. MHC includes three levels of hierarchy, with the simultaneous increase of cell resolution and residual threshold from the low to the high level of the hierarchy. At each level, the surface is iteratively interpolated towards the ground using thin plate spline (TPS) until no ground points are classified, and the classified ground points are used to update the surface in the next iteration. 15 groups of benchmark dataset, provided by the International Society for Photogrammetry and Remote Sensing (ISPRS) commission, were used to compare the performance of MHC with those of the 17 other publicized filtering methods. Results indicated that MHC with the average total error and average Cohen’s kappa coefficient of 4.11% and 86.27% performs better than all other filtering methods.

  19. Exploring U.S Cropland - A Web Service based Cropland Data Layer Visualization, Dissemination and Querying System (Invited)

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Han, W.; di, L.

    2010-12-01

    The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability

  20. Digital exchange of graphic arts material: the ultimate challenge

    NASA Astrophysics Data System (ADS)

    McDowell, David Q.

    1996-02-01

    The digital exchange of graphic arts material - particularly advertising material for publications- in an open standardized environment represents the ultimate challenge for electronic data exchange. To meet the needs of publication advertising, the graphic arts industry must be able to transmit advertisements in an open environment where there are many senders and many receivers of the material. The material being transmitted consists of combinations of pictorial material, text, and line art with these elements superimposed on top of each other and/or interrelated in complex ways. The business relationships established by the traditional workflow environment, the combination of aesthetic and technical requirements, and the large base of existing hardware and software play a major role in limiting the options available. Existing first- and second-generation standards are focused on the CEPS environment, which operates on and stores data as raster files. The revolution in personal computer hardware and software, and the acceptance of these tools by the graphic arts community, dictates that standards must also be created and implemented for this world of vector/raster-based systems. The requirements for digital distribution of advertising material for publications, the existing graphic arts standards base, and the anticipation of future standards developments in response to these needs are explored.

  1. Digital exchange of graphic arts material: the ultimate challenge

    NASA Astrophysics Data System (ADS)

    McDowell, David Q.

    1996-01-01

    The digital exchange of graphic arts material -- particularly advertising material for publications -- in an open standardized environment represents the ultimate challenge for electronic data exchange. To meet the needs of publication advertising, the graphic arts industry must be able to transmit advertisements in an open environment where there are many senders and many receivers of the material. The material being transmitted consists of combinations of pictorial material, text, and line art with these elements superimposed on top of each other and/or interrelated in complex ways. The business relationships established by the traditional workflow environment, the combination of aesthetic and technical requirements, and the large base of existing hardware and software play a major role in limiting the options available. Existing first- and second-generation standards are focused on the CEPS environment, which operates on and stores data as raster files. The revolution in personal computer hardware and software, and the acceptance of these tools by the graphic arts community, dictates that standards must also be created and implemented for this world of vector/raster-based systems. The requirements for digital distribution of advertising material for publications, the existing graphic arts standards base, and the anticipation of future standards developments in response to these needs are explored.

  2. U.S. conterminous wall-to-wall anthropogenic land use trends (NWALT), 1974–2012

    USGS Publications Warehouse

    Falcone, James A.

    2015-09-14

    This dataset provides a U.S. national 60-meter, 19-class mapping of anthropogenic land uses for five time periods: 1974, 1982, 1992, 2002, and 2012. The 2012 dataset is based on a slightly modified version of the National Land Cover Database 2011 (NLCD 2011) that was recoded to a schema of land uses, and mapped back in time to develop datasets for the four earlier eras. The time periods coincide with U.S. Department of Agriculture (USDA) Census of Agriculture data collection years. Changes are derived from (a) known changes in water bodies from reservoir construction or removal; (b) housing unit density changes; (c) regional mining/extraction trends; (d) for 1999–2012, timber and forestry activity based on U.S. Geological Survey (USGS) Landscape Fire and Resource Management Planning Tools (Landfire) data; (e) county-level USDA Census of Agriculture change in cultivated land; and (f) establishment dates of major conservation areas. The data are compared to several other published studies and datasets as validation. Caveats are provided about limitations of the data for some classes. The work was completed as part of the USGS National Water-Quality Assessment (NAWQA) Program and termed the NAWQA Wall-to-Wall Anthropogenic Land Use Trends (NWALT) dataset. The associated datasets include five 60-meter geospatial rasters showing anthropogenic land use for the years 1974, 1982, 1992, 2002, and 2012, and 14 rasters showing the annual extent of timber clearcutting and harvest from 1999 to 2012.

  3. Land Boundary Conditions for the Goddard Earth Observing System Model Version 5 (GEOS-5) Climate Modeling System: Recent Updates and Data File Descriptions

    NASA Technical Reports Server (NTRS)

    Mahanama, Sarith P.; Koster, Randal D.; Walker, Gregory K.; Takacs, Lawrence L.; Reichle, Rolf H.; De Lannoy, Gabrielle; Liu, Qing; Zhao, Bin; Suarez, Max J.

    2015-01-01

    The Earths land surface boundary conditions in the Goddard Earth Observing System version 5 (GEOS-5) modeling system were updated using recent high spatial and temporal resolution global data products. The updates include: (i) construction of a global 10-arcsec land-ocean lakes-ice mask; (ii) incorporation of a 10-arcsec Globcover 2009 land cover dataset; (iii) implementation of Level 12 Pfafstetter hydrologic catchments; (iv) use of hybridized SRTM global topography data; (v) construction of the HWSDv1.21-STATSGO2 merged global 30 arc second soil mineral and carbon data in conjunction with a highly-refined soil classification system; (vi) production of diffuse visible and near-infrared 8-day MODIS albedo climatologies at 30-arcsec from the period 2001-2011; and (vii) production of the GEOLAND2 and MODIS merged 8-day LAI climatology at 30-arcsec for GEOS-5. The global data sets were preprocessed and used to construct global raster data files for the software (mkCatchParam) that computes parameters on catchment-tiles for various atmospheric grids. The updates also include a few bug fixes in mkCatchParam, as well as changes (improvements in algorithms, etc.) to mkCatchParam that allow it to produce tile-space parameters efficiently for high resolution AGCM grids. The update process also includes the construction of data files describing the vegetation type fractions, soil background albedo, nitrogen deposition and mean annual 2m air temperature to be used with the future Catchment CN model and the global stream channel network to be used with the future global runoff routing model. This report provides detailed descriptions of the data production process and data file format of each updated data set.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soares, Alexei S.; Engel, Matthew A.; Stearns, Richard

    We demonstrate a general strategy for determining structures from showers of microcrystals. It uses acoustic droplet ejection to transfer 2.5 nL droplets from the surface of microcrystal slurries, through the air, onto mounting micromesh pins. Individual microcrystals are located by raster-scanning a several-micrometer X-ray beam across the cryocooled micromeshes. X-ray diffraction data sets merged from several micrometer-sized crystals are used to determine 1.8 {angstrom} resolution crystal structures.

  5. Planetary SUrface Portal (PSUP): a tool for easy visualization and analysis of Martian surface

    NASA Astrophysics Data System (ADS)

    Poulet, Francois; Quantin-Nataf, Cathy; Ballans, Hervé; Lozac'h, Loic; Audouard, Joachim; Carter, John; Dassas, karin; Malapert, Jean-Christophe; Marmo, Chiara; Poulleau, Gilles; Riu, Lucie; Séjourné, antoine

    2016-10-01

    PSUP is two software application platforms for working with raster, vector, DTM, and hyper-spectral data acquired by various space instruments analyzing the surface of Mars from orbit. The first platform of PSUP is MarsSI (Martian surface data processing Information System, http://emars.univ-lyon1.fr). It provides data analysis functionalities to select and download ready-to-use products or to process data though specific and validated pipelines. To date, MarsSI handles CTX, HiRISE and CRISM data of NASA/MRO mission, HRSC and OMEGA data of ESA/MEx mission and THEMIS data of NASA/ODY mission (Lozac'h et al., EPSC 2015). The second part of PSUP is also open to the scientific community and can be visited at http://psup.ias.u-psud.fr/. This web-based user interface provides access to many data products for Mars: image footprints and rasters from the MarsSI tool; compositional maps from OMEGA and TES; albedo and thermal inertia from OMEGA and TES; mosaics from THEMIS, Viking, and CTX; high level specific products (defined as catalogues) such as hydrated mineral sites derived from CRISM and OMEGA data, central peaks mineralogy,… In addition, OMEGA C channel data cubes corrected for atmospheric and aerosol contributions can be downloaded. The architecture of PSUP data management and visualization is based on SITools2 and MIZAR, two CNES generic tools developed by a joint effort between CNES and scientific laboratories. SITools2 provides a self-manageable data access layer deployed on the PSUP data, while MIZAR is 3D application in a browser for discovering and visualizing geospatial data. Further developments including the addition of high level products of Mars (regional geological maps, new global compositional maps,…) are foreseen. Ultimately, PSUP will be adapted to other planetary surfaces and space missions in which the French research institutes are involved.

  6. dada - a web-based 2D detector analysis tool

    NASA Astrophysics Data System (ADS)

    Osterhoff, Markus

    2017-06-01

    The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.

  7. rasdaman Array Database: current status

    NASA Astrophysics Data System (ADS)

    Merticariu, George; Toader, Alexandru

    2015-04-01

    rasdaman (Raster Data Manager) is a Free Open Source Array Database Management System which provides functionality for storing and processing massive amounts of raster data in the form of multidimensional arrays. The user can access, process and delete the data using SQL. The key features of rasdaman are: flexibility (datasets of any dimensionality can be processed with the help of SQL queries), scalability (rasdaman's distributed architecture enables it to seamlessly run on cloud infrastructures while offering an increase in performance with the increase of computation resources), performance (real-time access, processing, mixing and filtering of arrays of any dimensionality) and reliability (legacy communication protocol replaced with a new one based on cutting edge technology - Google Protocol Buffers and ZeroMQ). Among the data with which the system works, we can count 1D time series, 2D remote sensing imagery, 3D image time series, 3D geophysical data, and 4D atmospheric and climate data. Most of these representations cannot be stored only in the form of raw arrays, as the location information of the contents is also important for having a correct geoposition on Earth. This is defined by ISO 19123 as coverage data. rasdaman provides coverage data support through the Petascope service. Extensions were added on top of rasdaman in order to provide support for the Geoscience community. The following OGC standards are currently supported: Web Map Service (WMS), Web Coverage Service (WCS), and Web Coverage Processing Service (WCPS). The Web Map Service is an extension which provides zoom and pan navigation over images provided by a map server. Starting with version 9.1, rasdaman supports WMS version 1.3. The Web Coverage Service provides capabilities for downloading multi-dimensional coverage data. Support is also provided for several extensions of this service: Subsetting Extension, Scaling Extension, and, starting with version 9.1, Transaction Extension, which

  8. User's manual for flight Simulator Display System (FSDS)

    NASA Technical Reports Server (NTRS)

    Egerdahl, C. C.

    1979-01-01

    The capabilities of the flight simulator display system (FSDS) are described. FSDS is a color raster scan display generator designed to meet the special needs of Flight Simulation Laboratories. The FSDS can update (revise) the images it generates every 16.6 mS, with limited support from a host processor. This corresponds to the standard TV vertical rate of 60 Hertz, and allows the system to carry out display functions in a time critical environment. Rotation of a complex image in the television raster with minimal hardware is possible with the system.

  9. Multicriteria analysis for sources of renewable energy using data from remote sensing

    NASA Astrophysics Data System (ADS)

    Matejicek, L.

    2015-04-01

    Renewable energy sources are major components of the strategy to reduce harmful emissions and to replace depleting fossil energy resources. Data from remote sensing can provide information for multicriteria analysis for sources of renewable energy. Advanced land cover quantification makes it possible to search for suitable sites. Multicriteria analysis, together with other data, is used to determine the energy potential and socially acceptability of suggested locations. The described case study is focused on an area of surface coal mines in the northwestern region of the Czech Republic, where the impacts of surface mining and reclamation constitute a dominant force in land cover changes. High resolution satellite images represent the main input datasets for identification of suitable sites. Solar mapping, wind predictions, the location of weirs in watersheds, road maps and demographic information complement the data from remote sensing for multicriteria analysis, which is implemented in a geographic information system (GIS). The input spatial datasets for multicriteria analysis in GIS are reclassified to a common scale and processed with raster algebra tools to identify suitable sites for sources of renewable energy. The selection of suitable sites is limited by the CORINE land cover database to mining and agricultural areas. The case study is focused on long term land cover changes in the 1985-2015 period. Multicriteria analysis based on CORINE data shows moderate changes in mapping of suitable sites for utilization of selected sources of renewable energy in 1990, 2000, 2006 and 2012. The results represent map layers showing the energy potential on a scale of a few preference classes (1-7), where the first class is linked to minimum preference and the last class to maximum preference. The attached histograms show the moderate variability of preference classes due to land cover changes caused by mining activities. The results also show a slight increase in the more

  10. EAARL Coastal Topography - Northern Gulf of Mexico, 2007: First Surface

    USGS Publications Warehouse

    Smith, Kathryn E.L.; Nayegandhi, Amar; Wright, C. Wayne; Bonisteel, Jamie M.; Brock, John C.

    2009-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) elevation data were produced as a collaborative effort between the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL; the National Park Service (NPS), Gulf Coast Network, Lafayette, LA; and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. The project provides highly detailed and accurate datasets of select barrier islands and peninsular regions of Louisiana, Mississippi, Alabama, and Florida, acquired June 27-30, 2007. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL system

  11. Appalachian Basin Play Fairway Analysis: Thermal Quality Analysis in Low-Temperature Geothermal Play Fairway Analysis (GPFA-AB

    DOE Data Explorer

    Teresa E. Jordan

    2015-11-15

    This collection of files are part of a larger dataset uploaded in support of Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB, DOE Project DE-EE0006726). Phase 1 of the GPFA-AB project identified potential Geothermal Play Fairways within the Appalachian basin of Pennsylvania, West Virginia and New York. This was accomplished through analysis of 4 key criteria or ‘risks’: thermal quality, natural reservoir productivity, risk of seismicity, and heat utilization. Each of these analyses represent a distinct project task, with the fifth task encompassing combination of the 4 risks factors. Supporting data for all five tasks has been uploaded into the Geothermal Data Repository node of the National Geothermal Data System (NGDS). This submission comprises the data for Thermal Quality Analysis (project task 1) and includes all of the necessary shapefiles, rasters, datasets, code, and references to code repositories that were used to create the thermal resource and risk factor maps as part of the GPFA-AB project. The identified Geothermal Play Fairways are also provided with the larger dataset. Figures (.png) are provided as examples of the shapefiles and rasters. The regional standardized 1 square km grid used in the project is also provided as points (cell centers), polygons, and as a raster. Two ArcGIS toolboxes are available: 1) RegionalGridModels.tbx for creating resource and risk factor maps on the standardized grid, and 2) ThermalRiskFactorModels.tbx for use in making the thermal resource maps and cross sections. These toolboxes contain “item description” documentation for each model within the toolbox, and for the toolbox itself. This submission also contains three R scripts: 1) AddNewSeisFields.R to add seismic risk data to attribute tables of seismic risk, 2) StratifiedKrigingInterpolation.R for the interpolations used in the thermal resource analysis, and 3) LeaveOneOutCrossValidation.R for the cross validations used in

  12. Comparative analysis of remotely-sensed data products via ecological niche modeling of avian influenza case occurrences in Middle Eastern poultry.

    PubMed

    Bodbyl-Roels, Sarah; Peterson, A Townsend; Xiao, Xiangming

    2011-03-28

    Ecological niche modeling integrates known sites of occurrence of species or phenomena with data on environmental variation across landscapes to infer environmental spaces potentially inhabited (i.e., the ecological niche) to generate predictive maps of potential distributions in geographic space. Key inputs to this process include raster data layers characterizing spatial variation in environmental parameters, such as vegetation indices from remotely sensed satellite imagery. The extent to which ecological niche models reflect real-world distributions depends on a number of factors, but an obvious concern is the quality and content of the environmental data layers. We assessed ecological niche model predictions of H5N1 avian flu presence quantitatively within and among four geographic regions, based on models incorporating two means of summarizing three vegetation indices derived from the MODIS satellite. We evaluated our models for predictive ability using partial ROC analysis and GLM ANOVA to compare performance among indices and regions. We found correlations between vegetation indices to be high, such that they contain information that overlaps broadly. Neither the type of vegetation index used nor method of summary affected model performance significantly. However, the degree to which model predictions had to be transferred (i.e., projected onto landscapes and conditions not represented on the landscape of training) impacted predictive strength greatly (within-region model predictions far out-performed models projected among regions). Our results provide the first quantitative tests of most appropriate uses of different remotely sensed data sets in ecological niche modeling applications. While our testing did not result in a decisive "best" index product or means of summarizing indices, it emphasizes the need for careful evaluation of products used in modeling (e.g. matching temporal dimensions and spatial resolution) for optimum performance, instead of

  13. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    NASA Astrophysics Data System (ADS)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  14. Improving the Terrain-Based Parameter for the Assessment of Snow Redistribution in the Col du Lac Blanc Area and Comparisons with TLS Snow Depth Data

    NASA Astrophysics Data System (ADS)

    Schön, Peter; Prokop, Alexander; Naaim-Bouvet, Florence; Nishimura, Kouichi; Vionnet, Vincent; Guyomarc'h, Gilbert

    2014-05-01

    Wind and the associated snow drift are dominating factors determining the snow distribution and accumulation in alpine areas, resulting in a high spatial variability of snow depth that is difficult to evaluate and quantify. The terrain-based parameter Sx characterizes the degree of shelter or exposure of a grid point provided by the upwind terrain, without the computational complexity of numerical wind field models. The parameter has shown to qualitatively predict snow redistribution with good reproduction of spatial patterns, but has failed to quantitatively describe the snow redistribution, and correlations with measured snow heights were poor. The objective of our research was to a) identify the sources of poor correlations between predicted and measured snow re-distribution and b) improve the parameters ability to qualitatively and quantitatively describe snow redistribution in our research area, the Col du Lac Blanc in the French Alps. The area is at an elevation of 2700 m and particularly suited for our study due to its constant wind direction and the availability of data from a meteorological station. Our work focused on areas with terrain edges of approximately 10 m height, and we worked with 1-2 m resolution digital terrain and snow surface data. We first compared the results of the terrain-based parameter calculations to measured snow-depths, obtained by high-accuracy terrestrial laser scan measurements. The results were similar to previous studies: The parameter was able to reproduce observed patterns in snow distribution, but regression analyses showed poor correlations between terrain-based parameter and measured snow-depths. We demonstrate how the correlations between measured and calculated snow heights improve if the parameter is calculated based on a snow surface model instead of a digital terrain model. We show how changing the parameter's search distance and how raster re-sampling and raster smoothing improve the results. To improve the parameter

  15. Numerical modeling of marine Gravity data for tsunami hazard zone mapping

    NASA Astrophysics Data System (ADS)

    Porwal, Nipun

    2012-07-01

    Tsunami is a series of ocean wave with very high wavelengths ranges from 10 to 500 km. Therefore tsunamis act as shallow water waves and hard to predict from various methods. Bottom Pressure Recorders of Poseidon class considered as a preeminent method to detect tsunami waves but Acoustic Modem in Ocean Bottom Pressure (OBP) sensors placed in the vicinity of trenches having depth of more than 6000m fails to propel OBP data to Surface Buoys. Therefore this paper is developed for numerical modeling of Gravity field coefficients from Bureau Gravimetric International (BGI) which do not play a central role in the study of geodesy, satellite orbit computation, & geophysics but by mathematical transformation of gravity field coefficients using Normalized Legendre Polynomial high resolution ocean bottom pressure (OBP) data is generated. Real time sea level monitored OBP data of 0.3° by 1° spatial resolution using Kalman filter (kf080) for past 10 years by Estimating the Circulation and Climate of the Ocean (ECCO) has been correlated with OBP data from gravity field coefficients which attribute a feasible study on future tsunami detection system from space and in identification of most suitable sites to place OBP sensors near deep trenches. The Levitus Climatological temperature and salinity are assimilated into the version of the MITGCM using the ad-joint method to obtain the sea height segment. Then TOPEX/Poseidon satellite altimeter, surface momentum, heat, and freshwater fluxes from NCEP reanalysis product and the dynamic ocean topography DOT_DNSCMSS08_EGM08 is used to interpret sea-bottom elevation. Then all datasets are associated under raster calculator in ArcGIS 9.3 using Boolean Intersection Algebra Method and proximal analysis tools with high resolution sea floor topographic map. Afterward tsunami prone area and suitable sites for set up of BPR as analyzed in this research is authenticated by using Passive microwave radiometry system for Tsunami Hazard Zone

  16. GIS Services, Visualization Products, and Interoperability at the National Oceanic and Atmospheric Administration (NOAA) National Climatic Data Center (NCDC)

    NASA Astrophysics Data System (ADS)

    Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.

    2007-12-01

    The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through

  17. Assessing landslide susceptibility by statistical data analysis and GIS: the case of Daunia (Apulian Apennines, Italy)

    NASA Astrophysics Data System (ADS)

    Ceppi, C.; Mancini, F.; Ritrovato, G.

    2009-04-01

    This study aim at the landslide susceptibility mapping within an area of the Daunia (Apulian Apennines, Italy) by a multivariate statistical method and data manipulation in a Geographical Information System (GIS) environment. Among the variety of existing statistical data analysis techniques, the logistic regression was chosen to produce a susceptibility map all over an area where small settlements are historically threatened by landslide phenomena. By logistic regression a best fitting between the presence or absence of landslide (dependent variable) and the set of independent variables is performed on the basis of a maximum likelihood criterion, bringing to the estimation of regression coefficients. The reliability of such analysis is therefore due to the ability to quantify the proneness to landslide occurrences by the probability level produced by the analysis. The inventory of dependent and independent variables were managed in a GIS, where geometric properties and attributes have been translated into raster cells in order to proceed with the logistic regression by means of SPSS (Statistical Package for the Social Sciences) package. A landslide inventory was used to produce the bivariate dependent variable whereas the independent set of variable concerned with slope, aspect, elevation, curvature, drained area, lithology and land use after their reductions to dummy variables. The effect of independent parameters on landslide occurrence was assessed by the corresponding coefficient in the logistic regression function, highlighting a major role played by the land use variable in determining occurrence and distribution of phenomena. Once the outcomes of the logistic regression are determined, data are re-introduced in the GIS to produce a map reporting the proneness to landslide as predicted level of probability. As validation of results and regression model a cell-by-cell comparison between the susceptibility map and the initial inventory of landslide events was

  18. JADDS - towards a tailored global atmospheric composition data service for CAMS forecasts and reanalysis

    NASA Astrophysics Data System (ADS)

    Stein, Olaf; Schultz, Martin G.; Rambadt, Michael; Saini, Rajveer; Hoffmann, Lars; Mallmann, Daniel

    2017-04-01

    server and address the major issues identified when relocating large four-dimensional datasets into a RASDAMAN raster array database. So far the RASDAMAN support for data available in netCDF format is limited with respect to metadata related to variables and axes. For community-wide accepted solutions, selected data coverages shall result in downloadable netCDF files including metadata complying with the netCDF CF Metadata Conventions standard (http://cfconventions.org/). This can be achieved by adding custom metadata elements for RASDAMAN bands (model levels) on data ingestion. Furthermore, an optimization strategy for ingestion of several TB of 4D model output data will be outlined.

  19. MExLab Planetary Geoportal: 3D-access to planetary images and results of spatial data analysis

    NASA Astrophysics Data System (ADS)

    Karachevtseva, I.; Garov, A.

    2015-10-01

    MExLab Planetary Geoportal was developed as Geodesy and Cartography Node which provide access to results of study of celestial bodies such as DEM and orthoimages, as well as basemaps, crater catalogues and derivative products: slope, roughness, crater density (http://cartsrv.mexlab.ru/geoportal). The main feature of designed Geoportal is the ability of spatial queries and access to the contents selecting from the list of available data set (Phobos, Mercury, Moon, including Lunokhod's archive data). Prior version of Geoportal has been developed using Flash technology. Now we are developing new version which will use 3D-API (OpenGL, WebGL) based on shaders not only for standard 3D-functionality, but for 2D-mapping as well. Users can obtain quantitative and qualitative characteristics of the objects in graphical, tabular and 3D-forms. It will bring the advantages of unification of code and speed of processing and provide a number of functional advantages based on GIS-tools such as: - possibility of dynamic raster transform for needed map projection; - effective implementation of the co-registration of planetary images by combining spatial data geometries; - presentation in 3D-form different types of data, including planetary atmospheric measurements, subsurface radar data, ect. The system will be created with a new software architecture, which has a potential for development and flexibility in reconfiguration based on cross platform solution: - an application for the three types of platforms: desktop (Windows, Linux, OSX), web platform (any HTML5 browser), and mobile application (Android, iOS); - a single codebase shared between platforms (using cross compilation for Web); - a new telecommunication solution to connect between modules and external system like PROVIDE WebGIS (http://www.provide-space.eu/progis/). The research leading to these result was partly supported by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement n

  20. Estimation of terracing characteristics from airborne laser scanning data

    NASA Astrophysics Data System (ADS)

    Kokalj, Žiga

    2015-04-01

    explore and manipulate the raw data, i.e. the lidar point cloud, where relevant features that could be removed in the filtering process can still be traced and their exact extents discernible. However, this is only possible for specific and very detailed analyses, while much more of the work has to be done with already processed raster elevation data. Processing has to be tailored specifically with terracing in mind, otherwise typical characteristics, such as riser slope gradient and thread edges can be distorted. We also investigated the role different elevation model visualizations have on the manual interpretation of terraced landscapes and which visualizations can benefit semiautomatic processing.

  1. Tillage practices in the conterminous United States, 1989-2004-Datasets Aggregated by Watershed

    USGS Publications Warehouse

    Baker, Nancy T.

    2011-01-01

    This report documents the methods used to aggregate county-level tillage practices to the 8-digit hydrologic unit (HU) watershed. The original county-level data were collected by the Conservation Technology Information Center (CTIC). The CTIC collects tillage data by conducting surveys about tillage systems for all counties in the United States. Tillage systems include three types of conservation tillage (no-till, ridge-till, and mulch-till), reduced tillage, and intensive tillage. Total planted acreage for each tillage practice for each crop grown is reported to the CTIC. The dataset includes total planted acreage by tillage type for selected crops (corn, cotton, grain sorghum, soybeans, fallow, forage, newly established permanent pasture, spring and fall seeded small grains, and 'other' crops) for 1989-2004. Two tabular datasets, based on the 1992 enhanced and 2001 National Land Cover Data (NLCD), are provided as part of this report and include the land-cover area-weighted interpolation and aggregation of acreage for each tillage practice in each 8-digit HU watershed in the conterminous United States for each crop. Watershed aggregations were done by overlying the 8-digit HU polygons with a raster of county boundaries and a raster of either the enhanced 1992 or the 2001 NLCD for cultivated land to derive a county/land-cover area weighting factor. The weighting factor then was applied to the county-level tillage data for the counties within each 8-digit HU and summed to yield the total acreage of each tillage type within each 8-digit HU watershed.

  2. Towards more accurate isoscapes encouraging results from wine, water and marijuana data/model and model/model comparisons.

    NASA Astrophysics Data System (ADS)

    West, J. B.; Ehleringer, J. R.; Cerling, T.

    2006-12-01

    Understanding how the biosphere responds to change it at the heart of biogeochemistry, ecology, and other Earth sciences. The dramatic increase in human population and technological capacity over the past 200 years or so has resulted in numerous, simultaneous changes to biosphere structure and function. This, then, has lead to increased urgency in the scientific community to try to understand how systems have already responded to these changes, and how they might do so in the future. Since all biospheric processes exhibit some patchiness or patterns over space, as well as time, we believe that understanding the dynamic interactions between natural systems and human technological manipulations can be improved if these systems are studied in an explicitly spatial context. We present here results of some of our efforts to model the spatial variation in the stable isotope ratios (δ2H and δ18O) of plants over large spatial extents, and how these spatial model predictions compare to spatially explicit data. Stable isotopes trace and record ecological processes and as such, if modeled correctly over Earth's surface allow us insights into changes in biosphere states and processes across spatial scales. The data-model comparisons show good agreement, in spite of the remaining uncertainties (e.g., plant source water isotopic composition). For example, inter-annual changes in climate are recorded in wine stable isotope ratios. Also, a much simpler model of leaf water enrichment driven with spatially continuous global rasters of precipitation and climate normals largely agrees with complex GCM modeling that includes leaf water δ18O. Our results suggest that modeling plant stable isotope ratios across large spatial extents may be done with reasonable accuracy, including over time. These spatial maps, or isoscapes, can now be utilized to help understand spatially distributed data, as well as to help guide future studies designed to understand ecological change across

  3. Coordinate metrology using scanning probe microscopes

    NASA Astrophysics Data System (ADS)

    Marinello, F.; Savio, E.; Bariani, P.; Carmignato, S.

    2009-08-01

    New positioning, probing and measuring strategies in coordinate metrology are needed for the accomplishment of true three-dimensional characterization of microstructures, with uncertainties in the nanometre range. In the present work, the implementation of scanning probe microscopes (SPMs) as systems for coordinate metrology is discussed. A new non-raster measurement approach is proposed, where the probe is moved to sense points along free paths on the sample surface, with no loss of accuracy with respect to traditional raster scanning and scan time reduction. Furthermore, new probes featuring long tips with innovative geometries suitable for coordinate metrology through SPMs are examined and reported.

  4. (Full field) optical coherence tomography and applications

    NASA Astrophysics Data System (ADS)

    Buchroithner, Boris; Hannesschläger, Günther; Leiss-Holzinger, Elisabeth; Prylepa, Andrii; Heise, Bettina

    2018-03-01

    This paper illustrates specific features and use of optical coherence tomography (OCT) in the raster-scanning and in comparison in the full field version of this imaging technique. Cases for nondestructive testing are discussed alongside other application schemes. In particular monitoring time-dependent processes and probing of birefringent specimens are considered here. In the context of polymer testing birefringence mapping may often provide information about internal strain and stress states. Recent results obtained with conventional raster-scanning OCT systems, with (dual and single-shot) full field OCT configurations, and with polarization-sensitive versions of (full field) OCT are presented here.

  5. Use of collateral information to improve LANDSAT classification accuracies

    NASA Technical Reports Server (NTRS)

    Strahler, A. H. (Principal Investigator)

    1981-01-01

    Methods to improve LANDSAT classification accuracies were investigated including: (1) the use of prior probabilities in maximum likelihood classification as a methodology to integrate discrete collateral data with continuously measured image density variables; (2) the use of the logit classifier as an alternative to multivariate normal classification that permits mixing both continuous and categorical variables in a single model and fits empirical distributions of observations more closely than the multivariate normal density function; and (3) the use of collateral data in a geographic information system as exercised to model a desired output information layer as a function of input layers of raster format collateral and image data base layers.

  6. Anisotropic modeling and joint-MAP stitching for improved ultrasound model-based iterative reconstruction of large and thick specimens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almansouri, Hani; Venkatakrishnan, Singanallur V.; Clayton, Dwight A.

    One-sided non-destructive evaluation (NDE) is widely used to inspect materials, such as concrete structures in nuclear power plants (NPP). A widely used method for one-sided NDE is the synthetic aperture focusing technique (SAFT). The SAFT algorithm produces reasonable results when inspecting simple structures. However, for complex structures, such as heavily reinforced thick concrete structures, SAFT results in artifacts and hence there is a need for a more sophisticated inversion technique. Model-based iterative reconstruction (MBIR) algorithms, which are typically equivalent to regularized inversion techniques, offer a powerful framework to incorporate complex models for the physics, detector miscalibrations and the materials beingmore » imaged to obtain high quality reconstructions. Previously, we have proposed an ultrasonic MBIR method that signifcantly improves reconstruction quality compared to SAFT. However, the method made some simplifying assumptions on the propagation model and did not disucss ways to handle data that is obtained by raster scanning a system over a surface to inspect large regions. In this paper, we propose a novel MBIR algorithm that incorporates an anisotropic forward model and allows for the joint processing of data obtained from a system that raster scans a large surface. We demonstrate that the new MBIR method can produce dramatic improvements in reconstruction quality compared to SAFT and suppresses articfacts compared to the perviously presented MBIR approach.« less

  7. An improved method for precise automatic co-registration of moderate and high-resolution spacecraft imagery

    NASA Technical Reports Server (NTRS)

    Bryant, Nevin A.; Logan, Thomas L.; Zobrist, Albert L.

    2006-01-01

    Improvements to the automated co-registration and change detection software package, AFIDS (Automatic Fusion of Image Data System) has recently completed development for and validation by NGA/GIAT. The improvements involve the integration of the AFIDS ultra-fine gridding technique for horizontal displacement compensation with the recently evolved use of Rational Polynomial Functions/ Coefficients (RPFs/RPCs) for image raster pixel position to Latitude/Longitude indexing. Mapping and orthorectification (correction for elevation effects) of satellite imagery defies exact projective solutions because the data are not obtained from a single point (like a camera), but as a continuous process from the orbital path. Standard image processing techniques can apply approximate solutions, but advances in the state-of-the-art had to be made for precision change-detection and time-series applications where relief offsets become a controlling factor. The earlier AFIDS procedure required the availability of a camera model and knowledge of the satellite platform ephemeredes. The recent design advances connect the spacecraft sensor Rational Polynomial Function, a deductively developed model, with the AFIDS ultrafine grid, an inductively developed representation of the relationship raster pixel position to latitude /longitude. As a result, RPCs can be updated by AFIDS, a situation often necessary due to the accuracy limits of spacecraft navigation systems. An example of precision change detection will be presented from Quickbird.

  8. Anisotropic modeling and joint-MAP stitching for improved ultrasound model-based iterative reconstruction of large and thick specimens

    NASA Astrophysics Data System (ADS)

    Almansouri, Hani; Venkatakrishnan, Singanallur; Clayton, Dwight; Polsky, Yarom; Bouman, Charles; Santos-Villalobos, Hector

    2018-04-01

    One-sided non-destructive evaluation (NDE) is widely used to inspect materials, such as concrete structures in nuclear power plants (NPP). A widely used method for one-sided NDE is the synthetic aperture focusing technique (SAFT). The SAFT algorithm produces reasonable results when inspecting simple structures. However, for complex structures, such as heavily reinforced thick concrete structures, SAFT results in artifacts and hence there is a need for a more sophisticated inversion technique. Model-based iterative reconstruction (MBIR) algorithms, which are typically equivalent to regularized inversion techniques, offer a powerful framework to incorporate complex models for the physics, detector miscalibrations and the materials being imaged to obtain high quality reconstructions. Previously, we have proposed an ultrasonic MBIR method that signifcantly improves reconstruction quality compared to SAFT. However, the method made some simplifying assumptions on the propagation model and did not disucss ways to handle data that is obtained by raster scanning a system over a surface to inspect large regions. In this paper, we propose a novel MBIR algorithm that incorporates an anisotropic forward model and allows for the joint processing of data obtained from a system that raster scans a large surface. We demonstrate that the new MBIR method can produce dramatic improvements in reconstruction quality compared to SAFT and suppresses articfacts compared to the perviously presented MBIR approach.

  9. EAARL Topography - George Washington Birthplace National Monument 2008

    USGS Publications Warehouse

    Brock, John C.; Nayegandhi, Amar; Wright, C. Wayne; Stevens, Sara; Yates, Xan

    2009-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived bare earth (BE) and first surface (FS) topography were produced as a collaborative effort between the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL; the National Park Service (NPS), Northeast Coastal and Barrier Network, Kingston, RI; and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the George Washington Birthplace National Monument in Virginia, acquired on March 26, 2008. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL) was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL

  10. EAARL Coastal Topography - Northeast Barrier Islands 2007: Bare Earth

    USGS Publications Warehouse

    Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Wright, C. Wayne; Yates, Xan; Bonisteel, Jamie M.

    2008-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived bare earth (BE) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the northeast coastal barrier islands in New York and New Jersey, acquired April 29-30 and May 15-16, 2007. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers and an integrated miniature digital inertial measurement unit, which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom

  11. EAARL Coastal Topography-Pearl River Delta 2008: Bare Earth

    USGS Publications Warehouse

    Nayegandhi, Amar; Brock, John C.; Wright, C. Wayne; Miner, Michael D.; Yates, Xan; Bonisteel, Jamie M.

    2009-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived bare earth (BE) topography were produced as a collaborative effort between the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL; the University of New Orleans (UNO), Pontchartrain Institute for Environmental Sciences (PIES), New Orleans, LA; and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Pearl River Delta in Louisiana and Mississippi, acquired March 9-11, 2008. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the

  12. EAARL Coastal Topography-Pearl River Delta 2008: First Surface

    USGS Publications Warehouse

    Nayegandhi, Amar; Brock, John C.; Wright, C. Wayne; Miner, Michael D.; Michael, D.; Yates, Xan; Bonisteel, Jamie M.

    2009-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) topography were produced as a collaborative effort between the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL; the University of New Orleans (UNO), Pontchartrain Institute for Environmental Sciences (PIES), New Orleans, LA; and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Pearl River Delta in Louisiana and Mississippi, acquired March 9-11, 2008. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the

  13. EAARL Topography - Natchez Trace Parkway 2007: First Surface

    USGS Publications Warehouse

    Nayegandhi, Amar; Brock, John C.; Wright, C. Wayne; Segura, Martha; Yates, Xan

    2008-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) topography were produced as a collaborative effort between the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL; the National Park Service (NPS), Gulf Coast Network, Lafayette, LA; and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Natchez Trace Parkway in Mississippi, acquired on September 14, 2007. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL system, and the resulting data were then

  14. EAARL Topography - Jean Lafitte National Historical Park and Preserve 2006

    USGS Publications Warehouse

    Nayegandhi, Amar; Brock, John C.; Wright, C. Wayne; Segura, Martha; Yates, Xan

    2008-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) and bare earth (BE) topography were produced as a collaborative effort between the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL; the National Park Service (NPS), Gulf Coast Network, Lafayette, LA; and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the Jean Lafitte National Historical Park and Preserve in Louisiana, acquired on September 22, 2006. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL system

  15. EAARL Topography - Vicksburg National Military Park 2008: Bare Earth

    USGS Publications Warehouse

    Nayegandhi, Amar; Brock, John C.; Wright, C. Wayne; Segura, Martha; Yates, Xan

    2008-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived bare earth (BE) topography were produced as a collaborative effort between the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL; the National Park Service (NPS), Gulf Coast Network, Lafayette, LA; and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the Vicksburg National Military Park in Mississippi, acquired on March 6, 2008. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL system, and the resulting data were then processed

  16. EAARL Coastal Topography - Northeast Barrier Islands 2007: First Surface

    USGS Publications Warehouse

    Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Wright, C. Wayne; Yates, Xan; Bonisteel, Jamie M.

    2009-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the northeast coastal barrier islands in New York and New Jersey, acquired April 29-30 and May 15-16, 2007. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a

  17. EAARL Coastal Topography - Northern Gulf of Mexico, 2007: Bare Earth

    USGS Publications Warehouse

    Smith, Kathryn E.L.; Nayegandhi, Amar; Wright, C. Wayne; Bonisteel, Jamie M.; Brock, John C.

    2009-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived bare earth (BE) topography were produced as a collaborative effort between the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL; the National Park Service (NPS), Gulf Coast Network, Lafayette, LA; and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. The purpose of this project is to provide highly detailed and accurate datasets of select barrier islands and peninsular regions of Louisiana, Mississippi, Alabama, and Florida, acquired on June 27-30, 2007. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using

  18. EAARL Topography-Vicksburg National Military Park 2007: First Surface

    USGS Publications Warehouse

    Nayegandhi, Amar; Brock, John C.; Wright, C. Wayne; Segura, Martha; Yates, Xan

    2009-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived first-surface (FS) topography were produced as a collaborative effort between the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL; the National Park Service (NPS), Gulf Coast Network, Lafayette, LA; and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the Vicksburg National Military Park in Mississippi, acquired on September 12, 2007. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL system, and the resulting data were then

  19. EAARL Submerged Topography - U.S. Virgin Islands 2003

    USGS Publications Warehouse

    Nayegandhi, Amar; Brock, John C.; Wright, C. Wayne; Stevens, Sara; Yates, Xan; Bonisteel, Jamie M.

    2008-01-01

    These remotely sensed, geographically referenced elevation measurements of Lidar-derived submerged topography were produced as a collaborative effort between the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL; the National Park Service (NPS), South Florida-Caribbean Network, Miami, FL; and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate bathymetric datasets of a portion of the U.S. Virgin Islands, acquired on April 21, 23, and 30, May 2, and June 14 and 17, 2003. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative airborne Lidar instrument originally developed at the NASA Wallops Flight Facility, and known as the Experimental Advanced Airborne Research Lidar (EAARL), was used during data acquisition. The EAARL system is a raster-scanning, waveform-resolving, green-wavelength (532-nanometer) Lidar designed to map near-shore bathymetry, topography, and vegetation structure simultaneously. The EAARL sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive Lidar, a down-looking red-green-blue (RGB) digital camera, a high-resolution multi-spectral color infrared (CIR) camera, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for submeter georeferencing of each laser sample. The nominal EAARL platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a Lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL system, and

  20. EAARL-B submerged topography: Barnegat Bay, New Jersey, post-Hurricane Sandy, 2012-2013

    USGS Publications Warehouse

    Wright, C. Wayne; Troche, Rodolfo J.; Kranenburg, Christine J.; Klipp, Emily S.; Fredericks, Xan; Nagle, David B.

    2014-01-01

    These remotely sensed, geographically referenced elevation measurements of lidar-derived submerged topography datasets were produced by the U.S. Geological Survey (USGS), St. Petersburg Coastal and Marine Science Center, St. Petersburg, Florida. This project provides highly detailed and accurate datasets for part of Barnegat Bay, New Jersey, acquired post-Hurricane Sandy on November 1, 5, 16, 20, and 30, 2012; December 5, 6, and 21, 2012; and January 10, 2013. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative airborne lidar system, known as the second-generation Experimental Advanced Airborne Research Lidar (EAARL-B), was used during data acquisition. The EAARL-B system is a raster-scanning, waveform-resolving, green-wavelength (532-nm) lidar designed to map nearshore bathymetry, topography, and vegetation structure simultaneously. The EAARL-B sensor suite includes the raster-scanning, water-penetrating full-waveform adaptive lidar, down-looking red-green-blue (RGB) and infrared (IR) digital cameras, two precision dual-frequency kinematic carrier-phase GPS receivers, and an integrated miniature digital inertial measurement unit, which provide for sub-meter georeferencing of each laser sample. The nominal EAARL-B platform is a twin-engine Cessna 310 aircraft, but the instrument may be deployed on a range of light aircraft. A single pilot, a lidar operator, and a data analyst constitute the crew for most survey operations. This sensor has the potential to make significant contributions in measuring sub-aerial and submarine coastal topography within cross-environmental surveys. Elevation measurements were collected over the survey area using the EAARL-B system. The resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed originally in a NASA-USGS collaboration. The exploration and processing of lidar data in an