Sample records for resolution version tiff

  1. Leveraging GeoTIFF Compatibility for Visualizing a New EASE-Grid 2.0 Global Satellite Passive Microwave Climate Record

    NASA Astrophysics Data System (ADS)

    Paget, A. C.; Brodzik, M. J.; Long, D. G.; Hardman, M.

    2016-02-01

    The historical record of satellite-derived passive microwave brightness temperatures comprises data from multiple imaging radiometers (SMMR, SSM/I-SSMIS, AMSR-E), spanning nearly 40 years of Earth observations from 1978 to the present. Passive microwave data are used to monitor time series of many climatological variables, including ocean wind speeds, cloud liquid water and sea ice concentrations and ice velocity. Gridded versions of passive microwave data have been produced using various map projections (polar stereographic, Lambert azimuthal equal-area, cylindrical equal-area, quarter-degree Platte-Carree) and data formats (flat binary, HDF). However, none of the currently available versions can be rendered in the common visualization standard, geoTIFF, without requiring cartographic reprojection. Furthermore, the reprojection details are complicated and often require expert knowledge of obscure software package options. We are producing a consistently calibrated, completely reprocessed data set of this valuable multi-sensor satellite record, using EASE-Grid 2.0, an improved equal-area projection definition that will require no reprojection for translation into geoTIFF. Our approach has been twofold: 1) define the projection ellipsoid to match the reference datum of the satellite data, and 2) include required file-level metadata for standard projection software to correctly render the data in the geoTIFF standard. The Calibrated, Enhanced Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR), leverages image reconstruction techniques to enhance gridded spatial resolution to 3 km and uses newly available intersensor calibrations to improve the quality of derived geophysical products. We expect that our attention to easy geoTIFF compatibility will foster higher-quality analysis with the CETB product by enabling easy and correct intercomparison with other gridded and in situ data.

  2. User's guide for mapIMG 3--Map image re-projection software package

    USGS Publications Warehouse

    Finn, Michael P.; Mattli, David M.

    2012-01-01

    Version 0.0 (1995), Dan Steinwand, U.S. Geological Survey (USGS)/Earth Resources Observation Systems (EROS) Data Center (EDC)--Version 0.0 was a command line version for UNIX that required four arguments: the input metadata, the output metadata, the input data file, and the output destination path. Version 1.0 (2003), Stephen Posch and Michael P. Finn, USGS/Mid-Continent Mapping Center (MCMC--Version 1.0 added a GUI interface that was built using the Qt library for cross platform development. Version 1.01 (2004), Jason Trent and Michael P. Finn, USGS/MCMC--Version 1.01 suggested bounds for the parameters of each projection. Support was added for larger input files, storage of the last used input and output folders, and for TIFF/ GeoTIFF input images. Version 2.0 (2005), Robert Buehler, Jason Trent, and Michael P. Finn, USGS/National Geospatial Technical Operations Center (NGTOC)--Version 2.0 added Resampling Methods (Mean, Mode, Min, Max, and Sum), updated the GUI design, and added the viewer/pre-viewer. The metadata style was changed to XML and was switched to a new naming convention. Version 3.0 (2009), David Mattli and Michael P. Finn, USGS/Center of Excellence for Geospatial Information Science (CEGIS)--Version 3.0 brings optimized resampling methods, an updated GUI, support for less than global datasets, UTM support and the whole codebase was ported to Qt4.

  3. GLCF: Welcome

    Science.gov Websites

    using remotely sensed satellite data and products to access land cover change for local to global Reports * IGOL * Landsat GeoCover * SRTM DEM GeoTIFFs * Rapid Response ` News Tree Canopy Cover Version 4

  4. 10 CFR 2.1011 - Management of electronic information.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... participants shall make textual (or, where non-text, image) versions of their documents available on a web... of the following acceptable formats: ASCII, native word processing (Word, WordPerfect), PDF Normal, or HTML. (iv) Image files must be formatted as TIFF CCITT G4 for bi-tonal images or PNG (Portable...

  5. 10 CFR 2.1011 - Management of electronic information.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... participants shall make textual (or, where non-text, image) versions of their documents available on a web... of the following acceptable formats: ASCII, native word processing (Word, WordPerfect), PDF Normal, or HTML. (iv) Image files must be formatted as TIFF CCITT G4 for bi-tonal images or PNG (Portable...

  6. 10 CFR 2.1011 - Management of electronic information.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... production and service: (i) The participants shall make textual (or, where non-text, image) versions of their... set and be in one of the following acceptable formats: ASCII, native word processing (Word, WordPerfect), PDF Normal, or HTML. (iv) Image files must be formatted as TIFF CCITT G4 for bi-tonal images or...

  7. Using a geographic information system and scanning technology to create high-resolution land-use data sets

    USGS Publications Warehouse

    Harvey, Craig A.; Kolpin, Dana W.; Battaglin, William A.

    1996-01-01

    A geographic information system (GIS) procedure was developed to compile low-altitude aerial photography, digitized data, and land-use data from U.S. Department of Agriculture Consolidated Farm Service Agency (CFSA) offices into a high-resolution (approximately 5 meters) land-use GIS data set. The aerial photography consisted of 35-mm slides which were scanned into tagged information file format (TIFF) images. These TIFF images were then imported into the GIS where they were registered into a geographically referenced coordinate system. Boundaries between land use were delineated from these GIS data sets using on-screen digitizing techniques. Crop types were determined using information obtained from the U.S. Department of Agriculture CFSA offices. Crop information not supplied by the CFSA was attributed by manual classification procedures. Automated methods to provide delineation of the field boundaries and land-use classification were investigated. It was determined that using these data sources, automated methods were less efficient and accurate than manual methods of delineating field boundaries and classifying land use.

  8. A seamless, high-resolution digital elevation model (DEM) of the north-central California coast

    USGS Publications Warehouse

    Foxgrover, Amy C.; Barnard, Patrick L.

    2012-01-01

    A seamless, 2-meter resolution digital elevation model (DEM) of the north-central California coast has been created from the most recent high-resolution bathymetric and topographic datasets available. The DEM extends approximately 150 kilometers along the California coastline, from Half Moon Bay north to Bodega Head. Coverage extends inland to an elevation of +20 meters and offshore to at least the 3 nautical mile limit of state waters. This report describes the procedures of DEM construction, details the input data sources, and provides the DEM for download in both ESRI Arc ASCII and GeoTIFF file formats with accompanying metadata.

  9. Cross-calibration of Fuji TR image plate and RAR 2492 x-ray film to determine the response of a DITABIS Super Micron image plate scanner

    NASA Astrophysics Data System (ADS)

    Dunham, G.; Harding, E. C.; Loisel, G. P.; Lake, P. W.; Nielsen-Weber, L. B.

    2016-11-01

    Fuji TR image plate is frequently used as a replacement detector medium for x-ray imaging and spectroscopy diagnostics at NIF, Omega, and Z facilities. However, the familiar Fuji BAS line of image plate scanners is no longer supported by the industry, and so a replacement scanning system is needed. While the General Electric Typhoon line of scanners could replace the Fuji systems, the shift away from photo stimulated luminescence units to 16-bit grayscale Tag Image File Format (TIFF) leaves a discontinuity when comparing data collected from both systems. For the purposes of quantitative spectroscopy, a known unit of intensity applied to the grayscale values of the TIFF is needed. The DITABIS Super Micron image plate scanning system was tested and shown to potentially rival the resolution and dynamic range of Kodak RAR 2492 x-ray film. However, the absolute sensitivity of the scanner is unknown. In this work, a methodology to cross calibrate Fuji TR image plate and the absolutely calibrated Kodak RAR 2492 x-ray film is presented. Details of the experimental configurations used are included. An energy dependent scale factor to convert Fuji TR IP scanned on a DITABIS Super Micron scanner from 16-bit grayscale TIFF to intensity units (i.e., photons per square micron) is discussed.

  10. Cross-calibration of Fuji TR image plate and RAR 2492 x-ray film to determine the response of a DITABIS Super Micron image plate scanner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunham, G., E-mail: gsdunha@sandia.gov; Harding, E. C.; Loisel, G. P.

    Fuji TR image plate is frequently used as a replacement detector medium for x-ray imaging and spectroscopy diagnostics at NIF, Omega, and Z facilities. However, the familiar Fuji BAS line of image plate scanners is no longer supported by the industry, and so a replacement scanning system is needed. While the General Electric Typhoon line of scanners could replace the Fuji systems, the shift away from photo stimulated luminescence units to 16-bit grayscale Tag Image File Format (TIFF) leaves a discontinuity when comparing data collected from both systems. For the purposes of quantitative spectroscopy, a known unit of intensity appliedmore » to the grayscale values of the TIFF is needed. The DITABIS Super Micron image plate scanning system was tested and shown to potentially rival the resolution and dynamic range of Kodak RAR 2492 x-ray film. However, the absolute sensitivity of the scanner is unknown. In this work, a methodology to cross calibrate Fuji TR image plate and the absolutely calibrated Kodak RAR 2492 x-ray film is presented. Details of the experimental configurations used are included. An energy dependent scale factor to convert Fuji TR IP scanned on a DITABIS Super Micron scanner from 16-bit grayscale TIFF to intensity units (i.e., photons per square micron) is discussed.« less

  11. Classifications for Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) site-specific projects: 2010

    USGS Publications Warehouse

    Jones, William R.; Garber, Adrienne

    2013-01-01

    The Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) funds over 100 wetland restoration projects across Louisiana. Integral to the success of CWPPRA is its long-term monitoring program, which enables State and Federal agencies to determine the effectiveness of each restoration effort. One component of this monitoring program is the classification of high-resolution, color-infrared aerial photography at the U.S. Geological Survey’s National Wetlands Research Center in Lafayette, Louisiana. Color-infrared aerial photography (9- by 9-inch) is obtained before project construction and several times after construction. Each frame is scanned on a photogrametric scanner that produces a high-resolution image in Tagged Image File Format (TIFF). By using image-processing software, these TIFF files are then orthorectified and mosaicked to produce a seamless image of a project area and its associated reference area (a control site near the project that has common environmental features, such as marsh type, soil types, and water salinities.) The project and reference areas are then classified according to pixel value into two distinct classes, land and water. After initial land and water ratios have been established by using photography obtained before and after project construction, subsequent comparisons can be made over time to determine land-water change.

  12. Using GDAL to Convert NetCDF 4 CF 1.6 to GeoTIFF: Interoperability Problems and Solutions for Data Providers and Distributors

    NASA Astrophysics Data System (ADS)

    Haran, T. M.; Brodzik, M. J.; Nordgren, B.; Estilow, T.; Scott, D. J.

    2015-12-01

    An increasing number of new Earth science datasets are being producedby data providers in self-describing, machine-independent file formatsincluding Hierarchical Data Format version 5 (HDF5) and NetworkCommon Data Form version 4 (netCDF-4). Furthermore data providers maybe producing netCDF-4 files that follow the conventions for Climateand Forecast metadata version 1.6 (CF 1.6) which, for datasets mappedto a projected raster grid covering all or a portion of the earth,includes the Coordinate Reference System (CRS) used to define howlatitude and longitude are mapped to grid coordinates, i.e. columnsand rows, and vice versa. One problem that users may encounter is thattheir preferred visualization and analysis tool may not yet includesupport for one of these newer formats. Moreover, data distributorssuch as NASA's NSIDC DAAC may not yet include support for on-the-flyconversion of data files for all data sets produced in a new format toa preferred older distributed format.There do exist open source solutions to this dilemma in the form ofsoftware packages that can translate files in one of the new formatsto one of the preferred formats. However these software packagesrequire that the file to be translated conform to the specificationsof its respective format. Although an online CF-Convention compliancechecker is available from cfconventions.org, a recent NSIDC userservices incident described here in detail involved an NSIDC-supporteddata set that passed the (then current) CF Checker Version 2.0.6, butwas in fact lacking two variables necessary for conformance. Thisproblem was not detected until GDAL, a software package which reliedon the missing variables, was employed by a user in an attempt totranslate the data into a different file format, namely GeoTIFF.This incident indicates that testing a candidate data product with oneor more software products written to accept the advertised conventionsis proposed as a practice which improves interoperability. Differencesbetween data file contents and software package expectations areexposed, affording an opportunity to improve conformance of software,data or both. The incident can also serve as a demonstration that dataproviders, distributors, and users can work together to improve dataproduct quality and interoperability.

  13. Digitized hand-wrist radiographs: comparison of subjective and software-derived image quality at various compression ratios.

    PubMed

    McCord, Layne K; Scarfe, William C; Naylor, Rachel H; Scheetz, James P; Silveira, Anibal; Gillespie, Kevin R

    2007-05-01

    The objectives of this study were to compare the effect of JPEG 2000 compression of hand-wrist radiographs on observer image quality qualitative assessment and to compare with a software-derived quantitative image quality index. Fifteen hand-wrist radiographs were digitized and saved as TIFF and JPEG 2000 images at 4 levels of compression (20:1, 40:1, 60:1, and 80:1). The images, including rereads, were viewed by 13 orthodontic residents who determined the image quality rating on a scale of 1 to 5. A quantitative analysis was also performed by using a readily available software based on the human visual system (Image Quality Measure Computer Program, version 6.2, Mitre, Bedford, Mass). ANOVA was used to determine the optimal compression level (P < or =.05). When we compared subjective indexes, JPEG compression greater than 60:1 significantly reduced image quality. When we used quantitative indexes, the JPEG 2000 images had lower quality at all compression ratios compared with the original TIFF images. There was excellent correlation (R2 >0.92) between qualitative and quantitative indexes. Image Quality Measure indexes are more sensitive than subjective image quality assessments in quantifying image degradation with compression. There is potential for this software-based quantitative method in determining the optimal compression ratio for any image without the use of subjective raters.

  14. Polar2Grid 2.0: Reprojecting Satellite Data Made Easy

    NASA Astrophysics Data System (ADS)

    Hoese, D.; Strabala, K.

    2015-12-01

    Polar-orbiting multi-band meteorological sensors such as those on the Suomi National Polar-orbiting Partnership (SNPP) satellite pose substantial challenges for taking imagery the last mile to forecast offices, scientific analysis environments, and the general public. To do this quickly and easily, the Cooperative Institute for Meteorological Satellite Studies (CIMSS) at the University of Wisconsin has created an open-source, modular application system, Polar2Grid. This bundled solution automates tools for converting various satellite products like those from VIIRS and MODIS into a variety of output formats, including GeoTIFFs, AWIPS compatible NetCDF files, and NinJo forecasting workstation compatible TIFF images. In addition to traditional visible and infrared imagery, Polar2Grid includes three perceptual enhancements for the VIIRS Day-Night Band (DNB), as well as providing the capability to create sharpened true color, sharpened false color, and user-defined RGB images. Polar2Grid performs conversions and projections in seconds on large swaths of data. Polar2Grid is currently providing VIIRS imagery over the Continental United States, as well as Alaska and Hawaii, from various Direct-Broadcast antennas to operational forecasters at the NOAA National Weather Service (NWS) offices in their AWIPS terminals, within minutes of an overpass of the Suomi NPP satellite. Three years after Polar2Grid development started, the Polar2Grid team is now releasing version 2.0 of the software; supporting more sensors, generating more products, and providing all of its features in an easy to use command line interface.

  15. Classifications for Coastal Wetlands Planning, Protection and Restoration Act site-specific projects: 2008 and 2009

    USGS Publications Warehouse

    Jones, William R.; Garber, Adrienne

    2012-01-01

    The Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) funds over 100 wetland restoration projects across Louisiana. Integral to the success of CWPPRA is its long-term monitoring program, which enables State and Federal agencies to determine the effectiveness of each restoration effort. One component of this monitoring program is the analysis of high-resolution, color-infrared aerial photography at the U.S. Geological Survey's National Wetlands Research Center in Lafayette, Louisiana. Color-infrared aerial photography (9- by 9-inch) is obtained before project construction and several times after construction. Each frame is scanned on a photogrametric scanner that produces a high-resolution image in Tagged Image File Format (TIFF). By using image-processing software, these TIFF files are then orthorectified and mosaicked to produce a seamless image of a project area and its associated reference area (a control site near the project that has common environmental features, such as marsh type, soil types, and water salinities.) The project and reference areas are then classified according to pixel value into two distinct classes, land and water. After initial land and water ratios have been established by using photography obtained before and after project construction, subsequent comparisons can be made over time to determine land-water change. Several challenges are associated with the land-water interpretation process. Primarily, land-water classifications are often complicated by the presence of floating aquatic vegetation that occurs throughout the freshwater systems of coastal Louisiana and that is sometimes difficult to differentiate from emergent marsh. Other challenges include tidal fluctuations and water movement from strong winds, which may result in flooding and inundation of emergent marsh during certain conditions. Compensating for these events is difficult but possible by using other sources of imagery to verify marsh conditions for other dates in time.

  16. Processed Thematic Mapper Satellite Imagery for Selected Areas within the U.S.-Mexico Borderlands

    USGS Publications Warehouse

    Dohrenwend, John C.; Gray, Floyd; Miller, Robert J.

    2000-01-01

    The study is summarized in the Adobe Acrobat Portable Document Format (PDF) file OF00-309.PDF. This publication also contain satellite full-scene images of selected areas along the U.S.-Mexico border. These images are presented as high-resolution images in jpeg format (IMAGES). The folder LOCATIONS in contains TIFF images showing exact positions of easily-identified reference locations for each of the Landsat TM scenes located at least partly within the U.S. A reference location table (BDRLOCS.DOC in MS Word format) lists the latitude and longitude of each reference location with a nominal precision of 0.001 minute of arc

  17. Volcanic Eruptions in Kamchatka

    NASA Technical Reports Server (NTRS)

    2007-01-01

    [figure removed for brevity, see original site] [figure removed for brevity, see original site] Sheveluch Stratovolcano Click on the image for full resolution TIFF Klyuchevskoy Stratovolcano Click on the image for full resolution TIFF

    One of the most volcanically active regions of the world is the Kamchatka Peninsula in eastern Siberia, Russia. It is not uncommon for several volcanoes to be erupting at the same time. On April 26, 2007, the Advanced Spaceborne Thermal Emission and Reflection Radioneter (ASTER) on NASA's Terra spacecraft captured these images of the Klyuchevskoy and Sheveluch stratovolcanoes, erupting simultaneously, and 80 kilometers (50 miles) apart. Over Klyuchevskoy, the thermal infrared data (overlaid in red) indicates that two open-channel lava flows are descending the northwest flank of the volcano. Also visible is an ash-and-water plume extending to the east. Sheveluch volcano is partially cloud-covered. The hot flows highlighted in red come from a lava dome at the summit. They are avalanches of material from the dome, and pyroclastic flows.

    With its 14 spectral bands from the visible to the thermal infrared wavelength region, and its high spatial resolution of 15 to 90 meters (about 50 to 300 feet), ASTER images Earth to map and monitor the changing surface of our planet.

    ASTER is one of five Earth-observing instruments launched December 18, 1999, on NASA's Terra spacecraft. The instrument was built by Japan's Ministry of Economy, Trade and Industry. A joint U.S./Japan science team is responsible for validation and calibration of the instrument and the data products.

    The broad spectral coverage and high spectral resolution of ASTER provides scientists in numerous disciplines with critical information for surface mapping, and monitoring of dynamic conditions and temporal change. Example applications are: monitoring glacial advances and retreats; monitoring potentially active volcanoes; identifying crop stress; determining cloud morphology and physical properties; wetlands evaluation; thermal pollution monitoring; coral reef degradation; surface temperature mapping of soils and geology; and measuring surface heat balance.

    The U.S. science team is located at NASA's Jet Propulsion Laboratory, Pasadena, Calif. The Terra mission is part of NASA's Science Mission Directorate.

    Size: 19.2 by 21 kilometers (11.9 by 13.0 miles) Location: 57 degrees North latitude, 161 degrees East longitude Orientation: North at top Image Data: ASTER Bands 3, 2, and 1, and 12 in red Original Data Resolution: ASTER 15 meters (49.2 feet) visible; 90 meters (295.2 feet) thermal infrared Date Acquired: April 26, 2007

  18. Effects of Image Compression on Automatic Count of Immunohistochemically Stained Nuclei in Digital Images

    PubMed Central

    López, Carlos; Lejeune, Marylène; Escrivà, Patricia; Bosch, Ramón; Salvadó, Maria Teresa; Pons, Lluis E.; Baucells, Jordi; Cugat, Xavier; Álvaro, Tomás; Jaén, Joaquín

    2008-01-01

    This study investigates the effects of digital image compression on automatic quantification of immunohistochemical nuclear markers. We examined 188 images with a previously validated computer-assisted analysis system. A first group was composed of 47 images captured in TIFF format, and other three contained the same images converted from TIFF to JPEG format with 3×, 23× and 46× compression. Counts of TIFF format images were compared with the other three groups. Overall, differences in the count of the images increased with the percentage of compression. Low-complexity images (≤100 cells/field, without clusters or with small-area clusters) had small differences (<5 cells/field in 95–100% of cases) and high-complexity images showed substantial differences (<35–50 cells/field in 95–100% of cases). Compression does not compromise the accuracy of immunohistochemical nuclear marker counts obtained by computer-assisted analysis systems for digital images with low complexity and could be an efficient method for storing these images. PMID:18755997

  19. The National Map - Orthoimagery

    USGS Publications Warehouse

    Mauck, James; Brown, Kim; Carswell, William J.

    2009-01-01

    Orthorectified digital aerial photographs and satellite images of 1-meter (m) pixel resolution or finer make up the orthoimagery component of The National Map. The process of orthorectification removes feature displacements and scale variations caused by terrain relief and sensor geometry. The result is a combination of the image characteristics of an aerial photograph or satellite image and the geometric qualities of a map. These attributes allow users to: *Measure distance *Calculate areas *Determine shapes of features *Calculate directions *Determine accurate coordinates *Determine land cover and use *Perform change detection *Update maps The standard digital orthoimage is a 1-m or finer resolution, natural color or color infra-red product. Most are now produced as GeoTIFFs and accompanied by a Federal Geographic Data Committee (FGDC)-compliant metadata file. The primary source for 1-m data is the National Agriculture Imagery Program (NAIP) leaf-on imagery. The U.S. Geological Survey (USGS) utilizes NAIP imagery as the image layer on its 'Digital- Map' - a new generation of USGS topographic maps (http://nationalmap.gov/digital_map). However, many Federal, State, and local governments and organizations require finer resolutions to meet a myriad of needs. Most of these images are leaf-off, natural-color products at resolutions of 1-foot (ft) or finer.

  20. Influence of image compression on the interpretation of spectral-domain optical coherence tomography in exudative age-related macular degeneration

    PubMed Central

    Kim, J H; Kang, S W; Kim, J-r; Chang, Y S

    2014-01-01

    Purpose To evaluate the effect of image compression of spectral-domain optical coherence tomography (OCT) images in the examination of eyes with exudative age-related macular degeneration (AMD). Methods Thirty eyes from 30 patients who were diagnosed with exudative AMD were included in this retrospective observational case series. The horizontal OCT scans centered at the center of the fovea were conducted using spectral-domain OCT. The images were exported to Tag Image File Format (TIFF) and 100, 75, 50, 25 and 10% quality of Joint Photographic Experts Group (JPEG) format. OCT images were taken before and after intravitreal ranibizumab injections, and after relapse. The prevalence of subretinal and intraretinal fluids was determined. Differences in choroidal thickness between the TIFF and JPEG images were compared with the intra-observer variability. Results The prevalence of subretinal and intraretinal fluids was comparable regardless of the degree of compression. However, the chorio–scleral interface was not clearly identified in many images with a high degree of compression. In images with 25 and 10% quality of JPEG, the difference in choroidal thickness between the TIFF images and the respective JPEG images was significantly greater than the intra-observer variability of the TIFF images (P=0.029 and P=0.024, respectively). Conclusions In OCT images of eyes with AMD, 50% of the quality of the JPEG format would be an optimal degree of compression for efficient data storage and transfer without sacrificing image quality. PMID:24788012

  1. Natural-Color Image Mosaics of Afghanistan: Digital Databases and Maps

    USGS Publications Warehouse

    Davis, Philip A.; Hare, Trent M.

    2007-01-01

    Explanation: The 50 tiled images in this dataset are natural-color renditions of the calibrated six-band Landsat mosaics created from Landsat Enhanced Thematic Mapper Plus (ETM+) data. Natural-color images depict the surface as seen by the human eye. The calibration of the Landsat ETM+ maps produced by Davis (2006) are relative reflectance and need to be grounded with ground-reflectance data, but the difficulties in performing fieldwork in Afghanistan precluded ground-reflectance surveys. For natural color calibration, which involves only the blue, green, and red color bands of Landsat, we could use ground photographs, Munsell color readings of ground surfaces, or another image base that accurately depicts the surface color. Each map quadrangle is 1? of latitude by? of longitude. The numbers assigned to each map quadrangle refer to the latitude and longitude coordinates of the lower left corner of the quadrangle. For example, quadrangle Q2960 has its lower left corner at lat 29? N., long 60? E. Each quadrangle overlaps adjacent quadrangles by 100 pixels (2.85 km). Only the 14.25-m-spacial-resolution UTM and 28.5-m-spacial-resolution WGS84 geographic geotiff datasets are available in this report to decrease the amount of space needed. The images are (three-band, eight-bit) geotiffs with embedded georeferencing. As such, most software will not require the associated world files. An index of all available images in geographic is displayed here: Index_Geo_DD.pdf. The country of Afghanistan spans three UTM zones: (41-43). Maps are stored as geoTIFFs in their respective UTM zone projection. Indexes of all available topographic map sheets in their respective UTM zone are displayed here: Index_UTM_Z41.pdf, Index_UTM_Z42.pdf, Index_UTM_Z43.pdf. You will need Adobe Reader to view the PDF files. Download a copy of the latest version of Adobe Reader for free.

  2. Proposed color workflow solution from mobile and website to printing

    NASA Astrophysics Data System (ADS)

    Qiao, Mu; Wyse, Terry

    2015-03-01

    With the recent introduction of mobile devices and development in client side application technologies, there is an explosion of the parameter matrix for color management: hardware platform (computer vs. mobile), operating system (Windows, Mac OS, Android, iOS), client application (Flesh, IE, Firefox, Safari, Chrome), and file format (JPEG, TIFF, PDF of various versions). In a modern digital print shop, multiple print solutions are used: digital presses, wide format inkjet, dye sublimation inkjet are used to produce a wide variety of customizable products from photo book, personalized greeting card, canvas, mobile phone case and more. In this paper, we outline a strategy spans from client side application, print file construction, to color setup on printer to manage consistency and also achieve what-you-see-is-what-you-get for customers who are using a wide variety of technologies in viewing and ordering product.

  3. 'tomo_display' and 'vol_tools': IDL VM Packages for Tomography Data Reconstruction, Processing, and Visualization

    NASA Astrophysics Data System (ADS)

    Rivers, M. L.; Gualda, G. A.

    2009-05-01

    One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk, and (3) generate MPEG movies for inclusion in presentations, publications, websites, etc. Both are freely available as run-time ('.sav') versions that can be run using the free IDL Virtual Machine TM, available from ITT Visual Information Solutions: http://www.ittvis.com/ProductServices/IDL/VirtualMachine.aspx The run-time versions of 'tomo_display' and 'vol_tools' can be downloaded from: http://cars.uchicago.edu/software/idl/tomography.html http://sites.google.com/site/voltools/

  4. Providing Internet Access to High-Resolution Lunar Images

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2008-01-01

    The OnMoon server is a computer program that provides Internet access to high-resolution Lunar images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of the Moon. The OnMoon server implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) server protocol and supports Moon-specific extensions. Unlike other Internet map servers that provide Lunar data using an Earth coordinate system, the OnMoon server supports encoding of data in Moon-specific coordinate systems. The OnMoon server offers access to most of the available high-resolution Lunar image and elevation data. This server can generate image and map files in the tagged image file format (TIFF) or the Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. Full-precision spectral arithmetic processing is also available, by use of a custom SLD extension. This server can dynamically add shaded relief based on the Lunar elevation to any image layer. This server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.

  5. Providing Internet Access to High-Resolution Mars Images

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2008-01-01

    The OnMars server is a computer program that provides Internet access to high-resolution Mars images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of Mars. The OnMars server is an implementation of the Open Geospatial Consortium (OGC) Web Map Service (WMS) server. Unlike other Mars Internet map servers that provide Martian data using an Earth coordinate system, the OnMars WMS server supports encoding of data in Mars-specific coordinate systems. The OnMars server offers access to most of the available high-resolution Martian image and elevation data, including an 8-meter-per-pixel uncontrolled mosaic of most of the Mars Global Surveyor (MGS) Mars Observer Camera Narrow Angle (MOCNA) image collection, which is not available elsewhere. This server can generate image and map files in the tagged image file format (TIFF), Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. The OnMars server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.

  6. IIPImage: Large-image visualization

    NASA Astrophysics Data System (ADS)

    Pillay, Ruven

    2014-08-01

    IIPImage is an advanced high-performance feature-rich image server system that enables online access to full resolution floating point (as well as other bit depth) images at terabyte scales. Paired with the VisiOmatic (ascl:1408.010) celestial image viewer, the system can comfortably handle gigapixel size images as well as advanced image features such as both 8, 16 and 32 bit depths, CIELAB colorimetric images and scientific imagery such as multispectral images. Streaming is tile-based, which enables viewing, navigating and zooming in real-time around gigapixel size images. Source images can be in either TIFF or JPEG2000 format. Whole images or regions within images can also be rapidly and dynamically resized and exported by the server from a single source image without the need to store multiple files in various sizes.

  7. Archive of digital and digitized analog boomer seismic reflection data collected during USGS cruise 96CCT02 in Copano, Corpus Christi, and Nueces Bays and Corpus Christi Bayou, Texas, July 1996

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Morton, Robert A.; Blum, Mike D.; Wiese, Dana S.; Subiño, Janice A.

    2007-01-01

    In June of 1996, the U.S. Geological Survey conducted geophysical surveys from Nueces to Copano Bays, Texas. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, cruise log, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles and high resolution scanned TIFF images of the original paper printouts are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  8. Taking a Position

    NASA Technical Reports Server (NTRS)

    1999-01-01

    "TerrAvoid" and "Position Integrity" combine Global Positioning Satellite (GPS) data with high-resolution maps of the Earth's topography. Dubbs & Severino, Inc., based in Irvine, California, has developed software that allows the system to be run on a battery-powered laptop in the cockpit. The packages, designed primarily for military sponsors and now positioned to hit the consumer market in coming months, came about as the result of the Jet Propulsion Laboratory's Technology Affiliates Program. Intended to give American industry assistance from NASA experts and to facilitate business use of intellectual property developed for the space program, the Technology Affiliates Program introduced the start-up company of Dubbs & Severino to JPL's Dr. Nevin Bryant four years ago. GeoTIFF is now in the public domain, and its use for commercial product development has evolved into an industry standard over the last year.

  9. The Hazards Data Distribution System update

    USGS Publications Warehouse

    Jones, Brenda K.; Lamb, Rynn M.

    2010-01-01

    After a major disaster, a satellite image or a collection of aerial photographs of the event is frequently the fastest, most effective way to determine its scope and severity. The U.S. Geological Survey (USGS) Emergency Operations Portal provides emergency first responders and support personnel with easy access to imagery and geospatial data, geospatial Web services, and a digital library focused on emergency operations. Imagery and geospatial data are accessed through the Hazards Data Distribution System (HDDS). HDDS historically provided data access and delivery services through nongraphical interfaces that allow emergency response personnel to select and obtain pre-event baseline data and (or) event/disaster response data. First responders are able to access full-resolution GeoTIFF images or JPEG images at medium- and low-quality compressions through ftp downloads. USGS HDDS home page: http://hdds.usgs.gov/hdds2/

  10. Enhancements to TauDEM to support Rapid Watershed Delineation Services

    NASA Astrophysics Data System (ADS)

    Sazib, N. S.; Tarboton, D. G.

    2015-12-01

    Watersheds are widely recognized as the basic functional unit for water resources management studies and are important for a variety of problems in hydrology, ecology, and geomorphology. Nevertheless, delineating a watershed spread across a large region is still cumbersome due to the processing burden of working with large Digital Elevation Model. Terrain Analysis Using Digital Elevation Models (TauDEM) software supports the delineation of watersheds and stream networks from within desktop Geographic Information Systems. A rich set of watershed and stream network attributes are computed. However limitations of the TauDEM desktop tools are (1) it supports only one type of raster (tiff format) data (2) requires installation of software for parallel processing, and (3) data have to be in projected coordinate system. This paper presents enhancements to TauDEM that have been developed to extend its generality and support web based watershed delineation services. The enhancements of TauDEM include (1) reading and writing raster data with the open-source geospatial data abstraction library (GDAL) not limited to the tiff data format and (2) support for both geographic and projected coordinates. To support web services for rapid watershed delineation a procedure has been developed for sub setting the domain based on sub-catchments, with preprocessed data prepared for each catchment stored. This allows the watershed delineation to function locally, while extending to the full extent of watersheds using preprocessed information. Additional capabilities of this program includes computation of average watershed properties and geomorphic and channel network variables such as drainage density, shape factor, relief ratio and stream ordering. The updated version of TauDEM increases the practical applicability of it in terms of raster data type, size and coordinate system. The watershed delineation web service functionality is useful for web based software as service deployments that alleviate the need for users to install and work with desktop GIS software.

  11. Ecohydrologic coevolution in drylands: relative roles of vegetation, soil depth and runoff connectivity on ecosystem shifts.

    NASA Astrophysics Data System (ADS)

    Saco, P. M.; Moreno de las Heras, M.; Willgoose, G. R.

    2014-12-01

    Watersheds are widely recognized as the basic functional unit for water resources management studies and are important for a variety of problems in hydrology, ecology, and geomorphology. Nevertheless, delineating a watershed spread across a large region is still cumbersome due to the processing burden of working with large Digital Elevation Model. Terrain Analysis Using Digital Elevation Models (TauDEM) software supports the delineation of watersheds and stream networks from within desktop Geographic Information Systems. A rich set of watershed and stream network attributes are computed. However limitations of the TauDEM desktop tools are (1) it supports only one type of raster (tiff format) data (2) requires installation of software for parallel processing, and (3) data have to be in projected coordinate system. This paper presents enhancements to TauDEM that have been developed to extend its generality and support web based watershed delineation services. The enhancements of TauDEM include (1) reading and writing raster data with the open-source geospatial data abstraction library (GDAL) not limited to the tiff data format and (2) support for both geographic and projected coordinates. To support web services for rapid watershed delineation a procedure has been developed for sub setting the domain based on sub-catchments, with preprocessed data prepared for each catchment stored. This allows the watershed delineation to function locally, while extending to the full extent of watersheds using preprocessed information. Additional capabilities of this program includes computation of average watershed properties and geomorphic and channel network variables such as drainage density, shape factor, relief ratio and stream ordering. The updated version of TauDEM increases the practical applicability of it in terms of raster data type, size and coordinate system. The watershed delineation web service functionality is useful for web based software as service deployments that alleviate the need for users to install and work with desktop GIS software.

  12. Tularosa Basin Play Fairway Analysis: Weights of Evidence; Mineralogy, and Temperature Anomaly Maps

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    This submission has two shapefiles and a tiff image. The weights of evidence analysis was applied to data representing heat of the earth and fracture permeability using training sites around the Southwest; this is shown in the tiff image. A shapefile of surface temperature anomalies was derived from the statistical analysis of Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) thermal infrared data which had been converted to surface temperatures; these anomalies have not been field checked. The second shapefile shows outcrop mineralogy which originally mapped by the New Mexico Bureau of Geology and Mineral Resources, and supplemented with mineralogic information related to rock fracability risk for EGS. Further metadata can be found within each file.

  13. Dying star creates sculpture of gas and dust

    NASA Astrophysics Data System (ADS)

    2004-09-01

    Sculpture of gas and dust hi-res Size hi-res: 125 Kb Credits: ESA, NASA, HEIC and The Hubble Heritage Team (STScI/AURA) Dying star creates sculpture of gas and dust The so-called Cat's Eye Nebula, formally catalogued NGC 6543 and seen here in this detailed view from the NASA/ESA Hubble Space Telescope, is one of the most complex planetary nebulae ever seen in space. A planetary nebula forms when Sun-like stars gently eject their outer gaseous layers to form bright nebulae with amazing twisted shapes. Hubble first revealed NGC 6543's surprisingly intricate structures including concentric gas shells, jets of high-speed gas and unusual shock-induced knots of gas in 1994. This new image, taken with Hubble's Advanced Camera for Surveys (ACS), reveals the full beauty of a bull's-eye pattern of eleven or more concentric rings, or shells, around the Cat’s Eye. Each ‘ring’ is actually the edge of a spherical bubble seen projected onto the sky - which is why it appears bright along its outer edge. High resolution version (JPG format) 125 Kb High resolution version (TIFF format) 2569 Kb Acknowledgment: R. Corradi (Isaac Newton Group of Telescopes, Spain) and Z. Tsvetanov (NASA). Sculpture of gas and dust hi-res Size hi-res: 287 Kb Credits: Nordic Optical Telescope and Romano Corradi (Isaac Newton Group of Telescopes, Spain) Dying star creates sculpture of gas and dust An enormous but extremely faint halo of gaseous material surrounds the Cat’s Eye Nebula and is over three light-years across. Some planetary nebulae been found to have halos like this one, likely formed of material ejected during earlier active episodes in the star's evolution - most likely some 50 000 to 90 000 years ago. This image was taken by Romano Corradi with the Nordic Optical Telescope on La Palma in the Canary Islands. The image is constructed from two narrow-band exposures showing oxygen atoms (1800 seconds, in blue) and nitrogen atoms (1800 seconds, in red). High resolution version (JPG format) 287 Kb High resolution version (TIFF format) 4674 Kb Although the rings may be the key to explaining the final ‘gasp’ of the dying central star, the mystery behind the Cat’s Eye Nebula’s nested ‘Russian doll’ structure remains largely unsolved. The so-called Cat's Eye Nebula, formally catalogued NGC 6543 and seen here in this detailed view from the NASA/ESA Hubble Space Telescope, is one of the most complex planetary nebulae ever seen in space. A planetary nebula forms when Sun-like stars gently eject their outer gaseous layers to form bright nebulae with amazing twisted shapes. Hubble first revealed NGC 6543's surprisingly intricate structures including concentric gas shells, jets of high-speed gas and unusual shock-induced knots of gas in 1994. This new image, taken with Hubble's Advanced Camera for Surveys (ACS), reveals the full beauty of a bull's-eye pattern of eleven or more concentric rings, or shells, around the Cat’s Eye. Each ‘ring’ is actually the edge of a spherical bubble seen projected onto the sky - which is why it appears bright along its outer edge. Observations suggest that the star ejected its mass in a series of pulses at 1500-year intervals. These convulsions created dust shells that each contains as much mass as all of the planets in our Solar System combined (but still only one-percent of the Sun's mass). These concentric shells make a layered onion-skin structure around the dying star. The view from Hubble is like seeing an onion cut in half, where each layer of skin is discernible. Until recently, it was thought that shells around planetary nebulae were a rare phenomenon. However, Romano Corradi (Isaac Newton Group of Telescopes, Spain) and collaborators, in a paper published in the European journal Astronomy & Astrophysics in April 2004, have instead shown that the formation of these rings is likely to be the rule rather than the exception. The bull's-eye patterns seen around planetary nebulae come as a surprise to astronomers because they had no expectation of episodes of mass loss at the end of stellar lives that repeat every 1500 years or so. Several explanations have been proposed, including cycles of magnetic activity somewhat similar to our own Sun's sunspot cycle, the action of companion stars orbiting around the dying star, and stellar pulsations. Another school of thought is that the material is ejected smoothly from the star, and the rings are created later on due to formation of waves in the outflowing material. It will take further observations and more theoretical studies to decide between these and other possible explanations. Approximately 1000 years ago the pattern of mass loss suddenly changed, and the Cat's Eye Nebula itself started forming inside the dusty shells. It has been expanding ever since, as can be seen by comparing Hubble images taken in 1994, 1997, 2000 and 2002. But what has caused this dramatic change? Many aspects of the process that leads a star to lose its gaseous envelope are poorly known, and the study of planetary nebulae is one of the few ways to recover information about the last few thousand years in the life of a Sun-like star. Notes for editors: The group of astronomers involved in the April 2004, Astronomy & Astrophysics paper are: R.L.M. Corradi (Isaac Newton Group of Telescopes, Spain), P. Sanchez-Blazquez (Universidad Complutense, Spain), G. Mellema (Foundation for Research in Astronomy, The Netherlands), C. Giammanco (Instituto de Astrofisica de Canarias, Spain) and H.E. Schwarz (Cerro Tololo Inter-American Observatory, Chile). The Hubble Space Telescope is a project of international co-operation between ESA and NASA.

  14. Water and Streambed-Sediment Quality in the Upper Elk River Basin, Missouri and Arkansas, 2004-06

    USGS Publications Warehouse

    Smith, Brenda J.; Richards, Joseph M.; Schumacher, John G.

    2007-01-01

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, collected water and streambedsediment samples in the Upper Elk River Basin in southwestern Missouri and northwestern Arkansas from October 2004 through December 2006. The samples were collected to determine the stream-water quality and streambed-sediment quality. In 1998, the Missouri Department of Natural Resources included a 21.5-mile river reach of the Elk River on the 303(d) list of impaired waters in Missouri as required by Section 303(d) of the Federal Clean Water Act. The Elk River is on the 303(d) list for excess nutrient loading. The total phosphorus distribution by decade indicates that the concentrations since 2000 have increased significantly from those in the 1960s, 1980s, and 1990s. The nitrate as nitrogen (nitrate) concentrations also have increased significantly in post-1985 from pre-1985 samples collected at the Elk River near Tiff City. Concentrations have increased significantly since the 1960s. Concentrations in the 1970s and 1980s, though similar, have increased from those in the 1960s, and the concentrations from the 1990s and 2000s increased still more. Nitrate concentrations significantly increased in samples that were collected during large discharges (greater than 355 cubic feet per second) from the Elk River near Tiff City. Nitrate concentrations were largest in Indian Creek. Several sources of nitrate are present in the basin, including poultry facilities in the upper part of the basin, effluent inflow from communities of Anderson and Lanagan, land-applied animal waste, chemical fertilizer, and possible leaking septic systems. Total phosphorus concentrations were largest in Little Sugar Creek. The median concentration of total phosphorus from samples from Little Sugar Creek near Pineville was almost four times the median concentration in samples from the Elk River near Tiff City. Median concentrations of nutrient species were greater in the stormwater samples than the median concentrations in the ambient samples. Nitrate concentrations in stormwater samples ranged from 133 to 179 percent of the concentration in the ambient samples. The total phosphorus concentrations in the stormwater samples ranged from about 200 to more than 600 percent of the concentration in the ambient samples. Base-flow conditions as reflected by the seepage run of the summer of 2006 indicate that 52 percent of the discharge at the Elk River near Tiff City is contributed by Indian Creek. Little Sugar Creek contributes 32 percent and Big Sugar Creek 9 percent of the discharge in the Elk River near Tiff City. Only about 7 percent of the discharge at Tiff City comes from the mainstem of the Elk River. Concentrations of dissolved ammonia plus organic nitrogen as nitrogen, dissolved ammonia as nitrogen, dissolved phosphorus, and dissolved orthophosphorus were detected in all streambed-sediment leachate samples. Concentrations of leachable nutrients in streambed-sediment samples generally tended to be slightly larger along the major forks of the Elk River as compared to tributary sites, with sites in the upper reaches of the major forks having among the largest concentrations. Concentrations of leachable nutrients in the major forks generally decreased with increasing distance downstream.

  15. High-resolution seismic-reflection data offshore of Dana Point, southern California borderland

    USGS Publications Warehouse

    Sliter, Ray W.; Ryan, Holly F.; Triezenberg, Peter J.

    2010-01-01

    The U.S. Geological Survey collected high-resolution shallow seismic-reflection profiles in September 2006 in the offshore area between Dana Point and San Mateo Point in southern Orange and northern San Diego Counties, California. Reflection profiles were located to image folds and reverse faults associated with the San Mateo fault zone and high-angle strike-slip faults near the shelf break (the Newport-Inglewood fault zone) and at the base of the slope. Interpretations of these data were used to update the USGS Quaternary fault database and in shaking hazard models for the State of California developed by the Working Group for California Earthquake Probabilities. This cruise was funded by the U.S. Geological Survey Coastal and Marine Catastrophic Hazards project. Seismic-reflection data were acquired aboard the R/V Sea Explorer, which is operated by the Ocean Institute at Dana Point. A SIG ELC820 minisparker seismic source and a SIG single-channel streamer were used. More than 420 km of seismic-reflection data were collected. This report includes maps of the seismic-survey sections, linked to Google Earth? software, and digital data files showing images of each transect in SEG-Y, JPEG, and TIFF formats.

  16. The Chinese Visible Human (CVH) datasets incorporate technical and imaging advances on earlier digital humans

    PubMed Central

    Zhang, Shao-Xiang; Heng, Pheng-Ann; Liu, Zheng-Jin; Tan, Li-Wen; Qiu, Ming-Guo; Li, Qi-Yu; Liao, Rong-Xia; Li, Kai; Cui, Gao-Yu; Guo, Yan-Li; Yang, Xiao-Ping; Liu, Guang-Jiu; Shan, Jing-Lu; Liu, Ji-Jun; Zhang, Wei-Guo; Chen, Xian-Hong; Chen, Jin-Hua; Wang, Jian; Chen, Wei; Lu, Ming; You, Jian; Pang, Xue-Li; Xiao, Hong; Xie, Yong-Ming; Cheng, Jack Chun-Yiu

    2004-01-01

    We report the availability of a digitized Chinese male and a digitzed Chinese female typical of the population and with no obvious abnormalities. The embalming and milling procedures incorporate three technical improvements over earlier digitized cadavers. Vascular perfusion with coloured gelatin was performed to facilitate blood vessel identification. Embalmed cadavers were embedded in gelatin and cryosectioned whole so as to avoid section loss resulting from cutting the body into smaller pieces. Milling performed at −25 °C prevented small structures (e.g. teeth, concha nasalis and articular cartilage) from falling off from the milling surface. The male image set (.tiff images each of 36 Mb) has a section resolution of 3072 × 2048 pixels (∼170 μm, the accompanying magnetic resonance imaging and computer tomography data have a resolution of 512 × 512, i.e. ∼440 μm). The Chinese Visible Human male and female datasets are available at http://www.chinesevisiblehuman.com. (The male is 90.65 Gb and female 131.04 Gb). MPEG videos of direct records of real-time volume rendering are at: http://www.cse.cuhk.edu.hk/~crc PMID:15032906

  17. 10 CFR 9.35 - Duplication fees.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) copying of ADAMS documents to paper (applies to images, OCR TIFF, and PDF text) is $0.30 per page. (B) EFT... is $0.30 per page. (vi) Priority rates (rush processing) are as follows: (A) The priority rate...

  18. 10 CFR 9.35 - Duplication fees.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) copying of ADAMS documents to paper (applies to images, OCR TIFF, and PDF text) is $0.30 per page. (B) EFT... is $0.30 per page. (vi) Priority rates (rush processing) are as follows: (A) The priority rate...

  19. 10 CFR 9.35 - Duplication fees.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) copying of ADAMS documents to paper (applies to images, OCR TIFF, and PDF text) is $0.30 per page. (B) EFT... is $0.30 per page. (vi) Priority rates (rush processing) are as follows: (A) The priority rate...

  20. http://www.esa.int/esaSC/Pr_21_2004_s_en.html

    NASA Astrophysics Data System (ADS)

    2004-09-01

    X-ray brightness map hi-res Size hi-res: 38 Kb Credits: ESA/ XMM-Newton/ Patrick Henry et al. X-ray brightness map This map shows "surface brightness" or how luminous the region is. The larger of the two galaxy clusters is brighter, shown here as a white and red spot. A second cluster resides about "2 o'clock" from this, shown by a batch of yellow surrounded by green. Luminosity is related to density, so the densest regions (cluster cores) are the brightest regions. The white color corresponds to regions of the highest surface brightness, followed by red, orange, yellow, green, blue and purple. High resolution version (JPG format) 38 Kb High resolution version (TIFF format) 525 Kb Temperature map Credits: NASA Artist’s impression of cosmic head on collision The event details what the scientists are calling the perfect cosmic storm: galaxy clusters that collided like two high-pressure weather fronts and created hurricane-like conditions, tossing galaxies far from their paths and churning shock waves of 100-million-degree gas through intergalactic space. The tiny dots in this artist's concept are galaxies containing thousand million of stars. Animated GIF version Temperature map hi-res Size hi-res: 57 Kb Credits: ESA/ XMM-Newton/ Patrick Henry et al. Temperature map This image shows the temperature of gas in and around the two merging galaxy clusters, based directly on X-ray data. The galaxies themselves are difficult to identify; the image highlights the hot ‘invisible’ gas between the clusters heated by shock waves. The white colour corresponds to regions of the highest temperature - million of degrees, hotter than the surface of the Sun - followed by red, orange, yellow and blue. High resolution version (JPG format) 57 Kb High resolution version (TIFF format) 819 Kb The event details what the scientists are calling the ‘perfect cosmic storm’: galaxy clusters that collided like two high-pressure weather fronts and created hurricane-like conditions, tossing galaxies far from their paths and churning shock waves of 100-million-degree gas through intergalactic space. This unprecedented view of a merger in action crystallises the theory that the Universe built its magnificent hierarchal structure from the ‘bottom up’ - essentially through mergers of smaller galaxies and galaxy clusters into bigger ones. "Here before our eyes we see the making of one of the biggest objects in the Universe," said Dr Patrick Henry of the University of Hawaii, who led the study. "What was once two distinct but smaller galaxy clusters 300 million years ago is now one massive cluster in turmoil.” Henry and his colleagues, Alexis Finoguenov and Ulrich Briel of the Max-Planck Institute for Extraterrestrial Physics in Germany, present these results in an upcoming issue of the Astrophysical Journal. The forecast for the new super-cluster, they said, is 'clear and calm' now that the worst of the storm has passed. Galaxy clusters are the largest gravitationally bound structures in Universe, containing hundreds to thousands of galaxies. Our Milky Way galaxy is part of a small group of galaxies but is not gravitationally bound to the closest cluster, the Virgo Cluster. We are destined for a collision in a few thousand million years, though. The cluster named Abell 754 in the constellation Hydra has been known for decades. However, to the scientists' surprise, the new observation reveals that the merger may have occurred from the opposite direction than what was thought. They found evidence for this by tracing the wreckage today left in the merger's wake, spanning a distance of millions of light years. While other large mergers are known, none has been measured in such detail as Abell 754. For the first time, the scientists could create a complete ‘weather map’ of Abell 754 and thus determine a forecast. This map contains information about the temperature, pressure and density of the new cluster. As in all clusters, most the ordinary matter is in the form of gas between the galaxies and not locked up in the galaxies or stars themselves. The massive forces of the merging clusters accelerated intergalactic gas to great speeds. This resulted in shock waves that heat the gas to very high temperatures, which then radiated X-ray light, far more energetic than the visible light our eyes can detect. XMM-Newton, in orbit, detects this type of high-energy light. The dynamics of the merger revealed by XMM-Newton point to a cluster in transition. "One cluster has apparently smashed into the other from the 'north-west' and has since made one pass through," said Finoguenov. "Now, gravity will pull the remnants of this first cluster back towards the core of the second. Over the next few thousand million of years, the remnants of the clusters will settle and the merger will be complete." The observation implies that the largest structures in the Universe are essentially still forming in the modern era. Abell 754 is relatively close, about 800 million light years away. The construction boom may soon be over in a few more thousand million years though. A mysterious substance dubbed 'dark energy' appears to be accelerating the Universe's expansion rate. This means that objects are flying apart from each other at an ever-increasing speed and that clusters may eventually never have the opportunity to collide with each other. X-ray observations of galaxy clusters such as Abell 754 will help to better define dark energy and also dark matter, an ‘invisible’ and mysterious substance that appears to comprise over 80 percent of a galaxy cluster's mass. Notes for editors: This observation was announced at a NASA Internet press conference today. A paper describing these results, by Patrick Henry and his collaborators, will be published in the Astrophysical Journal. Images and other visual material are available at: http://www.gsfc.nasa.gov/topstory/2004/0831galaxymerger_media.html More about XMM-Newton ESA's XMM-Newton can detect more X-ray sources than any previous satellite and is helping to solve many cosmic mysteries of the violent Universe, from black holes to the formation of galaxies. It was launched on 10 December 1999, using an Ariane-5 rocket, from French Guiana. It is expected to return data for a decade. XMM-Newton's high-tech design uses over 170 wafer-thin cylindrical mirrors spread over three telescopes. Its orbit takes it almost a third of the way to the Moon, so that astronomers can enjoy long, uninterrupted views of celestial objects.

  1. Visualization of GPM Standard Products at the Precipitation Processing System (PPS)

    NASA Astrophysics Data System (ADS)

    Kelley, O.

    2010-12-01

    Many of the standard data products for the Global Precipitation Measurement (GPM) constellation of satellites will be generated at and distributed by the Precipitation Processing System (PPS) at NASA Goddard. PPS will provide several means to visualize these data products. These visualization tools will be used internally by PPS analysts to investigate potential anomalies in the data files, and these tools will also be made available to researchers. Currently, a free data viewer called THOR, the Tool for High-resolution Observation Review, can be downloaded and installed on Linux, Windows, and Mac OS X systems. THOR can display swath and grid products, and to a limited degree, the low-level data packets that the satellite itself transmits to the ground system. Observations collected since the 1997 launch of the Tropical Rainfall Measuring Mission (TRMM) satellite can be downloaded from the PPS FTP archive, and in the future, many of the GPM standard products will also be available from this FTP site. To provide easy access to this 80 terabyte and growing archive, PPS currently operates an on-line ordering tool called STORM that provides geographic and time searches, browse-image display, and the ability to order user-specified subsets of standard data files. Prior to the anticipated 2013 launch of the GPM core satellite, PPS will expand its visualization tools by integrating an on-line version of THOR within STORM to provide on-the-fly image creation of any portion of an archived data file at a user-specified degree of magnification. PPS will also provide OpenDAP access to the data archive and OGC WMS image creation of both swath and gridded data products. During the GPM era, PPS will continue to provide realtime globally-gridded 3-hour rainfall estimates to the public in a compact binary format (3B42RT) and in a GIS format (2-byte TIFF images + ESRI WorldFiles).

  2. Scanning technology selection impacts acceptability and usefulness of image-rich content.

    PubMed

    Alpi, Kristine M; Brown, James C; Neel, Jennifer A; Grindem, Carol B; Linder, Keith E; Harper, James B

    2016-01-01

    Clinical and research usefulness of articles can depend on image quality. This study addressed whether scans of figures in black and white (B&W), grayscale, or color, or portable document format (PDF) to tagged image file format (TIFF) conversions as provided by interlibrary loan or document delivery were viewed as acceptable or useful by radiologists or pathologists. Residency coordinators selected eighteen figures from studies from radiology, clinical pathology, and anatomic pathology journals. With original PDF controls, each figure was prepared in three or four experimental conditions: PDF conversion to TIFF, and scans from print in B&W, grayscale, and color. Twelve independent observers indicated whether they could identify the features and whether the image quality was acceptable. They also ranked all the experimental conditions of each figure in terms of usefulness. Of 982 assessments of 87 anatomic pathology, 83 clinical pathology, and 77 radiology images, 471 (48%) were unidentifiable. Unidentifiability of originals (4%) and conversions (10%) was low. For scans, unidentifiability ranged from 53% for color, to 74% for grayscale, to 97% for B&W. Of 987 responses about acceptability (n=405), 41% were said to be unacceptable, 97% of B&W, 66% of grayscale, 41% of color, and 1% of conversions. Hypothesized order (original, conversion, color, grayscale, B&W) matched 67% of rankings (n=215). PDF to TIFF conversion provided acceptable content. Color images are rarely useful in grayscale (12%) or B&W (less than 1%). Acceptability of grayscale scans of noncolor originals was 52%. Digital originals are needed for most images. Print images in color or grayscale should be scanned using those modalities.

  3. Predicted seafloor facies of Central Santa Monica Bay, California

    USGS Publications Warehouse

    Dartnell, Peter; Gardner, James V.

    2004-01-01

    Summary -- Mapping surficial seafloor facies (sand, silt, muddy sand, rock, etc.) should be the first step in marine geological studies and is crucial when modeling sediment processes, pollution transport, deciphering tectonics, and defining benthic habitats. This report outlines an empirical technique that predicts the distribution of seafloor facies for a large area offshore Los Angeles, CA using high-resolution bathymetry and co-registered, calibrated backscatter from multibeam echosounders (MBES) correlated to ground-truth sediment samples. The technique uses a series of procedures that involve supervised classification and a hierarchical decision tree classification that are now available in advanced image-analysis software packages. Derivative variance images of both bathymetry and acoustic backscatter are calculated from the MBES data and then used in a hierarchical decision-tree framework to classify the MBES data into areas of rock, gravelly muddy sand, muddy sand, and mud. A quantitative accuracy assessment on the classification results is performed using ground-truth sediment samples. The predicted facies map is also ground-truthed using seafloor photographs and high-resolution sub-bottom seismic-reflection profiles. This Open-File Report contains the predicted seafloor facies map as a georeferenced TIFF image along with the multibeam bathymetry and acoustic backscatter data used in the study as well as an explanation of the empirical classification process.

  4. High-Resolution Seismic-Reflection and Marine Magnetic Data Along the Hosgri Fault Zone, Central California

    USGS Publications Warehouse

    Sliter, Ray W.; Triezenberg, Peter J.; Hart, Patrick E.; Watt, Janet T.; Johnson, Samuel Y.; Scheirer, Daniel S.

    2009-01-01

    The U.S. Geological Survey (USGS) collected high-resolution shallow seismic-reflection and marine magnetic data in June 2008 in the offshore areas between the towns of Cayucos and Pismo Beach, Calif., from the nearshore (~6-m depth) to just west of the Hosgri Fault Zone (~200-m depth). These data are in support of the California State Waters Mapping Program and the Cooperative Research and Development Agreement (CRADA) between the Pacific Gas & Electric Co. and the U.S. Geological Survey. Seismic-reflection and marine magnetic data were acquired aboard the R/V Parke Snavely, using a SIG 2Mille minisparker seismic source and a Geometrics G882 cesium-vapor marine magnetometer. More than 550 km of seismic and marine magnetic data was collected simultaneously along shore-perpendicular transects spaced 800 m apart, with an additional 220 km of marine magnetometer data collected across the Hosgri Fault Zone, resulting in spacing locally as smallas 400 m. This report includes maps of the seismic-survey sections, linked to Google Earth software, and digital data files showing images of each transect in SEG-Y, JPEG, and TIFF formats, as well as preliminary gridded marine-magnetic-anomaly and residual-magnetic-anomaly (shallow magnetic source) maps.

  5. Digitized Database of Old Seismograms Recorder in Romania

    NASA Astrophysics Data System (ADS)

    Paulescu, Daniel; Rogozea, Maria; Popa, Mihaela; Radulian, Mircea

    2016-08-01

    The aim of this paper is to describe a managing system for a unique Romanian database of historical seismograms and complementary documentation (metadata) and its dissemination and analysis procedure. For this study, 5188 historical seismograms recorded between 1903 and 1957 by the Romanian seismological observatories (Bucharest-Filaret, Focşani, Bacău, Vrincioaia, Câmpulung-Muscel, Iaşi) were used. In order to reconsider the historical instrumental data, the analog seismograms are converted to digital images and digital waveforms (digitization/ vectorialisation). First, we applied a careful scanning procedure of the seismograms and related material (seismic bulletins, station books, etc.). In a next step, the high resolution scanned seismograms will be processed to obtain the digital/numeric waveforms. We used a Colortrac Smartlf Cx40 scanner which provides images in TIFF or JPG format. For digitization the algorithm Teseo2 developed by the National Institute of Geophysics and Volcanology in Rome (Italy), within the framework of the SISMOS Project, will be used.

  6. Archive of Digitized Analog Boomer Seismic Reflection Data Collected from the Mississippi-Alabama-Florida Shelf During Cruises Onboard the R/V Kit Jones, June 1990 and July 1991

    USGS Publications Warehouse

    Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.

    2009-01-01

    In June of 1990 and July of 1991, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Mississippi-Alabama-Florida shelf in the northern Gulf of Mexico, from Mississippi Sound to the Florida Panhandle. Work was done onboard the Mississippi Mineral Resources Institute R/V Kit Jones as part of a project to study coastal erosion and offshore sand resources. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.

  7. Tularosa Basin Play Fairway Analysis: Strain Analysis

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    A DEM of the Tularosa Basin was divided into twelve zones, each of which a ZR ratio was calculated for. This submission has a TIFF image of the zoning designations, along with a table with respective ZR ratio calculations in the metadata.

  8. 77 FR 23382 - Airworthiness Directives; Sikorsky Aircraft Corporation Helicopters

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-19

    ... prior to that time. (e) Required Actions Within 90 days: (1) By making pen and ink changes, insert into... depicted in the circled area of Figure 1 of this AD. [GRAPHIC] [TIFF OMITTED] TR19AP12.000 (f) Alternative...

  9. 77 FR 42971 - Airworthiness Directives; Various Restricted Category Helicopters

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-23

    ... specifies replacing any mast with a crack, pitting, or corrosion beyond surface rust that is removed with a... rust with a wire brush or steel wool. [GRAPHIC] [TIFF OMITTED] TR23JY12.003 (2) If there is a crack...

  10. Scanning technology selection impacts acceptability and usefulness of image-rich content*†

    PubMed Central

    Alpi, Kristine M.; Brown, James C.; Neel, Jennifer A.; Grindem, Carol B.; Linder, Keith E.; Harper, James B.

    2016-01-01

    Objective Clinical and research usefulness of articles can depend on image quality. This study addressed whether scans of figures in black and white (B&W), grayscale, or color, or portable document format (PDF) to tagged image file format (TIFF) conversions as provided by interlibrary loan or document delivery were viewed as acceptable or useful by radiologists or pathologists. Methods Residency coordinators selected eighteen figures from studies from radiology, clinical pathology, and anatomic pathology journals. With original PDF controls, each figure was prepared in three or four experimental conditions: PDF conversion to TIFF, and scans from print in B&W, grayscale, and color. Twelve independent observers indicated whether they could identify the features and whether the image quality was acceptable. They also ranked all the experimental conditions of each figure in terms of usefulness. Results Of 982 assessments of 87 anatomic pathology, 83 clinical pathology, and 77 radiology images, 471 (48%) were unidentifiable. Unidentifiability of originals (4%) and conversions (10%) was low. For scans, unidentifiability ranged from 53% for color, to 74% for grayscale, to 97% for B&W. Of 987 responses about acceptability (n=405), 41% were said to be unacceptable, 97% of B&W, 66% of grayscale, 41% of color, and 1% of conversions. Hypothesized order (original, conversion, color, grayscale, B&W) matched 67% of rankings (n=215). Conclusions PDF to TIFF conversion provided acceptable content. Color images are rarely useful in grayscale (12%) or B&W (less than 1%). Acceptability of grayscale scans of noncolor originals was 52%. Digital originals are needed for most images. Print images in color or grayscale should be scanned using those modalities. PMID:26807048

  11. Effects of Digitization and JPEG Compression on Land Cover Classification Using Astronaut-Acquired Orbital Photographs

    NASA Technical Reports Server (NTRS)

    Robinson, Julie A.; Webb, Edward L.; Evangelista, Arlene

    2000-01-01

    Studies that utilize astronaut-acquired orbital photographs for visual or digital classification require high-quality data to ensure accuracy. The majority of images available must be digitized from film and electronically transferred to scientific users. This study examined the effect of scanning spatial resolution (1200, 2400 pixels per inch [21.2 and 10.6 microns/pixel]), scanning density range option (Auto, Full) and compression ratio (non-lossy [TIFF], and lossy JPEG 10:1, 46:1, 83:1) on digital classification results of an orbital photograph from the NASA - Johnson Space Center archive. Qualitative results suggested that 1200 ppi was acceptable for visual interpretive uses for major land cover types. Moreover, Auto scanning density range was superior to Full density range. Quantitative assessment of the processing steps indicated that, while 2400 ppi scanning spatial resolution resulted in more classified polygons as well as a substantially greater proportion of polygons < 0.2 ha, overall agreement between 1200 ppi and 2400 ppi was quite high. JPEG compression up to approximately 46:1 also did not appear to have a major impact on quantitative classification characteristics. We conclude that both 1200 and 2400 ppi scanning resolutions are acceptable options for this level of land cover classification, as well as a compression ratio at or below approximately 46:1. Auto range density should always be used during scanning because it acquires more of the information from the film. The particular combination of scanning spatial resolution and compression level will require a case-by-case decision and will depend upon memory capabilities, analytical objectives and the spatial properties of the objects in the image.

  12. 75 FR 51684 - Magnuson-Stevens Act Provisions; Fisheries Off West Coast States; Pacific Coast Groundfish...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-23

    ... result of management measures designed to meet the Pacific Coast Groundfish FMP objective of achieving..., subpart G, are revised to read as follows: BILLING CODE 3510-22-P [GRAPHIC] [TIFF OMITTED] TR23AU10.046 [[Page 51687

  13. 75 FR 22213 - Energy Conservation Program: Test Procedures for General Service Fluorescent Lamps, Incandescent...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-28

    .... Since the publication of that rule, it has come to DOE's attention that, due to a technical oversight, a...-percent confidence limit of the true mean (X L ) divided by 0.97, i.e., [GRAPHIC] [TIFF OMITTED] TR28AP10...

  14. TADIR-production version: El-Op's high-resolution 480x4 TDI thermal imaging system

    NASA Astrophysics Data System (ADS)

    Sarusi, Gabby; Ziv, Natan; Zioni, O.; Gaber, J.; Shechterman, Mark S.; Lerner, M.

    1999-07-01

    Efforts invested at El-Op during the last four years have led to the development of TADIR - engineering model thermal imager, demonstrated in 1998, and eventually to the final production version of TADIR to be demonstrated in full operation during 1999. Both versions take advantage of the high resolution and high sensitivity obtained by the 480 X 4 TDI MCT detector as well as many more features implemented in the system to obtain a state of the art high- end thermal imager. The production version of TADIR uses a 480 X 6 TDI HgCdTe detector made by the SCD Israeli company. In this paper, we will present the main features of the production version of TADIR.

  15. Documentation for the Machine-readable Version of the 0.2-A Resolution Far-ultraviolet Stellar Spectra Measured with COPERNICUS

    NASA Technical Reports Server (NTRS)

    Sheridan, W. T.; Warren, W. H., Jr.

    1981-01-01

    The spectra described represent a subset comprising data for 60 O- and B-type stars. The tape contains data in the spectral region lamda lamda 1000-1450 A with a resolution of 0.2 A. The magnetic tape version of the data is described.

  16. apART: system for the acquisition, processing, archiving, and retrieval of digital images in an open, distributed imaging environment

    NASA Astrophysics Data System (ADS)

    Schneider, Uwe; Strack, Ruediger

    1992-04-01

    apART reflects the structure of an open, distributed environment. According to the general trend in the area of imaging, network-capable, general purpose workstations with capabilities of open system image communication and image input are used. Several heterogeneous components like CCD cameras, slide scanners, and image archives can be accessed. The system is driven by an object-oriented user interface where devices (image sources and destinations), operators (derived from a commercial image processing library), and images (of different data types) are managed and presented uniformly to the user. Browsing mechanisms are used to traverse devices, operators, and images. An audit trail mechanism is offered to record interactive operations on low-resolution image derivatives. These operations are processed off-line on the original image. Thus, the processing of extremely high-resolution raster images is possible, and the performance of resolution dependent operations is enhanced significantly during interaction. An object-oriented database system (APRIL), which can be browsed, is integrated into the system. Attribute retrieval is supported by the user interface. Other essential features of the system include: implementation on top of the X Window System (X11R4) and the OSF/Motif widget set; a SUN4 general purpose workstation, inclusive ethernet, magneto optical disc, etc., as the hardware platform for the user interface; complete graphical-interactive parametrization of all operators; support of different image interchange formats (GIF, TIFF, IIF, etc.); consideration of current IPI standard activities within ISO/IEC for further refinement and extensions.

  17. High-Resolution Chirp and Mini-Sparker Seismic-Reflection Data From the Southern California Continental Shelf - Gaviota to Mugu Canyon

    USGS Publications Warehouse

    Sliter, Ray W.; Triezenberg, Peter J.; Hart, Patrick E.; Draut, Amy E.; Normark, William R.; Conrad, James E.

    2008-01-01

    The U.S. Geological Survey (USGS) collected high-resolution shallow seismic-reflection data in September, 2007, and June-July, 2008, from the continental shelf offshore of southern California between Gaviota and Mugu Canyon, in support of the California's State Waters Mapping Program. Data were acquired using SIG 2mille mini-sparker and Edgetech chirp 512 instruments aboard the R/V Zephyr (Sept. 2007) and R/V Parke Snavely (June-July 2008). The survey area spanned approximately 120 km of coastline, and included shore-perpendicular transects spaced 1.0-1.5 km apart that extended offshore to at least the 3-mile limit of State waters, in water depths ranging from 10 m near shore to 300 m near the offshore extent of Mugu and Hueneme submarine canyons. Subbottom acoustic penetration spanned tens to several hundred meters, variable by location. This report includes maps of the surveyed transects, linked to Google Earth software, as well as digital data files showing images of each transect in SEG-Y, JPEG, and TIFF formats. The images of sediment deposits, tectonic structure, and natural-gas seeps collected during this study provide geologic information that is essential to coastal zone and resource management at Federal, State and local levels, as well as to future research on the sedimentary, tectonic, and climatic record of southern California.

  18. Simulation of modern climate with the new version of the INM RAS climate model

    NASA Astrophysics Data System (ADS)

    Volodin, E. M.; Mortikov, E. V.; Kostrykin, S. V.; Galin, V. Ya.; Lykosov, V. N.; Gritsun, A. S.; Diansky, N. A.; Gusev, A. V.; Yakovlev, N. G.

    2017-03-01

    The INMCM5.0 numerical model of the Earth's climate system is presented, which is an evolution from the previous version, INMCM4.0. A higher vertical resolution for the stratosphere is applied in the atmospheric block. Also, we raised the upper boundary of the calculating area, added the aerosol block, modified parameterization of clouds and condensation, and increased the horizontal resolution in the ocean block. The program implementation of the model was also updated. We consider the simulation of the current climate using the new version of the model. Attention is focused on reducing systematic errors as compared to the previous version, reproducing phenomena that could not be simulated correctly in the previous version, and modeling the problems that remain unresolved.

  19. Cluster finds giant gas vortices at the edge of Earth's magnetic bubble

    NASA Astrophysics Data System (ADS)

    2004-08-01

    12 August 2004 ESA’s quartet of space-weather watchers, Cluster, has discovered vortices of ejected solar material high above the Earth. The superheated gases trapped in these structures are probably tunnelling their way into the Earth’s magnetic ‘bubble’, the magnetosphere. This discovery possibly solves a 17-year-mystery of how the magnetosphere is constantly topped up with electrified gases when it should be acting as a barrier. hi-res Size hi-res: 1446 Kb Credits: H. Hasegawa (Dartmouth College) Three-dimensional cut-away view of Earth's magnetosphere This figure shows a three-dimensional cut-away view of Earth' s magnetosphere. The curly features sketched on the boundary layer are the Kelvin-Helmholtz vortices discovered by Cluster. They originate where two adjacent flows travel with different speed. In this case, one of the flows is the heated gas inside the boundary layer of the magnetosphere, the other the solar wind just outside it. The arrows show the direction of the magnetic field, in red that associated with the solar wind and in green the one inside Earth’s magnetosphere. The white dashed arrow shows the trajectory followed by Cluster. High resolution version (JPG format) 1446 Kb High resolution version (TIFF format) 15 365 Kb hi-res Size hi-res: 22 Kb Credits: H. Hasegawa (Dartmouth College) Electrified gas varies across the vortices along Cluster’s trajectory This computer simulation shows how the density of the electrified gas is expected to vary across the vortices along Cluster’s trajectory (white dashed line). The density is lower inside the boundary layer (blue region) and higher outside, in the region dominated by the solar wind (shown in red). The density variations measured by the instruments on board Cluster match those predicted by this model. Low resolution version (JPG format) 22 Kb High resolution version (TIFF format) 3438 Kb The Earth’s magnetic field is our planet’s first line of defence against the bombardment of the solar wind. The solar wind itself is launched from the Sun and carries the Sun’s magnetic field throughout the Solar System. Sometimes this magnetic field is aligned with Earth’s and sometimes it points in the opposite direction. When the two fields point in opposite directions, scientists understand how ‘doors’ in Earth’s field can open. This phenomenon, called ‘magnetic reconnection’, allows the solar wind to flow in and collect in the reservoir known as the boundary layer. On the contrary, when the fields are aligned they should present an impenetrable barrier to the flow. However, spacecraft measurements of the boundary layer, dating back to 1987, present a puzzle because they clearly show that the boundary layer is fuller when the fields are aligned than when they are not. So how is the solar wind getting in? Thanks to the data from the four formation-flying spacecraft of ESA’s Cluster mission, scientists have made a breakthrough. On 20 November 2001, the Cluster flotilla was heading around from behind Earth and had just arrived at the dusk side of the planet, where the solar wind slides past Earth’s magnetosphere. There it began to encounter gigantic vortices of gas at the magnetopause, the outer ‘edge’ of the magnetosphere. “These vortices were really huge structures, about six Earth radii across,” says Hiroshi Hasegawa, Dartmouth College, New Hampshire who has been analysing the data with help from an international team of colleagues. Their results place the size of the vortices at almost 40 000 kilometres each, and this is the first time such structures have been detected. These vortices are known as products of Kelvin-Helmholtz instabilities (KHI). They can occur when two adjacent flows are travelling with different speeds, so one is slipping past the other. Good examples of such instabilities are the waves whipped up by the wind slipping across the surface of the ocean. Although KHI-waves had been observed before, this is the first time that vortices are actually detected. When a KHI-wave rolls up into a vortex, it becomes known as a ‘Kelvin Cat’s eye’. The data collected by Cluster have shown density variations of the electrified gas, right at the magnetopause, precisely like those expected when travelling through a ‘Kelvin Cat’s eye’. Scientists had postulated that, if these structures were to form at the magnetopause, they might be able to pull large quantities of the solar wind inside the boundary layer as they collapse. Once the solar wind particles are carried into the inner part of the magnetosphere, they can be excited strongly, allowing them to smash into Earth’s atmosphere and give rise to the aurorae. Cluster’s discovery strengthens this scenario but does not show the precise mechanism by which the gas is transported into Earth’s magnetic bubble. Thus, scientists still do not know whether this is the only process to fill up the boundary layer when the magnetic fields are aligned. For those measurements, Hasegawa says, scientists will have to wait for a future generation of magnetospheric satellites. Notes for editors The results of this investigation have appeared in today’s issue of the scientific journal Nature, in a paper entitled ‘Transport of solar wind into Earth's magnetosphere through rolled-up Kelvin-Helmholtz vortices’, by H. Hasegawa, M. Fujimoto, T.D. Phan, H. Reme, A. Balogh, M.W. Dunlop, C. Hashimoto and R. TanDokoro. More about magnetic reconnection Solar wind particles follow ‘magnetic field lines’, rather like beads on a wire. The ‘doors’ that open in Earth’s magnetosphere during oppositely aligned magnetic configurations are caused by a phenomenon called ‘magnetic reconnection‘. During this process, Earth’s field lines spontaneously break and join themselves to the Sun’s, allowing the solar wind to pass freely into Earth’s magnetosphere. Magnetic reconnections are not possible in the aligned case, however, hence the need for a different mechanism to inject the particles into Earth’s magnetosphere. More about Cluster Cluster is a mission of international co-operation between ESA and NASA. It involves four spacecraft, launched on two Russian rockets during the summer of 2000. They are now flying in formation around Earth, relaying the most detailed ever information about how the solar wind affects our planet in 3D. The solar wind is the perpetual stream of subatomic particles given out by the Sun and it can damage communications satellites and power stations on Earth. The Cluster mission is expected to continue until at least 2005. The ongoing archiving of the Cluster data (or Cluster Active Archive) is part of the International Living with a Star programme (ILWS), in which space agencies worldwide get together to investigate how variations in the Sun affect the environment of Earth and the other planets. In particular, ILWS concentrate on those aspects of the Sun-Earth system that may affect mankind and society. ILWS is a collaborative initiative between Europe, the United States, Russia, Japan and Canada.

  20. Characteristics of the ocean simulations in the Max Planck Institute Ocean Model (MPIOM) the ocean component of the MPI-Earth system model

    NASA Astrophysics Data System (ADS)

    Jungclaus, J. H.; Fischer, N.; Haak, H.; Lohmann, K.; Marotzke, J.; Matei, D.; Mikolajewicz, U.; Notz, D.; von Storch, J. S.

    2013-06-01

    MPI-ESM is a new version of the global Earth system model developed at the Max Planck Institute for Meteorology. This paper describes the ocean state and circulation as well as basic aspects of variability in simulations contributing to the fifth phase of the Coupled Model Intercomparison Project (CMIP5). The performance of the ocean/sea-ice model MPIOM, coupled to a new version of the atmosphere model ECHAM6 and modules for land surface and ocean biogeochemistry, is assessed for two model versions with different grid resolution in the ocean. The low-resolution configuration has a nominal resolution of 1.5°, whereas the higher resolution version features a quasiuniform, eddy-permitting global resolution of 0.4°. The paper focuses on important oceanic features, such as surface temperature and salinity, water mass distribution, large-scale circulation, and heat and freshwater transports. In general, these integral quantities are simulated well in comparison with observational estimates, and improvements in comparison with the predecessor system are documented; for example, for tropical variability and sea ice representation. Introducing an eddy-permitting grid configuration in the ocean leads to improvements, in particular, in the representation of interior water mass properties in the Atlantic and in the representation of important ocean currents, such as the Agulhas and Equatorial current systems. In general, however, there are more similarities than differences between the two grid configurations, and several shortcomings, known from earlier versions of the coupled model, prevail.

  1. 77 FR 12843 - Fees for Sanitation Inspections of Cruise Ships

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... used to determine the fees: [GRAPHIC] [TIFF OMITTED] TN02MR12.006 The average cost per inspection is multiplied by size and cost factors to determine the fee for vessels in each size category. The size and cost... exists rodent, insect, or other vermin infestations, contaminated food or water, or other sanitary...

  2. Regional Community Climate Simulations with variable resolution meshes in the Community Earth System Model

    NASA Astrophysics Data System (ADS)

    Zarzycki, C. M.; Gettelman, A.; Callaghan, P.

    2017-12-01

    Accurately predicting weather extremes such as precipitation (floods and droughts) and temperature (heat waves) requires high resolution to resolve mesoscale dynamics and topography at horizontal scales of 10-30km. Simulating such resolutions globally for climate scales (years to decades) remains computationally impractical. Simulating only a small region of the planet is more tractable at these scales for climate applications. This work describes global simulations using variable-resolution static meshes with multiple dynamical cores that target the continental United States using developmental versions of the Community Earth System Model version 2 (CESM2). CESM2 is tested in idealized, aquaplanet and full physics configurations to evaluate variable mesh simulations against uniform high and uniform low resolution simulations at resolutions down to 15km. Different physical parameterization suites are also evaluated to gauge their sensitivity to resolution. Idealized variable-resolution mesh cases compare well to high resolution tests. More recent versions of the atmospheric physics, including cloud schemes for CESM2, are more stable with respect to changes in horizontal resolution. Most of the sensitivity is due to sensitivity to timestep and interactions between deep convection and large scale condensation, expected from the closure methods. The resulting full physics model produces a comparable climate to the global low resolution mesh and similar high frequency statistics in the high resolution region. Some biases are reduced (orographic precipitation in the western United States), but biases do not necessarily go away at high resolution (e.g. summertime JJA surface Temp). The simulations are able to reproduce uniform high resolution results, making them an effective tool for regional climate studies and are available in CESM2.

  3. Visualization of small scale structures on high resolution DEMs

    NASA Astrophysics Data System (ADS)

    Kokalj, Žiga; Zakšek, Klemen; Pehani, Peter; Čotar, Klemen; Oštir, Krištof

    2015-04-01

    Knowledge on the terrain morphology is very important for observation of numerous processes and events and digital elevation models are therefore one of the most important datasets in geographic analyses. Furthermore, recognition of natural and anthropogenic microrelief structures, which can be observed on detailed terrain models derived from aerial laser scanning (lidar) or structure-from-motion photogrammetry, is of paramount importance in many applications. In this paper we thus examine and evaluate methods of raster lidar data visualization for the determination (recognition) of microrelief features and present a series of strategies to assist selecting the preferred visualization of choice for structures of various shapes and sizes, set in varied landscapes. Often the answer is not definite and more frequently a combination of techniques has to be used to map a very diverse landscape. Researchers can only very recently benefit from free software for calculation of advanced visualization techniques. These tools are often difficult to understand, have numerous options that confuse the user, or require and produce non-standard data formats, because they were written for specific purposes. We therefore designed the Relief Visualization Toolbox (RVT) as a free, easy-to-use, standalone application to create visualisations from high-resolution digital elevation data. It is tailored for the very beginners in relief interpretation, but it can also be used by more advanced users in data processing and geographic information systems. It offers a range of techniques, such as simple hillshading and its derivatives, slope gradient, trend removal, positive and negative openness, sky-view factor, and anisotropic sky-view factor. All included methods have been proven to be effective for detection of small scale features and the default settings are optimised to accomplish this task. However, the usability of the tool goes beyond computation for visualization purposes, as sky-view factor, for example, is an essential variable in many fields, e.g. in meteorology. RVT produces two types of results: 1) the original files have a full range of values and are intended for further analyses in geographic information systems, 2) the simplified versions are histogram stretched for visualization purposes and saved as 8-bit GeoTIFF files. This means that they can be explored in non-GIS software, e.g. with simple picture viewers, which is essential when a larger community of non-specialists needs to be considered, e.g. in public collaborative projects. The tool recognizes all frequently used single band raster formats and supports elevation raster file data conversion.

  4. Analyzing huge pathology images with open source software.

    PubMed

    Deroulers, Christophe; Ameisen, David; Badoual, Mathilde; Gerin, Chloé; Granier, Alexandre; Lartaud, Marc

    2013-06-06

    Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer's memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. The virtual slide(s) for this article can be found here:http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272.

  5. Analyzing huge pathology images with open source software

    PubMed Central

    2013-01-01

    Background Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer’s memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. Results We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Conclusions Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272 PMID:23829479

  6. Navigator GPS Receiver for Fast Acquisition and Weak Signal Space Applications

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke; Moreau, Michael; Boegner, Gregory J.; Sirotzky, Steve

    2004-01-01

    NASA Goddard Space Flight Center (GSFC) is developing a new space-borne GPS receiver that can operate effectively in the full range of Earth orbiting missions from Low Earth Orbit (LEO) to geostationary and beyond. Navigator is designed to be a fully space flight qualified GPS receiver optimized for fast signal acquisition and weak signal tracking. The fast acquisition capabilities provide exceptional time to first fix performance (TIFF) with no a priori receiver state or GPS almanac information, even in the presence of high Doppler shifts present in LEO (or near perigee in highly eccentric orbits). The fast acquisition capability also makes it feasible to implement extended correlation intervals and therefore significantly reduce Navigator s acquisition threshold. This greatly improves GPS observability when the receiver is above the GPS constellation (and satellites must be tracked from the opposite side of the Earth) by providing at least 10 dB of increased acquisition sensitivity. Fast acquisition and weak signal tracking algorithms have been implemented and validated on a hardware development board. A fully functional version of the receiver, employing most of the flight parts, with integrated navigation software is expected by mid 2005. An ultimate goal of this project is to license the Navigator design to an industry partner who will then market the receiver as a commercial product.

  7. Development of the GEOS-5 Atmospheric General Circulation Model: Evolution from MERRA to MERRA2.

    NASA Technical Reports Server (NTRS)

    Molod, Andrea; Takacs, Lawrence; Suarez, Max; Bacmeister, Julio

    2014-01-01

    The Modern-Era Retrospective Analysis for Research and Applications-2 (MERRA2) version of the GEOS-5 (Goddard Earth Observing System Model - 5) Atmospheric General Circulation Model (AGCM) is currently in use in the NASA Global Modeling and Assimilation Office (GMAO) at a wide range of resolutions for a variety of applications. Details of the changes in parameterizations subsequent to the version in the original MERRA reanalysis are presented here. Results of a series of atmosphere-only sensitivity studies are shown to demonstrate changes in simulated climate associated with specific changes in physical parameterizations, and the impact of the newly implemented resolution-aware behavior on simulations at different resolutions is demonstrated. The GEOS-5 AGCM presented here is the model used as part of the GMAO's MERRA2 reanalysis, the global mesoscale "nature run", the real-time numerical weather prediction system, and for atmosphere-only, coupled ocean-atmosphere and coupled atmosphere-chemistry simulations. The seasonal mean climate of the MERRA2 version of the GEOS-5 AGCM represents a substantial improvement over the simulated climate of the MERRA version at all resolutions and for all applications. Fundamental improvements in simulated climate are associated with the increased re-evaporation of frozen precipitation and cloud condensate, resulting in a wetter atmosphere. Improvements in simulated climate are also shown to be attributable to changes in the background gravity wave drag, and to upgrades in the relationship between the ocean surface stress and the ocean roughness. The series of "resolution aware" parameters related to the moist physics were shown to result in improvements at higher resolutions, and result in AGCM simulations that exhibit seamless behavior across different resolutions and applications.

  8. High-resolution chirp seismic reflection data acquired from the Cap de Creus shelf and canyon area, Gulf of Lions, Spain in 2004

    USGS Publications Warehouse

    Grossman, Eric E.; Hart, Patrick E.; Field, Michael E.; Triezenberg, Peter

    2006-01-01

    Seismic reflection data were collected from the Cap de Creus shelf and canyon in the southwest portion of the Gulf of Lions in October 2004. The data were acquired using the U.S. Geological Survey`s (USGS) high-resolution Edgetech CHIRP 512i seismic reflection system aboard the R/V Oceanus. Data from the shipboard 3.5 kHz echosounder were also collected but are not presented here. The seismic reflection data were collected as part of EuroSTRATAFORM funded by the Office of Naval Research. In October 2004, more than 200 km of high resolution seismic reflection data were collected in water depths ranging 30 m - 600 m. All data were recorded with a Delph Seismic PC-based digital recording system and processed with Delph Seismic software. Processed sections were georeferenced into tiff images for digital archive, processing and display. Penetration ranged 20-80 m. The data feature high quality vertical cross-section imagery of numerous sequences of Quaternary seismic stratigraphy. The report includes trackline maps showing the location of the data, as well as both digital data files (SEG-Y) and images of all of the profiles. The data are of high quality and provide new information on the location and thickness of sediment deposits overlying a major erosion surface on the Cap de Creus shelf; they also provide new insight into sediment processes on the walls and in the channel of Cap de Creus Canyon. These data are under study by researchers at the US Geological Survey, the University of Barcelona, and Texas A and M University. Copies of the data are available to all researchers.

  9. Concept and integration of an on-line quasi-operational airborne hyperspectral remote sensing system

    NASA Astrophysics Data System (ADS)

    Schilling, Hendrik; Lenz, Andreas; Gross, Wolfgang; Perpeet, Dominik; Wuttke, Sebastian; Middelmann, Wolfgang

    2013-10-01

    Modern mission characteristics require the use of advanced imaging sensors in reconnaissance. In particular, high spatial and high spectral resolution imaging provides promising data for many tasks such as classification and detecting objects of military relevance, such as camouflaged units or improvised explosive devices (IEDs). Especially in asymmetric warfare with highly mobile forces, intelligence, surveillance and reconnaissance (ISR) needs to be available close to real-time. This demands the use of unmanned aerial vehicles (UAVs) in combination with downlink capability. The system described in this contribution is integrated in a wing pod for ease of installation and calibration. It is designed for the real-time acquisition and analysis of hyperspectral data. The main component is a Specim AISA Eagle II hyperspectral sensor, covering the visible and near-infrared (VNIR) spectral range with a spectral resolution up to 1.2 nm and 1024 pixel across track, leading to a ground sampling distance below 1 m at typical altitudes. The push broom characteristic of the hyperspectral sensor demands an inertial navigation system (INS) for rectification and georeferencing of the image data. Additional sensors are a high resolution RGB (HR-RGB) frame camera and a thermal imaging camera. For on-line application, the data is preselected, compressed and transmitted to the ground control station (GCS) by an existing system in a second wing pod. The final result after data processing in the GCS is a hyperspectral orthorectified GeoTIFF, which is filed in the ERDAS APOLLO geographical information system. APOLLO allows remote access to the data and offers web-based analysis tools. The system is quasi-operational and was successfully tested in May 2013 in Bremerhaven, Germany.

  10. Sea-Floor Images and Data from Multibeam Surveys in San Francisco Bay, Southern California, Hawaii, the Gulf of Mexico, and Lake Tahoe, California-Nevada

    USGS Publications Warehouse

    Dartnell, Peter; Gardiner, James V.

    1999-01-01

    Accurate base maps are a prerequisite for any geologic study, regardless of the objectives. Land-based studies commonly utilize aerial photographs, USGS 7.5-minute quadrangle maps, and satellite images as base maps. Until now, studies that involve the ocean floor have been at a disadvantage due to an almost complete lack of accurate marine base maps. Many base maps of the sea floor have been constructed over the past century but with a wide range in navigational and depth accuracies. Only in the past few years has marine surveying technology advanced far enough to produce navigational accuracy of 1 meter and depth resolutions of 50 centimeters. The Pacific Seafloor Mapping Project of the U.S. Geological Survey's, Western Coastal and Marine Geology Program, Menlo Park, California, U.S.A., in cooperation with the Ocean Mapping Group, University of New Brunswick, Fredericton, Canada, is using this new technology to systematically map the ocean floor and lakes. This type of marine surveying, called multibeam surveying, collects high-resolution bathymetric and backscatter data that can be used for various base maps, GIS coverages, and scientific visualization methods. This is an interactive CD-ROM that contains images, movies, and data of all the surveys the Pacific Seafloor Mapping Project has completed up to January 1999. The images and movies on this CD-ROM, such as shaded relief of the bathymetry, backscatter, oblique views, 3-D views, and QuickTime movies help the viewer to visualize the multibeam data. This CD-ROM also contains ARC/INFO export (.e00) files and full-resolution TIFF images of all the survey sites that can be downloaded and used in many GIS packages.

  11. Kodak's New Photo CD Portfolio: Multimedia for the Rest of Us.

    ERIC Educational Resources Information Center

    Bonime, Andrew

    1994-01-01

    Describes Photo CD Portfolio, an Eastman Kodak product that provides interactive multimedia CD-ROM production capability. The article focuses on the capabilities of the tool's simplest authoring system, Create It, which allows users to work with Photo CD, PICT, or TIFF images, add graphics, text and audio, and create menus with branching. (KRN)

  12. Global Multi-Resolution Topography (GMRT) Synthesis - Recent Updates and Developments

    NASA Astrophysics Data System (ADS)

    Ferrini, V. L.; Morton, J. J.; Celnick, M.; McLain, K.; Nitsche, F. O.; Carbotte, S. M.; O'hara, S. H.

    2017-12-01

    The Global Multi-Resolution Topography (GMRT, http://gmrt.marine-geo.org) synthesis is a multi-resolution compilation of elevation data that is maintained in Mercator, South Polar, and North Polar Projections. GMRT consists of four independently curated elevation components: (1) quality controlled multibeam data ( 100m res.), (2) contributed high-resolution gridded bathymetric data (0.5-200 m res.), (3) ocean basemap data ( 500 m res.), and (4) variable resolution land elevation data (to 10-30 m res. in places). Each component is managed and updated as new content becomes available, with two scheduled releases each year. The ocean basemap content for GMRT includes the International Bathymetric Chart of the Arctic Ocean (IBCAO), the International Bathymetric Chart of the Southern Ocean (IBCSO), and the GEBCO 2014. Most curatorial effort for GMRT is focused on the swath bathymetry component, with an emphasis on data from the US Academic Research Fleet. As of July 2017, GMRT includes data processed and curated by the GMRT Team from 974 research cruises, covering over 29 million square kilometers ( 8%) of the seafloor at 100m resolution. The curated swath bathymetry data from GMRT is routinely contributed to international data synthesis efforts including GEBCO and IBCSO. Additional curatorial effort is associated with gridded data contributions from the international community and ensures that these data are well blended in the synthesis. Significant new additions to the gridded data component this year include the recently released data from the search for MH370 (Geoscience Australia) as well as a large high-resolution grid from the Gulf of Mexico derived from 3D seismic data (US Bureau of Ocean Energy Management). Recent developments in functionality include the deployment of a new Polar GMRT MapTool which enables users to export custom grids and map images in polar projection for their selected area of interest at the resolution of their choosing. Available for both the south and north polar regions, grids can be exported from GMRT in a variety of formats including ASCII, GeoTIFF and NetCDF to support use in common mapping software applications such as ArcGIS, GMT, Matlab, and Python. New web services have also been developed to enable programmatic access to grids and images in north and south polar projections.

  13. The effect of horizontal resolution on simulation quality in the Community Atmospheric Model, CAM5.1

    DOE PAGES

    Wehner, Michael F.; Reed, Kevin A.; Li, Fuyu; ...

    2014-10-13

    We present an analysis of version 5.1 of the Community Atmospheric Model (CAM5.1) at a high horizontal resolution. Intercomparison of this global model at approximately 0.25°, 1°, and 2° is presented for extreme daily precipitation as well as for a suite of seasonal mean fields. In general, extreme precipitation amounts are larger in high resolution than in lower-resolution configurations. In many but not all locations and/or seasons, extreme daily precipitation rates in the high-resolution configuration are higher and more realistic. The high-resolution configuration produces tropical cyclones up to category 5 on the Saffir-Simpson scale and a comparison to observations revealsmore » both realistic and unrealistic model behavior. In the absence of extensive model tuning at high resolution, simulation of many of the mean fields analyzed in this study is degraded compared to the tuned lower-resolution public released version of the model.« less

  14. Variability along the Atlantic water pathway in the forced Norwegian Earth System Model

    NASA Astrophysics Data System (ADS)

    Langehaug, H. R.; Sandø, A. B.; Årthun, M.; Ilıcak, M.

    2018-03-01

    The growing attention on mechanisms that can provide predictability on interannual-to-decadal time scales, makes it necessary to identify how well climate models represent such mechanisms. In this study we use a high (0.25° horizontal grid) and a medium (1°) resolution version of a forced global ocean-sea ice model, utilising the Norwegian Earth System Model, to assess the impact of increased ocean resolution. Our target is the simulation of temperature and salinity anomalies along the pathway of warm Atlantic water in the subpolar North Atlantic and the Nordic Seas. Although the high resolution version has larger biases in general at the ocean surface, the poleward propagation of thermohaline anomalies is better resolved in this version, i.e., the time for an anomaly to travel northward is more similar to observation based estimates. The extent of these anomalies can be rather large in both model versions, as also seen in observations, e.g., stretching from Scotland to northern Norway. The easternmost branch into the Nordic and Barents Seas, carrying warm Atlantic water, is also improved by higher resolution, both in terms of mean heat transport and variability in thermohaline properties. A more detailed assessment of the link between the North Atlantic Ocean circulation and the thermohaline anomalies at the entrance of the Nordic Seas reveals that the high resolution is more consistent with mechanisms that are previously published. This suggests better dynamics and variability in the subpolar region and the Nordic Seas in the high resolution compared to the medium resolution. This is most likely due a better representation of the mean circulation in the studied region when using higher resolution. As the poleward propagation of ocean heat anomalies is considered to be a key source of climate predictability, we recommend that similar methodology presented herein should be performed on coupled climate models that are used for climate prediction.

  15. Representations of the Stratospheric Polar Vortices in Versions 1 and 2 of the Goddard Earth Observing System Chemistry-Climate Model (GEOS CCM)

    NASA Technical Reports Server (NTRS)

    Pawson, S.; Stolarski, R.S.; Nielsen, J.E.; Perlwitz, J.; Oman, L.; Waugh, D.

    2009-01-01

    This study will document the behavior of the polar vortices in two versions of the GEOS CCM. Both versions of the model include the same stratospheric chemistry, They differ in the underlying circulation model. Version 1 of the GEOS CCM is based on the Goddard Earth Observing System, Version 4, general circulation model which includes the finite-volume (Lin-Rood) dynamical core and physical parameterizations from Community Climate Model, Version 3. GEOS CCM Version 2 is based on the GEOS-5 GCM that includes a different tropospheric physics package. Baseline simulations of both models, performed at two-degree spatial resolution, show some improvements in Version 2, but also some degradation, In the Antarctic, both models show an over-persistent stratospheric polar vortex with late breakdown, but the year-to-year variations that are overestimated in Version I are more realistic in Version 2. The implications of this for the interactions with tropospheric climate, the Southern Annular Mode, will be discussed. In the Arctic both model versions show a dominant dynamically forced variabi;ity, but Version 2 has a persistent warm bias in the low stratosphere and there are seasonal differences in the simulations. These differences will be quantified in terms of climate change and ozone loss. Impacts of model resolution, using simulations at one-degree and half-degree, and changes in physical parameterizations (especially the gravity wave drag) will be discussed.

  16. NAVAIR Portable Source Initiative (NPSI) Standard for Reusable Source Dataset Metadata (RSDM) V2.4

    DTIC Science & Technology

    2012-09-26

    defining a raster file format: <RasterFileFormat> <FormatName>TIFF</FormatName> <Order>BIP</Order> < DataType >8-BIT_UNSIGNED</ DataType ...interleaved by line (BIL); Band interleaved by pixel (BIP). element RasterFileFormatType/ DataType diagram type restriction of xsd:string facets

  17. The GEOS-5 Atmospheric General Circulation Model: Mean Climate and Development from MERRA to Fortuna

    NASA Technical Reports Server (NTRS)

    Molod, Andrea; Takacs, Lawrence; Suarez, Max; Bacmeister, Julio; Song, In-Sun; Eichmann, Andrew

    2012-01-01

    This report is a documentation of the Fortuna version of the GEOS-5 Atmospheric General Circulation Model (AGCM). The GEOS-5 AGCM is currently in use in the NASA Goddard Modeling and Assimilation Office (GMAO) for simulations at a wide range of resolutions, in atmosphere only, coupled ocean-atmosphere, and data assimilation modes. The focus here is on the development subsequent to the version that was used as part of NASA s Modern-Era Retrospective Analysis for Research and Applications (MERRA). We present here the results of a series of 30-year atmosphere-only simulations at different resolutions, with focus on the behavior of the 1-degree resolution simulation. The details of the changes in parameterizations subsequent to the MERRA model version are outlined, and results of a series of 30-year, atmosphere-only climate simulations at 2-degree resolution are shown to demonstrate changes in simulated climate associated with specific changes in parameterizations. The GEOS-5 AGCM presented here is the model used for the GMAO s atmosphere-only and coupled CMIP-5 simulations.

  18. Development and assessment of a higher-spatial-resolution (4.4 km) MISR aerosol optical depth product using AERONET-DRAGON data

    NASA Astrophysics Data System (ADS)

    Garay, Michael J.; Kalashnikova, Olga V.; Bull, Michael A.

    2017-04-01

    Since early 2000, the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite has been acquiring data that have been used to produce aerosol optical depth (AOD) and particle property retrievals at 17.6 km spatial resolution. Capitalizing on the capabilities provided by multi-angle viewing, the current operational (Version 22) MISR algorithm performs well, with about 75 % of MISR AOD retrievals globally falling within 0.05 or 20 % × AOD of paired validation data from the ground-based Aerosol Robotic Network (AERONET). This paper describes the development and assessment of a prototype version of a higher-spatial-resolution 4.4 km MISR aerosol optical depth product compared against multiple AERONET Distributed Regional Aerosol Gridded Observations Network (DRAGON) deployments around the globe. In comparisons with AERONET-DRAGON AODs, the 4.4 km resolution retrievals show improved correlation (r = 0. 9595), smaller RMSE (0.0768), reduced bias (-0.0208), and a larger fraction within the expected error envelope (80.92 %) relative to the Version 22 MISR retrievals.

  19. Introducing MISR Version 23: Resolution and Content Improvements to MISR Aerosol and Land Surface Product

    NASA Astrophysics Data System (ADS)

    Garay, M. J.; Bull, M. A.; Witek, M. L.; Diner, D. J.; Seidel, F.

    2017-12-01

    Since early 2000, the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite has been providing operational Level 2 (swath-based) aerosol optical depth (AOD) and particle property retrievals at 17.6 km spatial resolution and atmospherically corrected land surface products at 1.1 km resolution. A major, multi-year development effort has led to the release of updated operational MISR Level 2 aerosol and land surface retrieval products. The spatial resolution of the aerosol product has been increased to 4.4 km, allowing more detailed characterization of aerosol spatial variability, especially near local sources and in urban areas. The product content has been simplified and updated to include more robust measures of retrieval uncertainty and other fields to benefit users. The land surface product has also been updated to incorporate the Version 23 aerosol product as input and to improve spatial coverage, particularly over mountainous terrain and snow/ice-covered surfaces. We will describe the major upgrades incorporated in Version 23, present validation of the aerosol product, and describe some of the applications enabled by these product updates.

  20. New version of 1 km global river flood hazard maps for the next generation of Aqueduct Global Flood Analyzer

    NASA Astrophysics Data System (ADS)

    Sutanudjaja, Edwin; van Beek, Rens; Winsemius, Hessel; Ward, Philip; Bierkens, Marc

    2017-04-01

    The Aqueduct Global Flood Analyzer, launched in 2015, is an open-access and free-of-charge web-based interactive platform which assesses and visualises current and future projections of river flood impacts across the globe. One of the key components in the Analyzer is a set of river flood inundation hazard maps derived from the global hydrological model simulation of PCR-GLOBWB. For the current version of the Analyzer, accessible on http://floods.wri.org/#/, the early generation of PCR-GLOBWB 1.0 was used and simulated at 30 arc-minute ( 50 km at the equator) resolution. In this presentation, we will show the new version of these hazard maps. This new version is based on the latest version of PCR-GLOBWB 2.0 (https://github.com/UU-Hydro/PCR-GLOBWB_model, Sutanudjaja et al., 2016, doi:10.5281/zenodo.60764) simulated at 5 arc-minute ( 10 km at the equator) resolution. The model simulates daily hydrological and water resource fluxes and storages, including the simulation of overbank volume that ends up on the floodplain (if flooding occurs). The simulation was performed for the present day situation (from 1960) and future climate projections (until 2099) using the climate forcing created in the ISI-MIP project. From the simulated flood inundation volume time series, we then extract annual maxima for each cell, and fit these maxima to a Gumbel extreme value distribution. This allows us to derive flood volume maps of any hazard magnitude (ranging from 2-year to 1000-year flood events) and for any time period (e.g. 1960-1999, 2010-2049, 2030-2069, and 2060-2099). The derived flood volumes (at 5 arc-minute resolution) are then spread over the high resolution terrain model using an updated GLOFRIS downscaling module (Winsemius et al., 2013, doi:10.5194/hess-17-1871-2013). The updated version performs a volume spreading sequentially from more upstream basins to downstream basins, hence enabling a better inclusion of smaller streams, and takes into account spreading of water over diverging deltaic regions. This results in a set of high resolution hazard maps of flood inundation depth at 30 arc-second ( 1 km at the equator) resolution. Together with many other updates and new features, the resulting flood hazard maps will be used in the next generation of the Aqueduct Global Flood Analyzer.

  1. Integrated Multibeam and LIDAR Bathymetry Data Offshore of New London and Niantic, Connecticut

    USGS Publications Warehouse

    Poppe, L.J.; Danforth, W.W.; McMullen, K.Y.; Parker, Castle E.; Lewit, P.G.; Doran, E.F.

    2010-01-01

    Nearshore areas within Long Island Sound are of great interest to the Connecticut and New York research and resource management communities because of their ecological, recreational, and commercial importance. Although advances in multibeam echosounder technology permit the construction of high-resolution representations of sea-floor topography in deeper waters, limitations inherent in collecting fixed-angle multibeam data make using this technology in shallower waters (less than 10 meters deep) difficult and expensive. These limitations have often resulted in data gaps between areas for which multibeam bathymetric datasets are available and the adjacent shoreline. To address this problem, the geospatial data sets released in this report seamlessly integrate complete-coverage multibeam bathymetric data acquired off New London and Niantic Bay, Connecticut, with hydrographic Light Detection and Ranging (LIDAR) data acquired along the nearshore. The result is a more continuous sea floor representation and a much smaller gap between the digital bathymetric data and the shoreline than previously available. These data sets are provided online and on CD-ROM in Environmental Systems Research Institute (ESRI) raster-grid and GeoTIFF formats in order to facilitate access, compatibility, and utility.

  2. NASA Earth Observations (NEO): Data Imagery for Education and Visualization

    NASA Astrophysics Data System (ADS)

    Ward, K.

    2008-12-01

    NASA Earth Observations (NEO) has dramatically simplified public access to georeferenced imagery of NASA remote sensing data. NEO targets the non-traditional data users who are currently underserved by functionality and formats available from the existing data ordering systems. These users include formal and informal educators, museum and science center personnel, professional communicators, and citizen scientists. NEO currently serves imagery from 45 different datasets with daily, weekly, and/or monthly temporal resolutions, with more datasets currently under development. The imagery from these datasets is produced in coordination with several data partners who are affiliated either with the instrument science teams or with the respective data processing center. NEO is a system of three components -- website, WMS (Web Mapping Service), and ftp archive -- which together are able to meet the wide-ranging needs of our users. Some of these needs include the ability to: view and manipulate imagery using the NEO website -- e.g., applying color palettes, resizing, exporting to a variety of formats including PNG, JPEG, KMZ (Google Earth), GeoTIFF; access the NEO collection via a standards-based API (WMS); and create customized exports for select users (ftp archive) such as Science on a Sphere, NASA's Earth Observatory, and others.

  3. Erratum: The Effects of Thermal Energetics on Three-dimensional Hydrodynamic Instabilities in Massive Protostellar Disks. II. High-Resolution and Adiabatic Evolutions

    NASA Astrophysics Data System (ADS)

    Pickett, Brian K.; Cassen, Patrick; Durisen, Richard H.; Link, Robert

    2000-02-01

    In the paper ``The Effects of Thermal Energetics on Three-dimensional Hydrodynamic Instabilities in Massive Protostellar Disks. II. High-Resolution and Adiabatic Evolutions'' by Brian K. Pickett, Patrick Cassen, Richard H. Durisen, and Robert Link (ApJ, 529, 1034 [2000]), the wrong version of Figure 10 was published as a result of an error at the Press. The correct version of Figure 10 appears below. The Press sincerely regrets this error.

  4. Adapting generalization tools to physiographic diversity for the united states national hydrography dataset

    USGS Publications Warehouse

    Buttenfield, B.P.; Stanislawski, L.V.; Brewer, C.A.

    2011-01-01

    This paper reports on generalization and data modeling to create reduced scale versions of the National Hydrographic Dataset (NHD) for dissemination through The National Map, the primary data delivery portal for USGS. Our approach distinguishes local differences in physiographic factors, to demonstrate that knowledge about varying terrain (mountainous, hilly or flat) and varying climate (dry or humid) can support decisions about algorithms, parameters, and processing sequences to create generalized, smaller scale data versions which preserve distinct hydrographic patterns in these regions. We work with multiple subbasins of the NHD that provide a range of terrain and climate characteristics. Specifically tailored generalization sequences are used to create simplified versions of the high resolution data, which was compiled for 1:24,000 scale mapping. Results are evaluated cartographically and metrically against a medium resolution benchmark version compiled for 1:100,000, developing coefficients of linear and areal correspondence.

  5. SU-E-T-497: Semi-Automated in Vivo Radiochromic Film Dosimetry Using a Novel Image Processing Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reyhan, M; Yue, N

    Purpose: To validate an automated image processing algorithm designed to detect the center of radiochromic film used for in vivo film dosimetry against the current gold standard of manual selection. Methods: An image processing algorithm was developed to automatically select the region of interest (ROI) in *.tiff images that contain multiple pieces of radiochromic film (0.5x1.3cm{sup 2}). After a user has linked a calibration file to the processing algorithm and selected a *.tiff file for processing, an ROI is automatically detected for all films by a combination of thresholding and erosion, which removes edges and any additional markings for orientation.more » Calibration is applied to the mean pixel values from the ROIs and a *.tiff image is output displaying the original image with an overlay of the ROIs and the measured doses. Validation of the algorithm was determined by comparing in vivo dose determined using the current gold standard (manually drawn ROIs) versus automated ROIs for n=420 scanned films. Bland-Altman analysis, paired t-test, and linear regression were performed to demonstrate agreement between the processes. Results: The measured doses ranged from 0.2-886.6cGy. Bland-Altman analysis of the two techniques (automatic minus manual) revealed a bias of -0.28cGy and a 95% confidence interval of (5.5cGy,-6.1cGy). These values demonstrate excellent agreement between the two techniques. Paired t-test results showed no statistical differences between the two techniques, p=0.98. Linear regression with a forced zero intercept demonstrated that Automatic=0.997*Manual, with a Pearson correlation coefficient of 0.999. The minimal differences between the two techniques may be explained by the fact that the hand drawn ROIs were not identical to the automatically selected ones. The average processing time was 6.7seconds in Matlab on an IntelCore2Duo processor. Conclusion: An automated image processing algorithm has been developed and validated, which will help minimize user interaction and processing time of radiochromic film used for in vivo dosimetry.« less

  6. Left Limb of North Pole of the Sun, March 20, 2007

    NASA Technical Reports Server (NTRS)

    2007-01-01

    [figure removed for brevity, see original site] [figure removed for brevity, see original site] Figure 1: Left eye view of a stereo pair Click on the image for full resolution TIFF Figure 2: Right eye view of a stereo pair Click on the image for full resolution TIFF Figure 1: This image was taken by the SECCHI Extreme UltraViolet Imager (EUVI) mounted on the STEREO-B spacecraft. STEREO-B is located behind the Earth, and follows the Earth in orbit around the Sun. This location enables us to view the Sun from the position of a virtual left eye in space. Figure 2: This image was taken by the SECCHI Extreme UltraViolet Imager (EUVI) mounted on the STEREO-A spacecraft. STEREO-A is located ahead of the Earth, and leads the Earth in orbit around the Sun, This location enables us to view the Sun from the position of a virtual right eye in space.

    NASA's Solar TErrestrial RElations Observatory (STEREO) satellites have provided the first three-dimensional images of the Sun. For the first time, scientists will be able to see structures in the Sun's atmosphere in three dimensions. The new view will greatly aid scientists' ability to understand solar physics and thereby improve space weather forecasting.

    The EUVI imager is sensitive to wavelengths of light in the extreme ultraviolet portion of the spectrum. EUVI bands at wavelengths of 304, 171 and 195 Angstroms have been mapped to the red blue and green visible portion of the spectrum; and processed to emphasize the temperature difference of the solar material.

    STEREO, a two-year mission, launched October 2006, will provide a unique and revolutionary view of the Sun-Earth System. The two nearly identical observatories -- one ahead of Earth in its orbit, the other trailing behind -- will trace the flow of energy and matter from the Sun to Earth. They will reveal the 3D structure of coronal mass ejections; violent eruptions of matter from the sun that can disrupt satellites and power grids, and help us understand why they happen. STEREO will become a key addition to the fleet of space weather detection satellites by providing more accurate alerts for the arrival time of Earth-directed solar ejections with its unique side-viewing perspective.

    STEREO is the third mission in NASA's Solar Terrestrial Probes program within NASA's Science Mission Directorate, Washington. The Goddard Science and Exploration Directorate manages the mission, instruments, and science center. The Johns Hopkins University Applied Physics Laboratory, Laurel, Md., designed and built the spacecraft and is responsible for mission operations. The imaging and particle detecting instruments were designed and built by scientific institutions in the U.S., UK, France, Germany, Belgium, Netherlands, and Switzerland. JPL is a division of the California Institute of Technology in Pasadena.

  7. Enhanced Historical Land-Use and Land-Cover Data Sets of the U.S. Geological Survey

    USGS Publications Warehouse

    Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.; Clawges, Rick M.

    2007-01-01

    Historical land-use and land-cover data, available from the U.S. Geological Survey (USGS) for the conterminous United States and Hawaii, have been enhanced for use in geographic information systems (GIS) applications. The original digital data sets were created by the USGS in the late 1970s and early 1980s and were later converted by USGS and the U.S. Environmental Protection Agency (USEPA) to a geographic information system (GIS) format in the early 1990s. These data were made available on USEPA's Web site since the early 1990s and have been used for many national applications, despite minor coding and topological errors. During the 1990s, a group of USGS researchers made modifications to the data set for use in the National Water-Quality Assessment Program. These edited files have been further modified to create a more accurate, topologically clean, and seamless national data set. Several different methods, including custom editing software and several batch processes, were applied to create this enhanced version of the national data set. The data sets are included in this report in the commonly used shapefile and Tagged Image Format File (TIFF) formats. In addition, this report includes two polygon data sets (in shapefile format) representing (1) land-use and land-cover source documentation extracted from the previously published USGS data files, and (2) the extent of each polygon data file.

  8. Middle atmosphere simulated with high vertical and horizontal resolution versions of a GCM: Improvements in the cold pole bias and generation of a QBO-like oscillation in the tropics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamilton, K.; Wilson, R.J.; Hemler, R.S.

    1999-11-15

    The large-scale circulation in the Geophysical Fluid Dynamics Laboratory SKYHI troposphere-stratosphere-mesosphere finite-difference general circulation model is examined as a function of vertical and horizontal resolution. The experiments examined include one with horizontal grid spacing of {approximately}35 km and another with {approximately}100 km horizontal grid spacing but very high vertical resolution (160 levels between the ground and about 85 km). The simulation of the middle-atmospheric zonal-mean winds and temperatures in the extratropics is found to be very sensitive to horizontal resolution. For example, in the early Southern Hemisphere winter the South Pole near 1 mb in the model is colder thanmore » observed, but the bias is reduced with improved horizontal resolution (from {approximately}70 C in a version with {approximately}300 km grid spacing to less than 10 C in the {approximately}35 km version). The extratropical simulation is found to be only slightly affected by enhancements of the vertical resolution. By contrast, the tropical middle-atmospheric simulation is extremely dependent on the vertical resolution employed. With level spacing in the lower stratosphere {approximately}1.5 km, the lower stratospheric zonal-mean zonal winds in the equatorial region are nearly constant in time. When the vertical resolution is doubled, the simulated stratospheric zonal winds exhibit a strong equatorially centered oscillation with downward propagation of the wind reversals and with formation of strong vertical shear layers. This appears to be a spontaneous internally generated oscillation and closely resembles the observed QBO in many respects, although the simulated oscillation has a period less than half that of the real QBO.« less

  9. Exploring the impacts of physics and resolution on aqua-planet simulations from a nonhydrostatic global variable-resolution modeling framework: IMPACTS OF PHYSICS AND RESOLUTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Chun; Leung, L. Ruby; Park, Sang-Hun

    Advances in computing resources are gradually moving regional and global numerical forecasting simulations towards sub-10 km resolution, but global high resolution climate simulations remain a challenge. The non-hydrostatic Model for Prediction Across Scales (MPAS) provides a global framework to achieve very high resolution using regional mesh refinement. Previous studies using the hydrostatic version of MPAS (H-MPAS) with the physics parameterizations of Community Atmosphere Model version 4 (CAM4) found notable resolution dependent behaviors. This study revisits the resolution sensitivity using the non-hydrostatic version of MPAS (NH-MPAS) with both CAM4 and CAM5 physics. A series of aqua-planet simulations at global quasi-uniform resolutionsmore » ranging from 240 km to 30 km and global variable resolution simulations with a regional mesh refinement of 30 km resolution over the tropics are analyzed, with a primary focus on the distinct characteristics of NH-MPAS in simulating precipitation, clouds, and large-scale circulation features compared to H-MPAS-CAM4. The resolution sensitivity of total precipitation and column integrated moisture in NH-MPAS is smaller than that in H-MPAS-CAM4. This contributes importantly to the reduced resolution sensitivity of large-scale circulation features such as the inter-tropical convergence zone and Hadley circulation in NH-MPAS compared to H-MPAS. In addition, NH-MPAS shows almost no resolution sensitivity in the simulated westerly jet, in contrast to the obvious poleward shift in H-MPAS with increasing resolution, which is partly explained by differences in the hyperdiffusion coefficients used in the two models that influence wave activity. With the reduced resolution sensitivity, simulations in the refined region of the NH-MPAS global variable resolution configuration exhibit zonally symmetric features that are more comparable to the quasi-uniform high-resolution simulations than those from H-MPAS that displays zonal asymmetry in simulations inside the refined region. Overall, NH-MPAS with CAM5 physics shows less resolution sensitivity compared to CAM4. These results provide a reference for future studies to further explore the use of NH-MPAS for high-resolution climate simulations in idealized and realistic configurations.« less

  10. Paleozoic and mesozoic GIS data from the Geologic Atlas of the Rocky Mountain Region: Volume 1

    USGS Publications Warehouse

    Graeber, Aimee; Gunther, Gregory

    2017-01-01

    The Rocky Mountain Association of Geologists (RMAG) is, once again, publishing portions of the 1972 Geologic Atlas of the Rocky Mountain Region (Mallory, ed., 1972) as a geospatial map and data package. Georeferenced tiff (Geo TIFF) images of map figures from this atlas has served as the basis for these data products. Shapefiles and file geodatabase features have been generated and cartographically represented for select pages from the following chapters:• Phanerozoic Rocks (page 56)• Cambrian System (page 63)• Ordovician System (pages 78 and 79)• Silurian System (pages 87 - 89)• Devonian System (pages 93, 94, and 96 - 98)• Mississippian System (pages 102 and 103)• Pennsylvanian System (pages 114 and 115)• Permian System (pages 146 and 149 - 154)• Triassic System (pages 168 and 169)• Jurassic System (pages 179 and 180)• Cretaceous System (pages 197 - 201, 207 - 210, 215, - 218, 221, 222, 224, 225, and 227).The primary purpose of this publication is to provide regional-scale, as well as local-scale, geospatial data of the Rocky Mountain Region for use in geoscience studies. An important aspect of this interactive map product is that it does not require extensive GIS experience or highly specialized software.

  11. Publisher Correction: Quantum engineering of transistors based on 2D materials heterostructures

    NASA Astrophysics Data System (ADS)

    Iannaccone, Giuseppe; Bonaccorso, Francesco; Colombo, Luigi; Fiori, Gianluca

    2018-06-01

    In the version of this Perspective originally published, in the email address for the author Giuseppe Iannaccone, the surname was incorrectly given as "innaconne"; this has now been corrected in all versions of the Perspective. Also, an error in the production process led to Figs. 1, 2 and 3 being of low resolution; these have now been replaced with higher-quality versions.

  12. Analysis of Ultra High Resolution Sea Surface Temperature Level 4 Datasets

    NASA Technical Reports Server (NTRS)

    Wagner, Grant

    2011-01-01

    Sea surface temperature (SST) studies are often focused on improving accuracy, or understanding and quantifying uncertainties in the measurement, as SST is a leading indicator of climate change and represents the longest time series of any ocean variable observed from space. Over the past several decades SST has been studied with the use of satellite data. This allows a larger area to be studied with much more frequent measurements being taken than direct measurements collected aboard ship or buoys. The Group for High Resolution Sea Surface Temperature (GHRSST) is an international project that distributes satellite derived sea surface temperatures (SST) data from multiple platforms and sensors. The goal of the project is to distribute these SSTs for operational uses such as ocean model assimilation and decision support applications, as well as support fundamental SST research and climate studies. Examples of near real time applications include hurricane and fisheries studies and numerical weather forecasting. The JPL group has produced a new 1 km daily global Level 4 SST product, the Multiscale Ultrahigh Resolution (MUR), that blends SST data from 3 distinct NASA radiometers: the Moderate Resolution Imaging Spectroradiometer (MODIS), the Advanced Very High Resolution Radiometer (AVHRR), and the Advanced Microwave Scanning Radiometer ? Earth Observing System(AMSRE). This new product requires further validation and accuracy assessment, especially in coastal regions.We examined the accuracy of the new MUR SST product by comparing the high resolution version and a lower resolution version that has been smoothed to 19 km (but still gridded to 1 km). Both versions were compared to the same data set of in situ buoy temperature measurements with a focus on study regions of the oceans surrounding North and Central America as well as two smaller regions around the Gulf Stream and California coast. Ocean fronts exhibit high temperature gradients (Roden, 1976), and thus satellite data of SST can be used in the detection of these fronts. In this case, accuracy is less of a concern because the primary focus is on the spatial derivative of SST. We calculated the gradients for both versions of the MUR data set and did statistical comparisons focusing on the same regions.

  13. Distant Supernova Remnant Imaged by Chandra's High Resolution Camera

    NASA Astrophysics Data System (ADS)

    1999-09-01

    The High Resolution Camera (HRC), one of the two X-ray cameras on NASA's Chandra X-ray Observatory, was placed into the focus for the first time on Monday, August 30. The first target was LMC X-1, a point-like source of X rays in the Large Magellanic Cloud. The Large Magellanic Cloud, a companion galaxy to the Milky Way, is 160,000 light years from Earth. After checking the focus with LMC X-1, Chandra observed N132D, a remnant of an exploded star in the Large Magellanic Cloud. "These were preliminary test observations," emphasized Dr. Stephen Murray, of the Harvard-Smithsonian Center for Astrophysics, principal investigator for the High Resolution Camera. "But we are very pleased with the results. All indications are that the HRC will produce X-ray images of unprecedented clarity." The N132D image shows a highly structured remnant, or shell, of 10-million-degree gas that is 80 light years across. Such a shell in the vicinity of the Sun would encompass more than fifty nearby stars. The amount of material in the N132D hot gas remnant is equal to that of 600 suns. The N132D supernova remnant appears to be colliding with a giant molecular cloud, which produces the brightening on the southern rim of the remnant. The molecular cloud, visible with a radio telescope, has the mass of 300,000 suns. The relatively weak x-radiation on the upper left shows that the shock wave is expanding into a less dense region on the edge of the molecular cloud. A number of small circular structures are visible in the central regions and a hint of a large circular loop can be seen in the upper part of the remnant. Whether the peculiar shape of the supernova remnant can be fully explained in terms of these effects, or whether they point to a peculiar cylindrically shaped explosion remains to be seen. -more- "The image is so rich in structure that it will take a while to sort out what is really going on," Murray said. "It could be multiple supernovas, or absorbing clouds in the vicinity of the supernova." The unique capabilities of the HRC stem from the close match of its imaging capability to the focusing power of the mirrors. When used with the Chandra mirrors, the HRC will make images that reveal detail as small as one-half an arc second. This is equivalent to the ability to read a stop sign at a distance of twelve miles. The checkout period for the HRC will continue for the next few weeks, during which time the team expects to acquire images of other supernova remnants, star clusters, and starburst galaxies. To follow Chandra's progress, visit the Chandra News Web site at: http://chandra.harvard.edu AND http://chandra.nasa.gov NASA's Marshall Space Flight Center in Huntsville, Alabama, manages the Chandra X-ray Observatory for NASA's Office of Space Science, NASA Headquarters, Washington, D.C. The Smithsonian Astrophysical Observatory's Chandra X-ray Center in Cambridge, Mass., manages the Chandra science program and controls the observatory for NASA. TRW Space and Electronics Group of Redondo Beach, Calif., leads the contractor team that built Chandra. High resolution digital versions of the X-ray image (300 dpi JPG, TIFF) and other information associated with this release are available on the Internet at: http://chandra.harvard.edu/photo/0050/ or via links in: http://chandra.harvard.edu

  14. Demonstration of Airborne Wide Area Assessment Technologies at Pueblo Precision Bombing Ranges, Colorado. Hyperspectral Imaging, Version 2.0

    DTIC Science & Technology

    2007-09-27

    the spatial and spectral resolution ...variety of geological and vegetation mapping efforts, the Hymap sensor offered the best available combination of spectral and spatial resolution , signal... The limitations of the technology currently relate to spatial and spectral resolution and geo- correction accuracy. Secondly, HSI datasets

  15. The Mars Climate Database (MCD version 5.2)

    NASA Astrophysics Data System (ADS)

    Millour, E.; Forget, F.; Spiga, A.; Navarro, T.; Madeleine, J.-B.; Montabone, L.; Pottier, A.; Lefevre, F.; Montmessin, F.; Chaufray, J.-Y.; Lopez-Valverde, M. A.; Gonzalez-Galindo, F.; Lewis, S. R.; Read, P. L.; Huot, J.-P.; Desjean, M.-C.; MCD/GCM development Team

    2015-10-01

    The Mars Climate Database (MCD) is a database of meteorological fields derived from General Circulation Model (GCM) numerical simulations of the Martian atmosphere and validated using available observational data. The MCD includes complementary post-processing schemes such as high spatial resolution interpolation of environmental data and means of reconstructing the variability thereof. We have just completed (March 2015) the generation of a new version of the MCD, MCD version 5.2

  16. Advances in Small Pixel TES-Based X-Ray Microcalorimeter Arrays for Solar Physics and Astrophysics

    NASA Technical Reports Server (NTRS)

    Bandler, S. R.; Adams, J. S.; Bailey, C. N.; Busch, S. E.; Chervenak, J. A.; Eckart, M. E.; Ewin, A. E.; Finkbeiner, F. M.; Kelley, R. L.; Kelly, D. P.; hide

    2012-01-01

    We are developing small-pixel transition-edge-sensor (TES) for solar physics and astrophysics applications. These large format close-packed arrays are fabricated on solid silicon substrates and are designed to accommodate count-rates of up to a few hundred counts/pixel/second at a FWHM energy resolution approximately 2 eV at 6 keV. We have fabricated versions that utilize narrow-line planar and stripline wiring. We present measurements of the performance and uniformity of kilo-pixel arrays, incorporating TESs with single 65-micron absorbers on a 7s-micron pitch, as well as versions with more than one absorber attached to the TES, 4-absorber and 9-absorber "Hydras". We have also fabricated a version of this detector optimized for lower energies and lower count-rate applications. These devices have a lower superconducting transition temperature and are operated just above the 40mK heat sink temperature. This results in a lower heat capacity and low thermal conductance to the heat sink. With individual single pixels of this type we have achieved a FWHM energy resolution of 0.9 eV with 1.5 keV Al K x-rays, to our knowledge the first x-ray microcalorimeter with sub-eV energy resolution. The 4-absorber and 9-absorber versions of this type achieved FWHM energy resolutions of 1.4 eV and 2.1 eV at 1.5 keV respectively. We will discuss the application of these devices for new astrophysics mission concepts.

  17. Naval Research Lab Review 1999

    DTIC Science & Technology

    1999-01-01

    Center offers high-quality out- put from computer-generated files in EPS, Postscript, PICT, TIFF, Photoshop , and PowerPoint. Photo- graphic-quality color...767-3200 (228) 688-3390 (831) 656-4731 (410) 257-4000 DSN 297- or 754- 485 878 — Direct- in -Dialing 767- or 404- 688 656 257 Public Affairs (202) 767...research described in this NRL Review can be obtained from the Public Affairs Office, Code 1230, (202) 767-2541. Information concerning Technology

  18. Transactions of the Army Conference on Applied Mathematics and Computing (1st) Held at Washington, DC on 9-11 May 1983

    DTIC Science & Technology

    1984-02-01

    I . . . . . . An Introduction to Geometric Programming Patrick D. Allen and David W. Baker . . . . . . , . . . . . . . Space and Time...Zarwyn, US-Army Electronics R & D Comhiand GEOMETRIC PROGRAMING SPACE AND TIFFE ANALYSIS IN DYNAMIC PROGRAMING ALGORITHMS Renne..tf Stizti, AkeanXa...physical and parameter space can be connected by asymptotic matching. The purpose of the asymptotic analysis is to define the simplest problems

  19. NASCAP simulation of PIX 2 experiments

    NASA Technical Reports Server (NTRS)

    Roche, J. C.; Mandell, M. J.

    1985-01-01

    The latest version of the NASCAP/LEO digital computer code used to simulate the PIX 2 experiment is discussed. NASCAP is a finite-element code and previous versions were restricted to a single fixed mesh size. As a consequence the resolution was dictated by the largest physical dimension to be modeled. The latest version of NASCAP/LEO can subdivide selected regions. This permitted the modeling of the overall Delta launch vehicle in the primary computational grid at a coarse resolution, with subdivided regions at finer resolution being used to pick up the details of the experiment module configuration. Langmuir probe data from the flight were used to estimate the space plasma density and temperature and the Delta ground potential relative to the space plasma. This information is needed for input to NASCAP. Because of the uncertainty or variability in the values of these parameters, it was necessary to explore a range around the nominal value in order to determine the variation in current collection. The flight data from PIX 2 were also compared with the results of the NASCAP simulation.

  20. Managing multiple image stacks from confocal laser scanning microscopy

    NASA Astrophysics Data System (ADS)

    Zerbe, Joerg; Goetze, Christian H.; Zuschratter, Werner

    1999-05-01

    A major goal in neuroanatomy is to obtain precise information about the functional organization of neuronal assemblies and their interconnections. Therefore, the analysis of histological sections frequently requires high resolution images in combination with an overview about the structure. To overcome this conflict we have previously introduced a software for the automatic acquisition of multiple image stacks (3D-MISA) in confocal laser scanning microscopy. Here, we describe a Windows NT based software for fast and easy navigation through the multiple images stacks (MIS-browser), the visualization of individual channels and layers and the selection of user defined subregions. In addition, the MIS browser provides useful tools for the visualization and evaluation of the datavolume, as for instance brightness and contrast corrections of individual layers and channels. Moreover, it includes a maximum intensity projection, panning and zoom in/out functions within selected channels or focal planes (x/y) and tracking along the z-axis. The import module accepts any tiff-format and reconstructs the original image arrangement after the user has defined the sequence of images in x/y and z and the number of channels. The implemented export module allows storage of user defined subregions (new single image stacks) for further 3D-reconstruction and evaluation.

  1. Demonstrating the Value of Near Real-time Satellite-based Earth Observations in a Research and Education Framework

    NASA Astrophysics Data System (ADS)

    Chiu, L.; Hao, X.; Kinter, J. L.; Stearn, G.; Aliani, M.

    2017-12-01

    The launch of GOES-16 series provides an opportunity to advance near real-time applications in natural hazard detection, monitoring and warning. This study demonstrates the capability and values of receiving real-time satellite-based Earth observations over a fast terrestrial networks and processing high-resolution remote sensing data in a university environment. The demonstration system includes 4 components: 1) Near real-time data receiving and processing; 2) data analysis and visualization; 3) event detection and monitoring; and 4) information dissemination. Various tools are developed and integrated to receive and process GRB data in near real-time, produce images and value-added data products, and detect and monitor extreme weather events such as hurricane, fire, flooding, fog, lightning, etc. A web-based application system is developed to disseminate near-real satellite images and data products. The images are generated with GIS-compatible format (GeoTIFF) to enable convenient use and integration in various GIS platforms. This study enhances the capacities for undergraduate and graduate education in Earth system and climate sciences, and related applications to understand the basic principles and technology in real-time applications with remote sensing measurements. It also provides an integrated platform for near real-time monitoring of extreme weather events, which are helpful for various user communities.

  2. SedCT: MATLAB™ tools for standardized and quantitative processing of sediment core computed tomography (CT) data collected using a medical CT scanner

    NASA Astrophysics Data System (ADS)

    Reilly, B. T.; Stoner, J. S.; Wiest, J.

    2017-08-01

    Computed tomography (CT) of sediment cores allows for high-resolution images, three-dimensional volumes, and down core profiles. These quantitative data are generated through the attenuation of X-rays, which are sensitive to sediment density and atomic number, and are stored in pixels as relative gray scale values or Hounsfield units (HU). We present a suite of MATLAB™ tools specifically designed for routine sediment core analysis as a means to standardize and better quantify the products of CT data collected on medical CT scanners. SedCT uses a graphical interface to process Digital Imaging and Communications in Medicine (DICOM) files, stitch overlapping scanned intervals, and create down core HU profiles in a manner robust to normal coring imperfections. Utilizing a random sampling technique, SedCT reduces data size and allows for quick processing on typical laptop computers. SedCTimage uses a graphical interface to create quality tiff files of CT slices that are scaled to a user-defined HU range, preserving the quantitative nature of CT images and easily allowing for comparison between sediment cores with different HU means and variance. These tools are presented along with examples from lacustrine and marine sediment cores to highlight the robustness and quantitative nature of this method.

  3. Oil spill model coupled to an ultra-high-resolution circulation model: implementation for the Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Korotenko, K.

    2003-04-01

    An ultra-high-resolution version of DieCAST was adjusted for the Adriatic Sea and coupled with an oil spill model. Hydrodynamic module was developed on base of th low dissipative, four-order-accuracy version DieCAST with the resolution of ~2km. The oil spill model was developed on base of particle tracking technique The effect of evaporation is modeled with an original method developed on the base of the pseudo-component approach. A special dialog interface of this hybrid system allowing direct coupling to meteorlogical data collection systems or/and meteorological models. Experiments with hypothetic oil spill are analyzed for the Northern Adriatic Sea. Results (animations) of mesoscale circulation and oil slick modeling are presented at wabsite http://thayer.dartmouth.edu/~cushman/adriatic/movies/

  4. Completion of the 2006 National Land Cover Database Update for the Conterminous United States

    EPA Science Inventory

    Under the organization of the Multi-Resolution Land Characteristics (MRLC) Consortium, the National Land Cover Database (NLCD) has been updated to characterize both land cover and land cover change from 2001 to 2006. An updated version of NLCD 2001 (Version 2.0) is also provided....

  5. Hyper-Resolution Groundwater Modeling using MODFLOW 6

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; Langevin, C.

    2017-12-01

    MODFLOW 6 is the latest version of the U.S. Geological Survey's modular hydrologic model. MODFLOW 6 was developed to synthesize many of the recent versions of MODFLOW into a single program, improve the way different process models are coupled, and to provide an object-oriented framework for adding new types of models and packages. The object-oriented framework and underlying numerical solver make it possible to tightly couple any number of hyper-resolution models within coarser regional models. The hyper-resolution models can be used to evaluate local-scale groundwater issues that may be affected by regional-scale forcings. In MODFLOW 6, hyper-resolution meshes can be maintained as separate model datasets, similar to MODFLOW-LGR, which simplifies the development of a coarse regional model with imbedded hyper-resolution models from a coarse regional model. For example, the South Atlantic Coastal Plain regional water availability model was converted from a MODFLOW-2000 model to a MODFLOW 6 model. The horizontal discretization of the original model is approximately 3,218 m x 3,218 m. Hyper-resolution models of the Aiken and Sumter County water budget areas in South Carolina with a horizontal discretization of approximately 322 m x 322 m were developed and were tightly coupled to a modified version of the original coarse regional model that excluded these areas. Hydraulic property and aquifer geometry data from the coarse model were mapped to the hyper-resolution models. The discretization of the hyper-resolution models is fine enough to make detailed analyses of the effect that changes in groundwater withdrawals in the production aquifers have on the water table and surface-water/groundwater interactions. The approach used in this analysis could be applied to other regional water availability models that have been developed by the U.S. Geological Survey to evaluate local scale groundwater issues.

  6. Teaching Resources

    Science.gov Websites

    & Legislation Links Discussion Lists Quick Links AAPT eMentoring ComPADRE Review of High School Take Physics" Poster Why Physics Poster Thumbnail Download normal resolution JPEG Download high resolution JPEG Download Spanish Version Recruiting Physics Students in High School (FED newsletter article

  7. Demonstration of Inexact Computing Implemented in the JPEG Compression Algorithm using Probabilistic Boolean Logic applied to CMOS Components

    DTIC Science & Technology

    2015-12-24

    Signal to Noise Ratio SPICE Simulation Program with Integrated Circuit Emphasis TIFF Tagged Image File Format USC University of Southern California xvii...sources can create errors in digital circuits. These effects can be simulated using Simulation Program with Integrated Circuit Emphasis ( SPICE ) or...compute summary statistics. 4.1 Circuit Simulations Noisy analog circuits can be simulated in SPICE or Cadence SpectreTM software via noisy voltage

  8. ImageJ: Image processing and analysis in Java

    NASA Astrophysics Data System (ADS)

    Rasband, W. S.

    2012-06-01

    ImageJ is a public domain Java image processing program inspired by NIH Image. It can display, edit, analyze, process, save and print 8-bit, 16-bit and 32-bit images. It can read many image formats including TIFF, GIF, JPEG, BMP, DICOM, FITS and "raw". It supports "stacks", a series of images that share a single window. It is multithreaded, so time-consuming operations such as image file reading can be performed in parallel with other operations.

  9. The International Bathymetric Chart of the Southern Ocean (IBCSO) Version 1.0 - A new bathymetric compilation covering circum-Antarctic waters

    NASA Astrophysics Data System (ADS)

    Arndt, Jan Erik; Schenke, Hans Werner; Jakobsson, Martin; Nitsche, Frank O.; Buys, Gwen; Goleby, Bruce; Rebesco, Michele; Bohoyo, Fernando; Hong, Jongkuk; Black, Jenny; Greku, Rudolf; Udintsev, Gleb; Barrios, Felipe; Reynoso-Peralta, Walter; Taisei, Morishita; Wigley, Rochelle

    2013-06-01

    International Bathymetric Chart of the Southern Ocean (IBCSO) Version 1.0 is a new digital bathymetric model (DBM) portraying the seafloor of the circum-Antarctic waters south of 60°S. IBCSO is a regional mapping project of the General Bathymetric Chart of the Oceans (GEBCO). The IBCSO Version 1.0 DBM has been compiled from all available bathymetric data collectively gathered by more than 30 institutions from 15 countries. These data include multibeam and single-beam echo soundings, digitized depths from nautical charts, regional bathymetric gridded compilations, and predicted bathymetry. Specific gridding techniques were applied to compile the DBM from the bathymetric data of different origin, spatial distribution, resolution, and quality. The IBCSO Version 1.0 DBM has a resolution of 500 × 500 m, based on a polar stereographic projection, and is publicly available together with a digital chart for printing from the project website (www.ibcso.org) and at http://dx.doi.org/10.1594/PANGAEA.805736.

  10. Image transfer protocol in progressively increasing resolution

    NASA Technical Reports Server (NTRS)

    Percival, Jeffrey W. (Inventor); White, Richard L. (Inventor)

    1999-01-01

    A method of transferring digital image data over a communication link transforms and orders the data so that, as data is received by a receiving station, a low detail version of the image is immediately generated with later transmissions of data providing progressively greater detail in this image. User instructions are accepted, limiting the ultimate resolution of the image or suspending enhancement of the image except in certain user defined regions. When a low detail image is requested followed by a request for a high detailed version of the same image, the originally transmitted data of the low resolution image is not discarded or retransmitted but used with later data to improve the originally transmitted image. Only a single copy of the transformed image need be retained by the transmitting device in order to satisfy requests for different amounts of image detail.

  11. New products from the shuttle radar topography mission

    USGS Publications Warehouse

    Gesch, Dean B.; Farr, Tom; Slater, James; Muller, Jan-Peter; Cook, Sally

    2006-01-01

    Final products include elevation data resulting from a substantial editing effort by the NGA in which water bodies and coastlines were well defined and data artifacts known as spikes and wells (single pixel errors) were removed. This second version of the SRTM data set, also referred to as ‘finished’ data, represents a significant improvement over earlier versions that had nonflat water bodies, poorly defined coastlines, and numerous noise artifacts. The edited data are available at a one-arc-second resolution (approximately 30 meters) for the United States and its territories, and at a three-arc-second resolution (approximately 90 meters) for non-U.S. areas.

  12. Crystal Structures of Apo and Metal-Bound Forms of the UreE Protein from Helicobacter pylori: Role of Multiple Metal Binding Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Rong; Munger, Christine; Asinas, Abdalin

    2010-10-22

    The crystal structure of the urease maturation protein UreE from Helicobacter pylori has been determined in its apo form at 2.1 {angstrom} resolution, bound to Cu{sup 2+} at 2.7 {angstrom} resolution, and bound to Ni{sup 2+} at 3.1 {angstrom} resolution. Apo UreE forms dimers, while the metal-bound enzymes are arranged as tetramers that consist of a dimer of dimers associated around the metal ion through coordination by His102 residues from each subunit of the tetramer. Comparison of independent subunits from different crystal forms indicates changes in the relative arrangement of the N- and C-terminal domains in response to metal binding.more » The improved ability of engineered versions of UreE containing hexahistidine sequences at either the N-terminal or C-terminal end to provide Ni{sup 2+} for the final metal sink (urease) is eliminated in the H102A version. Therefore, the ability of the improved Ni{sup 2+}-binding versions to deliver more nickel is likely an effect of an increased local concentration of metal ions that can rapidly replenish transferred ions bound to His102.« less

  13. A world ocean model for greenhouse sensitivity studies: resolution intercomparison and the role of diagnostic forcing

    NASA Astrophysics Data System (ADS)

    Washington, Warren M.; Meehl, Gerald A.; Verplank, Lynda; Bettge, Thomas W.

    1994-05-01

    We have developed an improved version of a world ocean model with the intention of coupling to an atmospheric model. This article documents the simulation capability of this 1° global ocean model, shows improvements over our earlier 5° version, and compares it to features simulated with a 0.5° model. These experiments use a model spin-up methodology whereby the ocean model can subsequently be coupled to an atmospheric model and used for order 100-year coupled model integrations. With present-day computers, 1° is a reasonable compromise in resolution that allows for century-long coupled experiments. The 1° ocean model is derived from a 0.5°-resolution model developed by A. Semtner (Naval Postgraduate School) and R. Chervin (National Center for Atmospheric Research) for studies of the global eddy-resolving world ocean circulation. The 0.5° bottom topography and continental outlines have been altered to be compatible with the 1° resolution, and the Arctic Ocean has been added. We describe the ocean simulation characteristics of the 1° version and compare the result of weakly constraining (three-year time scale) the three-dimensional temperature and salinity fields to the observations below the thermocline (710 m) with the model forced only at the top of the ocean by observed annual mean wind stress, temperature, and salinity. The 1° simulations indicate that major ocean circulation patterns are greatly improved compared to the 5° version and are qualitatively reproduced in comparison to the 0.5° version. Using the annual mean top forcing alone in a 100-year simulation with the 1° version preserves the general features of the major observed temperature and salinity structure with most climate drift occurring mainly beneath the thermocline in the first 50 75 years. Because the thermohaline circulation in the 1° version is relatively weak with annual mean forcing, we demonstrate the importance of the seasonal cycle by performing two sensitivity experiments. Results show a dramatic intensification of the meridional overturning circulation (order of magnitude) with perpetual winter surface temperature forcing in the North Atlantic and strong intensification (factor of three) with perpetual early winter temperatures in that region. These effects are felt throughout the Atlantic (particularly an intensified and northward-shifted Gulf Stream outflow). In the Pacific, the temperature gradient strengthens in the thermocline, thus helping counter the systematic error of a thermocline that is too diffuse.

  14. Aerosol Optical Depth Changes in Version 4 CALIPSO Level 2 Product

    NASA Technical Reports Server (NTRS)

    Kim, Man-Hae; Omar, Ali H.; Tackett, Jason L.; Vaughan, Mark A.; Winker, David M.; Trepte, Charles R.; Hu, Yongxiang; Liu, Zhaoyan

    2017-01-01

    The Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) version 4.10 (V4) products were released in November 2016 with substantial enhancements. There have been improvements in the V4 CALIOP level 2 aerosol optical depth (AOD) compared to V3 (version 3) due to various factors. AOD change from V3 to V4 is investigated by separating factors. CALIOP AOD was compared with the Moderate Resolution Imaging Spectroradiometer (MODIS) and Aerosol Robotic Network (AERONET) for both V3 and V4.

  15. MR-CDF: Managing multi-resolution scientific data

    NASA Technical Reports Server (NTRS)

    Salem, Kenneth

    1993-01-01

    MR-CDF is a system for managing multi-resolution scientific data sets. It is an extension of the popular CDF (Common Data Format) system. MR-CDF provides a simple functional interface to client programs for storage and retrieval of data. Data is stored so that low resolution versions of the data can be provided quickly. Higher resolutions are also available, but not as quickly. By managing data with MR-CDF, an application can be relieved of the low-level details of data management, and can easily trade data resolution for improved access time.

  16. Depth-to-basement, sediment-thickness, and bathymetry data for the deep-sea basins offshore of Washington, Oregon, and California

    USGS Publications Warehouse

    Wong, Florence L.; Grim, Muriel S.

    2015-01-01

    Contours and derivative raster files of depth-to-basement, sediment-thickness, and bathymetry data for the area offshore of Washington, Oregon, and California are provided here as GIS-ready shapefiles and GeoTIFF files. The data were used to generate paper maps in 1992 and 1993 from 1984 surveys of the U.S. Exclusive Economic Zone by the U.S. Geological Survey for depth to basement and sediment thickness, and from older data for the bathymetry.

  17. Author Correction: Atomic-resolution three-dimensional hydration structures on a heterogeneously charged surface.

    PubMed

    Umeda, Kenichi; Zivanovic, Lidija; Kobayashi, Kei; Ritala, Juha; Kominami, Hiroaki; Spijker, Peter; Foster, Adam S; Yamada, Hirofumi

    2018-05-23

    The original version of the Supplementary Information associated with this Article contained an error in Supplementary Figure 9e,f in which the y-axes were incorrectly labelled from '-40' to '40', rather than the correct '-400' to '400'. The HTML has been updated to include a corrected version of the Supplementary Information.

  18. Pinhole X-ray/coronagraph optical systems concept definition study

    NASA Technical Reports Server (NTRS)

    Zehnpfenning, T. F.; Rappaport, S.; Wattson, R. B.

    1980-01-01

    The Pinhole X-ray/Coronagraph Concept utilizes the long baselines possible in Earth orbit with the space transportation system (shuttle) to produce observations of solar X-ray emission features at extremely high spatial resolution (up to 0.1 arc second) and high energy (up to 100 keV), and also white light and UV observations of the inner and outer corona at high spatial and/or spectral resolution. An examination of various aspects of a preliminary version of the X-ray Pinhole/Coronagraph Concept is presented. For this preliminary version, the instrument package will be carried in the shuttle bay on a mounting platform, and will be connected to the occulter with a deployable boom such as an Astromast. Generally, the spatial resolution, stray light levels, and minimum limb observing angles improve as the boom length increases. However, the associated engineering problems also become more serious with greater boom lengths.

  19. Development of high resolution simulations of the atmospheric environment using the MASS model

    NASA Technical Reports Server (NTRS)

    Kaplan, Michael L.; Zack, John W.; Karyampudi, V. Mohan

    1989-01-01

    Numerical simulations were performed with a very high resolution (7.25 km) version of the MASS model (Version 4.0) in an effort to diagnose the vertical wind shear and static stability structure during the Shuttle Challenger disaster which occurred on 28 January 1986. These meso-beta scale simulations reveal that the strongest vertical wind shears were concentrated in the 200 to 150 mb layer at 1630 GMT, i.e., at about the time of the disaster. These simulated vertical shears were the result of two primary dynamical processes. The juxtaposition of both of these processes produced a shallow (30 mb deep) region of strong vertical wind shear, and hence, low Richardson number values during the launch time period. Comparisons with the Cape Canaveral (XMR) rawinsonde indicates that the high resolution MASS 4.0 simulation more closely emulated nature than did previous simulations of the same event with the GMASS model.

  20. Assessment of Reference Height Models on Quality of Tandem-X dem

    NASA Astrophysics Data System (ADS)

    Mirzaee, S.; Motagh, M.; Arefi, H.

    2015-12-01

    The aim of this study is to investigate the effect of various Global Digital Elevation Models (GDEMs) in producing high-resolution topography model using TanDEM-X (TDX) Coregistered Single Look Slant Range Complex (CoSSC) images. We selected an image acquired on Jun 12th, 2012 over Doroud region in Lorestan, west of Iran and used 4 external digital elevation models in our processing including DLR/ASI X-SAR DEM (SRTM-X, 30m resolution), ASTER GDEM Version 2 (ASTER-GDEMV2, 30m resolution), NASA SRTM Version 4 (SRTM-V4, 90m resolution), and a local photogrammetry-based DEM prepared by National Cartographic Center (NCC DEM, 10m resolution) of Iran. InSAR procedure for DEM generation was repeated four times with each of the four external height references. The quality of each external DEM was initially assessed using ICESat filtered points. Then, the quality of, each TDX-based DEM was assessed using the more precise external DEM selected in the previous step. Results showed that both local (NCC) DEM and SRTM X-band performed the best (RMSE< 9m) for TDX-DEM generation. In contrast, ASTER GDEM v2 and SRTM C-band v4 showed poorer quality.

  1. Further investigation on "A multiplicative regularization for force reconstruction"

    NASA Astrophysics Data System (ADS)

    Aucejo, M.; De Smet, O.

    2018-05-01

    We have recently proposed a multiplicative regularization to reconstruct mechanical forces acting on a structure from vibration measurements. This method does not require any selection procedure for choosing the regularization parameter, since the amount of regularization is automatically adjusted throughout an iterative resolution process. The proposed iterative algorithm has been developed with performance and efficiency in mind, but it is actually a simplified version of a full iterative procedure not described in the original paper. The present paper aims at introducing the full resolution algorithm and comparing it with its simplified version in terms of computational efficiency and solution accuracy. In particular, it is shown that both algorithms lead to very similar identified solutions.

  2. Using High and Low Resolution Profiles of CO2 and CH4 Measured with AirCores to Evaluate Transport Models and Atmospheric Columns Retrieved from Space

    NASA Astrophysics Data System (ADS)

    Membrive, O.; Crevoisier, C. D.; Sweeney, C.; Hertzog, A.; Danis, F.; Picon, L.; Engel, A.; Boenisch, H.; Durry, G.; Amarouche, N.

    2015-12-01

    Over the past decades many methods have been developed to monitor the evolution of greenhouse gases (GHG): ground networks (NOAA, ICOS, TCCON), aircraft campaigns (HIPPO, CARIBIC, Contrail…), satellite observations (GOSAT, IASI, AIRS…). Nevertheless, precise and regular vertical profile measurements are currently still missing from the observing system. To address this need, an original and innovative atmospheric sampling system called AirCore has been developed at NOAA (Karion et al. 2010). This new system allows balloon measurements of GHG vertical profiles from the surface up to 30 km. New versions of this instrument have been developed at LMD: a high-resolution version "AirCore-HR" that differs from other AirCores by its high vertical resolution and two "light" versions (lower resolution) aiming to be flown under meteorological balloon. LMD AirCores were flown on multi-instrument gondolas along with other independent instruments measuring CO2 and CH4 in-situ during the Strato Science balloon campaigns operated by the French space agency CNES in collaboration with the Canadian Space Agency in Timmins (Ontario, Canada) in August 2014 and 2015. First, we will present comparisons of the vertical profiles retrieved with various AirCores (LMD and Frankfurt University) to illustrate repeatability and impact of the vertical resolution as well as comparisons with independent in-situ measurements from other instruments (laser diode based Pico-SDLA). Second, we will illustrate the usefulness of AirCore measurements in the upper troposphere and stratosphere for validating and interpreting vertical profiles from atmospheric transport models as well as observations of total and partial column of methane and carbon dioxide from several current and future spaceborne missions such as: ACE-FTS, IASI and GOSAT.

  3. Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service

    NASA Astrophysics Data System (ADS)

    Nonogaki, S.; Nemoto, T.

    2014-12-01

    Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.

  4. MATtrack: A MATLAB-Based Quantitative Image Analysis Platform for Investigating Real-Time Photo-Converted Fluorescent Signals in Live Cells.

    PubMed

    Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W; Gautier, Virginie W

    2015-01-01

    We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip.

  5. MATtrack: A MATLAB-Based Quantitative Image Analysis Platform for Investigating Real-Time Photo-Converted Fluorescent Signals in Live Cells

    PubMed Central

    Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W.; Gautier, Virginie W.

    2015-01-01

    We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip. PMID:26485569

  6. Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoa T. Nguyen; Stone, Daithi; E. Wes Bethel

    2016-01-01

    An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different casemore » studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.« less

  7. Archive of Digitized Analog Boomer and Minisparker Seismic Reflection Data Collected from the Alabama-Mississippi-Louisiana Shelf During Cruises Onboard the R/V Carancahua and R/V Gyre, April and July, 1981

    USGS Publications Warehouse

    Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.

    2009-01-01

    In April and July of 1981, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Alabama-Mississippi-Louisiana Shelf in the northern Gulf of Mexico. Work was conducted onboard the Texas A&M University R/V Carancahua and the R/V Gyre to develop a geologic understanding of the study area and to locate potential hazards related to offshore oil and gas production. While the R/V Carancahua only collected boomer data, the R/V Gyre used a 400-Joule minisparker, 3.5-kilohertz (kHz) subbottom profiler, 12-kHz precision depth recorder, and two air guns. The authors selected the minisparker data set because, unlike with the boomer data, it provided the most complete record. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer and minisparker paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.

  8. Provisional maps of thermal areas in Yellowstone National Park, based on satellite thermal infrared imaging and field observations

    USGS Publications Warehouse

    Vaughan, R. Greg; Heasler, Henry; Jaworowski, Cheryl; Lowenstern, Jacob B.; Keszthelyi, Laszlo P.

    2014-01-01

    Maps that define the current distribution of geothermally heated ground are useful toward setting a baseline for thermal activity to better detect and understand future anomalous hydrothermal and (or) volcanic activity. Monitoring changes in the dynamic thermal areas also supports decisions regarding the development of Yellowstone National Park infrastructure, preservation and protection of park resources, and ensuring visitor safety. Because of the challenges associated with field-based monitoring of a large, complex geothermal system that is spread out over a large and remote area, satellite-based thermal infrared images from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) were used to map the location and spatial extent of active thermal areas, to generate thermal anomaly maps, and to quantify the radiative component of the total geothermal heat flux. ASTER thermal infrared data acquired during winter nights were used to minimize the contribution of solar heating of the surface. The ASTER thermal infrared mapping results were compared to maps of thermal areas based on field investigations and high-resolution aerial photos. Field validation of the ASTER thermal mapping is an ongoing task. The purpose of this report is to make available ASTER-based maps of Yellowstone’s thermal areas. We include an appendix containing the names and characteristics of Yellowstone’s thermal areas, georeferenced TIFF files containing ASTER thermal imagery, and several spatial data sets in Esri shapefile format.

  9. View From Within 'Perseverance Valley' on Mars

    NASA Image and Video Library

    2017-12-06

    This view from within "Perseverance Valley," on the inner slope of the western rim of Endurance Crater on Mars, includes wheel tracks from the Opportunity rover's descent of the valley. The Panoramic Camera (Pancam) on Opportunity's mast took the component images of the scene during the period Sept. 4 through Oct. 6, 2017, corresponding to sols (Martian days) 4840 through 4871 of the rover's work on Mars. Perseverance Valley is a system of shallow troughs descending eastward about the length of two football fields from the crest of the crater rim to the floor of the crater. This panorama spans from northeast on the left to northwest on the right, including portions of the crater floor (eastward) in the left half and of the rim (westward) in the right half. Opportunity began descending Perseverance Valley in mid-2017 (see map) as part of an investigation into how the valley formed. Rover wheel tracks are darker brown, between two patches of bright bedrock, receding toward the horizon in the right half of the scene. This view combines multiple images taken through three different Pancam filters. The selected filters admit light centered on wavelengths of 753 nanometers (near-infrared), 535 nanometers (green) and 432 nanometers (violet). The three color bands are combined here to show approximately true color. A map and high-resolution TIFF file is available at https://photojournal.jpl.nasa.gov/catalog/PIA22074

  10. Realism of Indian Summer Monsoon Simulation in a Quarter Degree Global Climate Model

    NASA Astrophysics Data System (ADS)

    Salunke, P.; Mishra, S. K.; Sahany, S.; Gupta, K.

    2017-12-01

    This study assesses the fidelity of Indian Summer Monsoon (ISM) simulations using a global model at an ultra-high horizontal resolution (UHR) of 0.25°. The model used was the atmospheric component of the Community Earth System Model version 1.2.0 (CESM 1.2.0) developed at the National Center for Atmospheric Research (NCAR). Precipitation and temperature over the Indian region were analyzed for a wide range of space and time scales to evaluate the fidelity of the model under UHR, with special emphasis on the ISM simulations during the period of June-through-September (JJAS). Comparing the UHR simulations with observed data from the India Meteorological Department (IMD) over the Indian land, it was found that 0.25° resolution significantly improved spatial rainfall patterns over many regions, including the Western Ghats and the South-Eastern peninsula as compared to the standard model resolution. Convective and large-scale rainfall components were analyzed using the European Centre for Medium Range Weather Forecast (ECMWF) Re-Analysis (ERA)-Interim (ERA-I) data and it was found that at 0.25° resolution, there was an overall increase in the large-scale component and an associated decrease in the convective component of rainfall as compared to the standard model resolution. Analysis of the diurnal cycle of rainfall suggests a significant improvement in the phase characteristics simulated by the UHR model as compared to the standard model resolution. Analysis of the annual cycle of rainfall, however, failed to show any significant improvement in the UHR model as compared to the standard version. Surface temperature analysis showed small improvements in the UHR model simulations as compared to the standard version. Thus, one may conclude that there are some significant improvements in the ISM simulations using a 0.25° global model, although there is still plenty of scope for further improvement in certain aspects of the annual cycle of rainfall.

  11. Fuzzy Arden Syntax: A fuzzy programming language for medicine.

    PubMed

    Vetterlein, Thomas; Mandl, Harald; Adlassnig, Klaus-Peter

    2010-05-01

    The programming language Arden Syntax has been optimised for use in clinical decision support systems. We describe an extension of this language named Fuzzy Arden Syntax, whose original version was introduced in S. Tiffe's dissertation on "Fuzzy Arden Syntax: Representation and Interpretation of Vague Medical Knowledge by Fuzzified Arden Syntax" (Vienna University of Technology, 2003). The primary aim is to provide an easy means of processing vague or uncertain data, which frequently appears in medicine. For both propositional and number data types, fuzzy equivalents have been added to Arden Syntax. The Boolean data type was generalised to represent any truth degree between the two extremes 0 (falsity) and 1 (truth); fuzzy data types were introduced to represent fuzzy sets. The operations on truth values and real numbers were generalised accordingly. As the conditions to decide whether a certain programme unit is executed or not may be indeterminate, a Fuzzy Arden Syntax programme may split. The data in the different branches may be optionally aggregated subsequently. Fuzzy Arden Syntax offers the possibility to formulate conveniently Medical Logic Modules (MLMs) based on the principle of a continuously graded applicability of statements. Furthermore, ad hoc decisions about sharp value boundaries can be avoided. As an illustrative example shows, an MLM making use of the features of Fuzzy Arden Syntax is not significantly more complex than its Arden Syntax equivalent; in the ideal case, a programme handling crisp data remains practically unchanged when compared to its fuzzified version. In the latter case, the output data, which can be a set of weighted alternatives, typically depends continuously from the input data. In typical applications an Arden Syntax MLM can produce a different output after only slight changes of the input; discontinuities are in fact unavoidable when the input varies continuously but the output is taken from a discrete set of possibilities. This inconvenience can, however, be attenuated by means of certain mechanisms on which the programme flow under Fuzzy Arden Syntax is based. To write a programme making use of these possibilities is not significantly more difficult than to write a programme according to the usual practice. 2010 Elsevier B.V. All rights reserved.

  12. Recent Global Warming as Observed by AIRS and Depicted in GISSTEMP and MERRA-2

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Lee, Jae; Iredell, Lena

    2017-01-01

    AIRS Version-6 monthly mean level-3 surface temperature products confirm the result, depicted in the GISSTEMP dataset, that the earth's surface temperature has been warming since early 2015, though not before that. AIRS is at a higher spatial resolution than GISSTEMP, and produces sharper spatial features which are otherwise in excellent agreement with those of GISSTEMP. Version-6 AO Ts anomalies are consistent with those of Version-6 AIRS/AMSU. Version-7 AO anomalies should be even more accurate, especially at high latitudes. ARCs of MERRA-2 Ts anomalies are spurious as a result of a discontinuity which occurred somewhere between 2007 and 2008. This decreases global mean trends.

  13. Documentation of the Douglas-fir tussock moth outbreak-population model.

    Treesearch

    J.J. Colbert; W. Scott Overton; Curtis. White

    1979-01-01

    Documentation of three model versions: the Douglas-fir tussock moth population-branch model on (1) daily temporal resolution, (2) instart temporal resolution, and (3) the Douglas-fir tussock moth stand-outbreak model; the hierarchical framework and the conceptual paradigm used are described. The coupling of the model with a normal-stand model is discussed. The modeling...

  14. A High Resolution Graphic Input System for Interactive Graphic Display Terminals. Appendix B.

    ERIC Educational Resources Information Center

    Van Arsdall, Paul Jon

    The search for a satisfactory computer graphics input system led to this version of an analog sheet encoder which is transparent and requires no special probes. The goal of the research was to provide high resolution touch input capabilities for an experimental minicomputer based intelligent terminal system. The technique explored is compatible…

  15. Study on generation and sharing of on-demand global seamless data—Taking MODIS NDVI as an example

    NASA Astrophysics Data System (ADS)

    Shen, Dayong; Deng, Meixia; Di, Liping; Han, Weiguo; Peng, Chunming; Yagci, Ali Levent; Yu, Genong; Chen, Zeqiang

    2013-04-01

    By applying advanced Geospatial Data Abstraction Library (GDAL) and BigTIFF technology in a Geographical Information System (GIS) with Service Oriented Architecture (SOA), this study has derived global datasets using tile-based input data and implemented Virtual Web Map Service (VWMS) and Virtual Web Coverage Service (VWCS) to provide software tools for visualization and acquisition of global data. Taking MODIS Normalized Difference Vegetation Index (NDVI) as an example, this study proves the feasibility, efficiency and features of the proposed approach.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.

  17. Plan for DoD Wide Demonstrations of a DoD Improved Interactive Electronic Technical Manual (IETM) Architecture

    DTIC Science & Technology

    1998-07-01

    all the MS Word files into FrameMaker + SGML format and use the FrameMaker application to SGML tag all of the data in accordance with the Army TM...Document Type Definitions (DTDs) in MIL-STD- 2361. The edited SGML tagged files are saved as PDF files for delivery to the field. The FrameMaker ...as TIFF files and being imported into FrameMaker prior to saving the TMs as PDF files. Since the hardware to be used by the AN/PPS-5 technician is

  18. Left Limb of North Pole of the Sun, March 20, 2007 (Anaglyph)

    NASA Technical Reports Server (NTRS)

    2007-01-01

    [figure removed for brevity, see original site] [figure removed for brevity, see original site] Figure 1: Left eye view of a stereo pair Click on the image for full resolution TIFF Figure 2: Right eye view of a stereo pair Click on the image for full resolution TIFF Figure 1: This image was taken by the SECCHI Extreme UltraViolet Imager (EUVI) mounted on the STEREO-B spacecraft. STEREO-B is located behind the Earth, and follows the Earth in orbit around the Sun. This location enables us to view the Sun from the position of a virtual left eye in space. Figure 2: This image was taken by the SECCHI Extreme UltraViolet Imager (EUVI) mounted on the STEREO-A spacecraft. STEREO-A is located ahead of the Earth, and leads the Earth in orbit around the Sun, This location enables us to view the Sun from the position of a virtual right eye in space.

    NASA's Solar TErrestrial RElations Observatory (STEREO) satellites have provided the first three-dimensional images of the Sun. For the first time, scientists will be able to see structures in the Sun's atmosphere in three dimensions. The new view will greatly aid scientists' ability to understand solar physics and thereby improve space weather forecasting.

    This image is a composite of left and right eye color image pairs taken by the SECCHI Extreme UltraViolet Imager (EUVI) mounted on the STEREO-B and STEREO-A spacecraft. STEREO-B is located behind the Earth, and follows the Earth in orbit around the Sun, This location enables us to view the Sun from the position of a virtual left eye in space. STEREO-A is located ahead of the Earth, and leads the Earth in orbit around the Sun, This location enables us to view the Sun from the position of a virtual right eye in space.

    The EUVI imager is sensitive to wavelengths of light in the extreme ultraviolet portion of the spectrum. EUVI bands at wavelengths of 304, 171 and 195 Angstroms have been mapped to the red blue and green visible portion of the spectrum; and processed to emphasize the three-dimensional structure of the solar material.

    STEREO, a two-year mission, launched October 2006, will provide a unique and revolutionary view of the Sun-Earth System. The two nearly identical observatories -- one ahead of Earth in its orbit, the other trailing behind -- will trace the flow of energy and matter from the Sun to Earth. They will reveal the 3D structure of coronal mass ejections; violent eruptions of matter from the sun that can disrupt satellites and power grids, and help us understand why they happen. STEREO will become a key addition to the fleet of space weather detection satellites by providing more accurate alerts for the arrival time of Earth-directed solar ejections with its unique side-viewing perspective.

    STEREO is the third mission in NASA's Solar Terrestrial Probes program within NASA's Science Mission Directorate, Washington. The Goddard Science and Exploration Directorate manages the mission, instruments, and science center. The Johns Hopkins University Applied Physics Laboratory, Laurel, Md., designed and built the spacecraft and is responsible for mission operations. The imaging and particle detecting instruments were designed and built by scientific institutions in the U.S., UK, France, Germany, Belgium, Netherlands, and Switzerland. JPL is a division of the California Institute of Technology in Pasadena.

  19. New learning based super-resolution: use of DWT and IGMRF prior.

    PubMed

    Gajjar, Prakash P; Joshi, Manjunath V

    2010-05-01

    In this paper, we propose a new learning-based approach for super-resolving an image captured at low spatial resolution. Given the low spatial resolution test image and a database consisting of low and high spatial resolution images, we obtain super-resolution for the test image. We first obtain an initial high-resolution (HR) estimate by learning the high-frequency details from the available database. A new discrete wavelet transform (DWT) based approach is proposed for learning that uses a set of low-resolution (LR) images and their corresponding HR versions. Since the super-resolution is an ill-posed problem, we obtain the final solution using a regularization framework. The LR image is modeled as the aliased and noisy version of the corresponding HR image, and the aliasing matrix entries are estimated using the test image and the initial HR estimate. The prior model for the super-resolved image is chosen as an Inhomogeneous Gaussian Markov random field (IGMRF) and the model parameters are estimated using the same initial HR estimate. A maximum a posteriori (MAP) estimation is used to arrive at the cost function which is minimized using a simple gradient descent approach. We demonstrate the effectiveness of the proposed approach by conducting the experiments on gray scale as well as on color images. The method is compared with the standard interpolation technique and also with existing learning-based approaches. The proposed approach can be used in applications such as wildlife sensor networks, remote surveillance where the memory, the transmission bandwidth, and the camera cost are the main constraints.

  20. The CLIMB Geoportal - A web-based dissemination and documentation platform for hydrological modelling data

    NASA Astrophysics Data System (ADS)

    Blaschek, Michael; Gerken, Daniel; Ludwig, Ralf; Duttmann, Rainer

    2015-04-01

    Geoportals are important elements of spatial data infrastructures (SDIs) that are strongly based on GIS-related web services. These services are basically meant for distributing, documenting and visualizing (spatial) data in a standardized manner; an important but challenging task especially in large scientific projects with a high number of data suppliers and producers from various countries. This presentation focuses on introducing the free and open-source based geoportal solution developed within the research project CLIMB (Climate Induced Changes on the Hydrology of Mediterranean Basins, www.climb-fp7.eu) that serves as the central platform for interchanging project-related spatial data and information. In this collaboration, financed by the EU-FP7-framework and coordinated at the LMU Munich, 21 partner institutions from nine European and non-European countries were involved. The CLIMB Geoportal (lgi-climbsrv.geographie.uni-kiel.de) stores and provides spatially distributed data about the current state and future changes of the hydrological conditions within the seven CLIMB test sites around the Mediterranean. Hydrological modelling outcome - validated by the CLIMB partners - is offered to the public in forms of Web Map Services (WMS), whereas downloading the underlying data itself through Web Coverage Services (WCS) is possible for registered users only. A selection of common indicators such as discharge, drought index as well as uncertainty measures including their changes over time were used in different spatial resolution. Besides map information, the portal enables the graphical display of time series of selected variables calculated by the individual models applied within the CLIMB-project. The implementation of the CLIMB Geoportal is finally based on version 2.0c5 of the open source geospatial content management system GeoNode. It includes a GeoServer instance for providing the OGC-compliant web services and comes with a metadata catalog (pycsw) as well as a built-in WebGIS-client based on GeoExt (GeoExplorer). PostgreSQL enhanced by PostGIS in versions 9.2.1/2.0.1 serves as database backend for all base data of the study sites and for the time series of relevant hydrological indicators. Spatial model results in raster-format are stored file-based as GeoTIFFs. Due to the high number of model outputs, the generation of metadata (xml) and graphical rendering instructions (sld) associated with each single layer of the WMS has been done automatically using the statistical software R. Additional applications that have been programmed during the project period include a Java-based interface for comfortable download of climate data that was initially needed as input data in hydrological modeling as well as a tool for displaying time series of selected risk indicators which is directly integrated into the portal structure implemented using Python (Django) and JavaScript. The presented CLIMB Geoportal shows that relevant results of even large international research projects involving many partners and varying national standards in data handling, can be effectively disseminated to stakeholders, policy makers and other interested parties. Thus, it is a successful example of using free and open-source software for providing long-term visibility and access to data produced within a particular (environmental) research project.

  1. Online dynamical downscaling of temperature and precipitation within the iLOVECLIM model (version 1.1)

    NASA Astrophysics Data System (ADS)

    Quiquet, Aurélien; Roche, Didier M.; Dumas, Christophe; Paillard, Didier

    2018-02-01

    This paper presents the inclusion of an online dynamical downscaling of temperature and precipitation within the model of intermediate complexity iLOVECLIM v1.1. We describe the following methodology to generate temperature and precipitation fields on a 40 km × 40 km Cartesian grid of the Northern Hemisphere from the T21 native atmospheric model grid. Our scheme is not grid specific and conserves energy and moisture in the same way as the original climate model. We show that we are able to generate a high-resolution field which presents a spatial variability in better agreement with the observations compared to the standard model. Although the large-scale model biases are not corrected, for selected model parameters, the downscaling can induce a better overall performance compared to the standard version on both the high-resolution grid and on the native grid. Foreseen applications of this new model feature include the improvement of ice sheet model coupling and high-resolution land surface models.

  2. NAOMI instrument: a product line of compact and versatile cameras designed for high resolution missions in Earth observation

    NASA Astrophysics Data System (ADS)

    Luquet, Ph.; Chikouche, A.; Benbouzid, A. B.; Arnoux, J. J.; Chinal, E.; Massol, C.; Rouchit, P.; De Zotti, S.

    2017-11-01

    EADS Astrium is currently developing a new product line of compact and versatile instruments for high resolution missions in Earth Observation. First version has been developed in the frame of the ALSAT-2 contract awarded by the Algerian Space Agency (ASAL) to EADS Astrium. The Silicon Carbide Korsch-type telescope coupled with a multilines detector array offers a 2.5 m GSD in PAN band at Nadir @ 680 km altitude (10 m GSD in the four multispectral bands) with a 17.5 km swath width. This compact camera - 340 (W) x 460 (L) x 510 (H) mm3, 13 kg - is embarked on a Myriade-type small platform. The electronics unit accommodates video, housekeeping, and thermal control functions and also a 64 Gbit mass memory. Two satellites are developed; the first one is planned to be launched on mid 2009. Several other versions of the instrument have already been defined with enhanced resolution or/and larger field of view.

  3. Global 30m Height Above the Nearest Drainage

    NASA Astrophysics Data System (ADS)

    Donchyts, Gennadii; Winsemius, Hessel; Schellekens, Jaap; Erickson, Tyler; Gao, Hongkai; Savenije, Hubert; van de Giesen, Nick

    2016-04-01

    Variability of the Earth surface is the primary characteristics affecting the flow of surface and subsurface water. Digital elevation models, usually represented as height maps above some well-defined vertical datum, are used a lot to compute hydrologic parameters such as local flow directions, drainage area, drainage network pattern, and many others. Usually, it requires a significant effort to derive these parameters at a global scale. One hydrological characteristic introduced in the last decade is Height Above the Nearest Drainage (HAND): a digital elevation model normalized using nearest drainage. This parameter has been shown to be useful for many hydrological and more general purpose applications, such as landscape hazard mapping, landform classification, remote sensing and rainfall-runoff modeling. One of the essential characteristics of HAND is its ability to capture heterogeneities in local environments, difficult to measure or model otherwise. While many applications of HAND were published in the academic literature, no studies analyze its variability on a global scale, especially, using higher resolution DEMs, such as the new, one arc-second (approximately 30m) resolution version of SRTM. In this work, we will present the first global version of HAND computed using a mosaic of two DEMS: 30m SRTM and Viewfinderpanorama DEM (90m). The lower resolution DEM was used to cover latitudes above 60 degrees north and below 56 degrees south where SRTM is not available. We compute HAND using the unmodified version of the input DEMs to ensure consistency with the original elevation model. We have parallelized processing by generating a homogenized, equal-area version of HydroBASINS catchments. The resulting catchment boundaries were used to perform processing using 30m resolution DEM. To compute HAND, a new version of D8 local drainage directions as well as flow accumulation were calculated. The latter was used to estimate river head by incorporating fixed and variable thresholding methods. The resulting HAND dataset was analyzed regarding its spatial variability and to assess the global distribution of the main landform types: valley, ecotone, slope, and plateau. The method used to compute HAND was implemented using PCRaster software, running on Google Compute Engine platform running under Ubuntu Linux. The Google Earth Engine was used to perform mosaicing and clipping of the original DEMs as well as to provide access to the final product. The effort took about three months of computing time on eight core CPU virtual machine.

  4. Prognosis of Electrical Faults in Permanent Magnet AC Machines using the Hidden Markov Model

    DTIC Science & Technology

    2010-11-10

    time resolution and high frequency resolution Tiling is variable Wigner Ville Distribution Defined as W (t, ω) = ∫ s(t + τ 2 )s∗(t − τ 2 )e−jωτdτ...smoothed version of the Wigner distribution Amount of smoothing is controlled by σ Smoothing comes with a tradeoff of reduced resolution UNCLAS: Dist A...the Wigner or Choi-Williams distributions Although for Wigner and Choi-Williams distributions the probabilities are close for the early fault

  5. NASA Releases 'NASA App HD' for iPad

    NASA Image and Video Library

    2012-07-06

    The NASA App HD invites you to discover a wealth of NASA information right on your iPad. The application collects, customizes and delivers an extensive selection of dynamically updated mission information, images, videos and Twitter feeds from various online NASA sources in a convenient mobile package. Come explore with NASA, now on your iPad. 2012 Updated Version - HD Resolution and new features. Original version published on Sept. 1, 2010.

  6. Impacts of cloud superparameterization on projected daily rainfall intensity climate changes in multiple versions of the Community Earth System Model

    DOE PAGES

    Kooperman, Gabriel J.; Pritchard, Michael S.; Burt, Melissa A.; ...

    2016-09-26

    Changes in the character of rainfall are assessed using a holistic set of statistics based on rainfall frequency and amount distributions in climate change experiments with three conventional and superparameterized versions of the Community Atmosphere Model (CAM and SPCAM). Previous work has shown that high-order statistics of present-day rainfall intensity are significantly improved with superparameterization, especially in regions of tropical convection. Globally, the two modeling approaches project a similar future increase in mean rainfall, especially across the Inter-Tropical Convergence Zone (ITCZ) and at high latitudes, but over land, SPCAM predicts a smaller mean change than CAM. Changes in high-order statisticsmore » are similar at high latitudes in the two models but diverge at lower latitudes. In the tropics, SPCAM projects a large intensification of moderate and extreme rain rates in regions of organized convection associated with the Madden Julian Oscillation, ITCZ, monsoons, and tropical waves. In contrast, this signal is missing in all versions of CAM, which are found to be prone to predicting increases in the amount but not intensity of moderate rates. Predictions from SPCAM exhibit a scale-insensitive behavior with little dependence on horizontal resolution for extreme rates, while lower resolution (~2°) versions of CAM are not able to capture the response simulated with higher resolution (~1°). Furthermore, moderate rain rates analyzed by the “amount mode” and “amount median” are found to be especially telling as a diagnostic for evaluating climate model performance and tracing future changes in rainfall statistics to tropical wave modes in SPCAM.« less

  7. Application of the MacCormack scheme to overland flow routing for high-spatial resolution distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Zhang, Ling; Nan, Zhuotong; Liang, Xu; Xu, Yi; Hernández, Felipe; Li, Lianxia

    2018-03-01

    Although process-based distributed hydrological models (PDHMs) are evolving rapidly over the last few decades, their extensive applications are still challenged by the computational expenses. This study attempted, for the first time, to apply the numerically efficient MacCormack algorithm to overland flow routing in a representative high-spatial resolution PDHM, i.e., the distributed hydrology-soil-vegetation model (DHSVM), in order to improve its computational efficiency. The analytical verification indicates that both the semi and full versions of the MacCormack schemes exhibit robust numerical stability and are more computationally efficient than the conventional explicit linear scheme. The full-version outperforms the semi-version in terms of simulation accuracy when a same time step is adopted. The semi-MacCormack scheme was implemented into DHSVM (version 3.1.2) to solve the kinematic wave equations for overland flow routing. The performance and practicality of the enhanced DHSVM-MacCormack model was assessed by performing two groups of modeling experiments in the Mercer Creek watershed, a small urban catchment near Bellevue, Washington. The experiments show that DHSVM-MacCormack can considerably improve the computational efficiency without compromising the simulation accuracy of the original DHSVM model. More specifically, with the same computational environment and model settings, the computational time required by DHSVM-MacCormack can be reduced to several dozen minutes for a simulation period of three months (in contrast with one day and a half by the original DHSVM model) without noticeable sacrifice of the accuracy. The MacCormack scheme proves to be applicable to overland flow routing in DHSVM, which implies that it can be coupled into other PHDMs for watershed routing to either significantly improve their computational efficiency or to make the kinematic wave routing for high resolution modeling computational feasible.

  8. GEOS S2S-2_1: GMAO's New High Resolution Seasonal Prediction System

    NASA Technical Reports Server (NTRS)

    Molod, Andrea; Akella, Santha; Andrews, Lauren; Barahona, Donifan; Borovikov, Anna; Chang, Yehui; Cullather, Richard; Hackert, Eric; Kovach, Robin; Koster, Randal; hide

    2017-01-01

    A new version of the modeling and analysis system used to produce sub-seasonal to seasonal forecasts has just been released by the NASA Goddard Global Modeling and Assimilation Office. The new version runs at higher atmospheric resolution (approximately 12 degree globally), contains a substantially improved model description of the cryosphere, and includes additional interactive earth system model components (aerosol model). In addition, the Ocean data assimilation system has been replaced with a Local Ensemble Transform Kalman Filter. Here will describe the new system, along with the plans for the future (GEOS S2S-3_0) which will include a higher resolution ocean model and more interactive earth system model components (interactive vegetation, biomass burning from fires). We will also present results from a free-running coupled simulation with the new system and results from a series of retrospective seasonal forecasts. Results from retrospective forecasts show significant improvements in surface temperatures over much of the northern hemisphere and a much improved prediction of sea ice extent in both hemispheres. The precipitation forecast skill is comparable to previous S2S systems, and the only trade off is an increased double ITCZ, which is expected as we go to higher atmospheric resolution.

  9. Single image super-resolution based on compressive sensing and improved TV minimization sparse recovery

    NASA Astrophysics Data System (ADS)

    Vishnukumar, S.; Wilscy, M.

    2017-12-01

    In this paper, we propose a single image Super-Resolution (SR) method based on Compressive Sensing (CS) and Improved Total Variation (TV) Minimization Sparse Recovery. In the CS framework, low-resolution (LR) image is treated as the compressed version of high-resolution (HR) image. Dictionary Training and Sparse Recovery are the two phases of the method. K-Singular Value Decomposition (K-SVD) method is used for dictionary training and the dictionary represents HR image patches in a sparse manner. Here, only the interpolated version of the LR image is used for training purpose and thereby the structural self similarity inherent in the LR image is exploited. In the sparse recovery phase the sparse representation coefficients with respect to the trained dictionary for LR image patches are derived using Improved TV Minimization method. HR image can be reconstructed by the linear combination of the dictionary and the sparse coefficients. The experimental results show that the proposed method gives better results quantitatively as well as qualitatively on both natural and remote sensing images. The reconstructed images have better visual quality since edges and other sharp details are preserved.

  10. GEOS S2S-2_1: The GMAO new high resolution Seasonal Prediction System

    NASA Astrophysics Data System (ADS)

    Molod, A.; Vikhliaev, Y. V.; Hackert, E. C.; Kovach, R. M.; Zhao, B.; Cullather, R. I.; Marshak, J.; Borovikov, A.; Li, Z.; Barahona, D.; Andrews, L. C.; Chang, Y.; Schubert, S. D.; Koster, R. D.; Suarez, M.; Akella, S.

    2017-12-01

    A new version of the modeling and analysis system used to produce subseasonalto seasonal forecasts has just been released by the NASA/Goddard GlobalModeling and Assimilation Office. The new version runs at higher atmospheric resolution (approximately 1/2 degree globally), contains a subtantially improvedmodel description of the cryosphere, and includes additional interactive earth system model components (aerosol model). In addition, the Ocean data assimilationsystem has been replaced with a Local Ensemble Transform Kalman Filter.Here will describe the new system, along with the plans for the future (GEOS S2S-3_0) which will include a higher resolution ocean model and more interactive earth system model components (interactive vegetation, biomass burning from fires). We will alsopresent results from a free-running coupled simulation with the new system and resultsfrom a series of retrospective seasonal forecasts.Results from retrospective forecasts show significant improvements in surface temperaturesover much of the northern hemisphere and a much improved prediction of sea ice extent in bothhemispheres. The precipitation forecast skill is comparable to previous S2S systems, andthe only tradeoff is an increased "double ITCZ", which is expected as we go to higher atmospheric resolution.

  11. The dependence on atmospheric resolution of ENSO and related East Asian-western North Pacific summer climate variability in a coupled model

    NASA Astrophysics Data System (ADS)

    Liu, Bo; Zhao, Guijie; Huang, Gang; Wang, Pengfei; Yan, Bangliang

    2017-08-01

    The authors present results for El Niño-Southern Oscillation (ENSO) and East Asian-western North Pacific climate variability simulated in a new version high-resolution coupled model (ICM.V2) developed at the Center for Monsoon System Research of the Institute of Atmospheric Physics (CMSR, IAP), Chinese Academy of Sciences. The analyses are based on the last 100-year output of a 1000-year simulation. Results are compared to an earlier version of the same coupled model (ICM.V1), reanalysis, and observations. The two versions of ICM have similar physics but different atmospheric resolution. The simulated climatological mean states show marked improvement over many regions, especially the tropics in ICM.V2 compared to those in ICM.V1. The common bias in the cold tongue has reduced, and the warm biases along the ocean boundaries have improved as well. With improved simulation of ENSO, including its period and strength, the ENSO-related western North Pacific summer climate variability becomes more realistic compared to the observations. The simulated East Asian summer monsoon anomalies in the El Niño decaying summer are substantially more realistic in ICM.V2, which might be related to a better simulation of the Indo-Pacific Ocean capacitor (IPOC) effect and Pacific decadal oscillation (PDO).

  12. Image quality (IQ) guided multispectral image compression

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng; Chen, Genshe; Wang, Zhonghai; Blasch, Erik

    2016-05-01

    Image compression is necessary for data transportation, which saves both transferring time and storage space. In this paper, we focus on our discussion on lossy compression. There are many standard image formats and corresponding compression algorithms, for examples, JPEG (DCT -- discrete cosine transform), JPEG 2000 (DWT -- discrete wavelet transform), BPG (better portable graphics) and TIFF (LZW -- Lempel-Ziv-Welch). The image quality (IQ) of decompressed image will be measured by numerical metrics such as root mean square error (RMSE), peak signal-to-noise ratio (PSNR), and structural Similarity (SSIM) Index. Given an image and a specified IQ, we will investigate how to select a compression method and its parameters to achieve an expected compression. Our scenario consists of 3 steps. The first step is to compress a set of interested images by varying parameters and compute their IQs for each compression method. The second step is to create several regression models per compression method after analyzing the IQ-measurement versus compression-parameter from a number of compressed images. The third step is to compress the given image with the specified IQ using the selected compression method (JPEG, JPEG2000, BPG, or TIFF) according to the regressed models. The IQ may be specified by a compression ratio (e.g., 100), then we will select the compression method of the highest IQ (SSIM, or PSNR). Or the IQ may be specified by a IQ metric (e.g., SSIM = 0.8, or PSNR = 50), then we will select the compression method of the highest compression ratio. Our experiments tested on thermal (long-wave infrared) images (in gray scales) showed very promising results.

  13. Climate Sensitivity of the Community Climate System Model, Version 4

    DOE PAGES

    Bitz, Cecilia M.; Shell, K. M.; Gent, P. R.; ...

    2012-05-01

    Equilibrium climate sensitivity of the Community Climate System Model Version 4 (CCSM4) is 3.20°C for 1° horizontal resolution in each component. This is about a half degree Celsius higher than in the previous version (CCSM3). The transient climate sensitivity of CCSM4 at 1° resolution is 1.72°C, which is about 0.2°C higher than in CCSM3. These higher climate sensitivities in CCSM4 cannot be explained by the change to a preindustrial baseline climate. We use the radiative kernel technique to show that from CCSM3 to CCSM4, the global mean lapse-rate feedback declines in magnitude, and the shortwave cloud feedback increases. These twomore » warming effects are partially canceled by cooling due to slight decreases in the global mean water-vapor feedback and longwave cloud feedback from CCSM3 to CCSM4. A new formulation of the mixed-layer, slab ocean model in CCSM4 attempts to reproduce the SST and sea ice climatology from an integration with a full-depth ocean, and it is integrated with a dynamic sea ice model. These new features allow an isolation of the influence of ocean dynamical changes on the climate response when comparing integrations with the slab ocean and full-depth ocean. The transient climate response of the full-depth ocean version is 0.54 of the equilibrium climate sensitivity when estimated with the new slab ocean model version for both CCSM3 and CCSM4. We argue the ratio is the same in both versions because they have about the same zonal mean pattern of change in ocean surface heat flux, which broadly resembles the zonal mean pattern of net feedback strength.« less

  14. Comparison of motor diagnoses by Chicago Classification versions 2.0 and 3.0 on esophageal high-resolution manometry.

    PubMed

    Patel, A; Cassell, B; Sainani, N; Wang, D; Shahid, B; Bennett, M; Mirza, F A; Munigala, S; Gyawali, C P

    2017-07-01

    The Chicago Classification (CC) uses high-resolution manometry (HRM) software tools to designate esophageal motor diagnoses. We evaluated changes in diagnostic designations between two CC versions, and determined motor patterns not identified by either version. In this observational cohort study of consecutive patients undergoing esophageal HRM over a 6-year period, proportions meeting CC 2.0 and 3.0 criteria were segregated into esophageal outflow obstruction, hypermotility, and hypomotility disorders. Contraction wave abnormalities (CWA), and 'normal' cohorts were recorded. Symptom burden was characterized using dominant symptom intensity and global symptom severity. Motor diagnoses, presenting symptoms, and symptom burden were compared between CC 2.0 and 3.0, and in cohorts not meeting CC diagnoses. Of 2569 eligible studies, 49.9% met CC 2.0 criteria, but only 40.3% met CC 3.0 criteria (P<.0001). Between CC 2.0 and 3.0, 82.8% of diagnoses were concordant. Discordance resulted from decreasing proportions of hypermotility (4.4%) and hypomotility (9.0%) disorders, and increase in 'normal' designations (13.0%); esophageal outflow obstruction showed the least variation between CC versions. Symptom burden was higher with CC 3.0 diagnoses (P≤.005) but not with CC 2.0 diagnoses (P≥.1). Within 'normal' cohorts for both CC versions, CWA were associated with higher likelihood of esophageal symptoms, especially dysphagia, regurgitation, and heartburn, compared to truly normal studies (P≤.02 for each comparison). Despite lower sensitivity, CC 3.0 identifies esophageal motor disorders with higher symptom burden compared to CC 2.0. CWA, which are associated with both transit and perceptive symptoms, are not well identified by either version. © 2017 John Wiley & Sons Ltd.

  15. SeaWiFS technical report series. Volume 15: The simulated SesWiFS data set, version 2

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Gregg, Watson W.; Patt, Frederick S.; Woodward, Robert H.

    1994-01-01

    This document describes the second version of the simulated SeaWiFS data set. A realistic simulated data set is essential for mission readiness preparations and can potentially assist in all phases of ground support for a future mission. The second version improves on the first version primarily through additional realism and complexity. This version incorporates a representation of virtually every aspect of the flight mission. Thus, it provides a high-fidelity data set for testing several aspects of the ground system, including data acquisition, data processing, data transfers, calibration and validation, quality control, and mission operations. The data set is constructed for a seven-day period, 25-31 March 1994. Specific features of the data set include Global Area coverage (GAC), recorded Local Area Coverage (LAC), and realtime High Resolution Picture Transmission (HRPT) data for the seven-day period. A realistic orbit, which is propagated using a Brouwer-Lyddane model with drag, is used to simulate orbit positions. The simulated data corresponds to the command schedule based on the orbit for this seven-day period. It includes total (at-satellite) radiances not only for ocean, but for land, clouds, and ice. The simulation also utilizes a high-resolution land-sea mask. It includes the April 1993 SeaWiFS spectral responses and sensor saturation responses. The simulation is formatted according to July 1993 onboard data structures, which include corresponding telemetry (instrument and spacecraft) data. The methods are described and some examples of the output are given. The instrument response functions made available in April 1993 have been used to produce the Version 2 simulated data. These response functions will change as part of the sensor improvements initiated in July-August 1993.

  16. Update of global TC simulations using a variable resolution non-hydrostatic model

    NASA Astrophysics Data System (ADS)

    Park, S. H.

    2017-12-01

    Using in a variable resolution meshes in MPAS during 2017 summer., Tropical cyclone (TC) forecasts are simulated. Two physics suite are tested to explore performance and bias of each physics suite for TC forecasting. A WRF physics suite is selected from experience on weather forecasting and CAM (Community Atmosphere Model) physics is taken from a AMIP type climate simulation. Based on the last year results from CAM5 physical parameterization package and comparing with WRF physics, we investigated a issue with intensity bias using updated version of CAM physics (CAM6). We also compared these results with coupled version of TC simulations. During this talk, TC structure will be compared specially around of boundary layer and investigate their relationship between TC intensity and different physics package.

  17. Version 2 Goddard Satellite-Based Surface Turbulent Fluxes (GSSTF2)

    NASA Technical Reports Server (NTRS)

    Chou, Shu-Hsien; Nelkin, Eric; Ardizzone, Joe; Atlas, Robert M.; Shie, Chung-Lin; Starr, David O'C. (Technical Monitor)

    2002-01-01

    Information on the turbulent fluxes of momentum, moisture, and heat at the air-sea interface is essential in improving model simulations of climate variations and in climate studies. We have derived a 13.5-year (July 1987-December 2000) dataset of daily surface turbulent fluxes over global oceans from the Special Sensor Mcrowave/Imager (SSM/I) radiance measurements. This dataset, version 2 Goddard Satellite-based Surface Turbulent Fluxes (GSSTF2), has a spatial resolution of 1 degree x 1 degree latitude-longitude and a temporal resolution of 1 day. Turbulent fluxes are derived from the SSM/I surface winds and surface air humidity, as well as the 2-m air and sea surface temperatures (SST) of the NCEP/NCAR reanalysis, using a bulk aerodynamic algorithm based on the surface layer similarity theory.

  18. High-resolution dynamic downscaling of CMIP5 output over the Tropical Andes

    NASA Astrophysics Data System (ADS)

    Reichler, Thomas; Andrade, Marcos; Ohara, Noriaki

    2015-04-01

    Our project is targeted towards making robust predictions of future changes in climate over the tropical part of the South American Andes. This goal is challenging, since tropical lowlands, steep mountains, and snow covered subarctic surfaces meet over relatively short distances, leading to distinct climate regimes within the same domain and pronounced spatial gradients in virtually every climate quantity. We use an innovative approach to solve this problem, including several quadruple nested versions of WRF, a systematic validation strategy to find the version of WRF that best fits our study region, spatial resolutions at the kilometer scale, 20-year-long simulation periods, and bias-corrected output from various CMIP5 simulations that also include the multi-model mean of all CMIP5 models. We show that the simulated changes in climate are consistent with the results from the global climate models and also consistent with two different versions of WRF. We also discuss the expected changes in snow and ice, derived from off-line coupling the regional simulations to a carefully calibrated snow and ice model.

  19. The Impact of Horizontal and Temporal Resolution on Convection and Precipitation with High-Resolution GEOS-5

    NASA Technical Reports Server (NTRS)

    Putman, William P.

    2012-01-01

    Using a high-resolution non-hydrostatic version of GEOS-5 with the cubed-sphere finite-volume dynamical core, the impact of spatial and temporal resolution on cloud properties will be evaluated. There are indications from examining convective cluster development in high resolution GEOS-5 forecasts that the temporal resolution within the model may playas significant a role as horizontal resolution. Comparing modeled convective cloud clusters versus satellite observations of brightness temperature, we have found that improved. temporal resolution in GEOS-S accounts for a significant portion of the improvements in the statistical distribution of convective cloud clusters. Using satellite simulators in GEOS-S we will compare the cloud optical properties of GEOS-S at various spatial and temporal resolutions with those observed from MODIS. The potential impact of these results on tropical cyclone formation and intensity will be examined as well.

  20. Supernova Cosmology Project

    Science.gov Websites

    (note that the arXiv.org version lacks the full-resolution figures) The SCP "Union" SN Ia Matrix Description Covariance Matrix with Systematics Description Full Table of All SNe Description

  1. Mars-solar wind interaction: LatHyS, an improved parallel 3-D multispecies hybrid model

    NASA Astrophysics Data System (ADS)

    Modolo, Ronan; Hess, Sebastien; Mancini, Marco; Leblanc, Francois; Chaufray, Jean-Yves; Brain, David; Leclercq, Ludivine; Esteban-Hernández, Rosa; Chanteur, Gerard; Weill, Philippe; González-Galindo, Francisco; Forget, Francois; Yagi, Manabu; Mazelle, Christian

    2016-07-01

    In order to better represent Mars-solar wind interaction, we present an unprecedented model achieving spatial resolution down to 50 km, a so far unexplored resolution for global kinetic models of the Martian ionized environment. Such resolution approaches the ionospheric plasma scale height. In practice, the model is derived from a first version described in Modolo et al. (2005). An important effort of parallelization has been conducted and is presented here. A better description of the ionosphere was also implemented including ionospheric chemistry, electrical conductivities, and a drag force modeling the ion-neutral collisions in the ionosphere. This new version of the code, named LatHyS (Latmos Hybrid Simulation), is here used to characterize the impact of various spatial resolutions on simulation results. In addition, and following a global model challenge effort, we present the results of simulation run for three cases which allow addressing the effect of the suprathermal corona and of the solar EUV activity on the magnetospheric plasma boundaries and on the global escape. Simulation results showed that global patterns are relatively similar for the different spatial resolution runs, but finest grid runs provide a better representation of the ionosphere and display more details of the planetary plasma dynamic. Simulation results suggest that a significant fraction of escaping O+ ions is originated from below 1200 km altitude.

  2. Impact of numerical choices on water conservation in the E3SM Atmosphere Model version 1 (EAMv1)

    NASA Astrophysics Data System (ADS)

    Zhang, Kai; Rasch, Philip J.; Taylor, Mark A.; Wan, Hui; Leung, Ruby; Ma, Po-Lun; Golaz, Jean-Christophe; Wolfe, Jon; Lin, Wuyin; Singh, Balwinder; Burrows, Susannah; Yoon, Jin-Ho; Wang, Hailong; Qian, Yun; Tang, Qi; Caldwell, Peter; Xie, Shaocheng

    2018-06-01

    The conservation of total water is an important numerical feature for global Earth system models. Even small conservation problems in the water budget can lead to systematic errors in century-long simulations. This study quantifies and reduces various sources of water conservation error in the atmosphere component of the Energy Exascale Earth System Model. Several sources of water conservation error have been identified during the development of the version 1 (V1) model. The largest errors result from the numerical coupling between the resolved dynamics and the parameterized sub-grid physics. A hybrid coupling using different methods for fluid dynamics and tracer transport provides a reduction of water conservation error by a factor of 50 at 1° horizontal resolution as well as consistent improvements at other resolutions. The second largest error source is the use of an overly simplified relationship between the surface moisture flux and latent heat flux at the interface between the host model and the turbulence parameterization. This error can be prevented by applying the same (correct) relationship throughout the entire model. Two additional types of conservation error that result from correcting the surface moisture flux and clipping negative water concentrations can be avoided by using mass-conserving fixers. With all four error sources addressed, the water conservation error in the V1 model becomes negligible and insensitive to the horizontal resolution. The associated changes in the long-term statistics of the main atmospheric features are small. A sensitivity analysis is carried out to show that the magnitudes of the conservation errors in early V1 versions decrease strongly with temporal resolution but increase with horizontal resolution. The increased vertical resolution in V1 results in a very thin model layer at the Earth's surface, which amplifies the conservation error associated with the surface moisture flux correction. We note that for some of the identified error sources, the proposed fixers are remedies rather than solutions to the problems at their roots. Future improvements in time integration would be beneficial for V1.

  3. MATISSE a web-based tool to access, visualize and analyze high resolution minor bodies observation

    NASA Astrophysics Data System (ADS)

    Zinzi, Angelo; Capria, Maria Teresa; Palomba, Ernesto; Antonelli, Lucio Angelo; Giommi, Paolo

    2016-07-01

    In the recent years planetary exploration missions acquired data from minor bodies (i.e., dwarf planets, asteroid and comets) at a detail level never reached before. Since these objects often present very irregular shapes (as in the case of the comet 67P Churyumov-Gerasimenko target of the ESA Rosetta mission) "classical" bidimensional projections of observations are difficult to understand. With the aim of providing the scientific community a tool to access, visualize and analyze data in a new way, ASI Science Data Center started to develop MATISSE (Multi-purposed Advanced Tool for the Instruments for the Solar System Exploration - http://tools.asdc.asi.it/matisse.jsp) in late 2012. This tool allows 3D web-based visualization of data acquired by planetary exploration missions: the output could either be the straightforward projection of the selected observation over the shape model of the target body or the visualization of a high-order product (average/mosaic, difference, ratio, RGB) computed directly online with MATISSE. Standard outputs of the tool also comprise downloadable files to be used with GIS software (GeoTIFF and ENVI format) and 3D very high-resolution files to be viewed by means of the free software Paraview. During this period the first and most frequent exploitation of the tool has been related to visualization of data acquired by VIRTIS-M instruments onboard Rosetta observing the comet 67P. The success of this task, well represented by the good number of published works that used images made with MATISSE confirmed the need of a different approach to correctly visualize data coming from irregular shaped bodies. In the next future the datasets available to MATISSE are planned to be extended, starting from the addition of VIR-Dawn observations of both Vesta and Ceres and also using standard protocols to access data stored in external repositories, such as NASA ODE and Planetary VO.

  4. CT Scans of Cores Metadata, Barrow, Alaska 2015

    DOE Data Explorer

    Katie McKnight; Tim Kneafsey; Craig Ulrich

    2015-03-11

    Individual ice cores were collected from Barrow Environmental Observatory in Barrow, Alaska, throughout 2013 and 2014. Cores were drilled along different transects to sample polygonal features (i.e. the trough, center and rim of high, transitional and low center polygons). Most cores were drilled around 1 meter in depth and a few deep cores were drilled around 3 meters in depth. Three-dimensional images of the frozen cores were constructed using a medical X-ray computed tomography (CT) scanner. TIFF files can be uploaded to ImageJ (an open-source imaging software) to examine soil structure and densities within each core.

  5. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Temple, Brian Allen; Armstrong, Jerawan Chudoung

    This document is a mid-year report on a deliverable for the PYTHON Radiography Analysis Tool (PyRAT) for project LANL12-RS-107J in FY15. The deliverable is deliverable number 2 in the work package and is titled “Add the ability to read in more types of image file formats in PyRAT”. Right now PyRAT can only read in uncompressed TIF files (tiff files). It is planned to expand the file formats that can be read by PyRAT, making it easier to use in more situations. A summary of the file formats added include jpeg, jpg, png and formatted ASCII files.

  7. KB3D Reference Manual. Version 1.a

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Siminiceanu, Radu; Carreno, Victor A.; Dowek, Gilles

    2005-01-01

    This paper is a reference manual describing the implementation of the KB3D conflict detection and resolution algorithm. The algorithm has been implemented in the Java and C++ programming languages. The reference manual gives a short overview of the detection and resolution functions, the structural implementation of the program, inputs and outputs to the program, and describes how the program is used. Inputs to the program can be rectangular coordinates or geodesic coordinates. The reference manual also gives examples of conflict scenarios and the resolution outputs the program produces.

  8. A Look Inside Hurricane Alma

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Hurricane season in the eastern Pacific started off with a whimper late last month as Alma, a Category 2 hurricane, slowly made its way up the coast of Baja California, packing sustained winds of 110 miles per hour and gusts of 135 miles per hour. The above image of the hurricane was acquired on May 29, 2002, and displays the rainfall rates occurring within the storm. Click the image above to see an animated data visualization (3.8 MB) of the interior of Hurricane Alma. The images of the clouds seen at the beginning of the movie were retrieved from the National Oceanic and Atmospheric Association's (NOAA's) Geostationary Orbiting Environmental Satellite (GOES) network. As the movie continues, the clouds are peeled away to reveal an image of rainfall levels in the hurricane. The rainfall data were obtained by the Precipitation Radar aboard NASA's Tropical Rainfall Measuring Mission (TRMM) satellite. The Precipitation Radar bounces radio waves off of clouds to retrieve a reading of the number of large, rain-sized droplets within the clouds. Using these data, scientists can tell how much precipitation is occurring within and beneath a hurricane. In the movie, yellow denotes areas where 0.5 inches of rain is falling per hour, green denotes 1 inch per hour, and red denotes over 2 inches per hour. (Please note that high resolution still images of Hurricane Alma are available in the NASA Visible Earth in TIFF format.) Image and animation courtesy Lori Perkins, NASA Goddard Space Flight Center Scientific Visualization Studio

  9. A web-based subsetting service for regional scale MODIS land products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SanthanaVannan, Suresh K; Cook, Robert B; Holladay, Susan K

    2009-12-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) sensor has provided valuable information on various aspects of the Earth System since March 2000. The spectral, spatial, and temporal characteristics of MODIS products have made them an important data source for analyzing key science questions relating to Earth System processes at regional, continental, and global scales. The size of the MODIS product and native HDF-EOS format are not optimal for use in field investigations at individual sites (100 - 100 km or smaller). In order to make MODIS data readily accessible for field investigations, the NASA-funded Distributed Active Archive Center (DAAC) for Biogeochemicalmore » Dynamics at Oak Ridge National Laboratory (ORNL) has developed an online system that provides MODIS land products in an easy-to-use format and in file sizes more appropriate to field research. This system provides MODIS land products data in a nonproprietary comma delimited ASCII format and in GIS compatible formats (GeoTIFF and ASCII grid). Web-based visualization tools are also available as part of this system and these tools provide a quick snapshot of the data. Quality control tools and a multitude of data delivery options are available to meet the demands of various user communities. This paper describes the important features and design goals for the system, particularly in the context of data archive and distribution for regional scale analysis. The paper also discusses the ways in which data from this system can be used for validation, data intercomparison, and modeling efforts.« less

  10. Comparison of Open Source Compression Algorithms on Vhr Remote Sensing Images for Efficient Storage Hierarchy

    NASA Astrophysics Data System (ADS)

    Akoguz, A.; Bozkurt, S.; Gozutok, A. A.; Alp, G.; Turan, E. G.; Bogaz, M.; Kent, S.

    2016-06-01

    High resolution level in satellite imagery came with its fundamental problem as big amount of telemetry data which is to be stored after the downlink operation. Moreover, later the post-processing and image enhancement steps after the image is acquired, the file sizes increase even more and then it gets a lot harder to store and consume much more time to transmit the data from one source to another; hence, it should be taken into account that to save even more space with file compression of the raw and various levels of processed data is a necessity for archiving stations to save more space. Lossless data compression algorithms that will be examined in this study aim to provide compression without any loss of data holding spectral information. Within this objective, well-known open source programs supporting related compression algorithms have been implemented on processed GeoTIFF images of Airbus Defence & Spaces SPOT 6 & 7 satellites having 1.5 m. of GSD, which were acquired and stored by ITU Center for Satellite Communications and Remote Sensing (ITU CSCRS), with the algorithms Lempel-Ziv-Welch (LZW), Lempel-Ziv-Markov chain Algorithm (LZMA & LZMA2), Lempel-Ziv-Oberhumer (LZO), Deflate & Deflate 64, Prediction by Partial Matching (PPMd or PPM2), Burrows-Wheeler Transform (BWT) in order to observe compression performances of these algorithms over sample datasets in terms of how much of the image data can be compressed by ensuring lossless compression.

  11. View From Within 'Perseverance Valley' on Mars (Enhanced Color)

    NASA Image and Video Library

    2017-12-06

    This enhanced-color view from within "Perseverance Valley," on the inner slope of the western rim of Endurance Crater on Mars, includes wheel tracks from the Opportunity rover's descent of the valley. The Panoramic Camera (Pancam) on Opportunity's mast took the component images of the scene during the period Sept. 4 through Oct. 6, 2017, corresponding to sols (Martian days) 4840 through 4871 of the rover's work on Mars. Perseverance Valley is a system of shallow troughs descending eastward about the length of two football fields from the crest of the crater rim to the floor of the crater. This panorama spans from northeast on the left to northwest on the right, including portions of the crater floor (eastward) in the left half and of the rim (westward) in the right half. Opportunity began descending Perseverance Valley in mid-2017 (see map) as part of an investigation into how the valley formed. Rover wheel tracks are darker brown, between two patches of bright bedrock, receding toward the horizon in the right half of the scene. This view combines multiple images taken through three different Pancam filters. The selected filters admit light centered on wavelengths of 753 nanometers (near-infrared), 535 nanometers (green) and 432 nanometers (violet). The three color bands are combined here with enhancement to make differences in surface materials easier to see. A map and full-resolution TIFF file are available at https://photojournal.jpl.nasa.gov/catalog/PIA22073

  12. Dark and Bright Terrains of Pluto

    NASA Image and Video Library

    2015-07-10

    These circular maps shows the distribution of Pluto's dark and bright terrains as revealed by NASA's New Horizons mission prior to July 4, 2015. Each map is an azimuthal equidistant projection centered on the north pole, with latitude and longitude indicated. Both a gray-scale and color version are shown. The gray-scale version is based on 7 days of panchromatic imaging from the Long Range Reconnaissance Imager (LORRI), whereas the color version uses the gray-scale base and incorporates lower-resolution color information from the Multi-spectral Visible Imaging Camera (MVIC), part of the Ralph instrument. The color version is also shown in a simple cylindrical projection in PIA19700. In these maps, the polar bright terrain is surrounded by a somewhat darker polar fringe, one whose latitudinal position varies strongly with longitude. Especially striking are the much darker regions along the equator. A broad dark swath ("the whale") stretches along the equator from approximately 20 to 160 degrees of longitude. Several dark patches appear in a regular sequence centered near 345 degrees of longitude. A spectacular bright region occupies Pluto's mid-latitudes near 180 degrees of longitude, and stretches southward over the equator. New Horizons' closest approach to Pluto will occur near this longitude, which will permit high-resolution visible imaging and compositional mapping of these various regions. http://photojournal.jpl.nasa.gov/catalog/PIA19706

  13. Image monitoring of pharmaceutical blending processes and the determination of an end point by using a portable near-infrared imaging device based on a polychromator-type near-infrared spectrometer with a high-speed and high-resolution photo diode array detector.

    PubMed

    Murayama, Kodai; Ishikawa, Daitaro; Genkawa, Takuma; Sugino, Hiroyuki; Komiyama, Makoto; Ozaki, Yukihiro

    2015-03-03

    In the present study we have developed a new version (ND-NIRs) of a polychromator-type near-infrared (NIR) spectrometer with a high-resolution photo diode array detector, which we built before (D-NIRs). The new version has four 5 W halogen lamps compared with the three lamps for the older version. The new version also has a condenser lens with a shorter focal point length. The increase in the number of the lamps and the shortening of the focal point of the condenser lens realize high signal-to-noise ratio and high-speed NIR imaging measurement. By using the ND-NIRs we carried out the in-line monitoring of pharmaceutical blending and determined an end point of the blending process. Moreover, to determinate a more accurate end point, a NIR image of the blending sample was acquired by means of a portable NIR imaging device based on ND-NIRs. The imaging result has demonstrated that the mixing time of 8 min is enough for homogeneous mixing. In this way the present study has demonstrated that ND-NIRs and the imaging system based on a ND-NIRs hold considerable promise for process analysis.

  14. Theoretical hot methane line lists up to T = 2000 K for astrophysical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rey, M.; Tyuterev, Vl. G.; Nikitin, A. V., E-mail: michael.rey@univ-reims.fr

    2014-07-01

    The paper describes the construction of complete sets of hot methane lines based on accurate ab initio potential and dipole moment surfaces and extensive first-principle calculations. Four line lists spanning the [0-5000] cm{sup –1} infrared region were built at T = 500, 1000, 1500, and 2000 K. For each of these four temperatures, we have constructed two versions of line lists: a version for high-resolution applications containing strong and medium lines and a full version appropriate for low-resolution opacity calculations. A comparison with available empirical databases is discussed in detail for both cold and hot bands giving a very goodmore » agreement for line positions, typically <0.1-0.5 cm{sup –1} and ∼5% for intensities of strong lines. Together with numerical tests using various basis sets, this confirms the computational convergence of our results for the most important lines, which is the major issue for theoretical spectra predictions. We showed that transitions with lower state energies up to 14,000 cm{sup –1} could give significant contributions to the methane opacity and have to be systematically taken into account. Our list at 2000 K calculated up to J = 50 contains 11.5 billion transitions for I > 10{sup –29} cm mol{sup –1}. These new lists are expected to be quantitatively accurate with respect to the precision of available and currently planned observations of astrophysical objects with improved spectral resolution.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kooperman, Gabriel J.; Pritchard, Michael S.; Burt, Melissa A.

    Changes in the character of rainfall are assessed using a holistic set of statistics based on rainfall frequency and amount distributions in climate change experiments with three conventional and superparameterized versions of the Community Atmosphere Model (CAM and SPCAM). Previous work has shown that high-order statistics of present-day rainfall intensity are significantly improved with superparameterization, especially in regions of tropical convection. Globally, the two modeling approaches project a similar future increase in mean rainfall, especially across the Inter-Tropical Convergence Zone (ITCZ) and at high latitudes, but over land, SPCAM predicts a smaller mean change than CAM. Changes in high-order statisticsmore » are similar at high latitudes in the two models but diverge at lower latitudes. In the tropics, SPCAM projects a large intensification of moderate and extreme rain rates in regions of organized convection associated with the Madden Julian Oscillation, ITCZ, monsoons, and tropical waves. In contrast, this signal is missing in all versions of CAM, which are found to be prone to predicting increases in the amount but not intensity of moderate rates. Predictions from SPCAM exhibit a scale-insensitive behavior with little dependence on horizontal resolution for extreme rates, while lower resolution (~2°) versions of CAM are not able to capture the response simulated with higher resolution (~1°). Furthermore, moderate rain rates analyzed by the “amount mode” and “amount median” are found to be especially telling as a diagnostic for evaluating climate model performance and tracing future changes in rainfall statistics to tropical wave modes in SPCAM.« less

  16. The Fire INventory from NCAR (FINN): a high resolution global model to estimate the emissions from open burning

    Treesearch

    C. Wiedinmyer; S. K. Akagi; R. J. Yokelson; L. K. Emmons; J. A. Al-Saadi; J. J. Orlando; A. J. Soja

    2010-01-01

    The Fire INventory from NCAR version 1.0 (FINNv1) provides daily, 1 km resolution, global estimates of the trace gas and particle emissions from open burning of biomass, which includes wildfire, agricultural fires, and prescribed burning and does not include 5 biofuel use and trash burning. Emission factors used in the calculations have been updated with recent data,...

  17. Super-resolution Time-Lapse Seismic Waveform Inversion

    NASA Astrophysics Data System (ADS)

    Ovcharenko, O.; Kazei, V.; Peter, D. B.; Alkhalifah, T.

    2017-12-01

    Time-lapse seismic waveform inversion is a technique, which allows tracking changes in the reservoirs over time. Such monitoring is relatively computationally extensive and therefore it is barely feasible to perform it on-the-fly. Most of the expenses are related to numerous FWI iterations at high temporal frequencies, which is inevitable since the low-frequency components can not resolve fine scale features of a velocity model. Inverted velocity changes are also blurred when there is noise in the data, so the problem of low-resolution images is widely known. One of the problems intensively tackled by computer vision research community is the recovering of high-resolution images having their low-resolution versions. Usage of artificial neural networks to reach super-resolution from a single downsampled image is one of the leading solutions for this problem. Each pixel of the upscaled image is affected by all the pixels of its low-resolution version, which enables the workflow to recover features that are likely to occur in the corresponding environment. In the present work, we adopt machine learning image enhancement technique to improve the resolution of time-lapse full-waveform inversion. We first invert the baseline model with conventional FWI. Then we run a few iterations of FWI on a set of the monitoring data to find desired model changes. These changes are blurred and we enhance their resolution by using a deep neural network. The network is trained to map low-resolution model updates predicted by FWI into the real perturbations of the baseline model. For supervised training of the network we generate a set of random perturbations in the baseline model and perform FWI on the noisy data from the perturbed models. We test the approach on a realistic perturbation of Marmousi II model and demonstrate that it outperforms conventional convolution-based deblurring techniques.

  18. Roundness variation in JPEG images affects the automated process of nuclear immunohistochemical quantification: correction with a linear regression model.

    PubMed

    López, Carlos; Jaén Martinez, Joaquín; Lejeune, Marylène; Escrivà, Patricia; Salvadó, Maria T; Pons, Lluis E; Alvaro, Tomás; Baucells, Jordi; García-Rojo, Marcial; Cugat, Xavier; Bosch, Ramón

    2009-10-01

    The volume of digital image (DI) storage continues to be an important problem in computer-assisted pathology. DI compression enables the size of files to be reduced but with the disadvantage of loss of quality. Previous results indicated that the efficiency of computer-assisted quantification of immunohistochemically stained cell nuclei may be significantly reduced when compressed DIs are used. This study attempts to show, with respect to immunohistochemically stained nuclei, which morphometric parameters may be altered by the different levels of JPEG compression, and the implications of these alterations for automated nuclear counts, and further, develops a method for correcting this discrepancy in the nuclear count. For this purpose, 47 DIs from different tissues were captured in uncompressed TIFF format and converted to 1:3, 1:23 and 1:46 compression JPEG images. Sixty-five positive objects were selected from these images, and six morphological parameters were measured and compared for each object in TIFF images and those of the different compression levels using a set of previously developed and tested macros. Roundness proved to be the only morphological parameter that was significantly affected by image compression. Factors to correct the discrepancy in the roundness estimate were derived from linear regression models for each compression level, thereby eliminating the statistically significant differences between measurements in the equivalent images. These correction factors were incorporated in the automated macros, where they reduced the nuclear quantification differences arising from image compression. Our results demonstrate that it is possible to carry out unbiased automated immunohistochemical nuclear quantification in compressed DIs with a methodology that could be easily incorporated in different systems of digital image analysis.

  19. 78 FR 46330 - Public ICWG Announcement-2013

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-31

    ... Resolution Matrix (CRM) form. These forms along with the Was/Is Matrix, current versions of the documents... Compatibility (ABC) Study Group Kickoff Please provide them in the CRM form and submit to the SMC/GPER mailbox...

  20. The SeaFlux Turbulent Flux Dataset Version 1.0 Documentation

    NASA Technical Reports Server (NTRS)

    Clayson, Carol Anne; Roberts, J. Brent; Bogdanoff, Alec S.

    2012-01-01

    Under the auspices of the World Climate Research Programme (WCRP) Global Energy and Water cycle EXperiment (GEWEX) Data and Assessment Panel (GDAP), the SeaFlux Project was created to investigate producing a high-resolution satellite-based dataset of surface turbulent fluxes over the global oceans. The most current release of the SeaFlux product is Version 1.0; this represents the initial release of turbulent surface heat fluxes, associated near-surface variables including a diurnally varying sea surface temperature.

  1. Miniaturized Ion and Neutral Mass Spectrometer for CubeSat Atmospheric Measurements

    NASA Technical Reports Server (NTRS)

    Rodriguez, M.; Paschalidis, N.; Jones, S.; Sittler, E.; Chornay, D.; Uribe, P.; Cameron, T.

    2016-01-01

    To increase the number of single point in-situ measurements of thermosphere and exosphere ion and neutral composition and density, miniaturized instrumentation is in high demand to take advantage of the increasing platform opportunities available in the smallsat/cubesat industry. The INMS (Ion-Neutral Mass Spectrometer) addresses this need by providing simultaneous measurements of both the neutral and ion environment, essentially providing two instruments in one compact model. The 1.3U volume, 570 gram, 1.8W nominal power INMS instrument makes implementation into cubesat designs (3U and above) practical and feasible. With high dynamic range (0.1-500eV), mass dynamic range of 1-40amu, sharp time resolution (0.1s), and mass resolution of MdM16, the INMS instrument addresses the atmospheric science needs that otherwise would have required larger more expensive instrumentation. INMS-v1 (version 1) launched on Exocube (CalPoly 3U cubesat) in 2015 and INMS-v2 (version 2) is scheduled to launch on Dellingr (GSFC 6U cubesat) in 2017. New versions of INMS are currently being developed to increase and add measurement capabilities, while maintaining its smallsat/cubesat form.

  2. Air Quality Forecasts Using the NASA GEOS Model

    NASA Technical Reports Server (NTRS)

    Keller, Christoph A.; Knowland, K. Emma; Nielsen, Jon E.; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Follette-Cook, Melanie; Liu, Junhua; hide

    2018-01-01

    We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.

  3. New Mars Camera's First Image of Mars from Mapping Orbit (Full Frame)

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The high resolution camera on NASA's Mars Reconnaissance Orbiter captured its first image of Mars in the mapping orbit, demonstrating the full resolution capability, on Sept. 29, 2006. The High Resolution Imaging Science Experiment (HiRISE) acquired this first image at 8:16 AM (Pacific Time). With the spacecraft at an altitude of 280 kilometers (174 miles), the image scale is 25 centimeters per pixel (10 inches per pixel). If a person were located on this part of Mars, he or she would just barely be visible in this image.

    The image covers a small portion of the floor of Ius Chasma, one branch of the giant Valles Marineris system of canyons. The image illustrates a variety of processes that have shaped the Martian surface. There are bedrock exposures of layered materials, which could be sedimentary rocks deposited in water or from the air. Some of the bedrock has been faulted and folded, perhaps the result of large-scale forces in the crust or from a giant landslide. The image resolves rocks as small as small as 90 centimeters (3 feet) in diameter. It includes many dunes or ridges of windblown sand.

    This image (TRA_000823_1720) was taken by the High Resolution Imaging Science Experiment camera onboard the Mars Reconnaissance Orbiter spacecraft on Sept. 29, 2006. Shown here is the full image, centered at minus 7.8 degrees latitude, 279.5 degrees east longitude. The image is oriented such that north is to the top. The range to the target site was 297 kilometers (185.6 miles). At this distance the image scale is 25 centimeters (10 inches) per pixel (with one-by-one binning) so objects about 75 centimeters (30 inches) across are resolved. The image was taken at a local Mars time of 3:30 PM and the scene is illuminated from the west with a solar incidence angle of 59.7 degrees, thus the sun was about 30.3 degrees above the horizon. The season on Mars is northern winter, southern summer.

    [Photojournal note: Due to the large sizes of the high-resolution TIFF and JPEG files, some systems may experience extremely slow downlink time while viewing or downloading these images; some systems may be incapable of handling the download entirely.]

    NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The HiRISE camera was built by Ball Aerospace & Technologies Corporation, Boulder, Colo., and is operated by the University of Arizona, Tucson.

  4. Recovery of Sparse Positive Signals on the Sphere from Low Resolution Measurements

    NASA Astrophysics Data System (ADS)

    Bendory, Tamir; Eldar, Yonina C.

    2015-12-01

    This letter considers the problem of recovering a positive stream of Diracs on a sphere from its projection onto the space of low-degree spherical harmonics, namely, from its low-resolution version. We suggest recovering the Diracs via a tractable convex optimization problem. The resulting recovery error is proportional to the noise level and depends on the density of the Diracs. We validate the theory by numerical experiments.

  5. Bevalac studies of magnet Cerenkov spectroscopy

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The attempt was made to identify the various contributions to the velocity resolution of Cerenkov detectors such as might be used in Astromag, to measure the magnitude of these contributions and assess their effect on the mass resolution of an isotope spectrometer for Astromag, and to perform Bevalac tests of magnet/Cerenkov spectroscopy. A first version of a new 5 in. photomultiplier tube was also tested that is designed for use in large magnetic fields.

  6. Continuous flow electrophoretic separation of proteins and cells from mammalian tissues

    NASA Technical Reports Server (NTRS)

    Hymer, W. C.; Barlow, Grant H.; Blaisdell, Steven J.; Cleveland, Carolyn; Farrington, Mary Ann; Feldmeier, Mary; Hatfield, J. Michael; Lanham, J. Wayne; Grindeland, Richard; Snyder, Robert S.

    1987-01-01

    This paper describes an apparatus for continuous flow electrophoresis (CFE), designed to separate macromolecules and cells at conditions of microgravity. In this CFE, buffer flows upward in a 120-cm long flow chamber, which is 16-cm wide x 3.0-mm thick in the microgravity version (and 6-cm wide x 1.5-mm thick in the unit-gravity laboratory version). Ovalbumin and rat serum albumin were separated in space (flight STS-4) with the same resolution of the two proteins achieved at 25 percent total w/v concentration that was obtained in the laboratory at 0.2 percent w/v concentration. Rat anterior pituitary cells, cultured human embryonic kidney cells, and canine Langerhans cells were separated into subpopulations (flight STS-8) more effectively than in unit gravity, with comparable resolution having been achieved at 100 times the concentration possible on earth.

  7. Present-day Antarctic climatology of the NCAR Community Climate Model Version 1

    NASA Technical Reports Server (NTRS)

    Tzeng, Ren-Yow; Bromwich, David H.; Parish, Thomas R.

    1993-01-01

    The ability of the NCAR Community Climate Model Version 1 (CCM1) with R 15 resolution to simulate the present-day climate of Antarctica was evaluated using the five-year seasonal cycle output produced by the CCM1 and comparing the model results with observed horizontal syntheses and point data. The results showed that the CCM1 with R 15 resolution can simulate to some extent the dynamics of Antarctic climate on the synoptic scale as well as some mesoscale features. The model can also simulate the phase and the amplitude of the annual and semiannual variation of the temperature, sea level pressure, and zonally averaged zonal (E-W) wind. The main shortcomings of the CCM1 model are associated with the model's anomalously large precipitation amounts at high latitudes, due to the tendency of the scheme to suppress negative moisture values.

  8. Retrieved Products from Simulated Hyperspectral Observations of a Hurricane

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Kouvaris, Louis C.; Iredell, Lena; Blaisdell, John; Pagano, Thomas; Mathews, William

    2015-01-01

    This research uses GCM derived products, with 1 km spatial resolution and sampled every 10 minutes, over a moving area following the track of a simulated severe Atlantic storm. Model products were aggregated over sounder footprints corresponding to 13 km in LEO, 2 km in LEO, and 5 km in GEO sampled every 72 minutes. We simulated radiances for instruments with AIRS-like spectral coverage, spectral resolution, and channel noise, using these aggregated products as the truth, and analyzed them using a slightly modified version of the operational AIRS Version-6 retrieval algorithm. Accuracy of retrievals obtained using simulated AIRS radiances with a 13 km footprint was similar to that obtained using real AIRS data. Spatial coverage and accuracy of retrievals are shown for all three sounding scenarios. The research demonstrates the potential significance of flying Advanced AIRS-like instruments on future LEO and GEO missions.

  9. Language Measurement Equivalence of the Ethnic Identity Scale With Mexican American Early Adolescents

    PubMed Central

    White, Rebecca M. B.; Umaña-Taylor, Adriana J.; Knight, George P.; Zeiders, Katharine H.

    2011-01-01

    The current study considers methodological challenges in developmental research with linguistically diverse samples of young adolescents. By empirically examining the cross-language measurement equivalence of a measure assessing three components of ethnic identity development (i.e., exploration, resolution, and affirmation) among Mexican American adolescents, the study both assesses the cross-language measurement equivalence of a common measure of ethnic identity and provides an appropriate conceptual and analytical model for researchers needing to evaluate measurement scales translated into multiple languages. Participants are 678 Mexican-origin early adolescents and their mothers. Measures of exploration and resolution achieve the highest levels of equivalence across language versions. The measure of affirmation achieves high levels of equivalence. Results highlight potential ways to correct for any problems of nonequivalence across language versions of the affirmation measure. Suggestions are made for how researchers working with linguistically diverse samples can use the highlighted techniques to evaluate their own translated measures. PMID:22116736

  10. Access to Land Data Products Through the Land Processes DAAC

    NASA Astrophysics Data System (ADS)

    Klaassen, A. L.; Gacke, C. K.

    2004-12-01

    The Land Processes Distributed Active Archive Center (LP DAAC) was established as part of NASA's Earth Observing System (EOS) Data and Information System (EOSDIS) initiative to process, archive, and distribute land-related data collected by EOS sensors, thereby promoting the inter-disciplinary study and understanding of the integrated Earth system. The LP DAAC is responsible for archiving, product development, distribution, and user support of Moderate Resolution Imaging Spectroradiometer (MODIS) land products derived from data acquired by the Terra and Aqua satellites and processing and distribution of Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data products. These data are applied in scientific research, management of natural resources, emergency response to natural disaster, and Earth Science Education. There are several web interfaces by which the inventory may be searched and the products ordered. The LP DAAC web site (http://lpdaac.usgs.gov/) provides product-specific information and links to data access tools. The primary search and order tool is the EOS Data Gateway (EDG) (http://edcimswww.cr.usgs.gov/pub/imswelcome/) that allows users to search data holdings, retrieve descriptions of data sets, view browse images, and place orders. The EDG is the only tool to search the entire inventory of ASTER and MODIS products available from the LP DAAC. The Data Pool (http://lpdaac.usgs.gov/datapool/datapool.asp) is an online archive that provides immediate FTP access to selected LP DAAC data products. The data can be downloaded by going directly to the FTP site, where you can navigate to the desired granule, metadata file or browse image. It includes the ability to convert files from the standard HDF-EOS data format into GeoTIFF, to change the data projections, or perform spatial subsetting by using the HDF-EOS to GeoTIFF Converter (HEG) for selected data types. The Browse Tool also known as the USGS Global Visualization Viewer (http://lpdaac.usgs.gov/aster/glovis.asp) provides a easy online method to search, browse, and order the LP DAAC ASTER and MODIS land data by viewing browse images to define spatial and temporal queries. The LP DAAC User Services Office is the interface for support for the ASTER and MODIS data products and services. The user services representatives are available to answer questions, assist with ordering data, technical support and referrals, and provide information on a variety of tools available to assist in data preparation. The LP DAAC User Services contact information is: LP DAAC User Services U.S. Geological Survey EROS Data Center 47914 252nd Street Sioux Falls, SD 57198-0001 Voice: (605) 594-6116 Toll Free: 866-573-3222 Fax: 605-594-6963 E-mail: edc@eos.nasa.gov "This abstract was prepared under Contract number 03CRCN0001 between SAIC and U.S. Geological Survey. Abstract has not been reviewed for conformity with USGS editorial standards and has been submitted for approval by the USGS Director."

  11. eMODIS: A User-Friendly Data Source

    USGS Publications Warehouse

    Jenkerson, Calli B.; Maiersperger, Thomas; Schmidt, Gail

    2010-01-01

    The U.S. Geological Survey's (USGS) Earth Resources Observation and Science (EROS) Center is generating a suite of products called 'eMODIS' based on Moderate Resolution Imaging Spectroradiometer (MODIS) data acquired by the National Aeronautics and Space Administration's (NASA) Earth Observing System (EOS). With a more frequent repeat cycle than Landsat and higher spatial resolutions than the Advanced Very High Resolution Spectroradiometer (AVHRR), MODIS is well suited for vegetation studies. For operational monitoring, however, the benefits of MODIS are counteracted by usability issues with the standard map projection, file format, composite interval, high-latitude 'bow-tie' effects, and production latency. eMODIS responds to a community-specific need for alternatively packaged MODIS data, addressing each of these factors for real-time monitoring and historical trend analysis. eMODIS processes calibrated radiance data (level-1B) acquired by the MODIS sensors on the EOS Terra and Aqua satellites by combining MODIS Land Science Collection 5 Atmospherically Corrected Surface Reflectance production code and USGS EROS MODIS Direct Broadcast System (DBS) software to create surface reflectance and Normalized Difference Vegetation Index (NDVI) products. eMODIS is produced over the continental United States and over Alaska extending into Canada to cover the Yukon River Basin. The 250-meter (m), 500-m, and 1,000-m products are delivered in Geostationary Earth Orbit Tagged Image File Format (Geo- TIFF) and composited in 7-day intervals. eMODIS composites are projected to non-Sinusoidal mapping grids that best suit the geography in their areas of application (see eMODIS Product Description below). For eMODIS products generated over the continental United States (eMODIS CONUS), the Terra (from 2000) and Aqua (from 2002) records are available and continue through present time. eMODIS CONUS also is generated in an expedited process that delivers a 7-day rolling composite, created daily with the most recent 7 days of acquisition, to users monitoring real-time vegetation conditions. eMODIS Alaska is not part of expedited processing, but does cover the Terra mission life (2000-present). A simple file transfer protocol (FTP) distribution site currently is enabled on the Internet for direct download of eMODIS products (ftp://emodisftp.cr.usgs.gov/eMODIS), with plans to expand into an interactive portal environment.

  12. An Intercomparison of Large-Extent Tree Canopy Cover Geospatial Datasets

    NASA Astrophysics Data System (ADS)

    Bender, S.; Liknes, G.; Ruefenacht, B.; Reynolds, J.; Miller, W. P.

    2017-12-01

    As a member of the Multi-Resolution Land Characteristics Consortium (MRLC), the U.S. Forest Service (USFS) is responsible for producing and maintaining the tree canopy cover (TCC) component of the National Land Cover Database (NLCD). The NLCD-TCC data are available for the conterminous United States (CONUS), coastal Alaska, Hawai'i, Puerto Rico, and the U.S. Virgin Islands. The most recent official version of the NLCD-TCC data is based primarily on reference data from 2010-2011 and is part of the multi-component 2011 version of the NLCD. NLCD data are updated on a five-year cycle. The USFS is currently producing the next official version (2016) of the NLCD-TCC data for the United States, and it will be made publicly-available in early 2018. In this presentation, we describe the model inputs, modeling methods, and tools used to produce the 30-m NLCD-TCC data. Several tree cover datasets at 30-m, as well as datasets at finer resolution, have become available in recent years due to advancements in earth observation data and their availability, computing, and sensors. We compare multiple tree cover datasets that have similar resolution to the NLCD-TCC data. We also aggregate the tree class from fine-resolution land cover datasets to a percent canopy value on a 30-m pixel, in order to compare the fine-resolution datasets to the datasets created directly from 30-m Landsat data. The extent of the tree canopy cover datasets included in the study ranges from global and national to the state level. Preliminary investigation of multiple tree cover datasets over the CONUS indicates a high amount of spatial variability. For example, in a comparison of the NLCD-TCC and the Global Land Cover Facility's Landsat Tree Cover Continuous Fields (2010) data by MRLC mapping zones, the zone-level root mean-square deviation ranges from 2% to 39% (mean=17%, median=15%). The analysis outcomes are expected to inform USFS decisions with regard to the next cycle (2021) of NLCD-TCC production.

  13. A New Pansharpening Method Based on Spatial and Spectral Sparsity Priors.

    PubMed

    He, Xiyan; Condat, Laurent; Bioucas-Diaz, Jose; Chanussot, Jocelyn; Xia, Junshi

    2014-06-27

    The development of multisensor systems in recent years has led to great increase in the amount of available remote sensing data. Image fusion techniques aim at inferring high quality images of a given area from degraded versions of the same area obtained by multiple sensors. This paper focuses on pansharpening, which is the inference of a high spatial resolution multispectral image from two degraded versions with complementary spectral and spatial resolution characteristics: a) a low spatial resolution multispectral image; and b) a high spatial resolution panchromatic image. We introduce a new variational model based on spatial and spectral sparsity priors for the fusion. In the spectral domain we encourage low-rank structure, whereas in the spatial domain we promote sparsity on the local differences. Given the fact that both panchromatic and multispectral images are integrations of the underlying continuous spectra using different channel responses, we propose to exploit appropriate regularizations based on both spatial and spectral links between panchromatic and the fused multispectral images. A weighted version of the vector Total Variation (TV) norm of the data matrix is employed to align the spatial information of the fused image with that of the panchromatic image. With regard to spectral information, two different types of regularization are proposed to promote a soft constraint on the linear dependence between the panchromatic and the fused multispectral images. The first one estimates directly the linear coefficients from the observed panchromatic and low resolution multispectral images by Linear Regression (LR) while the second one employs the Principal Component Pursuit (PCP) to obtain a robust recovery of the underlying low-rank structure. We also show that the two regularizers are strongly related. The basic idea of both regularizers is that the fused image should have low-rank and preserve edge locations. We use a variation of the recently proposed Split Augmented Lagrangian Shrinkage (SALSA) algorithm to effectively solve the proposed variational formulations. Experimental results on simulated and real remote sensing images show the effectiveness of the proposed pansharpening method compared to the state-of-the-art.

  14. Sparse super-resolution reconstructions of video from mobile devices in digital TV broadcast applications

    NASA Astrophysics Data System (ADS)

    Boon, Choong S.; Guleryuz, Onur G.; Kawahara, Toshiro; Suzuki, Yoshinori

    2006-08-01

    We consider the mobile service scenario where video programming is broadcast to low-resolution wireless terminals. In such a scenario, broadcasters utilize simultaneous data services and bi-directional communications capabilities of the terminals in order to offer substantially enriched viewing experiences to users by allowing user participation and user tuned content. While users immediately benefit from this service when using their phones in mobile environments, the service is less appealing in stationary environments where a regular television provides competing programming at much higher display resolutions. We propose a fast super-resolution technique that allows the mobile terminals to show a much enhanced version of the broadcast video on nearby high-resolution devices, extending the appeal and usefulness of the broadcast service. The proposed single frame super-resolution algorithm uses recent sparse recovery results to provide high quality and high-resolution video reconstructions based solely on individual decoded frames provided by the low-resolution broadcast.

  15. HAM2D: 2D Shearing Box Model

    NASA Astrophysics Data System (ADS)

    Gammie, Charles F.; Guan, Xiaoyue

    2012-10-01

    HAM solves non-relativistic hyperbolic partial differential equations in conservative form using high-resolution shock-capturing techniques. This version of HAM has been configured to solve the magnetohydrodynamic equations of motion in axisymmetry to evolve a shearing box model.

  16. Science Comes Alive at NASA Goddard

    NASA Image and Video Library

    2017-05-17

    Science Comes Alive at NASA Goddard: Welcome to the NASA Goddard Space Flight Center. Where innovation and science never sleep and new discoveries never get old... At NASA Goddard. For Higher Resolutions and Other Versions: https://svs.gsfc.nasa.gov/12533

  17. Constraints on the Profiles of Total Water PDF in AGCMs from AIRS and a High-Resolution Model

    NASA Technical Reports Server (NTRS)

    Molod, Andrea

    2012-01-01

    Atmospheric general circulation model (AGCM) cloud parameterizations generally include an assumption about the subgrid-scale probability distribution function (PDF) of total water and its vertical profile. In the present study, the Atmospheric Infrared Sounder (AIRS) monthly-mean cloud amount and relative humidity fields are used to compute a proxy for the second moment of an AGCM total water PDF called the RH01 diagnostic, which is the AIRS mean relative humidity for cloud fractions of 0.1 or less. The dependence of the second moment on horizontal grid resolution is analyzed using results from a high-resolution global model simulation.The AIRS-derived RH01 diagnostic is generally larger near the surface than aloft, indicating a narrower PDF near the surface, and varies with the type of underlying surface. High-resolution model results show that the vertical structure of profiles of the AGCM PDF second moment is unchanged as the grid resolution changes from 200 to 100 to 50 km, and that the second-moment profiles shift toward higher values with decreasing grid spacing.Several Goddard Earth Observing System, version 5 (GEOS-5), AGCM simulations were performed with several choices for the profile of the PDF second moment. The resulting cloud and relative humidity fields were shown to be quite sensitive to the prescribed profile, and the use of a profile based on the AIRS-derived proxy results in improvements relative to observational estimates. The AIRS-guided total water PDF profiles, including their dependence on underlying surface type and on horizontal resolution, have been implemented in the version of the GEOS-5 AGCM used for publicly released simulations.

  18. The Latest Mars Climate Database (MCD v5.1)

    NASA Astrophysics Data System (ADS)

    Millour, Ehouarn; Forget, Francois; Spiga, Aymeric; Navarro, Thomas; Madeleine, Jean-Baptiste; Pottier, Alizée; Montabone, Luca; Kerber, Laura; Lefèvre, Franck; Montmessin, Franck; Chaufray, Jean-Yves; López-Valverde, Miguel; González-Galindo, Francisco; Lewis, Stephen; Read, Peter; Huot, Jean-Paul; Desjean, Marie-Christine; the MCD/GCM development Team

    2014-05-01

    For many years, several teams around the world have developed GCMs (General Circulation Model or Global Climate Model) to simulate the environment on Mars. The GCM developed at the Laboratoire de Météorologie Dynamique in collaboration with several teams in Europe (LATMOS, France, University of Oxford, The Open University, the Instituto de Astrofisica de Andalucia), and with the support of ESA and CNES is currently used for many applications. Its outputs have also regularly been compiled to build a Mars Climate Database, a freely available tool useful for the scientific and engineering communities. The Mars Climate Database (MCD) has over the years been distributed to more than 150 teams around the world. Following the recent improvements inthe GCM, a new series of reference simulations have been run and compiled into a new version (version5.1) of the Mars Climate Database, released in the first half of 2014. To summarize, MCD v5.1 provides: - Climatologies over a series of dust scenarios: standard year, cold (ie: low dust), warm (ie: dusty atmosphere) and dust storm, all topped by various cases of Extreme UV solar inputs (low, mean or maximum). These scenarios differ from those of previous versions of the MCD (version 4.x) as they have been derived from home-made, instrument-derived (TES, THEMIS, MCS, MERs), dust climatology of the last 8 Martian years. - Mean values and statistics of main meteorological variables (atmospheric temperature, density, pressure and winds), as well as surface pressure and temperature, CO2 ice cover, thermal and solar radiative fluxes, dust column opacity and mixing ratio, [H20] vapor and ice columns, concentrations of many species: [CO], [O2], [O], [N2], [H2], [O3], ... - A high resolution mode which combines high resolution (32 pixel/degree) MOLA topography records and Viking Lander 1 pressure records with raw lower resolution GCM results to yield, within the restriction of the procedure, high resolution values of atmospheric variables. - The possibility to reconstruct realistic conditions by combining the provided climatology with additional large scale and small scale perturbations schemes. At EGU, we will report on the latest improvements in the Mars Climate Database, with comparisons with available measurements from orbit (e.g.: TES, MCS) or landers (Viking, Phoenix, MSL).

  19. Photoionization Rate of Atomic Oxygen

    NASA Astrophysics Data System (ADS)

    Meier, R. R.; McLaughlin, B. M.; Warren, H. P.; Bishop, J.

    2006-05-01

    Accurate knowledge of the photoionization rate of atomic oxygen is important for the study and understanding of the ionospheres and emission processes of terrestrial, planetary, and cometary atmospheres. Past calculations of the photoionization rate have been carried out at various spectral resolutions, but none were at sufficiently high resolution to accommodate accidental resonances between solar emission lines and highly structured auto-ionization features in the photoionization cross section. A new version of the NRLEUV solar spectral irradiance model (at solar minimum) and a new model of the O photoionization cross section enable calculations at very high spectral resolution. We find unattenuated photoionization rates computed at 0.001 nm resolution are larger than those at moderate resolution (0.1 nm) by amounts approaching 20%. Allowing for attenuation in the terrestrial atmosphere, we find differences in photoionization rates computed at high and moderate resolution to vary with altitude, especially below 200 km where deviations of plus or minus 20% occur between the two cases.

  20. Extraction and labeling high-resolution images from PDF documents

    NASA Astrophysics Data System (ADS)

    Chachra, Suchet K.; Xue, Zhiyun; Antani, Sameer; Demner-Fushman, Dina; Thoma, George R.

    2013-12-01

    Accuracy of content-based image retrieval is affected by image resolution among other factors. Higher resolution images enable extraction of image features that more accurately represent the image content. In order to improve the relevance of search results for our biomedical image search engine, Open-I, we have developed techniques to extract and label high-resolution versions of figures from biomedical articles supplied in the PDF format. Open-I uses the open-access subset of biomedical articles from the PubMed Central repository hosted by the National Library of Medicine. Articles are available in XML and in publisher supplied PDF formats. As these PDF documents contain little or no meta-data to identify the embedded images, the task includes labeling images according to their figure number in the article after they have been successfully extracted. For this purpose we use the labeled small size images provided with the XML web version of the article. This paper describes the image extraction process and two alternative approaches to perform image labeling that measure the similarity between two images based upon the image intensity projection on the coordinate axes and similarity based upon the normalized cross-correlation between the intensities of two images. Using image identification based on image intensity projection, we were able to achieve a precision of 92.84% and a recall of 82.18% in labeling of the extracted images.

  1. A Fast and Efficient Version of the TwO-Moment Aerosol Sectional (TOMAS) Global Aerosol Microphysics Model

    NASA Technical Reports Server (NTRS)

    Lee, Yunha; Adams, P. J.

    2012-01-01

    This study develops more computationally efficient versions of the TwO-Moment Aerosol Sectional (TOMAS) microphysics algorithms, collectively called Fast TOMAS. Several methods for speeding up the algorithm were attempted, but only reducing the number of size sections was adopted. Fast TOMAS models, coupled to the GISS GCM II-prime, require a new coagulation algorithm with less restrictive size resolution assumptions but only minor changes in other processes. Fast TOMAS models have been evaluated in a box model against analytical solutions of coagulation and condensation and in a 3-D model against the original TOMAS (TOMAS-30) model. Condensation and coagulation in the Fast TOMAS models agree well with the analytical solution but show slightly more bias than the TOMAS-30 box model. In the 3-D model, errors resulting from decreased size resolution in each process (i.e., emissions, cloud processing wet deposition, microphysics) are quantified in a series of model sensitivity simulations. Errors resulting from lower size resolution in condensation and coagulation, defined as the microphysics error, affect number and mass concentrations by only a few percent. The microphysics error in CN70CN100 (number concentrations of particles larger than 70100 nm diameter), proxies for cloud condensation nuclei, range from 5 to 5 in most regions. The largest errors are associated with decreasing the size resolution in the cloud processing wet deposition calculations, defined as cloud-processing error, and range from 20 to 15 in most regions for CN70CN100 concentrations. Overall, the Fast TOMAS models increase the computational speed by 2 to 3 times with only small numerical errors stemming from condensation and coagulation calculations when compared to TOMAS-30. The faster versions of the TOMAS model allow for the longer, multi-year simulations required to assess aerosol effects on cloud lifetime and precipitation.

  2. Pathfinder Sea Surface Temperature Climate Data Record

    NASA Astrophysics Data System (ADS)

    Baker-Yeboah, S.; Saha, K.; Zhang, D.; Casey, K. S.

    2016-02-01

    Global sea surface temperature (SST) fields are important in understanding ocean and climate variability. The NOAA National Centers for Environmental Information (NCEI) develops and maintains a high resolution, long-term, climate data record (CDR) of global satellite SST. These SST values are generated at approximately 4 km resolution using Advanced Very High Resolution Radiometer (AVHRR) instruments aboard NOAA polar-orbiting satellites going back to 1981. The Pathfinder SST algorithm is based on the Non-Linear SST algorithm using the modernized NASA SeaWiFS Data Analysis System (SeaDAS). Coefficients for this SST product were generated using regression analyses with co-located in situ and satellite measurements. Previous versions of Pathfinder included level 3 collated (L3C) products. Pathfinder Version 5.3 includes level 2 pre-processed (L2P), level 3 Uncollated (L3C), and L3C products. Notably, the data were processed in the cloud using Amazon Web Services and are made available through all of the modern web visualization and subset services provided by the THREDDS Data Server, the Live Access Server, and the OPeNDAP Hyrax Server.In this version of Pathfinder SST, anomalous hot-spots at land-water boundaries are better identified and the dataset includes updated land masks and sea ice data over the Antarctic ice shelves. All quality levels of SST values are generated, giving the user greater flexibility and the option to apply their own cloud-masking procedures. Additional improvements include consistent cloud tree tests for NOAA-07 and NOAA-19 with respect to the other sensors, improved SSTs in sun glint areas, and netCDF file format improvements to ensure consistency with the latest Group for High Resolution SST (GHRSST) requirements. This quality controlled satellite SST field is a reference environmental data record utilized as a primary resource of SST for numerous regional and global marine efforts.

  3. Validation of a clinical assessment of spectral-ripple resolution for cochlear implant users.

    PubMed

    Drennan, Ward R; Anderson, Elizabeth S; Won, Jong Ho; Rubinstein, Jay T

    2014-01-01

    Nonspeech psychophysical tests of spectral resolution, such as the spectral-ripple discrimination task, have been shown to correlate with speech-recognition performance in cochlear implant (CI) users. However, these tests are best suited for use in the research laboratory setting and are impractical for clinical use. A test of spectral resolution that is quicker and could more easily be implemented in the clinical setting has been developed. The objectives of this study were (1) To determine whether this new clinical ripple test would yield individual results equivalent to the longer, adaptive version of the ripple-discrimination test; (2) To evaluate test-retest reliability for the clinical ripple measure; and (3) To examine the relationship between clinical ripple performance and monosyllabic word recognition in quiet for a group of CI listeners. Twenty-eight CI recipients participated in the study. Each subject was tested on both the adaptive and the clinical versions of spectral ripple discrimination, as well as consonant-nucleus-consonant word recognition in quiet. The adaptive version of spectral ripple used a two-up, one-down procedure for determining spectral ripple discrimination threshold. The clinical ripple test used a method of constant stimuli, with trials for each of 12 fixed ripple densities occurring six times in random order. Results from the clinical ripple test (proportion correct) were then compared with ripple-discrimination thresholds (in ripples per octave) from the adaptive test. The clinical ripple test showed strong concurrent validity, evidenced by a good correlation between clinical ripple and adaptive ripple results (r = 0.79), as well as a correlation with word recognition (r = 0.7). Excellent test-retest reliability was also demonstrated with a high test-retest correlation (r = 0.9). The clinical ripple test is a reliable nonlinguistic measure of spectral resolution, optimized for use with CI users in a clinical setting. The test might be useful as a diagnostic tool or as a possible surrogate outcome measure for evaluating treatment effects in hearing.

  4. Stochastic Analysis and Probabilistic Downscaling of Soil Moisture

    NASA Astrophysics Data System (ADS)

    Deshon, J. P.; Niemann, J. D.; Green, T. R.; Jones, A. S.

    2017-12-01

    Soil moisture is a key variable for rainfall-runoff response estimation, ecological and biogeochemical flux estimation, and biodiversity characterization, each of which is useful for watershed condition assessment. These applications require not only accurate, fine-resolution soil-moisture estimates but also confidence limits on those estimates and soil-moisture patterns that exhibit realistic statistical properties (e.g., variance and spatial correlation structure). The Equilibrium Moisture from Topography, Vegetation, and Soil (EMT+VS) model downscales coarse-resolution (9-40 km) soil moisture from satellite remote sensing or land-surface models to produce fine-resolution (10-30 m) estimates. The model was designed to produce accurate deterministic soil-moisture estimates at multiple points, but the resulting patterns do not reproduce the variance or spatial correlation of observed soil-moisture patterns. The primary objective of this research is to generalize the EMT+VS model to produce a probability density function (pdf) for soil moisture at each fine-resolution location and time. Each pdf has a mean that is equal to the deterministic soil-moisture estimate, and the pdf can be used to quantify the uncertainty in the soil-moisture estimates and to simulate soil-moisture patterns. Different versions of the generalized model are hypothesized based on how uncertainty enters the model, whether the uncertainty is additive or multiplicative, and which distributions describe the uncertainty. These versions are then tested by application to four catchments with detailed soil-moisture observations (Tarrawarra, Satellite Station, Cache la Poudre, and Nerrigundah). The performance of the generalized models is evaluated by comparing the statistical properties of the simulated soil-moisture patterns to those of the observations and the deterministic EMT+VS model. The versions of the generalized EMT+VS model with normally distributed stochastic components produce soil-moisture patterns with more realistic statistical properties than the deterministic model. Additionally, the results suggest that the variance and spatial correlation of the stochastic soil-moisture variations do not vary consistently with the spatial-average soil moisture.

  5. Digital Camera Control for Faster Inspection

    NASA Technical Reports Server (NTRS)

    Brown, Katharine; Siekierski, James D.; Mangieri, Mark L.; Dekome, Kent; Cobarruvias, John; Piplani, Perry J.; Busa, Joel

    2009-01-01

    Digital Camera Control Software (DCCS) is a computer program for controlling a boom and a boom-mounted camera used to inspect the external surface of a space shuttle in orbit around the Earth. Running in a laptop computer in the space-shuttle crew cabin, DCCS commands integrated displays and controls. By means of a simple one-button command, a crewmember can view low- resolution images to quickly spot problem areas and can then cause a rapid transition to high- resolution images. The crewmember can command that camera settings apply to a specific small area of interest within the field of view of the camera so as to maximize image quality within that area. DCCS also provides critical high-resolution images to a ground screening team, which analyzes the images to assess damage (if any); in so doing, DCCS enables the team to clear initially suspect areas more quickly than would otherwise be possible and further saves time by minimizing the probability of re-imaging of areas already inspected. On the basis of experience with a previous version (2.0) of the software, the present version (3.0) incorporates a number of advanced imaging features that optimize crewmember capability and efficiency.

  6. Air Quality Forecasts Using the NASA GEOS Model: A Unified Tool from Local to Global Scales

    NASA Technical Reports Server (NTRS)

    Knowland, E. Emma; Keller, Christoph; Nielsen, J. Eric; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Cook, Melanie; Liu, Junhua; hide

    2017-01-01

    We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (approximately 25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.

  7. Familial ethnic socialization, gender role attitudes, and ethnic identity development in Mexican-origin early adolescents.

    PubMed

    Sanchez, Delida; Whittaker, Tiffany A; Hamilton, Emma; Arango, Sarah

    2017-07-01

    This study examined the relations between familial ethnic socialization and ethnic identity development in 438 Mexican-origin (n = 242 boys and n = 196 girls) preadolescents. In addition, machismo and marianismo gender role attitudes were examined as potential mediators in this link. Confirmatory factor analyses (CFA) of the Familial Ethnic Socialization Scale (FES), Machismo Measure (MM), Marianismo Beliefs Scale (MBS), and the Ethnic Identity Brief Scale (EISB) were conducted to test the factor structure with a preadolescent Mexican-origin sample. Separate path analyses of analytic models were then performed on boys and girls. Results of the CFAs for survey measures revealed that for the FES, a 1-factor version indicated acceptable fit; for the MM, the original 2-factor structure indicated acceptable model fit; for the MBS, a revised 3-factor version indicated acceptable model fit; and, for the EISB, the affirmation and resolution dimensions showed acceptable fit. Among boys, FES was significantly and positively linked to caballerismo, and EISB affirmation and resolution; furthermore, the links between FES and EISB affirmation and resolution were indirectly connected by caballerismo. In addition, traditional machismo was negatively linked to EISB affirmation, and caballerismo was positively linked to EISB affirmation and resolution. Among girls, FES was significantly and positively related to the MBS-virtuous/chaste pillar, and EISB affirmation and resolution. The MBS-subordinate to others pillar was negatively linked to EISB affirmation. This study underscores the importance of FES and positive gender role attitudes in the link to ethnic identity development among Mexican-origin preadolescents. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Evaluation of improved land use and canopy representation in ...

    EPA Pesticide Factsheets

    Biogenic volatile organic compounds (BVOC) participate in reactions that can lead to secondarily formed ozone and particulate matter (PM) impacting air quality and climate. BVOC emissions are important inputs to chemical transport models applied on local to global scales but considerable uncertainty remains in the representation of canopy parameterizations and emission algorithms from different vegetation species. The Biogenic Emission Inventory System (BEIS) has been used to support both scientific and regulatory model assessments for ozone and PM. Here we describe a new version of BEIS which includes updated input vegetation data and canopy model formulation for estimating leaf temperature and vegetation data on estimated BVOC. The Biogenic Emission Landuse Database (BELD) was revised to incorporate land use data from the Moderate Resolution Imaging Spectroradiometer (MODIS) land product and 2006 National Land Cover Database (NLCD) land coverage. Vegetation species data are based on the US Forest Service (USFS) Forest Inventory and Analysis (FIA) version 5.1 for 2002–2013 and US Department of Agriculture (USDA) 2007 census of agriculture data. This update results in generally higher BVOC emissions throughout California compared with the previous version of BEIS. Baseline and updated BVOC emission estimates are used in Community Multiscale Air Quality (CMAQ) Model simulations with 4 km grid resolution and evaluated with measurements of isoprene and monoterp

  9. View Toward 'Vera Rubin Ridge' on Mount Sharp, Mars

    NASA Image and Video Library

    2017-07-11

    This look ahead from NASA's Curiosity Mars rover includes four geological layers to be examined by the mission, and higher reaches of Mount Sharp beyond the planned study area. The redder rocks of the foreground are part of the Murray formation. Pale gray rocks in the middle distance of the right half of the image are in the Clay Unit. A band between those terrains is "Vera Rubin Ridge." Rounded brown knobs beyond the Clay Unit are in the Sulfate Unit, beyond which lie higher portions of the mountain. The view combines six images taken with the rover's Mast Camera (Mastcam) on Jan. 24, 2017, during the 1,589th Martian day, or sol, of Curiosity's work on Mars, when the rover was still more than half a mile (about a kilometer) north of Vera Rubin Ridge. The panorama has been white-balanced so that the colors of the rock and sand materials resemble how they would appear under daytime lighting conditions on Earth. It spans from east-southeast on the left to south on the right. The Sol 1589 location was just north of the waypoint labeled "Ogunquit Beach" on a map of the area that also shows locations of the Murray formation, Vera Rubin Ridge, Clay Unit and Sulfate Unit. The ridge was informally named in early 2017 in memory of Vera Cooper Rubin (1928-2016), whose astronomical observations provided evidence for the existence of the universe's dark matter. Annotated and full resolution TIFF files are available at https://photojournal.jpl.nasa.gov/catalog/PIA21716

  10. NASA IMAGESEER: NASA IMAGEs for Science, Education, Experimentation and Research

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Grubb, Thomas G.; Milner, Barbara C.

    2012-01-01

    A number of web-accessible databases, including medical, military or other image data, offer universities and other users the ability to teach or research new Image Processing techniques on relevant and well-documented data. However, NASA images have traditionally been difficult for researchers to find, are often only available in hard-to-use formats, and do not always provide sufficient context and background for a non-NASA Scientist user to understand their content. The new IMAGESEER (IMAGEs for Science, Education, Experimentation and Research) database seeks to address these issues. Through a graphically-rich web site for browsing and downloading all of the selected datasets, benchmarks, and tutorials, IMAGESEER provides a widely accessible database of NASA-centric, easy to read, image data for teaching or validating new Image Processing algorithms. As such, IMAGESEER fosters collaboration between NASA and research organizations while simultaneously encouraging development of new and enhanced Image Processing algorithms. The first prototype includes a representative sampling of NASA multispectral and hyperspectral images from several Earth Science instruments, along with a few small tutorials. Image processing techniques are currently represented with cloud detection, image registration, and map cover/classification. For each technique, corresponding data are selected from four different geographic regions, i.e., mountains, urban, water coastal, and agriculture areas. Satellite images have been collected from several instruments - Landsat-5 and -7 Thematic Mappers, Earth Observing-1 (EO-1) Advanced Land Imager (ALI) and Hyperion, and the Moderate Resolution Imaging Spectroradiometer (MODIS). After geo-registration, these images are available in simple common formats such as GeoTIFF and raw formats, along with associated benchmark data.

  11. HiPS - Hierarchical Progressive Survey Version 1.0

    NASA Astrophysics Data System (ADS)

    Fernique, Pierre; Allen, Mark; Boch, Thomas; Donaldson, Tom; Durand, Daniel; Ebisawa, Ken; Michel, Laurent; Salgado, Jesus; Stoehr, Felix; Fernique, Pierre

    2017-05-01

    This document presents HiPS, a hierarchical scheme for the description, storage and access of sky survey data. The system is based on hierarchical tiling of sky regions at finer and finer spatial resolution which facilitates a progressive view of a survey, and supports multi-resolution zooming and panning. HiPS uses the HEALPix tessellation of the sky as the basis for the scheme and is implemented as a simple file structure with a direct indexing scheme that leads to practical implementations.

  12. Tools for Implementing the Recent IAU Resolutions: USNO Circular 179 and the NOVAS Software Package

    NASA Astrophysics Data System (ADS)

    Kaplan, G. H.; Bangert, J. A.

    2006-08-01

    The resolutions on positional astronomy adopted at the 1997 and 2000 IAU General Assemblies are far-reaching in scope, affecting both the details of various computations and the basic concepts upon which they are built. For many scientists and engineers, applying these recommendations to practical problems is thus doubly challenging. Because the U.S. Naval Observatory (USNO) serves a broad base of users, we have provided two different tools to aid in implementing the resolutions, both of which are intended for the person who is knowledgeable but not necessarily expert in positional astronomy. These tools complement the new material that has been added to The Astronomical Almanac (see paper by Hohenkerk). USNO Circular 179 is a 118-page book that introduces the resolutions to non-specialists. It includes extensive narratives describing the basic concepts as well as compilations of the equations necessary to apply the recommendations. The resolutions have been logically grouped into six main chapters. The Circular is available as a hard-cover book or as a PDF file that can be downloaded from either the USNO/AA web site (http://aa.usno.navy.mil/) or arXiv.org. NOVAS (Naval Observatory Vector Astrometry Subroutines) is a source-code library available in both Fortran and C. It is a long established package with a wide user base that has recently been extensively revised (in version 3.0) to implement the recent IAU resolutions. However, use of NOVAS does not require detailed knowledge of the resolutions, since commonly requested high-level data _ for example, topocentric positions of stars or planets _ are provided in a single call. NOVAS can be downloaded from the USNO/AA web site. Both Circular 179 and NOVAS version 3.0 anticipate IAU adoption of the recommendations of the 2003-2006 working groups on precession and nomenclature.

  13. Highlights of satellite-based forest change recognition and tracking using the ForWarn System

    Treesearch

    Steven P. Norman; William W. Hargrove; Joseph P. Spruce; William M. Christie; Sean W. Schroeder

    2013-01-01

    For a higher resolution version of this file, please use the following link: www.geobabble.orgSatellite-based remote sensing can assist forest managers with their need to recognize disturbances and track recovery. Despite the long...

  14. Resolution of ranking hierarchies in directed networks.

    PubMed

    Letizia, Elisa; Barucca, Paolo; Lillo, Fabrizio

    2018-01-01

    Identifying hierarchies and rankings of nodes in directed graphs is fundamental in many applications such as social network analysis, biology, economics, and finance. A recently proposed method identifies the hierarchy by finding the ordered partition of nodes which minimises a score function, termed agony. This function penalises the links violating the hierarchy in a way depending on the strength of the violation. To investigate the resolution of ranking hierarchies we introduce an ensemble of random graphs, the Ranked Stochastic Block Model. We find that agony may fail to identify hierarchies when the structure is not strong enough and the size of the classes is small with respect to the whole network. We analytically characterise the resolution threshold and we show that an iterated version of agony can partly overcome this resolution limit.

  15. Resolution of ranking hierarchies in directed networks

    PubMed Central

    Barucca, Paolo; Lillo, Fabrizio

    2018-01-01

    Identifying hierarchies and rankings of nodes in directed graphs is fundamental in many applications such as social network analysis, biology, economics, and finance. A recently proposed method identifies the hierarchy by finding the ordered partition of nodes which minimises a score function, termed agony. This function penalises the links violating the hierarchy in a way depending on the strength of the violation. To investigate the resolution of ranking hierarchies we introduce an ensemble of random graphs, the Ranked Stochastic Block Model. We find that agony may fail to identify hierarchies when the structure is not strong enough and the size of the classes is small with respect to the whole network. We analytically characterise the resolution threshold and we show that an iterated version of agony can partly overcome this resolution limit. PMID:29394278

  16. Cris-atms Retrievals Using an AIRS Science Team Version 6-like Retrieval Algorithm

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Kouvaris, Louis C.; Iredell, Lena

    2014-01-01

    CrIS is the infrared high spectral resolution atmospheric sounder launched on Suomi-NPP in 2011. CrISATMS comprise the IRMW Sounding Suite on Suomi-NPP. CrIS is functionally equivalent to AIRS, the high spectral resolution IR sounder launched on EOS Aqua in 2002 and ATMS is functionally equivalent to AMSU on EOS Aqua. CrIS is an interferometer and AIRS is a grating spectrometer. Spectral coverage, spectral resolution, and channel noise of CrIS is similar to AIRS. CrIS spectral sampling is roughly twice as coarse as AIRSAIRS has 2378 channels between 650 cm-1 and 2665 cm-1. CrIS has 1305 channels between 650 cm-1 and 2550 cm-1. Spatial resolution of CrIS is comparable to AIRS.

  17. Chandra Reads the Cosmic Bar Code of Gas Around a Black Hole

    NASA Astrophysics Data System (ADS)

    2000-02-01

    An international team of astronomers has used NASA's Chandra X-ray Observatory to make an energy bar code of hot gas in the vicinity of a giant black hole. These measurements, the most precise of their kind ever made with an X-ray telescope, demonstrate the existence of a blanket of warm gas that is expanding rapidly away from the black hole. The team consists of Jelle Kaastra, Rolf Mewe and Albert Brinkman of Space Research Organization Netherlands (SRON) in Utrecht, Duane Liedahl of Lawrence Livermore National Laboratory in Livermore, Calif., and Stefanie Komossa of Max Planck Institute in Garching, Germany. A report of their findings will be published in the March issue of the European journal Astronomy & Astrophysics. Kaastra and colleagues used the Low Energy Transmission Grating in conjunction with the High Resolution Camera to measure the number of X rays present at each energy. With this information they constructed an X-ray spectrum of the source. Their target was the central region, or nucleus of the galaxy NGC 5548, which they observed for 24 hours. This galaxy is one of a class of galaxies known to have unusually bright nuclei that are associated with gas flowing around and into giant black holes. This inflow produces an enormous outpouring of energy that blows some of the matter away from the black hole. Astronomers have used optical, ultraviolet, and X-ray telescopes in an effort to disentangle the complex nature of inflowing and outflowing gas at different distances from the black hole in NGC 5548. X-ray observations provide a ringside seat to the action around the black hole. By using the Low Energy Transmission Grating, the Dutch-US-German team concentrated on gas that forms a warm blanket that partially covers the innermost region where the highest energy X-rays are produced. As the high-energy X rays stream away from the vicinity of the black hole, they heat the blanketing gas to temperatures of a few million degrees, and the blanket absorbs some of the X rays from the central source. This produces dark stripes, or absorption lines in the X-ray spectrum. Bright stripes or emission lines due to emission from the blanketing gas are also present. Since each element has its own unique structure, these lines can be read like a cosmic bar code to take inventory of the gas. The team was able to determine what atoms the gas contains and how many, the number of electrons each atom has retained in the hostile environment of the black hole, and how the gas is moving there. They found lines from eight different elements including carbon, nitrogen, oxygen, and iron. The amount of this gas was found to be about 100 times greater than that found with optical and ultraviolet observations. The Low Energy Transmission Grating was built by the SRON. and the Max Planck Institute under the direction of Albert Brinkman. The High Resolution Camera was built by the Smithsonian Astrophysical Observatory in Cambridge, Mass. under the direction of Stephen Murray. To follow Chandra's progress or download images visit the Chandra sites at: http://chandra.harvard.edu/photo/2000/0170/index.html AND http://chandra.nasa.gov NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program. TRW, Inc., Redondo Beach, Calif., is the prime contractor for the spacecraft. The Smithsonian's Chandra X-ray Center controls science and flight operations from Cambridge, Mass. High resolution digital versions of the X-ray spectrum (JPG, 300 dpi TIFF ) and other information associated with this release are available on the Internet at: http://chandra.harvard.edu

  18. A New Approach in Downscaling Microwave Soil Moisture Product using Machine Learning

    NASA Astrophysics Data System (ADS)

    Abbaszadeh, Peyman; Yan, Hongxiang; Moradkhani, Hamid

    2016-04-01

    Understating the soil moisture pattern has significant impact on flood modeling, drought monitoring, and irrigation management. Although satellite retrievals can provide an unprecedented spatial and temporal resolution of soil moisture at a global-scale, their soil moisture products (with a spatial resolution of 25-50 km) are inadequate for regional study, where a resolution of 1-10 km is needed. In this study, a downscaling approach using Genetic Programming (GP), a specialized version of Genetic Algorithm (GA), is proposed to improve the spatial resolution of satellite soil moisture products. The GP approach was applied over a test watershed in United States using the coarse resolution satellite data (25 km) from Advanced Microwave Scanning Radiometer - EOS (AMSR-E) soil moisture products, the fine resolution data (1 km) from Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation index, and ground based data including land surface temperature, vegetation and other potential physical variables. The results indicated the great potential of this approach to derive the fine resolution soil moisture information applicable for data assimilation and other regional studies.

  19. Hybrid parallelization of the XTOR-2F code for the simulation of two-fluid MHD instabilities in tokamaks

    NASA Astrophysics Data System (ADS)

    Marx, Alain; Lütjens, Hinrich

    2017-03-01

    A hybrid MPI/OpenMP parallel version of the XTOR-2F code [Lütjens and Luciani, J. Comput. Phys. 229 (2010) 8130] solving the two-fluid MHD equations in full tokamak geometry by means of an iterative Newton-Krylov matrix-free method has been developed. The present work shows that the code has been parallelized significantly despite the numerical profile of the problem solved by XTOR-2F, i.e. a discretization with pseudo-spectral representations in all angular directions, the stiffness of the two-fluid stability problem in tokamaks, and the use of a direct LU decomposition to invert the physical pre-conditioner at every Krylov iteration of the solver. The execution time of the parallelized version is an order of magnitude smaller than the sequential one for low resolution cases, with an increasing speedup when the discretization mesh is refined. Moreover, it allows to perform simulations with higher resolutions, previously forbidden because of memory limitations.

  20. Dynamic Computation of Change Operations in Version Management of Business Process Models

    NASA Astrophysics Data System (ADS)

    Küster, Jochen Malte; Gerth, Christian; Engels, Gregor

    Version management of business process models requires that changes can be resolved by applying change operations. In order to give a user maximal freedom concerning the application order of change operations, position parameters of change operations must be computed dynamically during change resolution. In such an approach, change operations with computed position parameters must be applicable on the model and dependencies and conflicts of change operations must be taken into account because otherwise invalid models can be constructed. In this paper, we study the concept of partially specified change operations where parameters are computed dynamically. We provide a formalization for partially specified change operations using graph transformation and provide a concept for their applicability. Based on this, we study potential dependencies and conflicts of change operations and show how these can be taken into account within change resolution. Using our approach, a user can resolve changes of business process models without being unnecessarily restricted to a certain order.

  1. The Version 2 Global Precipitation Climatology Project (GPCP) Monthly Precipitation Analysis (1979-Present)

    NASA Technical Reports Server (NTRS)

    Adler, Robert F.; Huffman, George J.; Chang, Alfred; Ferraro, Ralph; Xie, Ping-Ping; Janowiak, John; Rudolf, Bruno; Schneider, Udo; Curtis, Scott; Bolvin, David

    2003-01-01

    The Global Precipitation Climatology Project (GPCP) Version 2 Monthly Precipitation Analysis is described. This globally complete, monthly analysis of surface precipitation at 2.5 degrees x 2.5 degrees latitude-longitude resolution is available from January 1979 to the present. It is a merged analysis that incorporates precipitation estimates from low-orbit-satellite microwave data, geosynchronous-orbit-satellite infrared data, and rain gauge observations. The merging approach utilizes the higher accuracy of the low-orbit microwave observations to calibrate, or adjust, the more frequent geosynchronous infrared observations. The data set is extended back into the premicrowave era (before 1987) by using infrared-only observations calibrated to the microwave-based analysis of the later years. The combined satellite-based product is adjusted by the raingauge analysis. This monthly analysis is the foundation for the GPCP suite of products including those at finer temporal resolution, satellite estimate, and error estimates for each field. The 23-year GPCP climatology is characterized, along with time and space variations of precipitation.

  2. JWST testbed telescope: a functionally accurate scaled version of the flight optical telescope element used to develop the flight wavefront sensing and control algorithm

    NASA Astrophysics Data System (ADS)

    Kingsbury, Lana K.; Atcheson, Paul D.

    2004-10-01

    The Northrop-Grumman/Ball/Kodak team is building the JWST observatory that will be launched in 2011. To develop the flight wavefront sensing and control (WFS&C) algorithms and software, Ball is designing and building a 1 meter diameter, functionally accurate version of the JWST optical telescope element (OTE). This testbed telescope (TBT) will incorporate the same optical element control capability as the flight OTE. The secondary mirror will be controlled by a 6 degree of freedom (dof) hexapod and each of the 18 segmented primary mirror assemblies will have 6 dof hexapod control as well as radius of curvature adjustment capability. In addition to the highly adjustable primary and secondary mirrors, the TBT will include a rigid tertiary mirror, 2 fold mirrors (to direct light into the TBT) and a very stable supporting structure. The total telescope system configured residual wavefront error will be better than 175 nm RMS double pass. The primary and secondary mirror hexapod assemblies enable 5 nm piston resolution, 0.0014 arcsec tilt resolution, 100 nm translation resolution, and 0.04497 arcsec clocking resolution. The supporting structure (specifically the secondary mirror support structure) is designed to ensure that the primary mirror segments will not change their despace position relative to the secondary mirror (spaced > 1 meter apart) by greater than 500 nm within a one hour period of ambient clean room operation.

  3. Evaluation of NorESM-OC (versions 1 and 1.2), the ocean carbon-cycle stand-alone configuration of the Norwegian Earth System Model (NorESM1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwinger, Jorg; Goris, Nadine; Tjiputra, Jerry F.

    Idealised and hindcast simulations performed with the stand-alone ocean carbon-cycle configuration of the Norwegian Earth System Model (NorESM-OC) are described and evaluated. We present simulation results of three different model configurations (two different model versions at different grid resolutions) using two different atmospheric forcing data sets. Model version NorESM-OC1 corresponds to the version that is included in the NorESM-ME1 fully coupled model, which participated in CMIP5. The main update between NorESM-OC1 and NorESM-OC1.2 is the addition of two new options for the treatment of sinking particles. We find that using a constant sinking speed, which has been the standard in NorESM'smore » ocean carbon cycle module HAMOCC (HAMburg Ocean Carbon Cycle model), does not transport enough particulate organic carbon (POC) into the deep ocean below approximately 2000 m depth. The two newly implemented parameterisations, a particle aggregation scheme with prognostic sinking speed, and a simpler scheme that uses a linear increase in the sinking speed with depth, provide better agreement with observed POC fluxes. Additionally, reduced deep ocean biases of oxygen and remineralised phosphate indicate a better performance of the new parameterisations. For model version 1.2, a re-tuning of the ecosystem parameterisation has been performed, which (i) reduces previously too high primary production at high latitudes, (ii) consequently improves model results for surface nutrients, and (iii) reduces alkalinity and dissolved inorganic carbon biases at low latitudes. We use hindcast simulations with prescribed observed and constant (pre-industrial) atmospheric CO 2 concentrations to derive the past and contemporary ocean carbon sink. As a result, for the period 1990–1999 we find an average ocean carbon uptake ranging from 2.01 to 2.58 Pg C yr -1 depending on model version, grid resolution, and atmospheric forcing data set.« less

  4. Evaluation of NorESM-OC (versions 1 and 1.2), the ocean carbon-cycle stand-alone configuration of the Norwegian Earth System Model (NorESM1)

    DOE PAGES

    Schwinger, Jorg; Goris, Nadine; Tjiputra, Jerry F.; ...

    2016-08-02

    Idealised and hindcast simulations performed with the stand-alone ocean carbon-cycle configuration of the Norwegian Earth System Model (NorESM-OC) are described and evaluated. We present simulation results of three different model configurations (two different model versions at different grid resolutions) using two different atmospheric forcing data sets. Model version NorESM-OC1 corresponds to the version that is included in the NorESM-ME1 fully coupled model, which participated in CMIP5. The main update between NorESM-OC1 and NorESM-OC1.2 is the addition of two new options for the treatment of sinking particles. We find that using a constant sinking speed, which has been the standard in NorESM'smore » ocean carbon cycle module HAMOCC (HAMburg Ocean Carbon Cycle model), does not transport enough particulate organic carbon (POC) into the deep ocean below approximately 2000 m depth. The two newly implemented parameterisations, a particle aggregation scheme with prognostic sinking speed, and a simpler scheme that uses a linear increase in the sinking speed with depth, provide better agreement with observed POC fluxes. Additionally, reduced deep ocean biases of oxygen and remineralised phosphate indicate a better performance of the new parameterisations. For model version 1.2, a re-tuning of the ecosystem parameterisation has been performed, which (i) reduces previously too high primary production at high latitudes, (ii) consequently improves model results for surface nutrients, and (iii) reduces alkalinity and dissolved inorganic carbon biases at low latitudes. We use hindcast simulations with prescribed observed and constant (pre-industrial) atmospheric CO 2 concentrations to derive the past and contemporary ocean carbon sink. As a result, for the period 1990–1999 we find an average ocean carbon uptake ranging from 2.01 to 2.58 Pg C yr -1 depending on model version, grid resolution, and atmospheric forcing data set.« less

  5. Land Surface Temperature Measurements from EOS MODIS Data

    NASA Technical Reports Server (NTRS)

    Wan, Zheng-Ming

    2004-01-01

    This report summarizes the accomplishments made by the MODIS LST (Land-Surface Temperature) group at University of California, Santa Barbara, under NASA Contract. Version 1 of the MODIS Land-Surface Temperature Algorithm Theoretical Basis Document (ATBD) was reviewed in June 1994, version 2 reviewed in November 1994, version 3.1 in August 1996, and version 3.3 updated in April 1999. Based on the ATBD, two LST algorithms were developed, one is the generalized split-window algorithm and another is the physics-based day/night LST algorithm. These two LST algorithms were implemented into the production generation executive code (PGE 16) for the daily standard MODIS LST products at level-2 (MODII-L2) and level-3 (MODIIA1 at 1 km resolution and MODIIB1 at 5km resolution). PGE codes for 8-day 1 km LST product (MODIIA2) and the daily, 8-day and monthly LST products at 0.05 degree latitude/longitude climate model grids (CMG) were also delivered. Four to six field campaigns were conducted each year since 2000 to validate the daily LST products generated by PGE16 and the calibration accuracies of the MODIS TIR bands used for the LST/emissivity retrieval from versions 2-4 of Terra MODIS data and versions 3-4 of Aqua MODIS data. Validation results from temperature-based and radiance-based methods indicate that the MODIS LST accuracy is better than 1 C in most clear-sky cases in the range from -10 to 58 C. One of the major lessons learn from multi- year temporal analysis of the consistent V4 daily Terra MODIS LST products in 2000-2003 over some selected target areas including lakes, snow/ice fields, and semi-arid sites is that there are variable numbers of cloud-contaminated LSTs in the MODIS LST products depending on surface elevation, land cover types, and atmospheric conditions. A cloud-screen scheme with constraints on spatial and temporal variations in LSTs was developed to remove cloud-contaminated LSTs. The 5km LST product was indirectly validated through comparisons to the 1 km LST product. Twenty three papers related to the LST research work were published in journals over the last decade.

  6. Photogrammetric Processing of IceBridge DMS Imagery into High-Resolution Digital Surface Models (DEM and Visible Overlay)

    NASA Astrophysics Data System (ADS)

    Arvesen, J. C.; Dotson, R. C.

    2014-12-01

    The DMS (Digital Mapping System) has been a sensor component of all DC-8 and P-3 IceBridge flights since 2009 and has acquired over 3 million JPEG images over Arctic and Antarctic land and sea ice. The DMS imagery is primarily used for identifying and locating open leads for LiDAR sea-ice freeboard measurements and documenting snow and ice surface conditions. The DMS is a COTS Canon SLR camera utilizing a 28mm focal length lens, resulting in a 10cm GSD and swath of ~400 meters from a nominal flight altitude of 500 meters. Exterior orientation is provided by an Applanix IMU/GPS which records a TTL pulse coincident with image acquisition. Notable for virtually all IceBridge flights is that parallel grids are not flown and thus there is no ability to photogrammetrically tie any imagery to adjacent flight lines. Approximately 800,000 Level-3 DMS Surface Model data products have been delivered to NSIDC, each consisting of a Digital Elevation Model (GeoTIFF DEM) and a co-registered Visible Overlay (GeoJPEG). Absolute elevation accuracy for each individual Elevation Model is adjusted to concurrent Airborne Topographic Mapper (ATM) Lidar data, resulting in higher elevation accuracy than can be achieved by photogrammetry alone. The adjustment methodology forces a zero mean difference to the corresponding ATM point cloud integrated over each DMS frame. Statistics are calculated for each DMS Elevation Model frame and show RMS differences are within +/- 10 cm with respect to the ATM point cloud. The DMS Surface Model possesses similar elevation accuracy to the ATM point cloud, but with the following advantages: · Higher and uniform spatial resolution: 40 cm GSD · 45% wider swath: 435 meters vs. 300 meters at 500 meter flight altitude · Visible RGB co-registered overlay at 10 cm GSD · Enhanced visualization through 3-dimensional virtual reality (i.e. video fly-through) Examples will be presented of the utility of these advantages and a novel use of a cell phone camera for aerial photogrammetry will also be presented.

  7. Enhancing the Accessibility and Utility of UAVSAR L-band SAR Data

    NASA Astrophysics Data System (ADS)

    Atwood, D.; Arko, S. A.; Gens, R.; Sanches, R. R.

    2011-12-01

    The UAVSAR instrument, developed at NASA Jet Propulsion Lab, is a reconfigurable L-band, quad-polarimetric Synthetic Aperture Radar (SAR) developed specifically for repeat-track differential interferometry (InSAR). It offers resolution of approximately 5m and swaths greater than 16 km. Although designed to be flown aboard a UAV (Uninhabited Aerial Vehicle), it is currently being flown aboard a Gulfstream III in an ambitious set of campaigns around the world. The current archive from 2009 contains data from more than 100 missions from North America, Central America, the Caribbean, and Greenland. Compared with most SAR data from satellites, UAVSAR offers higher resolution, full-polarimetry, and an impressive noise floor. For scientists, these datasets present wonderful opportunities for understanding Earth processes and developing new algorithms for information extraction. Yet despite the diverse range of coverage, UAVSAR is still relatively under-utilized. In its capacity as the NASA SAR DAAC, the Alaska Satellite Facility (ASF) is interested in expanding recognition of this data and serving data products that can be readily downloaded into a Geographic Information System (GIS) environment. Two hurdles exist: one is the large size of the data products and the second is the format of the data. The data volumes are in excess of several GB; presenting slow downloads and overwhelming many software programs. Secondly, while the data is appropriately formatted for expert users, it may prove challenging for scientists who have not previously worked with SAR. This paper will address ways that ASF is working to reduce data volume while maintaining the integrity of the data. At the same time, the creation of value-added products that permit immediate visualization in a GIS environment will be described. Conversion of the UAVSAR polarimetric data to radiometrically terrain-corrected Pauli images in a GeoTIFF format will permit researchers to understand the scattering mechanisms that characterize various land cover classes in their study areas. Specific examples of UAVSAR polarimetric classifications will be used to demonstrate the benefit of the UAVSAR products for Earth science projects.

  8. VO-compliant libraries of high resolution spectra of cool stars

    NASA Astrophysics Data System (ADS)

    Montes, D.

    2008-10-01

    In this contribution we describe a Virtual Observatory (VO) compliant version of the libraries of high resolution spectra of cool stars described by Montes et al. (1997; 1998; and 1999). Since their publication the fully reduced spectra in FITS format have been available via ftp and in the World Wide Web. However, in the VO all the spectra will be accessible using a common web interface following the standards of the International Virtual Observatory Alliance (IVOA). These libraries include F, G, K and M field stars, from dwarfs to giants. The spectral coverage is from 3800 to 10000 Å, with spectral resolution ranging from 0.09 to 3.0 Å.

  9. MOEMS Fabry-Pérot interferometer with point-anchored Si-air mirrors for middle infrared

    NASA Astrophysics Data System (ADS)

    Tuohiniemi, Mikko; Näsilä, Antti; Akujärvi, Altti; Blomberg, Martti

    2014-09-01

    We studied how a micromachined Fabry-Pérot interferometer, realized with wide point-anchored Si/air-gap reflectors, performs at the middle-infrared. A computational analysis of the anchor mechanical behavior is also presented. Compared with solid-film reflectors, this technology features better index contrast, which enables a wider stop band and potentially higher resolution. In this work, we investigate whether the performance is improved according to the index-contrast benefit, or whether the mechanical differences play a role. For comparison, we manufactured and characterized another design that applies solid-film reflectors of Si/SiO2 structure. This data is exploited as a reference for a middle-infrared interferometer and as a template for mapping the performance from the simulation results to the measured data. The novel Si/air-gap device was realized as a non-tunable proof-of-concept version. The measured data is mapped into an estimate of the achievable performance of a tunable version. We present the measured transmission and resolution data and compare the simulation models that reproduce the data. The prediction for the tunable middle-infrared Si/air-gap device is then presented. The results indicate that the interferometer’s resolution is expected to have improved twofold and have a much wider stop band compared with the prior art.

  10. Broadband Heating Rate Profile Project (BBHRP) - SGP ripbe370mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  11. Broadband Heating Rate Profile Project (BBHRP) - SGP 1bbhrpripbe1mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  12. Broadband Heating Rate Profile Project (BBHRP) - SGP ripbe1mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  13. A Coupled Surface Nudging Scheme for use in Retrospective ...

    EPA Pesticide Factsheets

    A surface analysis nudging scheme coupling atmospheric and land surface thermodynamic parameters has been implemented into WRF v3.8 (latest version) for use with retrospective weather and climate simulations, as well as for applications in air quality, hydrology, and ecosystem modeling. This scheme is known as the flux-adjusting surface data assimilation system (FASDAS) developed by Alapaty et al. (2008). This scheme provides continuous adjustments for soil moisture and temperature (via indirect nudging) and for surface air temperature and water vapor mixing ratio (via direct nudging). The simultaneous application of indirect and direct nudging maintains greater consistency between the soil temperature–moisture and the atmospheric surface layer mass-field variables. The new method, FASDAS, consistently improved the accuracy of the model simulations at weather prediction scales for different horizontal grid resolutions, as well as for high resolution regional climate predictions. This new capability has been released in WRF Version 3.8 as option grid_sfdda = 2. This new capability increased the accuracy of atmospheric inputs for use air quality, hydrology, and ecosystem modeling research to improve the accuracy of respective end-point research outcome. IMPACT: A new method, FASDAS, was implemented into the WRF model to consistently improve the accuracy of the model simulations at weather prediction scales for different horizontal grid resolutions, as wel

  14. Phylogenetic classification of bony fishes.

    PubMed

    Betancur-R, Ricardo; Wiley, Edward O; Arratia, Gloria; Acero, Arturo; Bailly, Nicolas; Miya, Masaki; Lecointre, Guillaume; Ortí, Guillermo

    2017-07-06

    Fish classifications, as those of most other taxonomic groups, are being transformed drastically as new molecular phylogenies provide support for natural groups that were unanticipated by previous studies. A brief review of the main criteria used by ichthyologists to define their classifications during the last 50 years, however, reveals slow progress towards using an explicit phylogenetic framework. Instead, the trend has been to rely, in varying degrees, on deep-rooted anatomical concepts and authority, often mixing taxa with explicit phylogenetic support with arbitrary groupings. Two leading sources in ichthyology frequently used for fish classifications (JS Nelson's volumes of Fishes of the World and W. Eschmeyer's Catalog of Fishes) fail to adopt a global phylogenetic framework despite much recent progress made towards the resolution of the fish Tree of Life. The first explicit phylogenetic classification of bony fishes was published in 2013, based on a comprehensive molecular phylogeny ( www.deepfin.org ). We here update the first version of that classification by incorporating the most recent phylogenetic results. The updated classification presented here is based on phylogenies inferred using molecular and genomic data for nearly 2000 fishes. A total of 72 orders (and 79 suborders) are recognized in this version, compared with 66 orders in version 1. The phylogeny resolves placement of 410 families, or ~80% of the total of 514 families of bony fishes currently recognized. The ordinal status of 30 percomorph families included in this study, however, remains uncertain (incertae sedis in the series Carangaria, Ovalentaria, or Eupercaria). Comments to support taxonomic decisions and comparisons with conflicting taxonomic groups proposed by others are presented. We also highlight cases were morphological support exist for the groups being classified. This version of the phylogenetic classification of bony fishes is substantially improved, providing resolution for more taxa than previous versions, based on more densely sampled phylogenetic trees. The classification presented in this study represents, unlike any other, the most up-to-date hypothesis of the Tree of Life of fishes.

  15. A computational atlas of the hippocampal formation using ex vivo, ultra-high resolution MRI: Application to adaptive segmentation of in vivo MRI.

    PubMed

    Iglesias, Juan Eugenio; Augustinack, Jean C; Nguyen, Khoa; Player, Christopher M; Player, Allison; Wright, Michelle; Roy, Nicole; Frosch, Matthew P; McKee, Ann C; Wald, Lawrence L; Fischl, Bruce; Van Leemput, Koen

    2015-07-15

    Automated analysis of MRI data of the subregions of the hippocampus requires computational atlases built at a higher resolution than those that are typically used in current neuroimaging studies. Here we describe the construction of a statistical atlas of the hippocampal formation at the subregion level using ultra-high resolution, ex vivo MRI. Fifteen autopsy samples were scanned at 0.13 mm isotropic resolution (on average) using customized hardware. The images were manually segmented into 13 different hippocampal substructures using a protocol specifically designed for this study; precise delineations were made possible by the extraordinary resolution of the scans. In addition to the subregions, manual annotations for neighboring structures (e.g., amygdala, cortex) were obtained from a separate dataset of in vivo, T1-weighted MRI scans of the whole brain (1mm resolution). The manual labels from the in vivo and ex vivo data were combined into a single computational atlas of the hippocampal formation with a novel atlas building algorithm based on Bayesian inference. The resulting atlas can be used to automatically segment the hippocampal subregions in structural MRI images, using an algorithm that can analyze multimodal data and adapt to variations in MRI contrast due to differences in acquisition hardware or pulse sequences. The applicability of the atlas, which we are releasing as part of FreeSurfer (version 6.0), is demonstrated with experiments on three different publicly available datasets with different types of MRI contrast. The results show that the atlas and companion segmentation method: 1) can segment T1 and T2 images, as well as their combination, 2) replicate findings on mild cognitive impairment based on high-resolution T2 data, and 3) can discriminate between Alzheimer's disease subjects and elderly controls with 88% accuracy in standard resolution (1mm) T1 data, significantly outperforming the atlas in FreeSurfer version 5.3 (86% accuracy) and classification based on whole hippocampal volume (82% accuracy). Copyright © 2015. Published by Elsevier Inc.

  16. Assessment of Version 4 of the SMAP Passive Soil Moisture Standard Product

    NASA Technical Reports Server (NTRS)

    O'neill, P. O.; Chan, S.; Bindlish, R.; Jackson, T.; Colliander, A.; Dunbar, R.; Chen, F.; Piepmeier, Jeffrey R.; Yueh, S.; Entekhabi, D.; hide

    2017-01-01

    NASAs Soil Moisture Active Passive (SMAP) mission launched on January 31, 2015 into a sun-synchronous 6 am6 pm orbit with an objective to produce global mapping of high-resolution soil moisture and freeze-thaw state every 2-3 days. The SMAP radiometer began acquiring routine science data on March 31, 2015 and continues to operate nominally. SMAPs radiometer-derived standard soil moisture product (L2SMP) provides soil moisture estimates posted on a 36-km fixed Earth grid using brightness temperature observations and ancillary data. A beta quality version of L2SMP was released to the public in October, 2015, Version 3 validated L2SMP soil moisture data were released in May, 2016, and Version 4 L2SMP data were released in December, 2016. Version 4 data are processed using the same soil moisture retrieval algorithms as previous versions, but now include retrieved soil moisture from both the 6 am descending orbits and the 6 pm ascending orbits. Validation of 19 months of the standard L2SMP product was done for both AM and PM retrievals using in situ measurements from global core calval sites. Accuracy of the soil moisture retrievals averaged over the core sites showed that SMAP accuracy requirements are being met.

  17. 77 FR 34123 - Information Collection Available for Public Comments and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-08

    ... from that office. SUPPLEMENTARY INFORMATION: Title of Collection: Monthly Report of Ocean Shipments... responsibilities under Public Resolution 17, to ensure compliance of ocean shipping requirements operating under.... An electronic version of this document is available on the World Wide Web at http://regulations.gov...

  18. Performance of a reconfigured atmospheric general circulation model at low resolution

    NASA Astrophysics Data System (ADS)

    Wen, Xinyu; Zhou, Tianjun; Wang, Shaowu; Wang, Bin; Wan, Hui; Li, Jian

    2007-07-01

    Paleoclimate simulations usually require model runs over a very long time. The fast integration version of a state-of-the-art general circulation model (GCM), which shares the same physical and dynamical processes but with reduced horizontal resolution and increased time step, is usually developed. In this study, we configure a fast version of an atmospheric GCM (AGCM), the Grid Atmospheric Model of IAP/LASG (Institute of Atmospheric Physics/State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics), at low resolution (GAMIL-L, hereafter), and compare the simulation results with the NCEP/NCAR reanalysis and other data to examine its performance. GAMIL-L, which is derived from the original GAMIL, is a finite difference AGCM with 72×40 grids in longitude and latitude and 26 vertical levels. To validate the simulated climatology and variability, two runs were achieved. One was a 60-year control run with fixed climatological monthly sea surface temperature (SST) forcing, and the other was a 50-yr (1950 2000) integration with observational time-varying monthly SST forcing. Comparisons between these two cases and the reanalysis, including intra-seasonal and inter-annual variability are also presented. In addition, the differences between GAMIL-L and the original version of GAMIL are also investigated. The results show that GAMIL-L can capture most of the large-scale dynamical features of the atmosphere, especially in the tropics and mid latitudes, although a few deficiencies exist, such as the underestimated Hadley cell and thereby the weak strength of the Asia summer monsoon. However, the simulated mean states over high latitudes, especially over the polar regions, are not acceptable. Apart from dynamics, the thermodynamic features mainly depend upon the physical parameterization schemes. Since the physical package of GAMIL-L is exactly the same as the original high-resolution version of GAMIL, in which the NCAR Community Atmosphere Model (CAM2) physical package was used, there are only small differences between them in the precipitation and temperature fields. Because our goal is to develop a fast-running AGCM and employ it in the coupled climate system model of IAP/LASG for paleoclimate studies such as ENSO and Australia-Asia monsoon, particular attention has been paid to the model performances in the tropics. More model validations, such as those ran for the Southern Oscillation and South Asia monsoon, indicate that GAMIL-L is reasonably competent and valuable in this regard.

  19. Hubble peers inside a celestial geode

    NASA Astrophysics Data System (ADS)

    2004-08-01

    celestial geode hi-res Size hi-res: 148 Kb Credits: ESA/NASA, Yäel Nazé (University of Liège, Belgium) and You-Hua Chu (University of Illinois, Urbana, USA) Hubble peers inside a celestial geode In this unusual image, the NASA/ESA Hubble Space Telescope captures a rare view of the celestial equivalent of a geode - a gas cavity carved by the stellar wind and intense ultraviolet radiation from a young hot star. Real geodes are handball-sized, hollow rocks that start out as bubbles in volcanic or sedimentary rock. Only when these inconspicuous round rocks are split in half by a geologist, do we get a chance to appreciate the inside of the rock cavity that is lined with crystals. In the case of Hubble's 35 light-year diameter ‘celestial geode’ the transparency of its bubble-like cavity of interstellar gas and dust reveals the treasures of its interior. Low resolution version (JPG format) 148 Kb High resolution version (TIFF format) 1929 Kb Acknowledgment: This image was created with the help of the ESA/ESO/NASA Photoshop FITS Liberator. Real geodes are handball-sized, hollow rocks that start out as bubbles in volcanic or sedimentary rock. Only when these inconspicuous round rocks are split in half by a geologist, do we get a chance to appreciate the inside of the rock cavity that is lined with crystals. In the case of Hubble's 35 light-year diameter ‘celestial geode’ the transparency of its bubble-like cavity of interstellar gas and dust reveals the treasures of its interior. The object, called N44F, is being inflated by a torrent of fast-moving particles (what astronomers call a 'stellar wind') from an exceptionally hot star (the bright star just below the centre of the bubble) once buried inside a cold dense cloud. Compared with our Sun (which is losing mass through the so-called 'solar wind'), the central star in N44F is ejecting more than a 100 million times more mass per second and the hurricane of particles moves much faster at 7 million km per hour (as opposed to less than 1.5 million km per hour for our Sun). Because the bright central star does not exist in empty space but is surrounded by an envelope of gas, the stellar wind collides with this gas, pushing it out, like a snow plough. This forms a bubble, whose striking structure is clearly visible in the crisp Hubble image. The nebula N44F is one of a handful of known interstellar bubbles. Bubbles like these have been seen around evolved massive stars (called 'Wolf-Rayet stars'), and also around clusters of stars (where they are called 'super-bubbles'). But they have rarely been viewed around isolated stars, as is the case here. On closer inspection N44F harbours additional surprises. The interior wall of its gaseous cavity is lined with several four to eight light-year high finger-like columns of cool dust and gas. (The structure of these 'columns' is similar to the Eagle Nebula’s iconic 'Pillars of Creation' photographed by Hubble a decade ago, and is seen in a few other nebulae as well). The fingers are created by a blistering ultraviolet radiation from the central star. Like wind socks caught in a gale, they point in the direction of the energy flow. These pillars look small in this image only because they are much farther away from us then the Eagle Nebula’s pillars. N44F is located about 160 000 light-years in the neighbouring dwarf galaxy the Large Magellanic Cloud, in the direction of the southern constellation Dorado. N44F is part of the larger N44 complex, which contains a large super-bubble, blown out by the combined action of stellar winds and multiple supernova explosions. N44 itself is roughly 1000 light-years across. Several compact star-forming regions, including N44F, are found along the rim of the central super-bubble. This image was taken with Hubble's Wide Field Planetary Camera 2, using filters that isolate light emitted by sulphur (shown in blue, a 1200-second exposure) and hydrogen gas (shown in red, a 1000-second exposure).

  20. Mesosacle eddies in a high resolution OGCM and coupled ocean-atmosphere GCM

    NASA Astrophysics Data System (ADS)

    Yu, Y.; Liu, H.; Lin, P.

    2017-12-01

    The present study described high-resolution climate modeling efforts including oceanic, atmospheric and coupled general circulation model (GCM) at the state key laboratory of numerical modeling for atmospheric sciences and geophysical fluid dynamics (LASG), Institute of Atmospheric Physics (IAP). The high-resolution OGCM is established based on the latest version of the LASG/IAP Climate system Ocean Model (LICOM2.1), but its horizontal resolution and vertical resolution are increased to 1/10° and 55 layers, respectively. Forced by the surface fluxes from the reanalysis and observed data, the model has been integrated for approximately more than 80 model years. Compared with the simulation of the coarse-resolution OGCM, the eddy-resolving OGCM not only better simulates the spatial-temporal features of mesoscale eddies and the paths and positions of western boundary currents but also reproduces the large meander of the Kuroshio Current and its interannual variability. Another aspect, namely, the complex structures of equatorial Pacific currents and currents in the coastal ocean of China, are better captured due to the increased horizontal and vertical resolution. Then we coupled the high resolution OGCM to NCAR CAM4 with 25km resolution, in which the mesoscale air-sea interaction processes are better captured.

  1. The WCRP/GEWEX Surface Radiation Budget Project Release 2: An Assessment of Surface Fluxes at 1 Degree Resolution

    NASA Technical Reports Server (NTRS)

    Stackhouse, P. W., Jr.; Gupta, S. K.; Cox, S. J.; Chiacchio, M.; Mikovitz, J. C.

    2004-01-01

    The U.S. National Aeronautics and Space Administration (NASA) based Surface Radiation Budget (SRB) Project in association with the World Climate Research Programme Global Energy and Water Cycle Experiment (WCRP/GEWEX) is preparing a new 1 deg x 1 deg horizontal resolution product for distribution scheduled for release in early 2001. The new release contains several significant upgrades from the previous version. This paper summarizes the most significant upgrades and presents validation results as an assessment of the new data set.

  2. Ultrasonic Ranging System With Increased Resolution

    NASA Technical Reports Server (NTRS)

    Meyer, William E.; Johnson, William G.

    1987-01-01

    Master-oscillator frequency increased. Ultrasonic range-measuring system with 0.1-in. resolution provides continuous digital display of four distance readings, each updated four times per second. Four rangefinder modules in system are modified versions of rangefinder used for automatic focusing in commercial series of cameras. Ultrasonic pulses emitted by system innocuous to both people and equipment. Provides economical solutions to such distance-measurement problems as posed by boats approaching docks, truck backing toward loading platform, runway-clearance readout for tail of airplane with high angle attack, or burglar alarm.

  3. Modelling the climate and surface mass balance of polar ice sheets using RACMO2 - Part 2: Antarctica (1979-2016)

    NASA Astrophysics Data System (ADS)

    Melchior van Wessem, Jan; van de Berg, Willem Jan; Noël, Brice P. Y.; van Meijgaard, Erik; Amory, Charles; Birnbaum, Gerit; Jakobs, Constantijn L.; Krüger, Konstantin; Lenaerts, Jan T. M.; Lhermitte, Stef; Ligtenberg, Stefan R. M.; Medley, Brooke; Reijmer, Carleen H.; van Tricht, Kristof; Trusel, Luke D.; van Ulft, Lambertus H.; Wouters, Bert; Wuite, Jan; van den Broeke, Michiel R.

    2018-04-01

    We evaluate modelled Antarctic ice sheet (AIS) near-surface climate, surface mass balance (SMB) and surface energy balance (SEB) from the updated polar version of the regional atmospheric climate model, RACMO2 (1979-2016). The updated model, referred to as RACMO2.3p2, incorporates upper-air relaxation, a revised topography, tuned parameters in the cloud scheme to generate more precipitation towards the AIS interior and modified snow properties reducing drifting snow sublimation and increasing surface snowmelt. Comparisons of RACMO2 model output with several independent observational data show that the existing biases in AIS temperature, radiative fluxes and SMB components are further reduced with respect to the previous model version. The model-integrated annual average SMB for the ice sheet including ice shelves (minus the Antarctic Peninsula, AP) now amounts to 2229 Gt y-1, with an interannual variability of 109 Gt y-1. The largest improvement is found in modelled surface snowmelt, which now compares well with satellite and weather station observations. For the high-resolution ( ˜ 5.5 km) AP simulation, results remain comparable to earlier studies. The updated model provides a new, high-resolution data set of the contemporary near-surface climate and SMB of the AIS; this model version will be used for future climate scenario projections in a forthcoming study.

  4. The version 3 OMI NO2 standard product

    NASA Astrophysics Data System (ADS)

    Krotkov, Nickolay A.; Lamsal, Lok N.; Celarier, Edward A.; Swartz, William H.; Marchenko, Sergey V.; Bucsela, Eric J.; Chan, Ka Lok; Wenig, Mark; Zara, Marina

    2017-09-01

    We describe the new version 3.0 NASA Ozone Monitoring Instrument (OMI) standard nitrogen dioxide (NO2) products (SPv3). The products and documentation are publicly available from the NASA Goddard Earth Sciences Data and Information Services Center (https://disc.gsfc.nasa.gov/datasets/OMNO2_V003/summary/). The major improvements include (1) a new spectral fitting algorithm for NO2 slant column density (SCD) retrieval and (2) higher-resolution (1° latitude and 1.25° longitude) a priori NO2 and temperature profiles from the Global Modeling Initiative (GMI) chemistry-transport model with yearly varying emissions to calculate air mass factors (AMFs) required to convert SCDs into vertical column densities (VCDs). The new SCDs are systematically lower (by ˜ 10-40 %) than previous, version 2, estimates. Most of this reduction in SCDs is propagated into stratospheric VCDs. Tropospheric NO2 VCDs are also reduced over polluted areas, especially over western Europe, the eastern US, and eastern China. Initial evaluation over unpolluted areas shows that the new SPv3 products agree better with independent satellite- and ground-based Fourier transform infrared (FTIR) measurements. However, further evaluation of tropospheric VCDs is needed over polluted areas, where the increased spatial resolution and more refined AMF estimates may lead to better characterization of pollution hot spots.

  5. Analyzing and leveraging self-similarity for variable resolution atmospheric models

    NASA Astrophysics Data System (ADS)

    O'Brien, Travis; Collins, William

    2015-04-01

    Variable resolution modeling techniques are rapidly becoming a popular strategy for achieving high resolution in a global atmospheric models without the computational cost of global high resolution. However, recent studies have demonstrated a variety of resolution-dependent, and seemingly artificial, features. We argue that the scaling properties of the atmosphere are key to understanding how the statistics of an atmospheric model should change with resolution. We provide two such examples. In the first example we show that the scaling properties of the cloud number distribution define how the ratio of resolved to unresolved clouds should increase with resolution. We show that the loss of resolved clouds, in the high resolution region of variable resolution simulations, with the Community Atmosphere Model version 4 (CAM4) is an artifact of the model's treatment of condensed water (this artifact is significantly reduced in CAM5). In the second example we show that the scaling properties of the horizontal velocity field, combined with the incompressibility assumption, necessarily result in an intensification of vertical mass flux as resolution increases. We show that such an increase is present in a wide variety of models, including CAM and the regional climate models of the ENSEMBLES intercomparision. We present theoretical arguments linking this increase to the intensification of precipitation with increasing resolution.

  6. Ultrasound Imaging Using Diffraction Tomography in a Cylindrical Geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chambers, D H; Littrup, P

    2002-01-24

    Tomographic images of tissue phantoms and a sample of breast tissue have been produced from an acoustic synthetic array system for frequencies near 500 kHz. The images for sound speed and attenuation show millimeter resolution and demonstrate the feasibility of obtaining high-resolution tomographic images with frequencies that can deeply penetrate tissue. The image reconstruction method is based on the Born approximation to acoustic scattering and is a simplified version of a method previously used by Andre (Andre, et. al., Int. J. Imaging Systems and Technology, Vol 8, No. 1, 1997) for a circular acoustic array system. The images have comparablemore » resolution to conventional ultrasound images at much higher frequencies (3-5 MHz) but with lower speckle noise. This shows the potential of low frequency, deeply penetrating, ultrasound for high-resolution quantitative imaging.« less

  7. Convection-Resolving Climate Change Simulations: Intensification of Heavy Hourly Precipitation Events

    NASA Astrophysics Data System (ADS)

    Ban, N.; Schmidli, J.; Schar, C.

    2014-12-01

    Reliable climate-change projections of extreme precipitation events are of great interest to decision makers, due to potentially important hydrological impacts such as floods, land slides and debris flows. Low-resolution climate models generally project increases of heavy precipitation events with climate change, but there are large uncertainties related to the limited spatial resolution and the parameterized representation of atmospheric convection. Here we employ a convection-resolving version of the COSMO model across an extended region (1100 km x 1100 km) covering the European Alps to investigate the differences between parameterized and explicit convection in climate-change scenarios. We conduct 10-year long integrations at resolutions of 12 and 2km. Validation using ERA-Interim driven simulations reveals major improvements with the 2km resolution, in particular regarding the diurnal cycle of mean precipitation and the representation of hourly extremes. In addition, 2km simulations replicate the observed super-adiabatic scaling at precipitation stations, i.e. peak hourly events increase faster with temperature than the Clausius-Clapeyron scaling of 7%/K (see Ban et al. 2014). Convection-resolving climate change scenarios are conducted using control (1991-2000) and scenario (2081-2090) simulations driven by a CMIP5 GCM (i.e. the MPI-ESM-LR) under the IPCC RCP8.5 scenario. Comparison between 12 and 2km resolutions with parameterized and explicit convection, respectively, reveals close agreement in terms of mean summer precipitation amounts (decrease by 30%), and regarding slight increases of heavy day-long events (amounting to 15% for 90th-percentile for wet-day precipitation). However, the different resolutions yield large differences regarding extreme hourly precipitation, with the 2km version projecting substantially faster increases of heavy hourly precipitation events (about 30% increases for 90th-percentile hourly events). Ban, N., J. Schmidli and C. Schӓr (2014): Evaluation of the convection-resolving regional climate modeling approach in decade-long simulations. J. Geophys. Res. Atmos.,119, 7889-7907, doi:10.1002/2014JD021478

  8. Version 3 of the SMAP Level 4 Soil Moisture Product

    NASA Technical Reports Server (NTRS)

    Reichle, Rolf; Liu, Qing; Ardizzone, Joe; Crow, Wade; De Lannoy, Gabrielle; Kolassa, Jana; Kimball, John; Koster, Randy

    2017-01-01

    The NASA Soil Moisture Active Passive (SMAP) Level 4 Soil Moisture (L4_SM) product provides 3-hourly, 9-km resolution, global estimates of surface (0-5 cm) and root zone (0-100 cm) soil moisture as well as related land surface states and fluxes from 31 March 2015 to present with a latency of 2.5 days. The ensemble-based L4_SM algorithm is a variant of the Goddard Earth Observing System version 5 (GEOS-5) land data assimilation system and ingests SMAP L-band (1.4 GHz) Level 1 brightness temperature observations into the Catchment land surface model. The soil moisture analysis is non-local (spatially distributed), performs downscaling from the 36-km resolution of the observations to that of the model, and respects the relative uncertainties of the modeled and observed brightness temperatures. Prior to assimilation, a climatological rescaling is applied to the assimilated brightness temperatures using a 6 year record of SMOS observations. A new feature in Version 3 of the L4_SM data product is the use of 2 years of SMAP observations for rescaling where SMOS observations are not available because of radio frequency interference, which expands the impact of SMAP observations on the L4_SM estimates into large regions of northern Africa and Asia. This presentation investigates the performance and data assimilation diagnostics of the Version 3 L4_SM data product. The L4_SM soil moisture estimates meet the 0.04 m3m3 (unbiased) RMSE requirement. We further demonstrate that there is little bias in the soil moisture analysis. Finally, we illustrate where the assimilation system overestimates or underestimates the actual errors in the system.

  9. Robust effects of cloud superparameterization on simulated daily rainfall intensity statistics across multiple versions of the Community Earth System Model

    DOE PAGES

    Kooperman, Gabriel J.; Pritchard, Michael S.; Burt, Melissa A.; ...

    2016-02-01

    This study evaluates several important statistics of daily rainfall based on frequency and amount distributions as simulated by a global climate model whose precipitation does not depend on convective parameterization—Super-Parameterized Community Atmosphere Model (SPCAM). Three superparameterized and conventional versions of CAM, coupled within the Community Earth System Model (CESM1 and CCSM4), are compared against two modern rainfall products (GPCP 1DD and TRMM 3B42) to discriminate robust effects of superparameterization that emerge across multiple versions. The geographic pattern of annual-mean rainfall is mostly insensitive to superparameterization, with only slight improvements in the double-ITCZ bias. However, unfolding intensity distributions reveal several improvementsmore » in the character of rainfall simulated by SPCAM. The rainfall rate that delivers the most accumulated rain (i.e., amount mode) is systematically too weak in all versions of CAM relative to TRMM 3B42 and does not improve with horizontal resolution. It is improved by superparameterization though, with higher modes in regions of tropical wave, Madden-Julian Oscillation, and monsoon activity. Superparameterization produces better representations of extreme rates compared to TRMM 3B42, without sensitivity to horizontal resolution seen in CAM. SPCAM produces more dry days over land and fewer over the ocean. Updates to CAM’s low cloud parameterizations have narrowed the frequency peak of light rain, converging toward SPCAM. Poleward of 50°, where more rainfall is produced by resolved-scale processes in CAM, few differences discriminate the rainfall properties of the two models. Lastly, these results are discussed in light of their implication for future rainfall changes in response to climate forcing.« less

  10. Comparison and Computational Performance of Tsunami-HySEA and MOST Models for the LANTEX 2013 scenario

    NASA Astrophysics Data System (ADS)

    González-Vida, Jose M.; Macías, Jorge; Mercado, Aurelio; Ortega, Sergio; Castro, Manuel J.

    2017-04-01

    Tsunami-HySEA model is used to simulate the Caribbean LANTEX 2013 scenario (LANTEX is the acronym for Large AtlaNtic Tsunami EXercise, which is carried out annually). The numerical simulation of the propagation and inundation phases, is performed with both models but using different mesh resolutions and nested meshes. Some comparisons with the MOST tsunami model available at the University of Puerto Rico (UPR) are made. Both models compare well for propagating tsunami waves in open sea, producing very similar results. In near-shore shallow waters, Tsunami-HySEA should be compared with the inundation version of MOST, since the propagation version of MOST is limited to deeper waters. Regarding the inundation phase, a 1 arc-sec (approximately 30 m) resolution mesh covering all of Puerto Rico, is used, and a three-level nested meshes technique implemented. In the inundation phase, larger differences between model results are observed. Nevertheless, the most striking difference resides in computational time; Tsunami-HySEA is coded using the advantages of GPU architecture, and can produce a 4 h simulation in a 60 arcsec resolution grid for the whole Caribbean Sea in less than 4 min with a single general-purpose GPU and as fast as 11 s with 32 general-purpose GPUs. In the inundation stage with nested meshes, approximately 8 hours of wall clock time is needed for a 2-h simulation in a single GPU (versus more than 2 days for the MOST inundation, running three different parts of the island—West, Center, East—at the same time due to memory limitations in MOST). When domain decomposition techniques are finally implemented by breaking up the computational domain into sub-domains and assigning a GPU to each sub-domain (multi-GPU Tsunami-HySEA version), we show that the wall clock time significantly decreases, allowing high-resolution inundation modelling in very short computational times, reducing, for example, if eight GPUs are used, the wall clock time to around 1 hour. Besides, these computational times are obtained using general-purpose GPU hardware.

  11. VALIDATION OF A CLINICAL ASSESSMENT OF SPECTRAL RIPPLE RESOLUTION FOR COCHLEAR-IMPLANT USERS

    PubMed Central

    Drennan, Ward. R.; Anderson, Elizabeth S.; Won, Jong Ho; Rubinstein, Jay T.

    2013-01-01

    Objectives Non-speech psychophysical tests of spectral resolution, such as the spectral-ripple discrimination task, have been shown to correlate with speech recognition performance in cochlear implant (CI) users (Henry et al., 2005; Won et al. 2007, 2011; Drennan et al. 2008; Anderson et al. 2011). However, these tests are best suited for use in the research laboratory setting and are impractical for clinical use. A test of spectral resolution that is quicker and could more easily be implemented in the clinical setting has been developed. The objectives of this study were 1) To determine if this new clinical ripple test would yield individual results equivalent to the longer, adaptive version of the ripple discrimination test; 2) To evaluate test-retest reliability for the clinical ripple measure; and 3) To examine the relationship between clinical ripple performance and monosyllabic word recognition in quiet for a group of CI listeners. Design Twenty-eight CI recipients participated in the study. Each subject was tested on both the adaptive and the clinical versions of spectral ripple discrimination, as well as CNC word recognition in quiet. The adaptive version of spectral ripple employed a 2-up, 1-down procedure for determining spectral ripple discrimination threshold. The clinical ripple test used a method of constant stimuli, with trials for each of 12 fixed ripple densities occurring six times in random order. Results from the clinical ripple test (proportion correct) were then compared to ripple discrimination thresholds (in ripples per octave) from the adaptive test. Results The clinical ripple test showed strong concurrent validity, evidenced by a good correlation between clinical ripple and adaptive ripple results (r=0.79), as well as a correlation with word recognition (r = 0.7). Excellent test-retest reliability was also demonstrated with a high test-retest correlation (r = 0.9). Conclusions The clinical ripple test is a reliable non-linguistic measure of spectral resolution, optimized for use with cochlear implant users in a clinical setting. The test might be useful as a diagnostic tool or as a possible surrogate outcome measure for evaluating treatment effects in hearing. PMID:24552679

  12. How Decisions Evolve: The Temporal Dynamics of Action Selection

    ERIC Educational Resources Information Center

    Scherbaum, Stefan; Dshemuchadse, Maja; Fischer, Rico; Goschke, Thomas

    2010-01-01

    To study the process of decision-making under conflict, researchers typically analyze response latency and accuracy. However, these tools provide little evidence regarding how the resolution of conflict unfolds over time. Here, we analyzed the trajectories of mouse movements while participants performed a continuous version of a spatial conflict…

  13. Validation of the design of a high resolution all-reflection Michelson interferometer for atmospheric spectroscopy

    NASA Astrophysics Data System (ADS)

    Carlson, Scott M.

    1993-06-01

    The design of a high resolution plane grating all-reflection Michelson interferometer for ionospheric spectroscopy was analyzed using ray tracing techniques. This interferometer produces an interference pattern whose spatial frequency is wavelength dependent. The instrument is intended for remote observations of the atomic oxygen triplet emission line profile at 1304 A in the thermosphere from sounding rocket or satellite platforms. The device was modeled using the PC-based ray tracing application, DART, and results analyzed through fourier techniques using the PC with Windows version of the Interactive Data Language (IDL). Through these methods, instrument resolution, resolving power, and bandpass were determined. An analysis of the effects of aperture size and shape on instrument performance was also conducted.

  14. Brokering technologies to realize the hydrology scenario in NSF BCube

    NASA Astrophysics Data System (ADS)

    Boldrini, Enrico; Easton, Zachary; Fuka, Daniel; Pearlman, Jay; Nativi, Stefano

    2015-04-01

    In the National Science Foundation (NSF) BCube project an international team composed of cyber infrastructure experts, geoscientists, social scientists and educators are working together to explore the use of brokering technologies, initially focusing on four domains: hydrology, oceans, polar, and weather. In the hydrology domain, environmental models are fundamental to understand the behaviour of hydrological systems. A specific model usually requires datasets coming from different disciplines for its initialization (e.g. elevation models from Earth observation, weather data from Atmospheric sciences, etc.). Scientific datasets are usually available on heterogeneous publishing services, such as inventory and access services (e.g. OGC Web Coverage Service, THREDDS Data Server, etc.). Indeed, datasets are published according to different protocols, moreover they usually come in different formats, resolutions, Coordinate Reference Systems (CRSs): in short different grid environments depending on the original data and the publishing service processing capabilities. Scientists can thus be impeded by the burden of discovery, access and normalize the desired datasets to the grid environment required by the model. These technological tasks of course divert scientists from their main, scientific goals. The use of GI-axe brokering framework has been experimented in a hydrology scenario where scientists needed to compare a particular hydrological model with two different input datasets (digital elevation models): - the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) dataset, v.2. - the Shuttle Radar Topography Mission (SRTM) dataset, v.3. These datasets were published by means of Hyrax Server technology, which can provide NetCDF files at their original resolution and CRS. Scientists had their model running on ArcGIS, so the main goal was to import the datasets using the available ArcPy library and have EPSG:4326 with the same resolution grid as the reference system, so that model outputs could be compared. ArcPy however is able to access only GeoTIff datasets that are published by a OGC Web Coverage Service (WCS). The GI-axe broker has then been deployed between the client application and the data providers. It has been configured to broker the two different Hyrax service endpoints and republish the data content through a WCS interface for the use of the ArcPy library. Finally, scientists were able to easily run the model, and to concentrate on the comparison of the different results obtained according to the selected input dataset. The use of a third party broker to perform such technological tasks has also shown to have the potential advantage of increasing the repeatability of a study among different researchers.

  15. First THEMIS Infrared and Visible Images of Mars

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This picture shows both a visible and a thermal infrared image taken by the thermal emission imaging system on NASA's 2001 Mars Odyssey spacecraft on November 2, 2001. The images were taken as part of the ongoing calibration and testing of the camera system as the spacecraft orbited Mars on its 13threvolution of the planet.

    The visible wavelength image, shown on the right in black and white, was obtained using one of the instrument's five visible filters. The spacecraft was approximately 22,000 kilometers (about 13,600 miles) above Mars looking down toward the south pole when this image was acquired. It is late spring in the martian southern hemisphere.

    The thermal infrared image, center, shows the temperature of the surface in color. The circular feature seen in blue is the extremely cold martian south polar carbon dioxide ice cap. The instrument has measured a temperature of minus 120 degrees Celsius (minus 184 degrees Fahrenheit) on the south polar ice cap. The polar cap is more than 900 kilometers (540 miles) in diameter at this time.

    The visible image shows additional details along the edge of the ice cap, as well as atmospheric hazes near the cap. The view of the surface appears hazy due to dust that still remains in the martian atmosphere from the massive martian dust storms that have occurred over the past several months.

    The infrared image covers a length of over 6,500 kilometers (3,900 miles)spanning the planet from limb to limb, with a resolution of approximately 5.5 kilometers per picture element, or pixel, (3.4 miles per pixel) at the point directly beneath the spacecraft. The visible image has a resolution of approximately 1 kilometer per pixel (.6 miles per pixel) and covers an area roughly the size of the states of Arizona and New Mexico combined.

    An annotated image is available at the same resolution in tiff format. Click the image to download (note: it is a 5.2 mB file) [figure removed for brevity, see original site]

    NASA's Jet Propulsion Laboratory, Pasadena, Calif. manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington D.C. The thermal-emission imaging system was developed at Arizona State University,Tempe, with Raytheon Santa Barbara Remote Sensing, Santa Barbara, Calif. Lockheed Martin Astronautics, Denver, is the prime contractor for the project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  16. The Snow Data System at NASA JPL

    NASA Astrophysics Data System (ADS)

    Horn, J.; Painter, T. H.; Bormann, K. J.; Rittger, K.; Brodzik, M. J.; Skiles, M.; Burgess, A. B.; Mattmann, C. A.; Ramirez, P.; Joyce, M.; Goodale, C. E.; McGibbney, L. J.; Zimdars, P.; Yaghoobi, R.

    2017-12-01

    The Snow Data System at NASA JPL includes data processing pipelines built with open source software, Apache 'Object Oriented Data Technology' (OODT). Processing is carried out in parallel across a high-powered computing cluster. The pipelines use input data from satellites such as MODIS, VIIRS and Landsat. They apply algorithms to the input data to produce a variety of outputs in GeoTIFF format. These outputs include daily data for SCAG (Snow Cover And Grain size) and DRFS (Dust Radiative Forcing in Snow), along with 8-day composites and MODICE annual minimum snow and ice calculations. This poster will describe the Snow Data System, its outputs and their uses and applications. It will also highlight recent advancements to the system and plans for the future.

  17. The Snow Data System at NASA JPL

    NASA Astrophysics Data System (ADS)

    Joyce, M.; Laidlaw, R.; Painter, T. H.; Bormann, K. J.; Rittger, K.; Brodzik, M. J.; Skiles, M.; Burgess, A. B.; Mattmann, C. A.; Ramirez, P.; Goodale, C. E.; McGibbney, L. J.; Zimdars, P.; Yaghoobi, R.

    2016-12-01

    The Snow Data System at NASA JPL includes data processing pipelines built with open source software, Apache 'Object Oriented Data Technology' (OODT). Processing is carried out in parallel across a high-powered computing cluster. The pipelines use input data from satellites such as MODIS, VIIRS and Landsat. They apply algorithms to the input data to produce a variety of outputs in GeoTIFF format. These outputs include daily data for SCAG (Snow Cover And Grain size) and DRFS (Dust Radiative Forcing in Snow), along with 8-day composites and MODICE annual minimum snow and ice calculations. This poster will describe the Snow Data System, its outputs and their uses and applications. It will also highlight recent advancements to the system and plans for the future.

  18. Terrain - Umbra Package v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oppel, Fred; Hart, Brian; Rigdon, James Brian

    This library contains modules that read terrain files (e.g., OpenFlight, Open Scene Graph IVE, GeoTIFF Image) and to read and manage ESRI terrain datasets. All data is stored and managed in Open Scene Graph (OSG). Terrain system accesses OSG and provides elevation data, access to meta-data such as soil types and enables linears, areals and buildings to be placed in a terrain, These geometry objects include boxes, point, path, and polygon (region), and sector modules. Utilities have been made available for clamping objects to the terrain and accessing LOS information. This assertion includes a managed C++ wrapper code (TerrainWrapper) tomore » enable C# applications, such as OpShed and UTU, to incorporate this library.« less

  19. Regional Climate Simulation with a Variable Resolution Stretched Grid GCM: The Regional Down-Scaling Effects

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.; Takacs, Lawrence L.; Suarez, Max; Sawyer, William; Govindaraju, Ravi C.

    1999-01-01

    The results obtained with the variable resolution stretched grid (SG) GEOS GCM (Goddard Earth Observing System General Circulation Models) are discussed, with the emphasis on the regional down-scaling effects and their dependence on the stretched grid design and parameters. A variable resolution SG-GCM and SG-DAS using a global stretched grid with fine resolution over an area of interest, is a viable new approach to REGIONAL and subregional CLIMATE studies and applications. The stretched grid approach is an ideal tool for representing regional to global scale interactions. It is an alternative to the widely used nested grid approach introduced a decade ago as a pioneering step in regional climate modeling. The GEOS SG-GCM is used for simulations of the anomalous U.S. climate events of 1988 drought and 1993 flood, with enhanced regional resolution. The height low level jet, precipitation and other diagnostic patterns are successfully simulated and show the efficient down-scaling over the area of interest the U.S. An imitation of the nested grid approach is performed using the developed SG-DAS (Data Assimilation System) that incorporates the SG-GCM. The SG-DAS is run with withholding data over the area of interest. The design immitates the nested grid framework with boundary conditions provided from analyses. No boundary condition buffer is needed for the case due to the global domain of integration used for the SG-GCM and SG-DAS. The experiments based on the newly developed versions of the GEOS SG-GCM and SG-DAS, with finer 0.5 degree (and higher) regional resolution, are briefly discussed. The major aspects of parallelization of the SG-GCM code are outlined. The KEY OBJECTIVES of the study are: 1) obtaining an efficient DOWN-SCALING over the area of interest with fine and very fine resolution; 2) providing CONSISTENT interactions between regional and global scales including the consistent representation of regional ENERGY and WATER BALANCES; 3) providing a high computational efficiency for future SG-GCM and SG-DAS versions using PARALLEL codes.

  20. USAID Expands eMODIS Coverage for Famine Early Warning

    NASA Astrophysics Data System (ADS)

    Jenkerson, C.; Meyer, D. J.; Evenson, K.; Merritt, M.

    2011-12-01

    Food security in countries at risk is monitored by U.S. Agency for International Development (USAID) through its Famine Early Warning Systems Network (FEWS NET) using many methods including Moderate Resolution Imaging Spectroradiometer (MODIS) data processed by U.S. Geological Survey (USGS) into eMODIS Normalized Difference Vegetation Index (NDVI) products. Near-real time production is used comparatively with trends derived from the eMODIS archive to operationally monitor vegetation anomalies indicating threatened cropland and rangeland conditions. eMODIS production over Central America and the Caribbean (CAMCAR) began in 2009, and processes 10-day NDVI composites every 5 days from surface reflectance inputs produced using predicted spacecraft and climatology information at Land and Atmosphere Near real time Capability for Earth Observing Systems (EOS) (LANCE). These expedited eMODIS composites are backed by a parallel archive of precision-based NDVI calculated from surface reflectance data ordered through Level 1 and Atmosphere Archive and Distribution System (LAADS). Success in the CAMCAR region led to the recent expansion of eMODIS production to include Africa in 2010, and Central Asia in 2011. Near-real time 250-meter products are available for each region on the last day of an acquisition interval (generally before midnight) from an anonymous file transfer protocol (FTP) distribution site (ftp://emodisftp.cr.usgs.gov/eMODIS). The FTP site concurrently hosts the regional historical collections (2000 to present) which are also searchable using the USGS Earth Explorer (http://edcsns17.cr.usgs.gov/NewEarthExplorer). As eMODIS coverage continues to grow, these geographically gridded, georeferenced tagged image file format (GeoTIFF) NDVI composites increase their utility as effective tools for operational monitoring of near-real time vegetation data against historical trends.

  1. cp-R, an interface the R programming language for clinical laboratory method comparisons.

    PubMed

    Holmes, Daniel T

    2015-02-01

    Clinical scientists frequently need to compare two different bioanalytical methods as part of assay validation/monitoring. As a matter necessity, regression methods for quantitative comparison in clinical chemistry, hematology and other clinical laboratory disciplines must allow for error in both the x and y variables. Traditionally the methods popularized by 1) Deming and 2) Passing and Bablok have been recommended. While commercial tools exist, no simple open source tool is available. The purpose of this work was to develop and entirely open-source GUI-driven program for bioanalytical method comparisons capable of performing these regression methods and able to produce highly customized graphical output. The GUI is written in python and PyQt4 with R scripts performing regression and graphical functions. The program can be run from source code or as a pre-compiled binary executable. The software performs three forms of regression and offers weighting where applicable. Confidence bands of the regression are calculated using bootstrapping for Deming and Passing Bablok methods. Users can customize regression plots according to the tools available in R and can produced output in any of: jpg, png, tiff, bmp at any desired resolution or ps and pdf vector formats. Bland Altman plots and some regression diagnostic plots are also generated. Correctness of regression parameter estimates was confirmed against existing R packages. The program allows for rapid and highly customizable graphical output capable of conforming to the publication requirements of any clinical chemistry journal. Quick method comparisons can also be performed and cut and paste into spreadsheet or word processing applications. We present a simple and intuitive open source tool for quantitative method comparison in a clinical laboratory environment. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Enhanced automated platform for 2D characterization of RFID communications

    NASA Astrophysics Data System (ADS)

    Vuza, Dan Tudor; Vlǎdescu, Marian

    2016-12-01

    The characterization of the quality of communication between an RFID reader and a transponder at all expected positions of the latter on the reader antenna is of primal importance for the evaluation of performance of an RFID system. Continuing the line of instruments developed for this purpose by the authors, the present work proposes an enhanced version of a previously introduced automated platform for 2D evaluation. By featuring higher performance in terms of mechanical speed, the new version allows to obtain 2D maps of communication with a higher resolution that would have been prohibitive in terms of test duration with the previous version. The list of measurement procedures that can be executed with the platform is now enlarged with additional ones, such as the determination of the variation of the magnetic coupling between transponder and antenna across the antenna surface and the utilization of transponder simulators for evaluation of the quality of communication.

  3. Flow measurements in a water tunnel using a holocinematographic velocimeter

    NASA Technical Reports Server (NTRS)

    Weinstein, Leonard M.; Beeler, George B.

    1987-01-01

    Dual-view holographic movies were used to examine complex flows with full three-space and time resolution. This approach, which tracks the movement of small tracer particles in water, is termed holocinematographic velocimetry (HCV). A small prototype of a new water tunnel was used to demonstrate proof-of-concept for the HCV. After utilizing a conventional flow visualization apparatus with a laser light sheet to illuminate tracer particles to evaluate flow quality of the prototype tunnel, a simplified version of the HCV was employed to demonstrate the capabilities of the approach. Results indicate that a full-scale version of the water tunnel and a high performance version of the HCV should be able to check theoretical and numerical modeling of complex flows and examine the mechanisms operative in turbulent and vortex flow control concepts, providing an entirely unique instrument capable, for the first time, of simultaneous three-space and time measurements in turbulent flow.

  4. Evaluating the Ocean Component of the US Navy Earth System Model

    NASA Astrophysics Data System (ADS)

    Zamudio, L.

    2017-12-01

    Ocean currents, temperature, and salinity observations are used to evaluate the ocean component of the US Navy Earth System Model. The ocean and atmosphere components of the system are an eddy-resolving (1/12.5° equatorial resolution) version of the HYbrid Coordinate Ocean Model (HYCOM), and a T359L50 version of the NAVy Global Environmental Model (NAVGEM), respectively. The system was integrated in hindcast mode and the ocean results are compared against unassimilated observations, a stand-alone version of HYCOM, and the Generalized Digital Environment Model ocean climatology. The different observation types used in the system evaluation are: drifting buoys, temperature profiles, salinity profiles, and acoustical proxies (mixed layer depth, sonic layer depth, below layer gradient, and acoustical trapping). To evaluate the system's performance in each different metric, a scorecard is used to translate the system's errors into scores, which provide an indication of the system's skill in both space and time.

  5. Recovering Swift-XRT Energy Resolution through CCD Charge Trap Mapping

    NASA Technical Reports Server (NTRS)

    Pagani, C.; Beardmore, A. P.; Abbey, A. F.; Mountford, C.; Osborne, J. P.; Capalbi, M.; Perri, M.; Angelini, L.; Burrows, D. N.; Campana, S.; hide

    2012-01-01

    The X-ray telescope on board the Swift satellite for gamma-ray burst astronomy has been exposed to the radiation of the space environment since launch in November 2004. Radiation causes damage to the detector, with the generation of dark current and charge trapping sites that result in the degradation of the spectral resolution and an increase of the instrumental background. The Swift team has a dedicated calibration program with the goal of recovering a significant proportion of the lost spectroscopic performance. Calibration observations of supernova remnants with strong emission lines are analysed to map the detector charge traps and to derive position-dependent corrections to the measured photon energies. We have achieved a substantial recovery in the XRT resolution by implementing these corrections in an updated version of the Swift XRT gain file and in corresponding improvements to the Swift XRT HEAsoft software. We provide illustrations of the impact of the enhanced energy resolution, and show that we have recovered most of the spectral resolution lost since launch

  6. Validation of a vector version of the 6S radiative transfer code for atmospheric correction of satellite data. Part I: Path radiance

    NASA Astrophysics Data System (ADS)

    Kotchenova, Svetlana Y.; Vermote, Eric F.; Matarrese, Raffaella; Klemm, Frank J., Jr.

    2006-09-01

    A vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), which enables accounting for radiation polarization, has been developed and validated against a Monte Carlo code, Coulson's tabulated values, and MOBY (Marine Optical Buoy System) water-leaving reflectance measurements. The developed code was also tested against the scalar codes SHARM, DISORT, and MODTRAN to evaluate its performance in scalar mode and the influence of polarization. The obtained results have shown a good agreement of 0.7% in comparison with the Monte Carlo code, 0.2% for Coulson's tabulated values, and 0.001-0.002 for the 400-550 nm region for the MOBY reflectances. Ignoring the effects of polarization led to large errors in calculated top-of-atmosphere reflectances: more than 10% for a molecular atmosphere and up to 5% for an aerosol atmosphere. This new version of 6S is intended to replace the previous scalar version used for calculation of lookup tables in the MODIS (Moderate Resolution Imaging Spectroradiometer) atmospheric correction algorithm.

  7. Validation of a vector version of the 6S radiative transfer code for atmospheric correction of satellite data. Part I: path radiance.

    PubMed

    Kotchenova, Svetlana Y; Vermote, Eric F; Matarrese, Raffaella; Klemm, Frank J

    2006-09-10

    A vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), which enables accounting for radiation polarization, has been developed and validated against a Monte Carlo code, Coulson's tabulated values, and MOBY (Marine Optical Buoy System) water-leaving reflectance measurements. The developed code was also tested against the scalar codes SHARM, DISORT, and MODTRAN to evaluate its performance in scalar mode and the influence of polarization. The obtained results have shown a good agreement of 0.7% in comparison with the Monte Carlo code, 0.2% for Coulson's tabulated values, and 0.001-0.002 for the 400-550 nm region for the MOBY reflectances. Ignoring the effects of polarization led to large errors in calculated top-of-atmosphere reflectances: more than 10% for a molecular atmosphere and up to 5% for an aerosol atmosphere. This new version of 6S is intended to replace the previous scalar version used for calculation of lookup tables in the MODIS (Moderate Resolution Imaging Spectroradiometer) atmospheric correction algorithm.

  8. Regional Data Assimilation Using a Stretched-Grid Approach and Ensemble Calculations

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, M. S.; Takacs, L. L.; Govindaraju, R. C.; Atlas, Robert (Technical Monitor)

    2002-01-01

    The global variable resolution stretched grid (SG) version of the Goddard Earth Observing System (GEOS) Data Assimilation System (DAS) incorporating the GEOS SG-GCM (Fox-Rabinovitz 2000, Fox-Rabinovitz et al. 2001a,b), has been developed and tested as an efficient tool for producing regional analyses and diagnostics with enhanced mesoscale resolution. The major area of interest with enhanced regional resolution used in different SG-DAS experiments includes a rectangle over the U.S. with 50 or 60 km horizontal resolution. The analyses and diagnostics are produced for all mandatory levels from the surface to 0.2 hPa. The assimilated regional mesoscale products are consistent with global scale circulation characteristics due to using the SG-approach. Both the stretched grid and basic uniform grid DASs use the same amount of global grid-points and are compared in terms of regional product quality.

  9. Libraries of High and Mid-Resolution Spectra of F, G, K, and M Field Stars

    NASA Astrophysics Data System (ADS)

    Montes, D.

    1998-06-01

    I have compiled here the three libraries of high and mid-resolution optical spectra of late-type stars I have recently published. The libraries include F, G, K and M field stars, from dwarfs to giants. The spectral coverage is from 3800 to 1000 Å, with spectral resolution ranging from 0.09 to 3.0 Å. These spectra include many of the spectral lines most widely used as optical and near-infrared indicators of chromospheric activity. The spectra have been obtained with the aim of providing a library of high and mid-resolution spectra to be used in the study of active chromosphere stars by applying a spectral subtraction technique. However, the data set presented here can also be utilized in a wide variety of ways. A digital version of all the fully reduced spectra is available via FTP and the World Wide Web (WWW) in FITS format.

  10. Design for and efficient dynamic climate model with realistic geography

    NASA Technical Reports Server (NTRS)

    Suarez, M. J.; Abeles, J.

    1984-01-01

    The long term climate sensitivity which include realistic atmospheric dynamics are severely restricted by the expense of integrating atmospheric general circulation models are discussed. Taking as an example models used at GSFC for this dynamic model is an alternative which is of much lower horizontal or vertical resolution. The model of Heid and Suarez uses only two levels in the vertical and, although it has conventional grid resolution in the meridional direction, horizontal resolution is reduced by keeping only a few degrees of freedom in the zonal wavenumber spectrum. Without zonally asymmetric forcing this model simulates a day in roughly 1/2 second on a CRAY. The model under discussion is a fully finite differenced, zonally asymmetric version of the Heid-Suarez model. It is anticipated that speeds can be obtained a few seconds a day roughly 50 times faster than moderate resolution, multilayer GCM's.

  11. Evaluation of LIS-based Soil Moisture and Evapotranspiration in the Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Jung, H. C.; Kang, D. H.; Kim, E. J.; Yoon, Y.; Kumar, S.; Peters-Lidard, C. D.; Baeck, S. H.; Hwang, E.; Chae, H.

    2017-12-01

    K-water is the South Korean national water agency. It is the government-funded private agency for water resource development that provides both civil and industrial water in S. Korea. K-water is interested in exploring how earth remote sensing and modeling can help their tasks. In this context, the NASA Land Information System (LIS) is implemented to simulate land surface processes in the Korean Peninsula. The Noah land surface model with Multi-Parameterization, version 3.6 (Noah-MP) is used to reproduce the water budget variables on a 1 km spatial resolution grid with a daily temporal resolution. The Modern-Era Retrospective analysis for Research and Applications, version 2 (MERRA-2) datasets is used to force the system. The rainfall data are spatially downscaled from high resolution WorldClim precipitation climatology. The other meteorological inputs (i.e. air temperature, humidity, pressure, winds, radiation) are also downscaled by statistical methods (i.e. lapse-rate, slope-aspect). Additional model experiments are conducted with local rainfall datasets and soil maps to replace the downscaled MERRA-2 precipitation field and the hybrid STATSGO/FAO soil texture, respectively. For the evaluation of model performance, daily soil moisture and evapotranspiration measurements at several stations are compared to the LIS-based outputs. This study demonstrates that application of NASA's LIS can enhance drought and flood prediction capabilities in South Asia and Korea.

  12. Troposphere-Stratosphere Connections in Recent Northern Winters in NASA GEOS Assimilated Datasets

    NASA Technical Reports Server (NTRS)

    Pawson, Steven

    2000-01-01

    The northern winter stratosphere displays a wide range of interannual variability, much of which is believed to result from the response to the damping of upward-propagating waves. However, there is considerable (growing) evidence that the stratospheric state can also impact the tropospheric circulation. This issue will be examined using datasets generated in the Data Assimilation Office (DAO) at NASA's Goddard Space Flight Center. Just as the tropospheric circulation in each of these years was dominated by differing synoptic-scale structures, the stratospheric polar vortex also displayed different evolutions. The two extremes are the winter 1998/1999, when the stratosphere underwent a series of warming events (including two major warmings), and the winter 1999/2000, which was dominated by a persistent, cold polar vortex, often distorted by a dominant blocking pattern in the troposphere. This study will examine several operational and research-level versions of the DAO's systems. The 70-level-TRMM-system with a resolution of 2-by-2.5 degrees and the 48-level, 1-by-l-degree resolution ''Terra'' system were operational in 1998/1999 and 1999/2000, respectively. Research versions of the system used a 48-level, 2-by-2.5-degree configuration, which facilitates studies of the impact of vertical resolution. The study includes checks against independent datasets and error analyses, as well as the main issue of troposphere-stratosphere interactions.

  13. NOAA News Online (Story 2249)

    Science.gov Websites

    ON OZONE HOLE NOAA image of Susan Solomon in her office in Boulder, Colo. June 23, 2004 - Susan Solomon, a leading atmospheric scientist at the NOAA Aeronomy Laboratory in Boulder, Colo., was awarded larger view of Susan Solomon in her office in Boulder, Colo. Click here for high resolution version

  14. Proteopedia Entry: The Large Ribosomal Subunit of "Haloarcula Marismortui"

    ERIC Educational Resources Information Center

    Decatur, Wayne A.

    2010-01-01

    This article presents a "Proteopedia" page that shows the refined version of the structure of the "Haloarcula" large ribosomal subunit as solved by the laboratories of Thomas Steitz and Peter Moore. The landmark structure is of great impact as it is the first atomic-resolution structure of the highly conserved ribosomal subunit which harbors…

  15. Moderate Resolution Imaging Spectroradiometer (MODIS) MOD21 Land Surface Temperature and Emissivity Algorithm Theoretical Basis Document

    NASA Technical Reports Server (NTRS)

    Hulley, G.; Malakar, N.; Hughes, T.; Islam, T.; Hook, S.

    2016-01-01

    This document outlines the theory and methodology for generating the Moderate Resolution Imaging Spectroradiometer (MODIS) Level-2 daily daytime and nighttime 1-km land surface temperature (LST) and emissivity product using the Temperature Emissivity Separation (TES) algorithm. The MODIS-TES (MOD21_L2) product, will include the LST and emissivity for three MODIS thermal infrared (TIR) bands 29, 31, and 32, and will be generated for data from the NASA-EOS AM and PM platforms. This is version 1.0 of the ATBD and the goal is maintain a 'living' version of this document with changes made when necessary. The current standard baseline MODIS LST products (MOD11*) are derived from the generalized split-window (SW) algorithm (Wan and Dozier 1996), which produces a 1-km LST product and two classification-based emissivities for bands 31 and 32; and a physics-based day/night algorithm (Wan and Li 1997), which produces a 5-km (C4) and 6-km (C5) LST product and emissivity for seven MODIS bands: 20, 22, 23, 29, 31-33.

  16. Evaluation of NCMRWF unified model vertical cloud structure with CloudSat over the Indian summer monsoon region

    NASA Astrophysics Data System (ADS)

    Jayakumar, A.; Mamgain, Ashu; Jisesh, A. S.; Mohandas, Saji; Rakhi, R.; Rajagopal, E. N.

    2016-05-01

    Representation of rainfall distribution and monsoon circulation in the high resolution versions of NCMRWF Unified model (NCUM-REG) for the short-range forecasting of extreme rainfall event is vastly dependent on the key factors such as vertical cloud distribution, convection and convection/cloud relationship in the model. Hence it is highly relevant to evaluate the vertical structure of cloud and precipitation of the model over the monsoon environment. In this regard, we utilized the synergy of the capabilities of CloudSat data for long observational period, by conditioning it for the synoptic situation of the model simulation period. Simulations were run at 4-km grid length with the convective parameterization effectively switched off and on. Since the sample of CloudSat overpasses through the monsoon domain is small, the aforementioned methodology may qualitatively evaluate the vertical cloud structure for the model simulation period. It is envisaged that the present study will open up the possibility of further improvement in the high resolution version of NCUM in the tropics for the Indian summer monsoon associated rainfall events.

  17. GEOS S2S-2_1 File Specification: GMAO Seasonal and Sub-Seasonal Forecast Output

    NASA Technical Reports Server (NTRS)

    Kovach, Robin M.; Marshak, Jelena; Molod, Andrea; Nakada, Kazumi

    2018-01-01

    The NASA GMAO seasonal (9 months) and subseasonal (45 days) forecasts are produced with the Goddard Earth Observing System (GEOS) Atmosphere-Ocean General Circulation Model and Data Assimilation System Version S2S-2_1. The new system replaces version S2S-1.0 described in Borovikov et al (2017), and includes upgrades to many components of the system. The atmospheric model includes an upgrade from a pre-MERRA-2 version running on a latitude-longitude grid at approx. 1 degree resolution to a current version running on a cubed sphere grid at approximately 1/2 degree resolution. The important developments are related to the dynamical core (Putman et al., 2011), the moist physics (''two-moment microphysics'' of Barahona et al., 2014) and the cryosphere (Cullather et al., 2014). As in the previous GMAO S2S system, the land model is that of Koster et al (2000). GMAO S2S-2_1 now includes the Goddard Chemistry Aerosol Radiation and Transport (GOCART, Colarco et al., 2010) single moment interactive aerosol model that includes predictive aerosols including dust, sea salt and several species of carbon and sulfate. The previous version of GMAO S2S specified aerosol amounts from climatology, which were used to inform the atmospheric radiation only. The ocean model includes an upgrade from MOM4 to MOM5 (Griffies 2012), and continues to be run on the tripolar grid at approximately 1/2 degree resolution in the tropics with 40 vertical levels. As in S2S-1.0, the sea ice model is from the Los Alamos Sea Ice model (CICE4, Hunke and Lipscomb 2010). The Ocean Data Assimilation System (ODAS) has been upgraded from the one described in Borovikov et al., 2017 to one that uses a modified version of the Penny, 2014 Local Ensemble Transform Kalman Filter (LETKF), and now assimilates along-track altimetry. The ODAS also does a nudging to MERRA-2 SST and sea ice boundary conditions. The atmospheric data assimilation fields used to constrain the atmosphere in the ODAS have been upgraded from MERRA to a MERRA-2 like system. The system is initialized using a MERRA-2-like atmospheric reanalysis (Gelaro et al. 2017) and the GMAO S2S-2_1 ocean analysis. Additional ensemble members for forecasts are produced with initial states at 5-day intervals, with additional members based on perturbations of the atmospheric and ocean states. Both subseasonal and seasonal forecasts are submitted to the National MultiModel Ensemble (NMME) project, and are part of the US/Canada multimodel seasonal forecasts (http://www.cpc.ncep.noaa.gov/products/NMME/). A large suite of retrospective forecasts (''hindcasts'') have been completed, and contribute to the calculation of the model's baseline climatology and drift, anomalies from which are the basis of the seasonal forecasts.

  18. Super-Resolution Imaging of Molecular Emission Spectra and Single Molecule Spectral Fluctuations

    PubMed Central

    Mlodzianoski, Michael J.; Curthoys, Nikki M.; Gunewardene, Mudalige S.; Carter, Sean; Hess, Samuel T.

    2016-01-01

    Localization microscopy can image nanoscale cellular details. To address biological questions, the ability to distinguish multiple molecular species simultaneously is invaluable. Here, we present a new version of fluorescence photoactivation localization microscopy (FPALM) which detects the emission spectrum of each localized molecule, and can quantify changes in emission spectrum of individual molecules over time. This information can allow for a dramatic increase in the number of different species simultaneously imaged in a sample, and can create super-resolution maps showing how single molecule emission spectra vary with position and time in a sample. PMID:27002724

  19. Performance Improvements of the CYCOFOS Flow Model

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Hari; Moulitsas, Irene; Syrakos, Alexandros; Zodiatis, George; Nikolaides, Andreas; Hayes, Daniel; Georgiou, Georgios C.

    2013-04-01

    The CYCOFOS-Cyprus Coastal Ocean Forecasting and Observing System has been operational since early 2002, providing daily sea current, temperature, salinity and sea level forecasting data for the next 4 and 10 days to end-users in the Levantine Basin, necessary for operational application in marine safety, particularly concerning oil spills and floating objects predictions. CYCOFOS flow model, similar to most of the coastal and sub-regional operational hydrodynamic forecasting systems of the MONGOOS-Mediterranean Oceanographic Network for Global Ocean Observing System is based on the POM-Princeton Ocean Model. CYCOFOS is nested with the MyOcean Mediterranean regional forecasting data and with SKIRON and ECMWF for surface forcing. The increasing demand for higher and higher resolution data to meet coastal and offshore downstream applications motivated the parallelization of the CYCOFOS POM model. This development was carried out in the frame of the IPcycofos project, funded by the Cyprus Research Promotion Foundation. The parallel processing provides a viable solution to satisfy these demands without sacrificing accuracy or omitting any physical phenomena. Prior to IPcycofos project, there are been several attempts to parallelise the POM, as for example the MP-POM. The existing parallel code models rely on the use of specific outdated hardware architectures and associated software. The objective of the IPcycofos project is to produce an operational parallel version of the CYCOFOS POM code that can replicate the results of the serial version of the POM code used in CYCOFOS. The parallelization of the CYCOFOS POM model use Message Passing Interface-MPI, implemented on commodity computing clusters running open source software and not depending on any specialized vendor hardware. The parallel CYCOFOS POM code constructed in a modular fashion, allowing a fast re-locatable downscaled implementation. The MPI takes advantage of the Cartesian nature of the POM mesh, and use the built-in functionality of MPI routines to split the mesh, using a weighting scheme, along longitude and latitude among the processors. Each server processor work on the model based on domain decomposition techniques. The new parallel CYCOFOS POM code has been benchmarked against the serial POM version of CYCOFOS for speed, accuracy, and resolution and the results are more than satisfactory. With a higher resolution CYCOFOS Levantine model domain the forecasts need much less time than the serial CYCOFOS POM coarser version, both with identical accuracy.

  20. CHARRON: Code for High Angular Resolution of Rotating Objects in Nature

    NASA Astrophysics Data System (ADS)

    Domiciano de Souza, A.; Zorec, J.; Vakili, F.

    2012-12-01

    Rotation is one of the fundamental physical parameters governing stellar physics and evolution. At the same time, spectrally resolved optical/IR long-baseline interferometry has proven to be an important observing tool to measure many physical effects linked to rotation, in particular, stellar flattening, gravity darkening, differential rotation. In order to interpret the high angular resolution observations from modern spectro-interferometers, such as VLTI/AMBER and VEGA/CHARA, we have developed an interferometry-oriented numerical model: CHARRON (Code for High Angular Resolution of Rotating Objects in Nature). We present here the characteristics of CHARRON, which is faster (≃q10-30 s per model) and thus more adapted to model-fitting than the first version of the code presented by Domiciano de Souza et al. (2002).

  1. Design and Performance of A High Resolution Micro-Spec: An Integrated Sub-Millimeter Spectrometer

    NASA Technical Reports Server (NTRS)

    Barrentine, Emily M.; Cataldo, Giuseppe; Brown, Ari D.; Ehsan, Negar; Noroozian, Omid; Stevenson, Thomas R.; U-Yen, Kongpop; Wollack, Edward J.; Moseley, S. Harvey

    2016-01-01

    Micro-Spec is a compact sub-millimeter (approximately 100 GHz--1:1 THz) spectrometer which uses low loss superconducting microstrip transmission lines and a single-crystal silicon dielectric to integrate all of the components of a diffraction grating spectrometer onto a single chip. We have already successfully evaluated the performance of a prototype Micro-Spec, with spectral resolving power, R=64. Here we present our progress towards developing a higher resolution Micro-Spec, which would enable the first science returns in a balloon flight version of this instrument. We describe modifications to the design in scaling from a R=64 to a R=256 instrument, as well as the ultimate performance limits and design concerns when scaling this instrument to higher resolutions.

  2. Towards breaking the spatial resolution barriers: An optical flow and super-resolution approach for sea ice motion estimation

    NASA Astrophysics Data System (ADS)

    Petrou, Zisis I.; Xian, Yang; Tian, YingLi

    2018-04-01

    Estimation of sea ice motion at fine scales is important for a number of regional and local level applications, including modeling of sea ice distribution, ocean-atmosphere and climate dynamics, as well as safe navigation and sea operations. In this study, we propose an optical flow and super-resolution approach to accurately estimate motion from remote sensing images at a higher spatial resolution than the original data. First, an external example learning-based super-resolution method is applied on the original images to generate higher resolution versions. Then, an optical flow approach is applied on the higher resolution images, identifying sparse correspondences and interpolating them to extract a dense motion vector field with continuous values and subpixel accuracies. Our proposed approach is successfully evaluated on passive microwave, optical, and Synthetic Aperture Radar data, proving appropriate for multi-sensor applications and different spatial resolutions. The approach estimates motion with similar or higher accuracy than the original data, while increasing the spatial resolution of up to eight times. In addition, the adopted optical flow component outperforms a state-of-the-art pattern matching method. Overall, the proposed approach results in accurate motion vectors with unprecedented spatial resolutions of up to 1.5 km for passive microwave data covering the entire Arctic and 20 m for radar data, and proves promising for numerous scientific and operational applications.

  3. Analysis of CrIS ATMS and AIRS AMSU Data Using Scientifically Equivalent Retrieval Algorithms

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Kouvaris, Louis; Iredell, Lena; Blaisdell, John

    2016-01-01

    Monthly mean August 2014 Version-6.28 AIRS and CrIS products agree well with OMPS and CERES, and reasonably well with each other. Version-6.28 CrIS total precipitable water is biased dry compared to AIRS. AIRS and CrIS Version-6.36 water vapor products are both improved compared to Version-6.28. Version-6.36 AIRS and CrIS total precipitable water also shows improved agreement with each other. AIRS Version-6.36 total ozone agrees even better with OMPS than does AIRS Version-6.28, and gives reasonable results during polar winter where OMPS does not generate products. CrIS and ATMS are high spectral resolution IR and Microwave atmospheric sounders currently flying on the SNPP satellite, and are also scheduled for flight on future NPOESS satellites. CrIS/ATMS have similar sounding capabilities to those of the AIRS/AMSU sounder suite flying on EOS Aqua. The objective of this research is to develop and implement scientifically equivalent AIRS/AMSU and CrIS/ATMS retrieval algorithms with the goal of generating a continuous data record of AIRS/AMSU and CrIS/ATMS level-3 data products with a seamless transition between them in time. To achieve this, monthly mean AIRS/AMSU and CrIS/ATMS retrieved products, and more importantly their interannual differences, should show excellent agreement with each other. The currently operational AIRS Science Team Version-6 retrieval algorithm has generated 14 years of level-3 data products. A scientifically improved AIRS Version-7 retrieval algorithm is expected to become operational in 2017. We see significant improvements in water vapor and ozone in Version-7 retrieval methodology compared to Version-6.We are working toward finalization and implementation of scientifically equivalent AIRS/AMSU and CrIS/ATMS Version-7 retrieval algorithms to be used for the eventual processing of all AIRS/AMSU and CrIS/ATMS data. The latest version of our retrieval algorithm is Verison-6.36, which includes almost all the improvements we want in Version-7. Version-6.28 has been used to process both AIRS and CrIS data for August 2014. This poster compares August 2014 monthly mean Version-6.28 AIRS/AMSU and CrIS/ATMS products with each other, and also with monthly mean products obtained using AIRS Version-6. AIRS and CrIS results using Version-6.36 are presented for April 15, 2016. These demonstrate further improvements since Version-6.28. The new results also show improved agreement of Version-6.36 AIRS and CrIS products with each other. Version-6.36 is not yet optimized for CrIS ozone products.

  4. Evaluation of cool season precipitation event characteristics over the Northeast US in a suite of downscaled climate model hindcasts

    NASA Astrophysics Data System (ADS)

    Loikith, Paul C.; Waliser, Duane E.; Kim, Jinwon; Ferraro, Robert

    2017-08-01

    Cool season precipitation event characteristics are evaluated across a suite of downscaled climate models over the northeastern US. Downscaled hindcast simulations are produced by dynamically downscaling the Modern-Era Retrospective Analysis for Research and Applications version 2 (MERRA2) using the National Aeronautics and Space Administration (NASA)-Unified Weather Research and Forecasting (WRF) regional climate model (RCM) and the Goddard Earth Observing System Model, Version 5 (GEOS-5) global climate model. NU-WRF RCM simulations are produced at 24, 12, and 4-km horizontal resolutions using a range of spectral nudging schemes while the MERRA2 global downscaled run is provided at 12.5-km. All model runs are evaluated using four metrics designed to capture key features of precipitation events: event frequency, event intensity, even total, and event duration. Overall, the downscaling approaches result in a reasonable representation of many of the key features of precipitation events over the region, however considerable biases exist in the magnitude of each metric. Based on this evaluation there is no clear indication that higher resolution simulations result in more realistic results in general, however many small-scale features such as orographic enhancement of precipitation are only captured at higher resolutions suggesting some added value over coarser resolution. While the differences between simulations produced using nudging and no nudging are small, there is some improvement in model fidelity when nudging is introduced, especially at a cutoff wavelength of 600 km compared to 2000 km. Based on the results of this evaluation, dynamical regional downscaling using NU-WRF results in a more realistic representation of precipitation event climatology than the global downscaling of MERRA2 using GEOS-5.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nair, Shyam; Hartley, Damon S; Hays, Ross D

    LEAF Version 2.0 is a framework comprising of three models RUSLE2, WEPS, and AGNPS. The framework can predict row crop, crop residue, and energy crop yields at a sub-field resolutions for various combinations of soil, climate and crop management and residue harvesting practices. It estimates the loss of soil, carbon, and nutrients to the atmosphere, to the groundwater, and to runoff. It also models the overland flow of water and washed-off sediments, nutrients and other chemicals to provide estimates of sediment, nutrient, and chemical loadings to water bodies within a watershed. AGNPS model and wash-off calculations are the new additionsmore » to this version of LEAF. Development of LEAF software is supported by DOE's BETO program.« less

  6. A student manual for promoting mental health among high school students.

    PubMed

    Gigantesco, Antonella; Del Re, Debora; Cascavilla, Isabella

    2013-01-01

    We describe a school program based on a student manual for promoting mental health and preventing mental illness. A preliminary version of the manual was assessed for face validity by two focus groups. The final version was evaluated for acceptability among 253 students in 10 high schools and 1 middle school in Italy. The manual included 18 chapters (or "units") which address skills for enabling students to cope with their daily lives: communication skills, problem-solving, assertive skills, negotiation, stress management, anger management and conflict resolution. The manual was found to have been acceptable by high school students. The effectiveness of the manual in actually promoting mental health and preventing mental illness is currently being evaluated.

  7. SEAPAK user's guide, version 2.0. Volume 2: Descriptions of programs

    NASA Technical Reports Server (NTRS)

    Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.

    1991-01-01

    The SEAPAK is a user-interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made since version 1.0, and the ancillary environmental data analysis module was greatly expanded. The package continues to be user friendly and user interactive. Also, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing for large quantities of data to be ingested and analyzed.

  8. SEAPAK user's guide, version 2.0. Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.

    1991-01-01

    The SEAPAK is a user interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made to version 1.0 of the guide, and the ancillary environmental data analysis module was expanded. The package continues to emphasize user friendliness and user interactive data analyses. Additionally, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing large quantities of data to be ingested and analyzed in background.

  9. New global fire emission estimates and evaluation of volatile organic compounds

    Treesearch

    C. Wiedinmyer; L. K. Emmons; S. K. Akagi; R. J. Yokelson; J. J. Orlando; J. A. Al-Saadi; A. J. Soja

    2010-01-01

    A daily, high-resolution, global fire emissions model has been built to estimate emissions from open burning for air quality modeling applications: The Fire INventory from NCAR (FINN version 1). The model framework uses daily fire detections from the MODIS instruments and updated emission factors, specifically for speciated non-methane organic compounds (NMOC). Global...

  10. Atmosphere surface storm track response to resolved ocean mesoscale in two sets of global climate model experiments

    NASA Astrophysics Data System (ADS)

    Small, R. Justin; Msadek, Rym; Kwon, Young-Oh; Booth, James F.; Zarzycki, Colin

    2018-05-01

    It has been hypothesized that the ocean mesoscale (particularly ocean fronts) can affect the strength and location of the overlying extratropical atmospheric storm track. In this paper, we examine whether resolving ocean fronts in global climate models indeed leads to significant improvement in the simulated storm track, defined using low level meridional wind. Two main sets of experiments are used: (i) global climate model Community Earth System Model version 1 with non-eddy-resolving standard resolution or with ocean eddy-resolving resolution, and (ii) the same but with the GFDL Climate Model version 2. In case (i), it is found that higher ocean resolution leads to a reduction of a very warm sea surface temperature (SST) bias at the east coasts of the U.S. and Japan seen in standard resolution models. This in turn leads to a reduction of storm track strength near the coastlines, by up to 20%, and a better location of the storm track maxima, over the western boundary currents as observed. In case (ii), the change in absolute SST bias in these regions is less notable, and there are modest (10% or less) increases in surface storm track, and smaller changes in the free troposphere. In contrast, in the southern Indian Ocean, case (ii) shows most sensitivity to ocean resolution, and this coincides with a larger change in mean SST as ocean resolution is changed. Where the ocean resolution does make a difference, it consistently brings the storm track closer in appearance to that seen in ERA-Interim Reanalysis data. Overall, for the range of ocean model resolutions used here (1° versus 0.1°) we find that the differences in SST gradient have a small effect on the storm track strength whilst changes in absolute SST between experiments can have a larger effect. The latter affects the land-sea contrast, air-sea stability, surface latent heat flux, and the boundary layer baroclinicity in such a way as to reduce storm track activity adjacent to the western boundary in the N. Hemisphere storm tracks, but strengthens the storm track over the southern Indian Ocean. A note of caution is that the results are sensitive to the choice of storm track metric. The results are contrasted with those from a high resolution coupled simulation where the SST is smoothed for the purposes of computing air-sea fluxes, an alternative method of testing sensitivity to SST gradients.

  11. Impact of numerical choices on water conservation in the E3SM Atmosphere Model Version 1 (EAM V1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Kai; Rasch, Philip J.; Taylor, Mark A.

    The conservation of total water is an important numerical feature for global Earth system models. Even small conservation problems in the water budget can lead to systematic errors in century-long simulations for sea level rise projection. This study quantifies and reduces various sources of water conservation error in the atmosphere component of the Energy Exascale Earth System Model. Several sources of water conservation error have been identified during the development of the version 1 (V1) model. The largest errors result from the numerical coupling between the resolved dynamics and the parameterized sub-grid physics. A hybrid coupling using different methods formore » fluid dynamics and tracer transport provides a reduction of water conservation error by a factor of 50 at 1° horizontal resolution as well as consistent improvements at other resolutions. The second largest error source is the use of an overly simplified relationship between the surface moisture flux and latent heat flux at the interface between the host model and the turbulence parameterization. This error can be prevented by applying the same (correct) relationship throughout the entire model. Two additional types of conservation error that result from correcting the surface moisture flux and clipping negative water concentrations can be avoided by using mass-conserving fixers. With all four error sources addressed, the water conservation error in the V1 model is negligible and insensitive to the horizontal resolution. The associated changes in the long-term statistics of the main atmospheric features are small. A sensitivity analysis is carried out to show that the magnitudes of the conservation errors decrease strongly with temporal resolution but increase with horizontal resolution. The increased vertical resolution in the new model results in a very thin model layer at the Earth’s surface, which amplifies the conservation error associated with the surface moisture flux correction. We note that for some of the identified error sources, the proposed fixers are remedies rather than solutions to the problems at their roots. Future improvements in time integration would be beneficial for this model.« less

  12. CALIPSO IIR Version 2 Level 1b calibrated radiances: analysis and reduction of residual biases in the Northern Hemisphere

    NASA Astrophysics Data System (ADS)

    Garnier, Anne; Trémas, Thierry; Pelon, Jacques; Lee, Kam-Pui; Nobileau, Delphine; Gross-Colzy, Lydwine; Pascal, Nicolas; Ferrage, Pascale; Scott, Noëlle A.

    2018-04-01

    Version 2 of the Level 1b calibrated radiances of the Imaging Infrared Radiometer (IIR) on board the Cloud-Aerosol Lidar and Infrared Satellite Observation (CALIPSO) satellite has been released recently. This new version incorporates corrections of small but systematic seasonal calibration biases previously revealed in Version 1 data products mostly north of 30° N. These biases - of different amplitudes in the three IIR channels 8.65 µm (IIR1), 10.6 µm (IIR2), and 12.05 µm (IIR3) - were made apparent by a striping effect in images of IIR inter-channel brightness temperature differences (BTDs) and through seasonal warm biases of nighttime IIR brightness temperatures in the 30-60° N latitude range. The latter were highlighted through observed and simulated comparisons with similar channels of the Moderate Resolution Imaging Spectroradiometer (MODIS) on board the Aqua spacecraft. To characterize the calibration biases affecting Version 1 data, a semi-empirical approach is developed, which is based on the in-depth analysis of the IIR internal calibration procedure in conjunction with observations such as statistical comparisons with similar MODIS/Aqua channels. Two types of calibration biases are revealed: an equalization bias affecting part of the individual IIR images and a global bias affecting the radiometric level of each image. These biases are observed only when the temperature of the instrument increases, and they are found to be functions of elapsed time since night-to-day transition, regardless of the season. Correction coefficients of Version 1 radiances could thus be defined and implemented in the Version 2 code. As a result, the striping effect seen in Version 1 is significantly attenuated in Version 2. Systematic discrepancies between nighttime and daytime IIR-MODIS BTDs in the 30-60° N latitude range in summer are reduced from 0.2 K in Version 1 to 0.1 K in Version 2 for IIR1-MODIS29. For IIR2-MODIS31 and IIR3-MODIS32, they are reduced from 0.4 K to close to zero, except for IIR3-MODIS32 in June, where the night-minus-day difference is around -0.1 K.

  13. Validation of the CHIRPS Satellite Rainfall Estimates over Eastern of Africa

    NASA Astrophysics Data System (ADS)

    Dinku, T.; Funk, C. C.; Tadesse, T.; Ceccato, P.

    2017-12-01

    Long and temporally consistent rainfall time series are essential in climate analyses and applications. Rainfall data from station observations are inadequate over many parts of the world due to sparse or non-existent observation networks, or limited reporting of gauge observations. As a result, satellite rainfall estimates have been used as an alternative or as a supplement to station observations. However, many satellite-based rainfall products with long time series suffer from coarse spatial and temporal resolutions and inhomogeneities caused by variations in satellite inputs. There are some satellite rainfall products with reasonably consistent time series, but they are often limited to specific geographic areas. The Climate Hazards Group Infrared Precipitation (CHIRP) and CHIRP combined with station observations (CHIRPS) are recently produced satellite-based rainfall products with relatively high spatial and temporal resolutions and quasi-global coverage. In this study, CHIRP and CHIRPS were evaluated over East Africa at daily, dekadal (10-day) and monthly time scales. The evaluation was done by comparing the satellite products with rain gauge data from about 1200 stations. The is unprecedented number of validation stations for this region covering. The results provide a unique region-wide understanding of how satellite products perform over different climatic/geographic (low lands, mountainous regions, and coastal) regions. The CHIRP and CHIRPS products were also compared with two similar satellite rainfall products: the African Rainfall Climatology version 2 (ARC2) and the latest release of the Tropical Applications of Meteorology using Satellite data (TAMSAT). The results show that both CHIRP and CHIRPS products are significantly better than ARC2 with higher skill and low or no bias. These products were also found to be slightly better than the latest version of the TAMSAT product. A comparison was also done between the latest release of the TAMSAT product (TAMSAT3) and the earlier version(TAMSAT2), which has shown that the latest version is a substantial improvement over the previous one, particularly with regards to the bias statistics.

  14. Summary of the Validation of the Second Version of the Aster Gdem

    NASA Astrophysics Data System (ADS)

    Meyer, D. J.; Tachikawa, T.; Abrams, M.; Crippen, R.; Krieger, T.; Gesch, D.; Carabajal, C.

    2012-07-01

    On October 17, 2011, NASA and the Ministry of Economy, Trade and Industry (METI) of Japan released the second version of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model (GDEM) to users worldwide at no charge as a contribution to the Global Earth Observing System of Systems (GEOSS). The first version of the ASTER GDEM, released on June 29, 2009, was compiled from over 1.2 million scene-based DEMs covering land surfaces between 83°N and 83°S latitudes. The second version (GDEM2) incorporates 260,000 additional scenes to improve coverage, a smaller correlation kernel to yield higher spatial resolution, and improved water masking. As with GDEM1, US and Japanese partners collaborated to validate GDEM2. Its absolute accuracy was within -0.20 meters on average when compared against 18,000 geodetic control points over the conterminous US (CONUS), with an accuracy of 17 meters at the 95% confidence level. The Japan study noted the GDEM2 differed from the 10-meter national elevation grid by -0.7 meters over bare areas, and by 7.4 meters over forested areas. The CONUS study noted a similar result, with the GDEM2 determined to be about 8 meters above the 1 arc-second US National Elevation Database (NED) over most forested areas, and more than a meter below NED over bare areas. A global ICESat study found the GDEM2 to be on average within 3 meters of altimeter-derived control. The Japan study noted a horizontal displacement of 0.23 pixels in GDEM2. A study from the US National Geospatial Intelligence Agency also determined horizontal displacement and vertical accuracy as compared to the 1 arc-second Shuttle Radar Topography Mission DEM. US and Japanese studies estimated the horizontal resolution of the GDEM2 to be between 71 and 82 meters. Finally, the number of voids and artifacts noted in GDEM1 were substantially reduced in GDEM2.

  15. Accurate and general treatment of electrostatic interaction in Hamiltonian adaptive resolution simulations

    NASA Astrophysics Data System (ADS)

    Heidari, M.; Cortes-Huerto, R.; Donadio, D.; Potestio, R.

    2016-10-01

    In adaptive resolution simulations the same system is concurrently modeled with different resolution in different subdomains of the simulation box, thereby enabling an accurate description in a small but relevant region, while the rest is treated with a computationally parsimonious model. In this framework, electrostatic interaction, whose accurate treatment is a crucial aspect in the realistic modeling of soft matter and biological systems, represents a particularly acute problem due to the intrinsic long-range nature of Coulomb potential. In the present work we propose and validate the usage of a short-range modification of Coulomb potential, the Damped shifted force (DSF) model, in the context of the Hamiltonian adaptive resolution simulation (H-AdResS) scheme. This approach, which is here validated on bulk water, ensures a reliable reproduction of the structural and dynamical properties of the liquid, and enables a seamless embedding in the H-AdResS framework. The resulting dual-resolution setup is implemented in the LAMMPS simulation package, and its customized version employed in the present work is made publicly available.

  16. Cosmic ray isotope measurements with a new Cerenkov X total energy telescope

    NASA Technical Reports Server (NTRS)

    Webber, W. R.; Kish, J. C.; Schrier, D. A.

    1985-01-01

    Measurements of the isotopic composition of cosmic nuclei with Z = 7-20 are reported. These measurements were made with a new version of a Cerenkov x total E telescope. Path length and uniformity corrections are made to all counters to a RMS level 1%. Since the Cerenkov counter is crucial to mass measurements using the C x E technique - special care was taken to optimize the resolution of the 2.4 cm thick Pilot 425 Cerenkov counter. This counter exhibited a beta = 1 muon equivalent LED resolution of 24%, corresponding to a total of 90 p.e. collected at the 1st dynodes of the photomultiplier tubes.

  17. Assessing Mesoscale Material Response via High-Resolution Line-Imaging VISAR

    NASA Astrophysics Data System (ADS)

    Furnish, M. D.; Trott, W. M.; Mason, J.; Podsednik, J.; Reinhart, W. D.; Hall, C.

    2004-07-01

    Of special promise for providing dynamic mesoscale response data is the line-imaging VISAR, an instrument for providing spatially resolved velocity histories in dynamic experiments. We have prepared a line-imaging VISAR system capable of spatial resolution in the 10 - 20 micron range. We are applying this instrument to selected experiments on a compressed gas gun, chosen to provide initial data for several problems of interest, including: (1) pore-collapse in single-crystal copper (70 micron diameter hole; 2 different versions); and (2) response of a welded joint in dissimilar materials (Ta, Nb) to ramp loading relative to that of a compression joint.

  18. Moderate-resolution sea surface temperature data for the nearshore North Pacific

    USGS Publications Warehouse

    Payne, Meredith C.; Reusser, Deborah A.; Lee, Henry; Brown, Cheryl A.

    2011-01-01

    Coastal sea surface temperature (SST) is an important environmental characteristic in determining the suitability of habitat for nearshore marine and estuarine organisms. This publication describes and provides access to an easy-to-use coastal SST dataset for ecologists, biogeographers, oceanographers, and other scientists conducting research on nearshore marine habitats or processes. The data cover the Temperate Northern Pacific Ocean as defined by the 'Marine Ecosystems of the World' (MEOW) biogeographic schema developed by The Nature Conservancy. The spatial resolution of the SST data is 4-km grid cells within 20 km of the shore. The data span a 29-year period - from September 1981 to December 2009. These SST data were derived from Advanced Very High Resolution Radiometer (AVHRR) instrument measurements compiled into monthly means as part of the Pathfinder versions 5.0 and 5.1 (PFSST V50 and V51) Project. The processing methods used to transform the data from their native Hierarchical Data Format Scientific Data Set (HDF SDS) to georeferenced, spatial datasets capable of being read into geographic information systems (GIS) software are explained. In addition, links are provided to examples of scripts involved in the data processing steps. The scripts were written in the Python programming language, which is supported by ESRI's ArcGIS version 9 or later. The processed data files are also provided in text (.csv) and Access 2003 Database (.mdb) formats. All data except the raster files include attributes identifying realm, province, and ecoregion as defined by the MEOW classification schema.

  19. Validating GPM-based Multi-satellite IMERG Products Over South Korea

    NASA Astrophysics Data System (ADS)

    Wang, J.; Petersen, W. A.; Wolff, D. B.; Ryu, G. H.

    2017-12-01

    Accurate precipitation estimates derived from space-borne satellite measurements are critical for a wide variety of applications such as water budget studies, and prevention or mitigation of natural hazards caused by extreme precipitation events. This study validates the near-real-time Early Run, Late Run and the research-quality Final Run Integrated Multi-Satellite Retrievals for GPM (IMERG) using Korean Quantitative Precipitation Estimation (QPE). The Korean QPE data are at a 1-hour temporal resolution and 1-km by 1-km spatial resolution, and were developed by Korea Meteorological Administration (KMA) from a Real-time ADjusted Radar-AWS (Automatic Weather Station) Rainrate (RAD-RAR) system utilizing eleven radars over the Republic of Korea. The validation is conducted by comparing Version-04A IMERG (Early, Late and Final Runs) with Korean QPE over the area (124.5E-130.5E, 32.5N-39N) at various spatial and temporal scales during March 2014 through November 2016. The comparisons demonstrate the reasonably good ability of Version-04A IMERG products in estimating precipitation over South Korea's complex topography that consists mainly of hills and mountains, as well as large coastal plains. Based on this data, the Early Run, Late Run and Final Run IMERG precipitation estimates higher than 0.1mm h-1 are about 20.1%, 7.5% and 6.1% higher than Korean QPE at 0.1o and 1-hour resolutions. Detailed comparison results are available at https://wallops-prf.gsfc.nasa.gov/KoreanQPE.V04/index.html

  20. Evaluation of snowmelt simulation in the Weather Research and Forecasting model

    NASA Astrophysics Data System (ADS)

    Jin, Jiming; Wen, Lijuan

    2012-05-01

    The objective of this study is to better understand and improve snowmelt simulations in the advanced Weather Research and Forecasting (WRF) model by coupling it with the Community Land Model (CLM) Version 3.5. Both WRF and CLM are developed by the National Center for Atmospheric Research. The automated Snow Telemetry (SNOTEL) station data over the Columbia River Basin in the northwestern United States are used to evaluate snowmelt simulations generated with the coupled WRF-CLM model. These SNOTEL data include snow water equivalent (SWE), precipitation, and temperature. The simulations cover the period of March through June 2002 and focus mostly on the snowmelt season. Initial results show that when compared to observations, WRF-CLM significantly improves the simulations of SWE, which is underestimated when the release version of WRF is coupled with the Noah and Rapid Update Cycle (RUC) land surface schemes, in which snow physics is oversimplified. Further analysis shows that more realistic snow surface energy allocation in CLM is an important process that results in improved snowmelt simulations when compared to that in Noah and RUC. Additional simulations with WRF-CLM at different horizontal spatial resolutions indicate that accurate description of topography is also vital to SWE simulations. WRF-CLM at 10 km resolution produces the most realistic SWE simulations when compared to those produced with coarser spatial resolutions in which SWE is remarkably underestimated. The coupled WRF-CLM provides an important tool for research and forecasts in weather, climate, and water resources at regional scales.

  1. AIRS Science Accomplishments Version 4.0/Plans for Version 5

    NASA Technical Reports Server (NTRS)

    Pagano, Thomas S.; Aumann, Hartmut; Elliott, Denis; Granger, Stephanie; Kahn, Brain; Eldering, Annmarie; Irion, Bill; Fetzer, Eric; Olsen, Ed; Lee, Sung-Yung; hide

    2006-01-01

    This talk is about accomplishments with AIRS data and what we have learned from almost three years of data what part of this is emerging in Version 4.0 what part we would like to see filtering into Version 5.0 and what part constitute limitations in the AIRS requirements, such as spectral and spatial resolution, which have to be deferred to the wish list for the next generation hyperspectral sounder. The AIRS calibration accuracy at the 1OOmK and stability at the 6 mK/year level are amazing. It establishes the unique capability of a cooled grating array spectrometer in Earth orbit for climate research. Data which are sufficiently clear to match the radiometric accuracy of the instrument, have a yield of less than 1%. This is OK for calibration. The 2616/cm window channel combined with the RTG.SST for tropical ocean allow excellent assessment radiometric calibration accuracy and stability. For absolute calibration verification 100mK is the limit due to cloud contamination. The 10 micron window channels can be used for stability assessment, but accuracy is limited at 300mK due to water continuum absorption uncertainties.

  2. Additions to Mars Global Reference Atmospheric Model (MARS-GRAM)

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; James, Bonnie

    1992-01-01

    Three major additions or modifications were made to the Mars Global Reference Atmospheric Model (Mars-GRAM): (1) in addition to the interactive version, a new batch version is available, which uses NAMELIST input, and is completely modular, so that the main driver program can easily be replaced by any calling program, such as a trajectory simulation program; (2) both the interactive and batch versions now have an option for treating local-scale dust storm effects, rather than just the global-scale dust storms in the original Mars-GRAM; and (3) the Zurek wave perturbation model was added, to simulate the effects of tidal perturbations, in addition to the random (mountain wave) perturbation model of the original Mars-GRAM. A minor modification was also made which allows heights to go 'below' local terrain height and return 'realistic' pressure, density, and temperature, and not the surface values, as returned by the original Mars-GRAM. This feature will allow simulations of Mars rover paths which might go into local 'valley' areas which lie below the average height of the present, rather coarse-resolution, terrain height data used by Mars-GRAM. Sample input and output of both the interactive and batch versions of Mars-GRAM are presented.

  3. Additions to Mars Global Reference Atmospheric Model (Mars-GRAM)

    NASA Technical Reports Server (NTRS)

    Justus, C. G.

    1991-01-01

    Three major additions or modifications were made to the Mars Global Reference Atmospheric Model (Mars-GRAM): (1) in addition to the interactive version, a new batch version is available, which uses NAMELIST input, and is completely modular, so that the main driver program can easily be replaced by any calling program, such as a trajectory simulation program; (2) both the interactive and batch versions now have an option for treating local-scale dust storm effects, rather than just the global-scale dust storms in the original Mars-GRAM; and (3) the Zurek wave perturbation model was added, to simulate the effects of tidal perturbations, in addition to the random (mountain wave) perturbation model of the original Mars-GRAM. A minor modification has also been made which allows heights to go below local terrain height and return realistic pressure, density, and temperature (not the surface values) as returned by the original Mars-GRAM. This feature will allow simulations of Mars rover paths which might go into local valley areas which lie below the average height of the present, rather coarse-resolution, terrain height data used by Mars-GRAM. Sample input and output of both the interactive and batch version of Mars-GRAM are presented.

  4. Technical Report Series on Global Modeling and Data Assimilation. Volume 14; A Comparison of GEOS Assimilated Data with FIFE Observations

    NASA Technical Reports Server (NTRS)

    Bosilovich, Michael G.; Suarez, Max J. (Editor); Schubert, Siegfried D.

    1998-01-01

    First ISLSCP Field Experiment (FIFE) observations have been used to validate the near-surface proper- ties of various versions of the Goddard Earth Observing System (GEOS) Data Assimilation System. The site- averaged FIFE data set extends from May 1987 through November 1989, allowing the investigation of several time scales, including the annual cycle, daily means and diurnal cycles. Furthermore, the development of the daytime convective planetary boundary layer is presented for several days. Monthly variations of the surface energy budget during the summer of 1988 demonstrate the affect of the prescribed surface soil wetness boundary conditions. GEOS data comes from the first frozen version of the assimilation system (GEOS-1 DAS) and two experimental versions of GEOS (v. 2.0 and 2.1) with substantially greater vertical resolution and other changes that influence the boundary layer. This report provides a baseline for future versions of the GEOS data assimilation system that will incorporate a state-of-the-art land surface parameterization. Several suggestions are proposed to improve the generality of future comparisons. These include the use of more diverse field experiment observations and an estimate of gridpoint heterogeneity from the new land surface parameterization.

  5. Fundamental techniques for resolution enhancement of average subsampled images

    NASA Astrophysics Data System (ADS)

    Shen, Day-Fann; Chiu, Chui-Wen

    2012-07-01

    Although single image resolution enhancement, otherwise known as super-resolution, is widely regarded as an ill-posed inverse problem, we re-examine the fundamental relationship between a high-resolution (HR) image acquisition module and its low-resolution (LR) counterpart. Analysis shows that partial HR information is attenuated but still exists, in its LR version, through the fundamental averaging-and-subsampling process. As a result, we propose a modified Laplacian filter (MLF) and an intensity correction process (ICP) as the pre and post process, respectively, with an interpolation algorithm to partially restore the attenuated information in a super-resolution (SR) enhanced image image. Experiments show that the proposed MLF and ICP provide significant and consistent quality improvements on all 10 test images with three well known interpolation methods including bilinear, bi-cubic, and the SR graphical user interface program provided by Ecole Polytechnique Federale de Lausanne. The proposed MLF and ICP are simple in implementation and generally applicable to all average-subsampled LR images. MLF and ICP, separately or together, can be integrated into most interpolation methods that attempt to restore the original HR contents. Finally, the idea of MLF and ICP can also be applied for average, subsampled one-dimensional signal.

  6. Hyperspectral imagery super-resolution by compressive sensing inspired dictionary learning and spatial-spectral regularization.

    PubMed

    Huang, Wei; Xiao, Liang; Liu, Hongyi; Wei, Zhihui

    2015-01-19

    Due to the instrumental and imaging optics limitations, it is difficult to acquire high spatial resolution hyperspectral imagery (HSI). Super-resolution (SR) imagery aims at inferring high quality images of a given scene from degraded versions of the same scene. This paper proposes a novel hyperspectral imagery super-resolution (HSI-SR) method via dictionary learning and spatial-spectral regularization. The main contributions of this paper are twofold. First, inspired by the compressive sensing (CS) framework, for learning the high resolution dictionary, we encourage stronger sparsity on image patches and promote smaller coherence between the learned dictionary and sensing matrix. Thus, a sparsity and incoherence restricted dictionary learning method is proposed to achieve higher efficiency sparse representation. Second, a variational regularization model combing a spatial sparsity regularization term and a new local spectral similarity preserving term is proposed to integrate the spectral and spatial-contextual information of the HSI. Experimental results show that the proposed method can effectively recover spatial information and better preserve spectral information. The high spatial resolution HSI reconstructed by the proposed method outperforms reconstructed results by other well-known methods in terms of both objective measurements and visual evaluation.

  7. Initial evaluation of the Celesteion large-bore PET/CT scanner in accordance with the NEMA NU2-2012 standard and the Japanese guideline for oncology FDG PET/CT data acquisition protocol version 2.0.

    PubMed

    Kaneta, Tomohiro; Ogawa, Matsuyoshi; Motomura, Nobutoku; Iizuka, Hitoshi; Arisawa, Tetsu; Hino-Shishikura, Ayako; Yoshida, Keisuke; Inoue, Tomio

    2017-10-11

    The goal of this study was to evaluate the performance of the Celesteion positron emission tomography/computed tomography (PET/CT) scanner, which is characterized by a large-bore and time-of-flight (TOF) function, in accordance with the NEMA NU-2 2012 standard and version 2.0 of the Japanese guideline for oncology fluorodeoxyglucose PET/CT data acquisition protocol. Spatial resolution, sensitivity, count rate characteristic, scatter fraction, energy resolution, TOF timing resolution, and image quality were evaluated according to the NEMA NU-2 2012 standard. Phantom experiments were performed using 18 F-solution and an IEC body phantom of the type described in the NEMA NU-2 2012 standard. The minimum scanning time required for the detection of a 10-mm hot sphere with a 4:1 target-to-background ratio, the phantom noise equivalent count (NEC phantom ), % background variability (N 10mm ), % contrast (Q H,10mm ), and recovery coefficient (RC) were calculated according to the Japanese guideline. The measured spatial resolution ranged from 4.5- to 5-mm full width at half maximum (FWHM). The sensitivity and scatter fraction were 3.8 cps/kBq and 37.3%, respectively. The peak noise-equivalent count rate was 70 kcps in the presence of 29.6 kBq mL -1 in the phantom. The system energy resolution was 12.4% and the TOF timing resolution was 411 ps at FWHM. Minimum scanning times of 2, 7, 6, and 2 min per bed position, respectively, are recommended for visual score, noise-equivalent count (NEC) phantom , N 10mm , and the Q H,10mm to N 10mm ratio (QNR) by the Japanese guideline. The RC of a 10-mm-diameter sphere was 0.49, which exceeded the minimum recommended value. The Celesteion large-bore PET/CT system had low sensitivity and NEC, but good spatial and time resolution when compared to other PET/CT scanners. The QNR met the recommended values of the Japanese guideline even at 2 min. The Celesteion is therefore thought to provide acceptable image quality with 2 min/bed position acquisition, which is the most common scan protocol in Japan.

  8. Features Based Assessments of Warm Season Convective Precipitation Forecasts From the High Resolution Rapid Refresh Model

    NASA Astrophysics Data System (ADS)

    Bytheway, Janice L.

    Forecast models have seen vast improvements in recent years, via increased spatial and temporal resolution, rapid updating, assimilation of more observational data, and continued development and improvement of the representation of the atmosphere. One such model is the High Resolution Rapid Refresh (HRRR) model, a 3 km, hourly-updated, convection-allowing model that has been in development since 2010 and running operationally over the contiguous US since 2014. In 2013, the HRRR became the only US model to assimilate radar reflectivity via diabatic assimilation, a process in which the observed reflectivity is used to induce a latent heating perturbation in the model initial state in order to produce precipitation in those areas where it is indicated by the radar. In order to support the continued development and improvement of the HRRR model with regard to forecasts of convective precipitation, the concept of an assessment is introduced. The assessment process aims to connect model output with observations by first validating model performance then attempting to connect that performance to model assumptions, parameterizations and processes to identify areas for improvement. Observations from remote sensing platforms such as radar and satellite can provide valuable information about three-dimensional storm structure and microphysical properties for use in the assessment, including estimates of surface rainfall, hydrometeor types and size distributions, and column moisture content. A features-based methodology is used to identify warm season convective precipitating objects in the 2013, 2014, and 2015 versions of HRRR precipitation forecasts, Stage IV multisensor precipitation products, and Global Precipitation Measurement (GPM) core satellite observations. Quantitative precipitation forecasts (QPFs) are evaluated for biases in hourly rainfall intensity, total rainfall, and areal coverage in both the US Central Plains (29-49N, 85-105W) and US Mountain West (29-49N, 105-125W). Features identified in the model and Stage IV were tracked through time in order to evaluate forecasts through several hours of the forecast period. The 2013 version of the model was found to produce significantly stronger convective storms than observed, with a slight southerly displacement from the observed storms during the peak hours of convective activity (17-00 UTC). This version of the model also displayed a strong relationship between atmospheric water vapor content and cloud thickness over the central plains. In the 2014 and 2015 versions of the model, storms in the western US were found to be smaller and weaker than the observed, and satellite products (brightness temperatures and reflectivities) simulated using model output indicated that many of the forecast storms contained too much ice above the freezing level. Model upgrades intended to decrease the biases seen in early versions include changes to the reflectivity assimilation, the addition of sub-grid scale cloud parameterizations, changes to the representation of surface processes and the addition of aerosol processes to the microphysics. The effects of these changes are evident in each successive version of the model, with reduced biases in intensity, elimination of the southerly bias, and improved representation of the onset of convection.

  9. PC-SEAPAK user's guide, version 4.0

    NASA Technical Reports Server (NTRS)

    Mcclain, Charles R.; Fu, Gary; Darzi, Michael; Firestone, James K.

    1992-01-01

    PC-SEAPAK is designed to provide a complete and affordable capability for processing and analysis of NOAA Advanced Very High Resolution Radiometer (AVHRR) and Nimbus-7 Coastal Zone Color Scanner (CZCS) data. Since the release of version 3.0 over a year ago, significant revisions were made to the AVHRR and CZCS programs and to the statistical data analysis module, and a number of new programs were added. This new version has 114 procedures listed in its menus. The package continues to emphasize user-friendliness and interactive data analysis. Additionally, because the scientific goals of the ocean color research being conducted have shifted to larger space and time scales, batch processing capabilities were enhanced, allowing large quantities of data to be easily ingested and analyzed. The development of PC-SEAPAK was paralled by two other activities that were influential and assistive: the global CZCS processing effort at GSFC and the continued development of VAX-SEAPAK. SEAPAK incorporates the instrument calibration and support all levels of data available from the CZCS archive.

  10. What You Need to Know About the OMI NO2 Data Product for Air Quality Studies

    NASA Technical Reports Server (NTRS)

    Celarier, E. A.; Gleason, J. F.; Bucsela, E. J.; Brinksma, E.; Veefkind, J. P.

    2007-01-01

    The standard nitrogen dioxide (NO2) data product, produced from measurements by the Ozone Monitoring Instrument (OMI), are publicly available online from the NASA GESDISC facility. Important data fields include total and tropospheric column densities, as well as collocated data for cloud fraction and cloud top height, surface albedo and snow/ice coverage, at the resolution of the OMI instrument (12 km x 26 km, at nadir). The retrieved NO2 data have been validated, principally under clear-sky conditions. The first public-release version has been available since September 2006. An improved version of the data product, which includes a number of new data fields, and improved estimates of the retrieval uncertainties will be released by the end of 2007. This talk will describe the standard NO2 data product, including details that are essential for the use of the data for air quality studies. We will also describe the principal improvements with the new version of the data product.

  11. Detection of grapes in natural environment using HOG features in low resolution images

    NASA Astrophysics Data System (ADS)

    Škrabánek, Pavel; Majerík, Filip

    2017-07-01

    Detection of grapes in real-life images has importance in various viticulture applications. A grape detector based on an SVM classifier, in combination with a HOG descriptor, has proven to be very efficient in detection of white varieties in high-resolution images. Nevertheless, the high time complexity of such utilization was not suitable for its real-time applications, even when a detector of a simplified structure was used. Thus, we examined possibilities of the simplified version application on images of lower resolutions. For this purpose, we designed a method aimed at search for a detector’s setting which gives the best time complexity vs. performance ratio. In order to provide precise evaluation results, we formed new extended datasets. We discovered that even applied on low-resolution images, the simplified detector, with an appropriate setting of all tuneable parameters, was competitive with other state of the art solutions. We concluded that the detector is qualified for real-time detection of grapes in real-life images.

  12. Combined self-learning based single-image super-resolution and dual-tree complex wavelet transform denoising for medical images

    NASA Astrophysics Data System (ADS)

    Yang, Guang; Ye, Xujiong; Slabaugh, Greg; Keegan, Jennifer; Mohiaddin, Raad; Firmin, David

    2016-03-01

    In this paper, we propose a novel self-learning based single-image super-resolution (SR) method, which is coupled with dual-tree complex wavelet transform (DTCWT) based denoising to better recover high-resolution (HR) medical images. Unlike previous methods, this self-learning based SR approach enables us to reconstruct HR medical images from a single low-resolution (LR) image without extra training on HR image datasets in advance. The relationships between the given image and its scaled down versions are modeled using support vector regression with sparse coding and dictionary learning, without explicitly assuming reoccurrence or self-similarity across image scales. In addition, we perform DTCWT based denoising to initialize the HR images at each scale instead of simple bicubic interpolation. We evaluate our method on a variety of medical images. Both quantitative and qualitative results show that the proposed approach outperforms bicubic interpolation and state-of-the-art single-image SR methods while effectively removing noise.

  13. The high resolution stereo camera (HRSC): acquisition of multi-spectral 3D-data and photogrammetric processing

    NASA Astrophysics Data System (ADS)

    Neukum, Gerhard; Jaumann, Ralf; Scholten, Frank; Gwinner, Klaus

    2017-11-01

    At the Institute of Space Sensor Technology and Planetary Exploration of the German Aerospace Center (DLR) the High Resolution Stereo Camera (HRSC) has been designed for international missions to planet Mars. For more than three years an airborne version of this camera, the HRSC-A, has been successfully applied in many flight campaigns and in a variety of different applications. It combines 3D-capabilities and high resolution with multispectral data acquisition. Variable resolutions depending on the camera control settings can be generated. A high-end GPS/INS system in combination with the multi-angle image information yields precise and high-frequent orientation data for the acquired image lines. In order to handle these data a completely automated photogrammetric processing system has been developed, and allows to generate multispectral 3D-image products for large areas and with accuracies for planimetry and height in the decimeter range. This accuracy has been confirmed by detailed investigations.

  14. Evaluation of snow modeling with Noah and Noah-MP land surface models in NCEP GFS/CFS system

    NASA Astrophysics Data System (ADS)

    Dong, J.; Ek, M. B.; Wei, H.; Meng, J.

    2017-12-01

    Land surface serves as lower boundary forcing in global forecast system (GFS) and climate forecast system (CFS), simulating interactions between land and the atmosphere. Understanding the underlying land model physics is a key to improving weather and seasonal prediction skills. With the upgrades in land model physics (e.g., release of newer versions of a land model), different land initializations, changes in parameterization schemes used in the land model (e.g., land physical parametrization options), and how the land impact is handled (e.g., physics ensemble approach), it always prompts the necessity that climate prediction experiments need to be re-conducted to examine its impact. The current NASA LIS (version 7) integrates NOAA operational land surface and hydrological models (NCEP's Noah, versions from 2.7.1 to 3.6 and the future Noah-MP), high-resolution satellite and observational data, and land DA tools. The newer versions of the Noah LSM used in operational models have a variety of enhancements compared to older versions, where the Noah-MP allows for different physics parameterization options and the choice could have large impact on physical processes underlying seasonal predictions. These impacts need to be reexamined before implemented into NCEP operational systems. A set of offline numerical experiments driven by the GFS forecast forcing have been conducted to evaluate the impact of snow modeling with daily Global Historical Climatology Network (GHCN).

  15. 'To Trust or Not to Trust--Pupils' Ways of Judging Information Encountered in a Socio-Scientific Issue.

    ERIC Educational Resources Information Center

    Kolsto, Stein Dankert

    2001-01-01

    Describes a qualitative study in which 16-year-old Norwegian pupils dealt with a socio-scientific issue. Investigates aspects of students' decision-making concerning a local version of the well-known controversial issue of whether or not power transmission lines increase the risk for childhood leukemia. Some of the resolution strategies imply that…

  16. Knowledge-based image data management - An expert front-end for the BROWSE facility

    NASA Technical Reports Server (NTRS)

    Stoms, David M.; Star, Jeffrey L.; Estes, John E.

    1988-01-01

    An intelligent user interface being added to the NASA-sponsored BROWSE testbed facility is described. BROWSE is a prototype system designed to explore issues involved in locating image data in distributed archives and displaying low-resolution versions of that imagery at a local terminal. For prototyping, the initial application is the remote sensing of forest and range land.

  17. Simulation of the present-day climate with the climate model INMCM5

    NASA Astrophysics Data System (ADS)

    Volodin, E. M.; Mortikov, E. V.; Kostrykin, S. V.; Galin, V. Ya.; Lykossov, V. N.; Gritsun, A. S.; Diansky, N. A.; Gusev, A. V.; Iakovlev, N. G.

    2017-12-01

    In this paper we present the fifth generation of the INMCM climate model that is being developed at the Institute of Numerical Mathematics of the Russian Academy of Sciences (INMCM5). The most important changes with respect to the previous version (INMCM4) were made in the atmospheric component of the model. Its vertical resolution was increased to resolve the upper stratosphere and the lower mesosphere. A more sophisticated parameterization of condensation and cloudiness formation was introduced as well. An aerosol module was incorporated into the model. The upgraded oceanic component has a modified dynamical core optimized for better implementation on parallel computers and has two times higher resolution in both horizontal directions. Analysis of the present-day climatology of the INMCM5 (based on the data of historical run for 1979-2005) shows moderate improvements in reproduction of basic circulation characteristics with respect to the previous version. Biases in the near-surface temperature and precipitation are slightly reduced compared with INMCM4 as well as biases in oceanic temperature, salinity and sea surface height. The most notable improvement over INMCM4 is the capability of the new model to reproduce the equatorial stratospheric quasi-biannual oscillation and statistics of sudden stratospheric warmings.

  18. GEOS-5 Seasonal Forecast System: ENSO Prediction Skill and Bias

    NASA Technical Reports Server (NTRS)

    Borovikov, Anna; Kovach, Robin; Marshak, Jelena

    2018-01-01

    The GEOS-5 AOGCM known as S2S-1.0 has been in service from June 2012 through January 2018 (Borovikov et al. 2017). The atmospheric component of S2S-1.0 is Fortuna-2.5, the same that was used for the Modern-Era Retrospective Analysis for Research and Applications (MERRA), but with adjusted parameterization of moist processes and turbulence. The ocean component is the Modular Ocean Model version 4 (MOM4). The sea ice component is the Community Ice CodE, version 4 (CICE). The land surface model is a catchment-based hydrological model coupled to the multi-layer snow model. The AGCM uses a Cartesian grid with a 1 deg × 1.25 deg horizontal resolution and 72 hybrid vertical levels with the upper most level at 0.01 hPa. OGCM nominal resolution of the tripolar grid is 1/2 deg, with a meridional equatorial refinement to 1/4 deg. In the coupled model initialization, selected atmospheric variables are constrained with MERRA. The Goddard Earth Observing System integrated Ocean Data Assimilation System (GEOS-iODAS) is used for both ocean state and sea ice initialization. SST, T and S profiles and sea ice concentration were assimilated.

  19. Distributed MRI reconstruction using Gadgetron-based cloud computing.

    PubMed

    Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S

    2015-03-01

    To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.

  20. Effect of Convection on the Tropical Tropopause Layer over the Tropical Americas

    NASA Technical Reports Server (NTRS)

    Pittman, Jasna; Robertson, Franklin

    2007-01-01

    Water vapor and ozone are the most important gases that regulate the radiative balance of the Tropical Tropopause Layer (TTL). Their radiative contribution dictates the height within the TTL and the rate at which air either ascends into the tropical stratosphere or subsides back to the tropical troposphere. The details of the mechanisms that control their concentration, however, are poorly understood. One of such mechanisms is convection that reaches into the TTL. ill this study, we will present evidence from space-borne observations of the impact that convection has on water vapor, ozone, and temperature in the TTL over the Tropical Americas where deep and overshooting convection have the highest frequency of occurrence in the tropics. We explore the effect of convective systems such as hurricanes during the 2005 season using the Microwave Limb Sounder (MLS) on Aura version 1.5 data and more recent tropical systems using the newly released version 2 data with higher vertical resolution. ill order to provide the horizontal extent and the vertical structure of the convective systems, we use data from the Moderate Resolution Imaging Spectroradiometer (MODIS) on Aqua, the Microwave Humidity Sensor (MHS) on NOAA18, and CloudSat when available.

  1. Resolution enhancement using a new multiple-pulse decoupling sequence for quadrupolar nuclei.

    PubMed

    Delevoye, L; Trébosc, J; Gan, Z; Montagne, L; Amoureux, J-P

    2007-05-01

    A new decoupling composite pulse sequence is proposed to remove the broadening on spin S=1/2 magic-angle spinning (MAS) spectra arising from the scalar coupling with a quadrupolar nucleus I. It is illustrated on the (31)P spectrum of an aluminophosphate, AlPO(4)-14, which is broadened by the presence of (27)Al/(31)P scalar couplings. The multiple-pulse (MP) sequence has the advantage over the continuous wave (CW) irradiation to efficiently annul the scalar dephasing without reintroducing the dipolar interaction. The MP decoupling sequence is first described in a rotor-synchronised version (RS-MP) where one parameter only needs to be adjusted. It clearly avoids the dipolar recoupling in order to achieve a better resolution than using the CW sequence. In a second improved version, the MP sequence is experimentally studied in the vicinity of the perfect rotor-synchronised conditions. The linewidth at half maximum (FWHM) of 65 Hz using (27)Al CW decoupling decreases to 48 Hz with RS-MP decoupling and to 30 Hz with rotor-asynchronised MP (RA-MP) decoupling. The main phenomena are explained using both experimental results and numerical simulations.

  2. Rationale and study design of the Prospective comparison of Angiotensin Receptor neprilysin inhibitor with Angiotensin receptor blocker MEasuring arterial sTiffness in the eldERly (PARAMETER) study.

    PubMed

    Williams, Bryan; Cockcroft, John R; Kario, Kazuomi; Zappe, Dion H; Cardenas, Pamela; Hester, Allen; Brunel, Patrick; Zhang, Jack

    2014-02-04

    Hypertension in elderly people is characterised by elevated systolic blood pressure (SBP) and increased pulse pressure (PP), which indicate large artery ageing and stiffness. LCZ696, a first-in-class angiotensin receptor neprilysin inhibitor (ARNI), is being developed to treat hypertension and heart failure. The Prospective comparison of Angiotensin Receptor neprilysin inhibitor with Angiotensin receptor blocker MEasuring arterial sTiffness in the eldERly (PARAMETER) study will assess the efficacy of LCZ696 versus olmesartan on aortic stiffness and central aortic haemodynamics. In this 52-week multicentre study, patients with hypertension aged ≥60 years with a mean sitting (ms) SBP ≥150 to <180 and a PP>60 mm Hg will be randomised to once daily LCZ696 200 mg or olmesartan 20 mg for 4 weeks, followed by a forced-titration to double the initial doses for the next 8 weeks. At 12-24 weeks, if the BP target has not been attained (msSBP <140 and ms diastolic BP <90 mm Hg), amlodipine (2.5-5 mg) and subsequently hydrochlorothiazide (6.25-25 mg) can be added. The primary and secondary endpoints are changes from baseline in central aortic systolic pressure (CASP) and central aortic PP (CAPP) at week 12, respectively. Other secondary endpoints are the changes in CASP and CAPP at week 52. A sample size of 432 randomised patients is estimated to ensure a power of 90% to assess the superiority of LCZ696 over olmesartan at week 12 in the change from baseline of mean CASP, assuming an SD of 19 mm Hg, the difference of 6.5 mm Hg and a 15% dropout rate. The primary variable will be analysed using a two-way analysis of covariance. The study was initiated in December 2012 and final results are expected in 2015. The results of this study will impact the design of future phase III studies assessing cardiovascular protection. EUDract number 2012-002899-14 and ClinicalTrials.gov NCT01692301.

  3. Data publication and sharing using the SciDrive service

    NASA Astrophysics Data System (ADS)

    Mishin, Dmitry; Medvedev, D.; Szalay, A. S.; Plante, R. L.

    2014-01-01

    Despite the last years progress in scientific data storage, still remains the problem of public data storage and sharing system for relatively small scientific datasets. These are collections forming the “long tail” of power log datasets distribution. The aggregated size of the long tail data is comparable to the size of all data collections from large archives, and the value of data is significant. The SciDrive project's main goal is providing the scientific community with a place to reliably and freely store such data and provide access to it to broad scientific community. The primary target audience of the project is astoromy community, and it will be extended to other fields. We're aiming to create a simple way of publishing a dataset, which can be then shared with other people. Data owner controls the permissions to modify and access the data and can assign a group of users or open the access to everyone. The data contained in the dataset will be automaticaly recognized by a background process. Known data formats will be extracted according to the user's settings. Currently tabular data can be automatically extracted to the user's MyDB table where user can make SQL queries to the dataset and merge it with other public CasJobs resources. Other data formats can be processed using a set of plugins that upload the data or metadata to user-defined side services. The current implementation targets some of the data formats commonly used by the astronomy communities, including FITS, ASCII and Excel tables, TIFF images, and YT simulations data archives. Along with generic metadata, format-specific metadata is also processed. For example, basic information about celestial objects is extracted from FITS files and TIFF images, if present. A 100TB implementation has just been put into production at Johns Hopkins University. The system features public data storage REST service supporting VOSpace 2.0 and Dropbox protocols, HTML5 web portal, command-line client and Java standalone client to synchronize a local folder with the remote storage. We use VAO SSO (Single Sign On) service from NCSA for users authentication that provides free registration for everyone.

  4. Global Fiducials Program Imagery: New Opportunities for Geospatial Research, Outreach, and Education

    NASA Astrophysics Data System (ADS)

    Price, S. D.

    2012-12-01

    MOLNIA, Bruce F., PRICE, Susan D. and, KING, Stephen E., U.S. Geological Survey (USGS), 562 National Center, Reston, VA 20192, sprice@usgs.gov The Civil Applications Committee (CAC), operated by the U.S. Geological Survey (USGS), is the Federal interagency committee that facilitates Federal civil agency access to U.S. National Systems space-based electro-optical (EO) imagery for natural disaster response; global change investigations; ecosystem monitoring; mapping, charting, and geodesy; and related topics. The CAC's Global Fiducials Program (GFP) has overseen the systematic collection of high-resolution imagery to provide geospatial data time series spanning a decade or more at carefully selected sites to study and monitor changes, and to facilitate a comprehensive understanding of dynamic and sensitive areas of our planet. Since 2008, more than 4,500 one-meter resolution EO images which comprise time series from 85 GFP sites have been released for unrestricted public use. Initial site selections were made by Federal and academic scientists based on each site's unique history, susceptibility, or environmental value. For each site, collection strategies were carefully defined to maximize information extraction capabilities. This consistency enhances our ability to understand Earth's dynamic processes and long-term trends. Individual time series focus on Arctic sea ice change; temperate glacier behavior; mid-continent wetland dynamics; barrier island response to hurricanes; coastline evolution; wildland fire recovery; Long-Term Ecological Resource (LTER) site processes; and many other topics. The images are available from a USGS website at no cost, in an orthorectified GeoTIFF format with supporting metadata, making them ideal for use in Earth science education and GIS projects. New on-line tools provide enhanced analysis of these time-series imagery. For additional information go to http://gfp.usgs.gov or http://gfl.usgs.gov.Bering Glacier is the largest and longest glacier in continental North America, with a length of 190 km, a width of 40 km, and an area of about 5,000 km2. In the nine years between the 1996 image and the 2005 image, parts of the terminus retreated by more than 5 km and thinned by as much as 100 m. Long-term monitoring of Bering Glacier will enable scientists to better understand the dynamics of surging glaciers as well as how changing Alaska climate is affecting temperate glacier environments.

  5. Uncertainties associated to the representation of surface processes in impact studies. A study in the Mediterranean area.

    NASA Astrophysics Data System (ADS)

    Quéguiner, Solen; Martin, Eric; Lafont, Sébastien; Calvet, Jean-Christophe; Faroux, Stéphanie

    2010-05-01

    In the framework of the assessment of the impact of climate change, the uncertainty associated to the direct effect of CO2 on plant physiology was seldom addressed, while some other sources of uncertainties have been more studied, such as those related to climate modeling or the downscaling method. A few studies are available at global or continental scale. The purpose of this study is to quantify this effect in a regional study focussed on the Mediterranean area of France. The Safran-Isba-Modcou chain was used. This chain is composed of a meteorological analysis system (SAFRAN), a land surface model describing the exchange with the atmosphere (ISBA) and a hydrogeological model (MODCOU), and has already been used in many studies in France. The present study focuses on the uncertainties related to the representation of carbon cycle and the photosynthesis in the surface model. Two versions of ISBA were used and compared. The standard version simulates the mass and energy exchanges between the continental surface (including vegetation and snow) and the atmosphere. In this version, the LAI (Leaf Area Index) is provided by the ECOCLIMAP2 database and the vegetation is divided into 12 types. The A-gs version accounts for the process of photosynthesis taking into account the vegetation assimilation of atmospheric CO2 concentration, and simulates the evolution of the biomass and the LAI. The domain studied is the French mediterranean basin, in which a sub domain was defined (latitude < 45 °N et height < 1000m) in order to identify the low land area pertaining to a Mediterranean climate. The study focuses on the impact of the climate change on the surface variables (LAI, water balance) and the discharges. The periods chosen to compare the changes are the end of the 20th century (1995-2005) and the end of the 21st century (2090-2099). A first comparison is made for the present climate between the versions of model and the observations of discharges, using two type of meteorological forcing : SAFRAN and data from a continuous high resolution climate scenario, based on the scenario A2, with a coupled atmosphere-mediterranean sea GCM. This scenario was further downscaled to the resolution of the study (a grid mesh of 8x8 km), using a quantile-quantile correction method. Concerning the present climate, the comparison shows a delay of the development of the vegetation simulated by ISBA-A-gs causing an underestimation of evaporation and an overestimation of discharges in the spring compared to the observations and the standard version of ISBA. In future climate, the explicit response of vegetation to the CO2 concentration of the ISBA A-gs version gives an different answer on the surface water budget and flow from the standard version of ISBA. This difference is especially visible in the southern area, the impact on the flow is increased and impact on evaporation is decreased, showing the interest of using a CO2 responsive version of ISBA for impact studies.

  6. File Specification for the MERRA Aerosol Reanalysis (MERRAero): MODIS AOD Assimilation based on a MERRA Replay

    NASA Technical Reports Server (NTRS)

    Da Silva, A. M.; Randles, C. A.; Buchard, V.; Darmenov, A.; Colarco, P. R.; Govindaraju, R.

    2015-01-01

    This document describes the gridded output files produced by the Goddard Earth Observing System version 5 (GEOS-5) Goddard Aerosol Assimilation System (GAAS) from July 2002 through December 2014. The MERRA Aerosol Reanalysis (MERRAero) is produced with the hydrostatic version of the GEOS-5 Atmospheric Global Climate Model (AGCM). In addition to standard meteorological parameters (wind, temperature, moisture, surface pressure), this simulation includes 15 aerosol tracers (dust, sea-salt, sulfate, black and organic carbon), ozone, carbon monoxide and carbon dioxide. This model simulation is driven by prescribed sea-surface temperature and sea-ice, daily volcanic and biomass burning emissions, as well as high-resolution inventories of anthropogenic emission sources. Meteorology is replayed from the MERRA Reanalysis.

  7. MODTRAN6: a major upgrade of the MODTRAN radiative transfer code

    NASA Astrophysics Data System (ADS)

    Berk, Alexander; Conforti, Patrick; Kennett, Rosemary; Perkins, Timothy; Hawes, Frederick; van den Bosch, Jeannette

    2014-06-01

    The MODTRAN6 radiative transfer (RT) code is a major advancement over earlier versions of the MODTRAN atmospheric transmittance and radiance model. This version of the code incorporates modern software ar- chitecture including an application programming interface, enhanced physics features including a line-by-line algorithm, a supplementary physics toolkit, and new documentation. The application programming interface has been developed for ease of integration into user applications. The MODTRAN code has been restructured towards a modular, object-oriented architecture to simplify upgrades as well as facilitate integration with other developers' codes. MODTRAN now includes a line-by-line algorithm for high resolution RT calculations as well as coupling to optical scattering codes for easy implementation of custom aerosols and clouds.

  8. Performance evaluation of D-SPECT: a novel SPECT system for nuclear cardiology

    NASA Astrophysics Data System (ADS)

    Erlandsson, Kjell; Kacperski, Krzysztof; van Gramberg, Dean; Hutton, Brian F.

    2009-05-01

    D-SPECT (Spectrum Dynamics, Israel) is a novel SPECT system for cardiac perfusion studies. Based on CZT detectors, region-centric scanning, high-sensitivity collimators and resolution recovery, it offers potential advantages over conventional systems. A series of measurements were made on a β-version D-SPECT system in order to evaluate its performance in terms of energy resolution, scatter fraction, sensitivity, count rate capability and resolution. Corresponding measurements were also done on a conventional SPECT system (CS) for comparison. The energy resolution of the D-SPECT system at 140 keV was 5.5% (CS: 9.25%), the scatter fraction 30% (CS: 34%), the planar sensitivity 398 s-1 MBq-1 per head (99mTc, 10 cm) (CS: 72 s-1 MBq-1), and the tomographic sensitivity in the heart region was in the range 647-1107 s-1 MBq-1 (CS: 141 s-1 MBq-1). The count rate increased linearly with increasing activity up to 1.44 M s-1. The intrinsic resolution was equal to the pixel size, 2.46 mm (CS: 3.8 mm). The average reconstructed resolution using the standard clinical filter was 12.5 mm (CS: 13.7 mm). The D-SPECT has superior sensitivity to that of a conventional system with similar spatial resolution. It also has excellent energy resolution and count rate characteristics, which should prove useful in dynamic and dual radionuclide studies.

  9. Diagnostic accuracy of different display types in detection of recurrent caries under restorations by using CBCT.

    PubMed

    Baltacıoĝlu, İsmail H; Eren, Hakan; Yavuz, Yasemin; Kamburoğlu, Kıvanç

    To assess the in vitro diagnostic ability of CBCT images using seven different display types in the detection of recurrent caries. Our study comprised 128 extracted human premolar and molar teeth. 8 groups each containing 16 teeth were obtained as follows: (1) Black Class I (Occlusal) amalgam filling without caries; (2) Black Class I (Occlusal) composite filling without caries; (3) Black Class II (Proximal) amalgam filling without caries; (4) Black Class II (Proximal) composite filling without caries; (5) Black Class I (Occlusal) amalgam filling with caries; (6) Black Class I (Occlusal) composite filling with caries; (7) Black Class II (Proximal) amalgam filling with caries; and (8) Black Class II (Proximal) composite filling with caries. Teeth were imaged using 100 × 90 mm field of view at three different voxel sizes of a CBCT unit (Planmeca ProMax(®) 3D ProFace™; Planmeca, Helsinki, Finland). CBCT TIFF images were opened and viewed using custom-designed software for computers on different display types. Intra- and interobserver agreements were calculated. The highest area under the receiver operating characteristic curve (Az) values for each image type, observer, reading and restoration were compared using z-tests against Az = 0.5. The significance level was set at p = 0.05. We found poor and moderate agreements. In general, Az values were found when software and medical diagnostic monitor were utilized. For Observer 2, Az values were statistically significantly higher when software was used on medical monitor [p = 0.036, p = 0.015 and p = 0.002, for normal-resolution mode (0.200 mm(3) voxel size), high-resolution mode (0.150 mm(3) voxel size) and low-resolution mode (0.400 mm(3) voxel size), respectively]. No statistically significant differences were found among other display types for all modes (p > 0.05). In general, no difference was found among 3 different voxel sizes (p > 0.05). In general, higher Az values were obtained for composite restorations than for amalgam restorations for all observers. For Observer 1, Az values for composite restorations were statistically significantly higher than those of amalgam restorations for MacBook and iPhone (Apple Inc., Cupertino, CA) assessments (p = 0.002 and p = 0.048, respectively). Higher Az values were observed with medical monitors when used with dedicated software compared to other display types which performed similarly in the diagnosis of recurrent caries under restorations. In addition, observers performed better in detection of recurrent caries when assessing composite restorations than amalgams.

  10. Diagnostic accuracy of different display types in detection of recurrent caries under restorations by using CBCT

    PubMed Central

    Baltacıoĝlu, İsmail H; Eren, Hakan; Yavuz, Yasemin

    2016-01-01

    Objectives: To assess the in vitro diagnostic ability of CBCT images using seven different display types in the detection of recurrent caries. Methods: Our study comprised 128 extracted human premolar and molar teeth. 8 groups each containing 16 teeth were obtained as follows: (1) Black Class I (Occlusal) amalgam filling without caries; (2) Black Class I (Occlusal) composite filling without caries; (3) Black Class II (Proximal) amalgam filling without caries; (4) Black Class II (Proximal) composite filling without caries; (5) Black Class I (Occlusal) amalgam filling with caries; (6) Black Class I (Occlusal) composite filling with caries; (7) Black Class II (Proximal) amalgam filling with caries; and (8) Black Class II (Proximal) composite filling with caries. Teeth were imaged using 100 × 90 mm field of view at three different voxel sizes of a CBCT unit (Planmeca ProMax® 3D ProFace™; Planmeca, Helsinki, Finland). CBCT TIFF images were opened and viewed using custom-designed software for computers on different display types. Intra- and interobserver agreements were calculated. The highest area under the receiver operating characteristic curve (Az) values for each image type, observer, reading and restoration were compared using z-tests against Az = 0.5. The significance level was set at p = 0.05. Results: We found poor and moderate agreements. In general, Az values were found when software and medical diagnostic monitor were utilized. For Observer 2, Az values were statistically significantly higher when software was used on medical monitor [p = 0.036, p = 0.015 and p = 0.002, for normal-resolution mode (0.200 mm3 voxel size), high-resolution mode (0.150 mm3 voxel size) and low-resolution mode (0.400 mm3 voxel size), respectively]. No statistically significant differences were found among other display types for all modes (p > 0.05). In general, no difference was found among 3 different voxel sizes (p > 0.05). In general, higher Az values were obtained for composite restorations than for amalgam restorations for all observers. For Observer 1, Az values for composite restorations were statistically significantly higher than those of amalgam restorations for MacBook and iPhone (Apple Inc., Cupertino, CA) assessments (p = 0.002 and p = 0.048, respectively). Conclusions: Higher Az values were observed with medical monitors when used with dedicated software compared to other display types which performed similarly in the diagnosis of recurrent caries under restorations. In addition, observers performed better in detection of recurrent caries when assessing composite restorations than amalgams. PMID:27319604

  11. Break-In, Performance, and Endurance Tests Results on Fixed Displacement Hydraulic Fluid Power Vane Pumps.

    DTIC Science & Technology

    1982-07-15

    D1FF MV-. WD MVF fE)J FLOW fUAfl. U( FFF C(HFFV MIH TFMW(F) KS! JD1HI-if. (FM1 fFFG Q ON J2778 2S3.40 17819.081r? ,3 C. 6,I, .1 PK 0.90 120.1 Eo 29.~ S...85.952 THE MAX. MECHANICAL EFFICIENCY IS: 91.2238 THE MIH . MECHANICAL EFFICIENCY IS: 90. 1712 THE AJERAGE MECHANICAL EFFICIENC’’ IS’ 90.6795 I’ iI ~I...EEC M 13503H TIfF (VilFT flIFF PSS T(AF . F7.ff T (mm I ixt FFF ?li I TT MIH TF’P(F) PFS1 IHi P F$m (Fm FFF(%.) Q’) 0. fe 177.40 IV I I .. 14U ..P W

  12. Software for hyperspectral, joint photographic experts group (.JPG), portable network graphics (.PNG) and tagged image file format (.TIFF) segmentation

    NASA Astrophysics Data System (ADS)

    Bruno, L. S.; Rodrigo, B. P.; Lucio, A. de C. Jorge

    2016-10-01

    This paper presents a system developed by an application of a neural network Multilayer Perceptron for drone acquired agricultural image segmentation. This application allows a supervised user training the classes that will posteriorly be interpreted by neural network. These classes will be generated manually with pre-selected attributes in the application. After the attribute selection a segmentation process is made to allow the relevant information extraction for different types of images, RGB or Hyperspectral. The application allows extracting the geographical coordinates from the image metadata, geo referencing all pixels on the image. In spite of excessive memory consume on hyperspectral images regions of interest, is possible to perform segmentation, using bands chosen by user that can be combined in different ways to obtain different results.

  13. A software platform for the analysis of dermatology images

    NASA Astrophysics Data System (ADS)

    Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon

    2017-11-01

    The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.

  14. Digital seismic-reflection data from western Rhode Island Sound, 1980

    USGS Publications Warehouse

    McMullen, K.Y.; Poppe, L.J.; Soderberg, N.K.

    2009-01-01

    During 1980, the U.S. Geological Survey (USGS) conducted a seismic-reflection survey in western Rhode Island Sound aboard the Research Vessel Neecho. Data from this survey were recorded in analog form and archived at the USGS Woods Hole Science Center's Data Library. Due to recent interest in the geology of Rhode Island Sound and in an effort to make the data more readily accessible while preserving the original paper records, the seismic data from this cruise were scanned and converted to Tagged Image File Format (TIFF) images and SEG-Y data files. Navigation data were converted from U.S. Coast Guard Long Range Aids to Navigation (LORAN-C) time delays to latitudes and longitudes, which are available in Environmental Systems Research Institute, Inc. (ESRI) shapefile format and as eastings and northings in space-delimited text format.

  15. Detailed Hydraulic Assessment Using a High-Resolution Piezocone Coupled to the GeoVIS

    DTIC Science & Technology

    2008-07-01

    statistical means can be performed. • Repeat K comparisons at a highly permeable site. The piezocone is capable of estimating K in soils of higher...Version 1, March 2006, 131 pp. Ferritto, J.M., 1997. Seismic Design Criteria for Soil Liquefaction , NFESC Technical Report TR-2077-SHR, June, 1997... Soil Type and Well Design Logs................................................................... 20 Figure 6. Interpolated Three-Dimensional Head

  16. U. S. GODAE: Global Ocean Prediction with the HYbrid Coordinate Ocean Model

    DTIC Science & Technology

    2009-01-01

    2008). There are three major contributors to the strength of the Gulf Stream, (1) the wind forcing, (2) the Atlantic meridional overturning ...Smith, 2007. Resolution convergence and sensitivity studies with North Atlantic circulation models. Part I. The western boundary current system...σ-z coordinates, and (3) a baroclinic version of ADvanced CIRCulation (ADCIRC), the latter an unstructured grid model for baroclinic coastal

  17. META-X Design Flow Tools

    DTIC Science & Technology

    2013-04-01

    Forces can be computed at specific angular positions, and geometrical parameters can be evaluated. Much higher resolution models are required, along...composition engines (C#, C++, Python, Java ) Desert operates on the CyPhy model, converting from a design space alternative structure to a set of design...consists of scripts to execute dymola, post-processing of results to create metrics, and general management of the job sequence. An earlier version created

  18. Taking Wave Prediction to New Levels: Wavewatch 3

    DTIC Science & Technology

    2016-01-01

    features such as surf and rip currents , conditions that affect special operations, amphibious assaults, and logistics over the shore. Changes in...The Navy’s current version of WAVEWATCH Ill features the capability of operating with gridded domains of multiple resolution simultaneously, ranging...Netherlands. Its current form, WAVEWATCH Ill, was developed at NOAA’s National Center for Environmental Prediction. The model is free and open source

  19. NextGen Avionics Roadmap, Version 1.2

    DTIC Science & Technology

    2010-09-21

    Based Aviation Rulemaking Com- mittee ( PARC ). This document is aimed at bring- ing these different proposed changes together into one perspective so...In capabili- ties will allow greater throughput at non- radar , non-tow- ered airports, increasing safety and efficiencies for general aviation (GA...integrity and resolution terrain databases to reduce Controlled Flight into Terrain (CFIT). ADS-B increases surveillance areas beyond today’s radar

  20. Evaluation of a 12-km Satellite-Era Reanalysis of Surface Mass Balance for the Greenland Ice Sheet

    NASA Astrophysics Data System (ADS)

    Cullather, R. I.; Nowicki, S.; Zhao, B.; Max, S.

    2016-12-01

    The recent contribution to sea level change from the Greenland Ice Sheet is thought to be strongly driven by surface processes including melt and runoff. Global reanalyses are potential means of reconstructing the historical time series of ice sheet surface mass balance (SMB), but lack spatial resolution needed to resolve ablation areas along the periphery of the ice sheet. In this work, the Modern-Era Retrospective analysis for Research and Applications, version 2 (MERRA-2) is used to examine the spatial and temporal variability of surface melt over the Greenland Ice Sheet. MERRA-2 is produced for the period 1980 to the present at a grid spacing of ½° latitude by ⅝° longitude, and includes snow hydrology processes including compaction, meltwater percolation and refreezing, runoff, and a prognostic surface albedo. The configuration of the MERRA-2 system allows for the background model - the Goddard Earth Observing System model, version 5 (GEOS-5) - to be carried in phase space through analyzed states via the computation of analysis increments, a capability referred to as "replay". Here, a MERRA-2 replay integration is conducted in which atmospheric forcing fields are interpolated and adjusted to sub- atmospheric grid-scale resolution. These adjustments include lapse-rate effects on temperature, humidity, precipitation, and other atmospheric variables that are known to have a strong elevation dependency over ice sheets. The surface coupling is performed such that mass and energy are conserved. The atmospheric forcing influences the surface representation, which operates on land surface tiles with an approximate 12-km spacing. This produces a high-resolution, downscaled SMB which is interactively coupled to the reanalysis model. We compare the downscaled SMB product with other reanalyses, regional climate model values, and a second MERRA-2 replay in which the background model has been replaced with a 12-km, non-hydrostatic version of GEOS-5. The assessment focuses on regional changes in SMB and SMB components, the identification of changes and temporal variability in the SMB equilibrium line, and the relation between SMB and other climate variables related to general circulation.

  1. Whole-brain background-suppressed pCASL MRI with 1D-accelerated 3D RARE Stack-Of-Spirals readout

    PubMed Central

    Vidorreta, Marta; Wang, Ze; Chang, Yulin V.; Wolk, David A.; Fernández-Seara, María A.; Detre, John A.

    2017-01-01

    Arterial Spin Labeled (ASL) perfusion MRI enables non-invasive, quantitative measurements of tissue perfusion, and has a broad range of applications including brain functional imaging. However, ASL suffers from low signal-to-noise ratio (SNR), limiting image resolution. Acquisitions using 3D readouts are optimal for background-suppression of static signals, but can be SAR intensive and typically suffer from through-plane blurring. In this study, we investigated the use of accelerated 3D readouts to obtain whole-brain, high-SNR ASL perfusion maps and reduce SAR deposition. Parallel imaging was implemented along the partition-encoding direction in a pseudo-continuous ASL sequence with background-suppression and 3D RARE Stack-Of-Spirals readout, and its performance was evaluated in three small cohorts. First, both non-accelerated and two-fold accelerated single-shot versions of the sequence were evaluated in healthy volunteers during a motor-photic task, and the performance was compared in terms of temporal SNR, GM-WM contrast, and statistical significance of the detected activation. Secondly, single-shot 1D-accelerated imaging was compared to a two-shot accelerated version to assess benefits of SNR and spatial resolution for applications in which temporal resolution is not paramount. Third, the efficacy of this approach in clinical populations was assessed by applying the single-shot 1D-accelerated version to a larger cohort of elderly volunteers. Accelerated data demonstrated the ability to detect functional activation at the subject level, including cerebellar activity, without loss in the perfusion signal temporal stability and the statistical power of the activations. The use of acceleration also resulted in increased GM-WM contrast, likely due to reduced through-plane partial volume effects, that were further attenuated with the use of two-shot readouts. In a clinical cohort, image quality remained excellent, and expected effects of age and sex on cerebral blood flow could be detected. The sequence is freely available upon request for academic use and could benefit a broad range of cognitive and clinical neuroscience research. PMID:28837640

  2. Low cost paths to binary optics

    NASA Technical Reports Server (NTRS)

    Nelson, Arthur; Domash, Lawrence

    1993-01-01

    Application of binary optics has been limited to a few major laboratories because of the limited availability of fabrication facilities such as e-beam machines and the lack of standardized design software. Foster-Miller has attempted to identify low cost approaches to medium-resolution binary optics using readily available computer and fabrication tools, primarily for the use of students and experimenters in optical computing. An early version of our system, MacBEEP, made use of an optimized laser film recorder from the commercial typesetting industry with 10 micron resolution. This report is an update on our current efforts to design and build a second generation MacBEEP, which aims at 1 micron resolution and multiple phase levels. Trails included a low cost scanning electron microscope in microlithography mode, and alternative laser inscribers or photomask generators. Our current software approach is based on Mathematica and PostScript compatibility.

  3. High resolution regional climate simulation of the Hawaiian Islands - Validation of the historical run from 2003 to 2012

    NASA Astrophysics Data System (ADS)

    Xue, L.; Newman, A. J.; Ikeda, K.; Rasmussen, R.; Clark, M. P.; Monaghan, A. J.

    2016-12-01

    A high-resolution (a 1.5 km grid spacing domain nested within a 4.5 km grid spacing domain) 10-year regional climate simulation over the entire Hawaiian archipelago is being conducted at the National Center for Atmospheric Research (NCAR) using the Weather Research and Forecasting (WRF) model version 3.7.1. Numerical sensitivity simulations of the Hawaiian Rainband Project (HaRP, a filed experiment from July to August in 1990) showed that the simulated precipitation properties are sensitive to initial and lateral boundary conditions, sea surface temperature (SST), land surface models, vertical resolution and cloud droplet concentration. The validations of model simulated statistics of the trade wind inversion, temperature, wind field, cloud cover, and precipitation over the islands against various observations from soundings, satellites, weather stations and rain gauges during the period from 2003 to 2012 will be presented at the meeting.

  4. Fiber-optic extrinsic Fabry-Perot interferometer strain sensor with <50 pm displacement resolution using three-wavelength digital phase demodulation.

    PubMed

    Schmidt, M; Werther, B; Fuerstenau, N; Matthias, M; Melz, T

    2001-04-09

    A fiber-optic extrinsic Fabry-Perot interferometer strain sensor (EFPI-S) of ls = 2.5 cm sensor length using three-wavelength digital phase demodulation is demonstrated to exhibit <50 pm displacement resolution (<2nm/m strain resolution) when measuring the cross expansion of a PZT-ceramic plate. The sensing (single-mode downlead-) and reflecting fibers are fused into a 150/360 microm capillary fiber where the fusion points define the sensor length. Readout is performed using an improved version of the previously described three-wavelength digital phase demodulation method employing an arctan-phase stepping algorithm. In the resent experiments the strain sensitivity was varied via the mapping of the arctan - lookup table to the 16-Bit DA-converter range from 188.25 k /V (6 Volt range 1130 k ) to 11.7 k /Volt (range 70 k ).

  5. Efficient methods for implementation of multi-level nonrigid mass-preserving image registration on GPUs and multi-threaded CPUs.

    PubMed

    Ellingwood, Nathan D; Yin, Youbing; Smith, Matthew; Lin, Ching-Long

    2016-04-01

    Faster and more accurate methods for registration of images are important for research involved in conducting population-based studies that utilize medical imaging, as well as improvements for use in clinical applications. We present a novel computation- and memory-efficient multi-level method on graphics processing units (GPU) for performing registration of two computed tomography (CT) volumetric lung images. We developed a computation- and memory-efficient Diffeomorphic Multi-level B-Spline Transform Composite (DMTC) method to implement nonrigid mass-preserving registration of two CT lung images on GPU. The framework consists of a hierarchy of B-Spline control grids of increasing resolution. A similarity criterion known as the sum of squared tissue volume difference (SSTVD) was adopted to preserve lung tissue mass. The use of SSTVD consists of the calculation of the tissue volume, the Jacobian, and their derivatives, which makes its implementation on GPU challenging due to memory constraints. The use of the DMTC method enabled reduced computation and memory storage of variables with minimal communication between GPU and Central Processing Unit (CPU) due to ability to pre-compute values. The method was assessed on six healthy human subjects. Resultant GPU-generated displacement fields were compared against the previously validated CPU counterpart fields, showing good agreement with an average normalized root mean square error (nRMS) of 0.044±0.015. Runtime and performance speedup are compared between single-threaded CPU, multi-threaded CPU, and GPU algorithms. Best performance speedup occurs at the highest resolution in the GPU implementation for the SSTVD cost and cost gradient computations, with a speedup of 112 times that of the single-threaded CPU version and 11 times over the twelve-threaded version when considering average time per iteration using a Nvidia Tesla K20X GPU. The proposed GPU-based DMTC method outperforms its multi-threaded CPU version in terms of runtime. Total registration time reduced runtime to 2.9min on the GPU version, compared to 12.8min on twelve-threaded CPU version and 112.5min on a single-threaded CPU. Furthermore, the GPU implementation discussed in this work can be adapted for use of other cost functions that require calculation of the first derivatives. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Dual Resolution Images from Paired Fingerprint Cards

    National Institute of Standards and Technology Data Gateway

    NIST Dual Resolution Images from Paired Fingerprint Cards (Web, free access)   NIST Special Database 30 is being distributed for use in development and testing of fingerprint compression and fingerprint matching systems. The database allows the user to develop and evaluate data compression algorithms for fingerprint images scanned at both 19.7 ppmm (500 dpi) and 39.4 ppmm (1000 dpi). The data consist of 36 ten-print paired cards with both the rolled and plain images scanned at 19.7 and 39.4 pixels per mm. A newer version of the compression/decompression software on the CDROM can be found at the website http://www.nist.gov/itl/iad/ig/nigos.cfm as part of the NBIS package.

  7. Hubble Space Telescope faint object camera instrument handbook (Post-COSTAR), version 5.0

    NASA Technical Reports Server (NTRS)

    Nota, A. (Editor); Jedrzejewski, R. (Editor); Greenfield, P. (Editor); Hack, W. (Editor)

    1994-01-01

    The faint object camera (FOC) is a long-focal-ratio, photon-counting device capable of taking high-resolution two-dimensional images of the sky up to 14 by 14 arc seconds squared in size with pixel dimensions as small as 0.014 by 0.014 arc seconds squared in the 1150 to 6500 A wavelength range. Its performance approaches that of an ideal imaging system at low light levels. The FOC is the only instrument on board the Hubble Space Telescope (HST) to fully use the spatial resolution capabilities of the optical telescope assembly (OTA) and is one of the European Space Agency's contributions to the HST program.

  8. A numerical study of the axisymmetric Couette-Taylor problem using a fast high-resolution second-order central scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kupferman, R.

    The author presents a numerical study of the axisymmetric Couette-Taylor problem using a finite difference scheme. The scheme is based on a staggered version of a second-order central-differencing method combined with a discrete Hodge projection. The use of central-differencing operators obviates the need to trace the characteristic flow associated with the hyperbolic terms. The result is a simple and efficient scheme which is readily adaptable to other geometries and to more complicated flows. The scheme exhibits competitive performance in terms of accuracy, resolution, and robustness. The numerical results agree accurately with linear stability theory and with previous numerical studies.

  9. Conservative classical and quantum resolution limits for incoherent imaging

    NASA Astrophysics Data System (ADS)

    Tsang, Mankei

    2018-06-01

    I propose classical and quantum limits to the statistical resolution of two incoherent optical point sources from the perspective of minimax parameter estimation. Unlike earlier results based on the Cramér-Rao bound (CRB), the limits proposed here, based on the worst-case error criterion and a Bayesian version of the CRB, are valid for any biased or unbiased estimator and obey photon-number scalings that are consistent with the behaviours of actual estimators. These results prove that, from the minimax perspective, the spatial-mode demultiplexing measurement scheme recently proposed by Tsang, Nair, and Lu [Phys. Rev. X 2016, 6 031033.] remains superior to direct imaging for sufficiently high photon numbers.

  10. Documentation for the machine-readable version of A Library of Stellar Spectra (Jacoby, Hunter and Christian 1984)

    NASA Technical Reports Server (NTRS)

    Warren, W. H., Jr.

    1984-01-01

    The machine readable library as it is currently being distributed from the Astronomical Data Center is described. The library contains digital spectral for 161 stars of spectral classes O through M and luminosity classes 1, 3 and 5 in the wavelength range 3510 A to 7427 A. The resolution is approximately 4.5 A, while the typical photometric uncertainty of each resolution element is approximately 1 percent and broadband variations are 3 percent. The documentation includes a format description, a table of the indigenous characteristics of the magnetic tape file, and a sample listing of logical records exactly as they are recorded on the tape.

  11. Chandra Images the Seething Cauldron of Starburst Galaxy

    NASA Astrophysics Data System (ADS)

    2000-01-01

    NASA's Chandra X-ray Observatory has imaged the core of the nearest starburst galaxy, Messier 82 (M82). The observatory has revealed a seething cauldron of exploding stars, neutron stars, black holes, 100 million degree gas, and a powerful galactic wind. The discovery will be presented by a team of scientists from Carnegie Mellon University, Pittsburgh, Penn., Pennsylvania State University, University Park, and the University of Michigan, Ann Arbor, on January 14 at the 195th national meeting of the American Astronomical Society. "In the disk of our Milky Way Galaxy, stars form and die in a relatively calm fashion like burning embers in a campfire," said Richard Griffiths, Professor of Astrophysics at Carnegie Mellon University. "But in a starburst galaxy, star birth and death are more like explosions in a fireworks factory." Short-lived massive stars in a starburst galaxy produce supernova explosions, which heat the interstellar gas to millions of degrees, and leave behind neutron stars and black holes. These explosions emit light in the X rays rather than in visible light. Because the superhot components inside starburst galaxies are complex and sometimes confusing, astronomers need an X-ray-detecting telescope with the highest focusing power (spatial resolution) to clearly discriminate the various structures. "NASA's Chandra X-ray Observatory is the perfect tool for studying starburst galaxies since it has the critical combination of high-resolution optics and good sensitivity to penetrating X rays," said Gordon Garmire, the Evan Pugh Professor of Astronomy and Astrophysics at Pennsylvania State University, and head of the team that conceived and built Chandra's Advanced CCD Imaging Spectrograph (ACIS) X-ray camera, which acquired the data. Many intricate structures missed by earlier satellite observatories are now visible in the ACIS image, including more than twenty powerful X-ray binary systems that contain a normal star in a close orbit around a neutron star or a black hole. "Several sources are so bright that they are probably black holes, perhaps left over from past starburst episodes," Garmire explained. The astronomers report that the X-ray emitting gas in the galaxy's core region has a surprisingly hot temperature. "Determining the source of high-energy X rays from M82 may elucidate whether starburst galaxies throughout the universe contribute significantly to the X-ray background radiation that pervades intergalactic space," said Griffiths."The image also shows a chimney-like structure at the base of the galactic wind, which may help us understand how metal-rich starburst gas is dispersed into intergalactic space." "What we don't see may be as important as what we do see," said Garmire. "There is no indication of a single, high luminosity, compact X-ray source from a supermassive black hole at the very center of the galaxy, although considerable evidence exists that such central black holes are present in many or most galaxies.". The astronomers note that recent optical and infrared data suggest most galaxies were starbursts when the universe was young and that their galactic winds may have distributed carbon, oxygen, iron and other heavy atoms that now pervade the Universe. The starburst in M82 is thought to have been caused by a near collision with a large spiral galaxy, M81, about 100 million years ago. At a distance of 11 million light years, M82 is the closest starburst galaxy to our Milky Way Galaxy and provides the best view of this type of galactic structure, which may have played a critical role in the early history of the Universe. The Chandra image was taken with the Advanced CCD Imaging Spectrometer (ACIS) on September 20, 1999 in an observation that lasted about 13 ½ hours. ACIS was built by Penn State Univ. and Massachusetts Institute of Technology, Cambridge. To follow Chandra's progress or download images visit the Chandra sites at: http://chandra.harvard.edu/photo/2000/0094/index.html AND http://chandra.nasa.gov NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program. TRW, Inc., Redondo Beach, CA, is the prime contractor for the spacecraft. The Smithsonian's Chandra X-ray Center controls science and flight operations from Cambridge, MA. High resolution digital versions of the X-ray image (JPG, 300 dpi TIFF) are available at the Internet site listed above.

  12. ntermediate frequency atmospheric disturbances: A dynamical bridge connecting western U.S. extreme precipitation with East Asian cold surges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Tianyu NMI; Evans, Katherine J; Deng, Yi

    In this study, an atmospheric river (AR) detection algorithm is developed to investigate the downstream modulation of the eastern North Pacific ARs by another weather extreme, known as the East Asian cold surge (EACS), in both reanalysis data and high-resolution global model simulations. It is shown that following the peak of an EACS, atmospheric disturbances of intermediate frequency (IF; 10 30 day period) are excited downstream. This leads to the formation of a persistent cyclonic circulation anomaly over the eastern North Pacific that dramatically enhances the AR occurrence probability and the surface precipitation over the western U.S. between 30 Nmore » and 50 N. A diagnosis of the local geopotential height tendency further confirms the essential role of IF disturbances in establishing the observed persistent anomaly. This downstream modulation effect is then examined in the two simulations of the National Center for Atmospheric Research Community Climate System Model version 4 with different horizontal resolutions (T85 and T341) for the same period (1979 2005). The connection between EACS and AR is much better captured by the T341 version of the model, mainly due to a better representation of the scale interaction and the characteristics of IF atmospheric disturbances in the higher-resolution model. The findings here suggest that faithful representations of scale interaction in a global model are critical for modeling and predicting the occurrences of hydrological extremes in the western U.S. and for understanding their potential future changes.« less

  13. A sensitivity analysis of cloud properties to CLUBB parameters in the single-column Community Atmosphere Model (SCAM5)

    DOE PAGES

    Guo, Zhun; Wang, Minghuai; Qian, Yun; ...

    2014-08-13

    In this study, we investigate the sensitivity of simulated shallow cumulus and stratocumulus clouds to selected tunable parameters of Cloud Layers Unified by Binormals (CLUBB) in the single column version of Community Atmosphere Model version 5 (SCAM5). A quasi-Monte Carlo (QMC) sampling approach is adopted to effectively explore the high-dimensional parameter space and a generalized linear model is adopted to study the responses of simulated cloud fields to tunable parameters. One stratocumulus and two shallow convection cases are configured at both coarse and fine vertical resolutions in this study.. Our results show that most of the variance in simulated cloudmore » fields can be explained by a small number of tunable parameters. The parameters related to Newtonian and buoyancy-damping terms of total water flux are found to be the most influential parameters for stratocumulus. For shallow cumulus, the most influential parameters are those related to skewness of vertical velocity, reflecting the strong coupling between cloud properties and dynamics in this regime. The influential parameters in the stratocumulus case are sensitive to the choice of the vertical resolution while little sensitivity is found for the shallow convection cases, as eddy mixing length (or dissipation time scale) plays a more important role and depends more strongly on the vertical resolution in stratocumulus than in shallow convections. The influential parameters remain almost unchanged when the number of tunable parameters increases from 16 to 35. This study improves understanding of the CLUBB behavior associated with parameter uncertainties.« less

  14. A new synoptic scale resolving global climate simulation using the Community Earth System Model

    NASA Astrophysics Data System (ADS)

    Small, R. Justin; Bacmeister, Julio; Bailey, David; Baker, Allison; Bishop, Stuart; Bryan, Frank; Caron, Julie; Dennis, John; Gent, Peter; Hsu, Hsiao-ming; Jochum, Markus; Lawrence, David; Muñoz, Ernesto; diNezio, Pedro; Scheitlin, Tim; Tomas, Robert; Tribbia, Joseph; Tseng, Yu-heng; Vertenstein, Mariana

    2014-12-01

    High-resolution global climate modeling holds the promise of capturing planetary-scale climate modes and small-scale (regional and sometimes extreme) features simultaneously, including their mutual interaction. This paper discusses a new state-of-the-art high-resolution Community Earth System Model (CESM) simulation that was performed with these goals in mind. The atmospheric component was at 0.25° grid spacing, and ocean component at 0.1°. One hundred years of "present-day" simulation were completed. Major results were that annual mean sea surface temperature (SST) in the equatorial Pacific and El-Niño Southern Oscillation variability were well simulated compared to standard resolution models. Tropical and southern Atlantic SST also had much reduced bias compared to previous versions of the model. In addition, the high resolution of the model enabled small-scale features of the climate system to be represented, such as air-sea interaction over ocean frontal zones, mesoscale systems generated by the Rockies, and Tropical Cyclones. Associated single component runs and standard resolution coupled runs are used to help attribute the strengths and weaknesses of the fully coupled run. The high-resolution run employed 23,404 cores, costing 250 thousand processor-hours per simulated year and made about two simulated years per day on the NCAR-Wyoming supercomputer "Yellowstone."

  15. GPU-accelerated two dimensional synthetic aperture focusing for photoacoustic microscopy

    NASA Astrophysics Data System (ADS)

    Liu, Siyu; Feng, Xiaohua; Gao, Fei; Jin, Haoran; Zhang, Ruochong; Luo, Yunqi; Zheng, Yuanjin

    2018-02-01

    Acoustic resolution photoacoustic microscopy (AR-PAM) generally suffers from limited depth of focus, which had been extended by synthetic aperture focusing techniques (SAFTs). However, for three dimensional AR-PAM, current one dimensional (1D) SAFT and its improved version like cross-shaped SAFT do not provide isotropic resolution in the lateral direction. The full potential of the SAFT remains to be tapped. To this end, two dimensional (2D) SAFT with fast computing architecture is proposed in this work. Explained by geometric modeling and Fourier acoustics theories, 2D-SAFT provide the narrowest post-focusing capability, thus to achieve best lateral resolution. Compared with previous 1D-SAFT techniques, the proposed 2D-SAFT improved the lateral resolution by at least 1.7 times and the signal-to-noise ratio (SNR) by about 10 dB in both simulation and experiments. Moreover, the improved 2D-SAFT algorithm is accelerated by a graphical processing unit that reduces the long period of reconstruction to only a few seconds. The proposed 2D-SAFT is demonstrated to outperform previous reported 1D SAFT in the aspects of improving the depth of focus, imaging resolution, and SNR with fast computational efficiency. This work facilitates future studies on in vivo deeper and high-resolution photoacoustic microscopy beyond several centimeters.

  16. Improving spectral resolution in spatial encoding dimension of single-scan nuclear magnetic resonance 2D spin echo correlated spectroscopy

    NASA Astrophysics Data System (ADS)

    Lin, Liangjie; Wei, Zhiliang; Yang, Jian; Lin, Yanqin; Chen, Zhong

    2014-11-01

    The spatial encoding technique can be used to accelerate the acquisition of multi-dimensional nuclear magnetic resonance spectra. However, with this technique, we have to make trade-offs between the spectral width and the resolution in the spatial encoding dimension (F1 dimension), resulting in the difficulty of covering large spectral widths while preserving acceptable resolutions for spatial encoding spectra. In this study, a selective shifting method is proposed to overcome the aforementioned drawback. This method is capable of narrowing spectral widths and improving spectral resolutions in spatial encoding dimensions by selectively shifting certain peaks in spectra of the ultrafast version of spin echo correlated spectroscopy (UFSECSY). This method can also serve as a powerful tool to obtain high-resolution correlated spectra in inhomogeneous magnetic fields for its resistance to any inhomogeneity in the F1 dimension inherited from UFSECSY. Theoretical derivations and experiments have been carried out to demonstrate performances of the proposed method. Results show that the spectral width in spatial encoding dimension can be reduced by shortening distances between cross peaks and axial peaks with the proposed method and the expected resolution improvement can be achieved. Finally, the shifting-absent spectrum can be recovered readily by post-processing.

  17. Development of alternative versions of the Logical Memory subtest of the WMS-R for use in Brazil

    PubMed Central

    Bolognani, Silvia Adriana Prado; Miranda, Monica Carolina; Martins, Marjorie; Rzezak, Patricia; Bueno, Orlando Francisco Amodeo; de Camargo, Candida Helena Pires; Pompeia, Sabine

    2015-01-01

    The logical memory test of the Wechsler Memory Scale is one of the most frequently used standardized tests for assessing verbal memory and consists of two separate short stories each containing 25 idea units. Problems with practice effects arise with re-testing a patient, as these stories may be remembered from previous assessments. Therefore, alternative versions of the test stimuli should be developed to minimize learning effects when repeated testing is required for longitudinal evaluations of patients. Objective To present three alternative stories for each of the original stories frequently used in Brazil (Ana Soares and Roberto Mota) and to show their similarity in terms of content, structure and linguistic characteristics. Methods The alternative stories were developed according to the following criteria: overall structure or thematic content (presentation of the character, conflict, aggravation or complements and resolution); specific structure (sex of the character, location and occupation, details of what happened); formal structure (number of words, characters, verbs and nouns); and readability. Results The alternative stories and scoring criteria are presented in comparison to the original WMS stories (Brazilian version). Conclusion The alternative stories presented here correspond well thematically and structurally to the Brazilian versions of the original stories. PMID:29213955

  18. Evaluation of Abdominal CT Image Quality Using a New Version of Vendor-Specific Model-Based Iterative Reconstruction

    PubMed Central

    Jensen, Corey T.; Telesmanich, Morgan E.; Wagner-Bartak, Nicolaus A.; Liu, Xinming; Rong, John; Szklaruk, Janio; Qayyum, Aliya; Wei, Wei; Chandler, Adam G.; Tamm, Eric P.

    2016-01-01

    Purpose To qualitatively and quantitatively compare abdominal CT images reconstructed with a new version of model-based iterative reconstruction (Veo 3.0; GE Healthcare) to those created with Veo 2.0. Materials & Methods This retrospective study was approved by our IRB and was HIPPA compliant. The raw data from 29 consecutive patients who had undergone CT abdomen scanning was used to reconstruct 4 sets of 3.75mm axial images: Veo 2.0, Veo 3.0 standard, Veo 3.0 5% resolution preference and Veo 3.0 20% resolution preference. A slice thickness optimization of 3.75 mm and texture feature was selected for Veo 3.0 reconstructions. The images were reviewed by three independent readers in a blinded, randomized fashion using a 5-point Likert scale and 5-point comparative scale. Multiple 2D circular regions of interest were defined for noise and contrast-to-noise ratio (CNR) measurements. Line profiles were drawn across the 7 lp/cm bar pattern of the CatPhan 600 phantom for spatial resolution evaluation. Results The Veo 3.0 standard image set was scored better than Veo 2.0 in terms of artifacts (mean difference 0.43, 95% CI 0.25-0.6, P<0.0001), overall image quality (mean difference 0.87, 95% CI 0.62-1.13, P<0.0001) and qualitative resolution (mean difference 0.9, 95% CI 0.69-1.1, P<0.0001). While the Veo 3.0 standard and RP05 presets were preferred across most categories, the Veo 3.0 RP20 series ranked best for bone detail. Image noise and spatial resolution increased along a spectrum with Veo 2.0 the lowest and RP20 the highest. Conclusion Veo 3.0 enhances imaging evaluation relative to Veo 2.0; readers preferred Veo 3.0 image appearance despite the associated mild increases in image noise. These results provide suggested parameters to be used clinically and as a basis for future evaluations such as focal lesion detection in the oncology setting. PMID:27529683

  19. High-resolution observations of small-scale gravity waves and turbulence features in the OH airglow layer

    NASA Astrophysics Data System (ADS)

    Sedlak, René; Hannawald, Patrick; Schmidt, Carsten; Wüst, Sabine; Bittner, Michael

    2016-12-01

    A new version of the Fast Airglow Imager (FAIM) for the detection of atmospheric waves in the OH airglow layer has been set up at the German Remote Sensing Data Center (DFD) of the German Aerospace Center (DLR) at Oberpfaffenhofen (48.09° N, 11.28° E), Germany. The spatial resolution of the instrument is 17 m pixel-1 in zenith direction with a field of view (FOV) of 11.1 km × 9.0 km at the OH layer height of ca. 87 km. Since November 2015, the system has been in operation in two different setups (zenith angles 46 and 0°) with a temporal resolution of 2.5 to 2.8 s. In a first case study we present observations of two small wave-like features that might be attributed to gravity wave instabilities. In order to spectrally analyse harmonic structures even on small spatial scales down to 550 m horizontal wavelength, we made use of the maximum entropy method (MEM) since this method exhibits an excellent wavelength resolution. MEM further allows analysing relatively short data series, which considerably helps to reduce problems such as stationarity of the underlying data series from a statistical point of view. We present an observation of the subsequent decay of well-organized wave fronts into eddies, which we tentatively interpret in terms of an indication for the onset of turbulence. Another remarkable event which demonstrates the technical capabilities of the instrument was observed during the night of 4-5 April 2016. It reveals the disintegration of a rather homogenous brightness variation into several filaments moving in different directions and with different speeds. It resembles the formation of a vortex with a horizontal axis of rotation likely related to a vertical wind shear. This case shows a notable similarity to what is expected from theoretical modelling of Kelvin-Helmholtz instabilities (KHIs). The comparatively high spatial resolution of the presented new version of the FAIM provides new insights into the structure of atmospheric wave instability and turbulent processes. Infrared imaging of wave dynamics on the sub-kilometre scale in the airglow layer supports the findings of theoretical simulations and modellings.

  20. The Chorus Conflict and Loss of Separation Resolution Algorithms

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.

    2013-01-01

    The Chorus software is designed to investigate near-term, tactical conflict and loss of separation detection and resolution concepts for air traffic management. This software is currently being used in two different problem domains: en-route self- separation and sense and avoid for unmanned aircraft systems. This paper describes the core resolution algorithms that are part of Chorus. The combination of several features of the Chorus program distinguish this software from other approaches to conflict and loss of separation resolution. First, the program stores a history of state information over time which enables it to handle communication dropouts and take advantage of previous input data. Second, the underlying conflict algorithms find resolutions that solve the most urgent conflict, but also seek to prevent secondary conflicts with the other aircraft. Third, if the program is run on multiple aircraft, and the two aircraft maneuver at the same time, the result will be implicitly co-ordinated. This implicit coordination property is established by ensuring that a resolution produced by Chorus will comply with a mathematically-defined criteria whose correctness has been formally verified. Fourth, the program produces both instantaneous solutions and kinematic solutions, which are based on simple accel- eration models. Finally, the program provides resolutions for recovery from loss of separation. Different versions of this software are implemented as Java and C++ software programs, respectively.

  1. Dynamic coupling of regional atmosphere to biosphere in the new generation regional climate system model REMO-iMOVE

    NASA Astrophysics Data System (ADS)

    Wilhelm, C.; Rechid, D.; Jacob, D.

    2013-05-01

    The main objective of this study is the coupling of the regional climate model REMO to a 3rd generation land surface scheme and the evaluation of the new model version of REMO, called REMO with interactive MOsaic-based VEgetation: REMO-iMOVE. Attention is paid to the documentation of the technical aspects of the new model constituents and the coupling mechanism. We compare simulation results of REMO-iMOVE and of the reference version REMO2009, to investigate the sensitivity of the regional model to the new land surface scheme. An 11 yr climate model run (1995-2005), forced with ECMWF ERA-Interim lateral boundary conditions, over Europe in 0.44° resolution of both model versions was carried out, to represent present day European climate. The result of these experiments are compared to multiple temperature, precipitation, heat flux and leaf area index observation data, to determine the differences in the model versions. The new model version has further the ability to model net primary productivity for the given plant functional types. This new feature is thoroughly evaluated by literature values of net primary productivity of different plant species in European climatic regions. The new model version REMO-iMOVE is able to model the European climate in the same quality as the parent model version REMO2009 does. The differences in the results of the two model versions stem from the differences in the dynamics of vegetation cover and density and can be distinct in some regions, due to the influences of these parameters to the surface heat and moisture fluxes. The modeled inter-annual variability in the phenology as well as the net primary productivity lays in the range of observations and literature values for most European regions. This study also reveals the need for a more sophisticated soil moisture representation in the newly developed model version REMO-iMOVE to be able to treat the differences in plant functional types. This gets especially important if the model will be used in dynamic vegetation studies.

  2. weather@home 2: validation of an improved global-regional climate modelling system

    NASA Astrophysics Data System (ADS)

    Guillod, Benoit P.; Jones, Richard G.; Bowery, Andy; Haustein, Karsten; Massey, Neil R.; Mitchell, Daniel M.; Otto, Friederike E. L.; Sparrow, Sarah N.; Uhe, Peter; Wallom, David C. H.; Wilson, Simon; Allen, Myles R.

    2017-05-01

    Extreme weather events can have large impacts on society and, in many regions, are expected to change in frequency and intensity with climate change. Owing to the relatively short observational record, climate models are useful tools as they allow for generation of a larger sample of extreme events, to attribute recent events to anthropogenic climate change, and to project changes in such events into the future. The modelling system known as weather@home, consisting of a global climate model (GCM) with a nested regional climate model (RCM) and driven by sea surface temperatures, allows one to generate a very large ensemble with the help of volunteer distributed computing. This is a key tool to understanding many aspects of extreme events. Here, a new version of the weather@home system (weather@home 2) with a higher-resolution RCM over Europe is documented and a broad validation of the climate is performed. The new model includes a more recent land-surface scheme in both GCM and RCM, where subgrid-scale land-surface heterogeneity is newly represented using tiles, and an increase in RCM resolution from 50 to 25 km. The GCM performs similarly to the previous version, with some improvements in the representation of mean climate. The European RCM temperature biases are overall reduced, in particular the warm bias over eastern Europe, but large biases remain. Precipitation is improved over the Alps in summer, with mixed changes in other regions and seasons. The model is shown to represent the main classes of regional extreme events reasonably well and shows a good sensitivity to its drivers. In particular, given the improvements in this version of the weather@home system, it is likely that more reliable statements can be made with regards to impact statements, especially at more localized scales.

  3. Strehl-constrained reconstruction of post-adaptive optics data and the Software Package AIRY, v. 6.1

    NASA Astrophysics Data System (ADS)

    Carbillet, Marcel; La Camera, Andrea; Deguignet, Jérémy; Prato, Marco; Bertero, Mario; Aristidi, Éric; Boccacci, Patrizia

    2014-08-01

    We first briefly present the last version of the Software Package AIRY, version 6.1, a CAOS-based tool which includes various deconvolution methods, accelerations, regularizations, super-resolution, boundary effects reduction, point-spread function extraction/extrapolation, stopping rules, and constraints in the case of iterative blind deconvolution (IBD). Then, we focus on a new formulation of our Strehl-constrained IBD, here quantitatively compared to the original formulation for simulated near-infrared data of an 8-m class telescope equipped with adaptive optics (AO), showing their equivalence. Next, we extend the application of the original method to the visible domain with simulated data of an AO-equipped 1.5-m telescope, testing also the robustness of the method with respect to the Strehl ratio estimation.

  4. Hubble Space Telescope faint object spectrograph instrument handbook, version 5.0

    NASA Technical Reports Server (NTRS)

    Kinney, A. L. (Editor)

    1994-01-01

    This version of the FOS Instrument Handbook is for the refurbished telescope, which is affected by an increase in throughput, especially for the smaller apertures, a decrease in efficiency due to the extra reflections of the COSTAR optics, and a change in focal length. The improved PSF affects all exposure time calculations due to better aperture throughputs and increases the spectral resolution. The extra reflections of COSTAR decrease the efficiency by 10-20 percent. The change in focal length affects the aperture sizes as projected on the sky. The aperture designations that are already in use both in the exposure logsheets and in the project data base (PDB) have not been changed. Apertures are referred to here by their size, followed by the designation used on the exposure logsheet.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosch, R.; Boutin, J. Y.; Le Breton, J. P.

    This article describes x-ray imaging with grazing-incidence microscopes, developed for the experimental program carried out on the Ligne d'Integration Laser (LIL) facility [J. P. Le Breton et al., Inertial Fusion Sciences and Applications 2001 (Elsevier, Paris, 2002), pp. 856-862] (24 kJ, UV--0.35 nm). The design includes a large target-to-microscope (400-700 mm) distance required by the x-ray ablation issues anticipated on the Laser MegaJoule facility [P. A. Holstein et al., Laser Part. Beams 17, 403 (1999)] (1.8 MJ) which is under construction. Two eight-image Kirkpatrick-Baez microscopes [P. Kirkpatrick and A. V. Baez J. Opt. Soc. Am. 38, 766 (1948)] with differentmore » spectral wavelength ranges and with a 400 mm source-to-mirror distance image the target on a custom-built framing camera (time resolution of {approx}80 ps). The soft x-ray version microscope is sensitive below 1 keV and its spatial resolution is better than 30 {mu}m over a 2-mm-diam region. The hard x-ray version microscope has a 10 {mu}m resolution over an 800-{mu}m-diam region and is sensitive in the 1-5 keV energy range. Two other x-ray microscopes based on an association of toroidal/spherical surfaces (T/S microscopes) produce an image on a streak camera with a spatial resolution better than 30 {mu}m over a 3 mm field of view in the direction of the camera slit. Both microscopes have been designed to have, respectively, a maximum sensitivity in the 0.1-1 and 1-5 keV energy range. We present the original design of these four microscopes and their test on a dc x-ray tube in the laboratory. The diagnostics were successfully used on LIL first experiments early in 2005. Results of soft x-ray imaging of a radiative jet during conical shaped laser interaction are shown.« less

  6. Experiments on tropical stratospheric mean-wind variations in a spectral general circulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamilton, K.; Yuan, L.

    1992-12-15

    A 30-level version of the rhomboidal-15 GFDL spectral climate model was constructed with roughly 2-km vertical resolution. This model fails to produce a realistic quasi-biennial oscillation (QBO) in the tropical stratosphere. Several simulations were conducted in which the zonal-mean winds and temperatures in the equatorial lower and middle stratosphere were instantaneously perturbed and the model was integrated while the mean state relaxed toward its equilibrium. The time scale for the mean wind relaxation varied from over one month at 40 km to a few months in the lower stratosphere. The wind relaxations in the model also displayed the downward phasemore » propagation characteristic of QBO wind reversals, and mean wind anomalies of opposite sign to the imposed perturbation appear at higher levels. In the GCM the downward propagation is clear only above about 20 mb. Detailed investigations were made of the zonal-mean zonal momentum budget in the equatorial stratosphere. The mean flow relaxations above 20 mb were mostly driven by the vertical Eliassen-Palm flux convergence. The anomalies in the horizontal Eliassen-Palm fluxes from extratropical planetary waves were found to be the dominant effect forcing the mean flow to its equilibrium at altitudes below 20 mb. The vertical eddy momentum fluxes near the equator in the model were decomposed using space-time Fourier analysis. While total fluxes associated with easterly and westerly waves are comparable to those used in simple mechanistic models of the QBO, the GCM has its flux spread over a broad range of wavenumbers and phase speeds. The effects of vertical resolution were studied by repeating part of the control integration with a 69-level version of the model with greatly enhance vertical resolution in the lower and middle stratosphere. The results showed that there is almost no sensitivity of the simulation in the tropical stratosphere to the increased vertical resolution. 34 refs., 16 figs., 3 tabs.« less

  7. Experimental Study of an Advanced Concept of Moderate-resolution Holographic Spectrographs

    NASA Astrophysics Data System (ADS)

    Muslimov, Eduard; Valyavin, Gennady; Fabrika, Sergei; Musaev, Faig; Galazutdinov, Gazinur; Pavlycheva, Nadezhda; Emelianov, Eduard

    2018-07-01

    We present the results of an experimental study of an advanced moderate-resolution spectrograph based on a cascade of narrow-band holographic gratings. The main goal of the project is to achieve a moderately high spectral resolution with R up to 5000 simultaneously in the 4300–6800 Å visible spectral range on a single standard CCD, together with an increased throughput. The experimental study consisted of (1) resolution and image quality tests performed using the solar spectrum, and (2) a total throughput test performed for a number of wavelengths using a calibrated lab monochromator. The measured spectral resolving power reaches values over R > 4000 while the experimental throughput is as high as 55%, which agrees well with the modeling results. Comparing the obtained characteristics of the spectrograph under consideration with the best existing spectrographs, we conclude that the used concept can be considered as a very competitive and cheap alternative to the existing spectrographs of the given class. We propose several astrophysical applications for the instrument and discuss the prospect of creating its full-scale version.

  8. Comparison of auditory temporal resolution between monolingual Persian and bilingual Turkish-Persian individuals.

    PubMed

    Omidvar, Shaghayegh; Jafari, Zahra; Tahaei, Ali Akbar; Salehi, Masoud

    2013-04-01

    The aims of this study were to prepare a Persian version of the temporal resolution test using the method of Phillips et al (1994) and Stuart and Phillips (1996), and to compare the word-recognition performance in the presence of continuous and interrupted noise as well as the temporal resolution abilities between monolingual (ML) Persian and bilingual (BL) Turkish-Persian young adults. Word-recognition scores (WRSs) were obtained in quiet and in the presence of background competing continuous and interrupted noise at signal-to-noise ratios (SNRs) of -20, -10, 0, and 10 dB. Two groups of 33 ML Persian and 36 BL Turkish-Persian volunteers participated. WRSs significantly differed between ML and BL subjects at four sensation levels in the presence of continuous and interrupted noise. However, the difference in the release from masking between ML and BL subjects was not significant at the studied SNRs. BL Turkish-Persian listeners seem to show poorer performance when responding to Persian words in continuous and interrupted noise. However, bilingualism may not affect auditory temporal resolution ability.

  9. X-Ray Diffractive Optics

    NASA Technical Reports Server (NTRS)

    Dennis, Brian; Li, Mary; Skinner, Gerald

    2013-01-01

    X-ray optics were fabricated with the capability of imaging solar x-ray sources with better than 0.1 arcsecond angular resolution, over an order of magnitude finer than is currently possible. Such images would provide a new window into the little-understood energy release and particle acceleration regions in solar flares. They constitute one of the most promising ways to probe these regions in the solar atmosphere with the sensitivity and angular resolution needed to better understand the physical processes involved. A circular slit structure with widths as fine as 0.85 micron etched in a silicon wafer 8 microns thick forms a phase zone plate version of a Fresnel lens capable of focusing approx. =.6 keV x-rays. The focal length of the 3-cm diameter lenses is 100 microns, and the angular resolution capability is better than 0.1 arcsecond. Such phase zone plates were fabricated in Goddard fs Detector Development Lab. (DDL) and tested at the Goddard 600-microns x-ray test facility. The test data verified that the desired angular resolution and throughput efficiency were achieved.

  10. Long-range speckle imaging theory, simulation, and brassboard results

    NASA Astrophysics Data System (ADS)

    Riker, Jim F.; Tyler, Glenn A.; Vaughn, Jeff L.

    2017-09-01

    In the SPIE 2016 Unconventional Imaging session, the authors laid out a breakthrough new theory for active array imaging that exploits the speckle return to generate a high-resolution picture of the target. Since then, we have pursued that theory even in long-range (<1000-km) engagement scenarios and shown how we can obtain that high-resolution image of the target using only a few illuminators, or by using many illuminators. There is a trade of illuminators versus receivers, but many combinations provide the same synthetic aperture resolution. We will discuss that trade, along with the corresponding radiometric and speckle-imaging Signal-to-Noise Ratios (SNR) for geometries that can fit on relatively small aircraft, such as an Unmanned Aerial Vehicle (UAV). Furthermore, we have simulated the performance of the technique, and we have created a laboratory version of the approach that is able to obtain high-resolution speckle imagery. The principal results presented in this paper are the Signal to Noise Ratios (SNR) for both the radiometric and the speckle imaging portions of the problem, and the simulated results obtained for representative arrays.

  11. How does increasing horizontal resolution in a global climate model improve the simulation of aerosol-cloud interactions?

    DOE PAGES

    Ma, Po-Lun; Rasch, Philip J.; Wang, Minghuai; ...

    2015-06-23

    We report the Community Atmosphere Model Version 5 is run at horizontal grid spacing of 2, 1, 0.5, and 0.25°, with the meteorology nudged toward the Year Of Tropical Convection analysis, and cloud simulators and the collocated A-Train satellite observations are used to explore the resolution dependence of aerosol-cloud interactions. The higher-resolution model produces results that agree better with observations, showing an increase of susceptibility of cloud droplet size, indicating a stronger first aerosol indirect forcing (AIF), and a decrease of susceptibility of precipitation probability, suggesting a weaker second AIF. The resolution sensitivities of AIF are attributed to those ofmore » droplet nucleation and precipitation parameterizations. Finally, the annual average AIF in the Northern Hemisphere midlatitudes (where most anthropogenic emissions occur) in the 0.25° model is reduced by about 1 W m -2 (-30%) compared to the 2° model, leading to a 0.26 W m -2 reduction (-15%) in the global annual average AIF.« less

  12. Very high resolution surface mass balance over Greenland modeled by the regional climate model MAR with a downscaling technique

    NASA Astrophysics Data System (ADS)

    Kittel, Christoph; Lang, Charlotte; Agosta, Cécile; Prignon, Maxime; Fettweis, Xavier; Erpicum, Michel

    2016-04-01

    This study presents surface mass balance (SMB) results at 5 km resolution with the regional climate MAR model over the Greenland ice sheet. Here, we use the last MAR version (v3.6) where the land-ice module (SISVAT) using a high resolution grid (5km) for surface variables is fully coupled while the MAR atmospheric module running at a lower resolution of 10km. This online downscaling technique enables to correct near-surface temperature and humidity from MAR by a gradient based on elevation before forcing SISVAT. The 10 km precipitation is not corrected. Corrections are stronger over the ablation zone where topography presents more variations. The model has been force by ERA-Interim between 1979 and 2014. We will show the advantages of using an online SMB downscaling technique in respect to an offline downscaling extrapolation based on local SMB vertical gradients. Results at 5 km show a better agreement with the PROMICE surface mass balance data base than the extrapolated 10 km MAR SMB results.

  13. US Navy Global and Regional Wave Modeling

    DTIC Science & Technology

    2014-09-01

    Future plans call for increasing the resolution to 0.5 degree, upgrading to WW3 version 4, and including the ...NAVOCEANO WW3 system is in the early stages, and a number of key shortcomings have been identified for future improvement. The multigrid sys- tem...J. Shriver, R. Helber, P. Spence, S . Carroll, O.M. Smedstad, and B. Lunde. 2011. Validation Test Report for the Navy Coupled Ocean

  14. Aladin Lite: Lightweight sky atlas for browsers

    NASA Astrophysics Data System (ADS)

    Boch, Thomas

    2014-02-01

    Aladin Lite is a lightweight version of the Aladin tool, running in the browser and geared towards simple visualization of a sky region. It allows visualization of image surveys (JPEG multi-resolution HEALPix all-sky surveys) and permits superimposing tabular (VOTable) and footprints (STC-S) data. Aladin Lite is powered by HTML5 canvas technology and is easily embeddable on any web page and can also be controlled through a Javacript API.

  15. Hubble Space Telescope: Wide field and planetary camera instrument handbook. Version 2.1

    NASA Technical Reports Server (NTRS)

    Griffiths, Richard (Editor)

    1990-01-01

    An overview is presented of the development and construction of the Wide Field and Planetary Camera (WF/PC). The WF/PC is a duel two dimensional spectrophotometer with rudimentary polarimetric and transmission grating capabilities. The instrument operates from 1150 to 11000 A with a resolution of 0.1 arcsec per pixel or 0.043 arcsec per pixel. Data products and standard calibration methods are briefly summarized.

  16. A measurement of global event shape distributions in the hadronic decays of the Z 0

    NASA Astrophysics Data System (ADS)

    Akrawy, M. Z.; Alexander, G.; Allison, J.; Allport, P. P.; Anderson, K. J.; Armitage, J. C.; Arnison, G. T. J.; Ashton, P.; Azuelos, G.; Baines, J. T. M.; Ball, A. H.; Banks, J.; Barker, G. J.; Barlow, R. J.; Batley, J. R.; Becker, J.; Behnke, T.; Bell, K. W.; Bella, G.; Bethke, S.; Biebel, O.; Binder, U.; Bloodworth, L. J.; Bock, P.; Breuker, H.; Brown, R. M.; Brun, R.; Buijs, A.; Burckhart, H. J.; Capiluppi, P.; Carnegie, R. K.; Carter, A. A.; Carter, J. R.; Chang, C. Y.; Charlton, D. G.; Chrin, J. T. M.; Cohen, I.; Collins, W. J.; Conboy, J. E.; Couch, M.; Coupland, M.; Cuffiani, M.; Dado, S.; Dallavalle, G. M.; Debu, P.; Deninno, M. M.; Dieckmann, A.; Dittmar, M.; Dixit, M. S.; Duchovni, E.; Duerdoth, I. P.; Dumas, D.; El Mamouni, H.; Elcombe, P. A.; Estabrooks, P. G.; Etzion, E.; Fabbri, F.; Farthouat, P.; Fischer, H. M.; Fong, D. G.; French, M. T.; Fukunaga, C.; Gaidot, A.; Ganel, O.; Gary, J. W.; Gascon, J.; Geddes, N. I.; Gee, C. N. P.; Geich-Gimbel, C.; Gensler, S. W.; Gentit, F. X.; Giacomelli, G.; Gibson, V.; Gibson, W. R.; Gillies, J. D.; Goldberg, J.; Goodrick, M. J.; Gorn, W.; Granite, D.; Gross, E.; Grosse-Wiesmann, P.; Grunhaus, J.; Hagedorn, H.; Hagemann, J.; Hansroul, M.; Hargrove, C. K.; Hart, J.; Hattersley, P. M.; Hauschild, M.; Hawkes, C. M.; Heflin, E.; Hemingway, R. J.; Heuer, R. D.; Hill, J. C.; Hillier, S. J.; Ho, C.; Hobbs, J. D.; Hobson, P. R.; Hochman, D.; Holl, B.; Homer, R. J.; Hou, S. R.; Howarth, C. P.; Hughes-Jones, R. E.; Igo-Kemenes, P.; Ihssen, H.; Imrie, D. C.; Jawahery, A.; Jeffreys, P. W.; Jeremie, H.; Jimack, M.; Jobes, M.; Jones, R. W. L.; Jovanovic, P.; Karlen, D.; Kawagoe, K.; Kawamoto, T.; Kellogg, R. G.; Kennedy, B. W.; Kleinwort, C.; Klem, D. E.; Knop, G.; Kobayashi, T.; Kokott, T. P.; Köpke, L.; Kowalewski, R.; Kreutzmann, H.; von Krogh, J.; Kroll, J.; Kuwano, M.; Kyberd, P.; Lafferty, G. D.; Lamarche, F.; Larson, W. J.; Lasota, M. M. B.; Layter, J. G.; Le Du, P.; Leblanc, P.; Lee, A. M.; Lellouch, D.; Lennert, P.; Lessard, L.; Levinson, L.; Lloyd, S. L.; Loebinger, F. K.; Lorah, J. M.; Lorazo, B.; Losty, M. J.; Ludwig, J.; Lupu, N.; Ma, J.; MacBeth, A. A.; Mannelli, M.; Marcellini, S.; Maringer, G.; Martin, A. J.; Martin, J. P.; Mashimo, T.; Mättig, P.; Maur, U.; McMahon, T. J.; McPherson, A. C.; Meijers, F.; Menszner, D.; Merritt, F. S.; Mes, H.; Michelini, A.; Middleton, R. P.; Mikenberg, G.; Miller, D. J.; Milstene, C.; Minowa, M.; Mohr, W.; Montanari, A.; Mori, T.; Moss, M. W.; Murphy, P. G.; Murray, W. J.; Nellen, B.; Nguyen, H. H.; Nozaki, M.; O'Dowd, A. J. P.; O'Neale, S. W.; O'Neill, B. P.; Oakham, F. G.; Odorici, F.; Ogg, M.; Oh, H.; Oreglia, M. J.; Orito, S.; Pansart, J. P.; Patrick, G. N.; Pawley, S. J.; Pfister, P.; Pilcher, J. E.; Pinfold, J. L.; Plane, D. E.; Poli, B.; Pouladdej, A.; Pritchard, P. W.; Quast, G.; Raab, J.; Redmond, M. W.; Rees, D. L.; Regimbald, M.; Riles, K.; Roach, C. M.; Robins, S. A.; Rollnik, A.; Roney, J. M.; Rossberg, S.; Rossi, A. M.; Routenburg, P.; Runge, K.; Runolfsson, O.; Sanghera, S.; Sansum, R. A.; Sasaki, M.; Saunders, B. J.; Schaile, A. D.; Schaile, O.; Schappert, W.; Scharff-Hansen, P.; von der Schmitt, H.; Schreiber, S.; Schwarz, J.; Shapira, A.; Shen, B. C.; Sherwood, P.; Simon, A.; Siroli, G. P.; Skuja, A.; Smith, A. M.; Smith, T. J.; Snow, G. A.; Spreadbury, E. J.; Springer, R. W.; Sproston, M.; Stephens, K.; Stier, H. E.; Ströhmer, R.; Strom, D.; Takeda, H.; Takeshita, T.; Tsukamoto, T.; Turner, M. F.; Tysarczyk-Niemeyer, G.; van den Plas, D.; Vandalen, G. J.; Vasseur, G.; Virtue, C. J.; Wagner, A.; Wahl, C.; Ward, C. P.; Ward, D. R.; Waterhouse, J.; Watkins, P. M.; Watson, A. T.; Watson, N. K.; Weber, M.; Weisz, S.; Wermes, N.; Weymann, M.; Wilson, G. W.; Wilson, J. A.; Wingerter, I.; Winterer, V.-H.; Wood, N. C.; Wotton, S.; Wuensch, B.; Wyatt, T. R.; Yaari, R.; Yang, Y.; Yekutieli, G.; Yoshida, T.; Zeuner, W.; Zorn, G. T.

    1990-12-01

    We present measurements of global event shape distributions in the hadronic decays of the Z 0. The data sample, corresponding to an integrated luminosity of about 1.3 pb-1, was collected with the OPAL detector at LEP. Most of the experimental distributions we present are unfolded for the finite acceptance and resolution of the OPAL detector. Through comparison with our unfolded data, we tune the parameter values of several Monte Carlo computer programs which simulate perturbative QCD and the hadronization of partons. Jetset version 7.2, Herwig version 3.4 and Ariadne version 3.1 all provide good descriptions of the experimental distributions. They in addition describe lower energy data with the parameter values adjusted at the Z 0 energy. A complete second order matrix element Monte Carlo program with a modified perturbation scale is also compared to our 91 GeV data and its parameter values are adjusted. We obtained an unfolded value for the mean charged multiplicity of 21.28±0.04±0.84, where the first error is statistical and the second is systematic.

  17. Release of the World Digital Magnetic Anomaly Map version 2 (WDMAM v2) scheduled

    NASA Astrophysics Data System (ADS)

    Dyment, Jérôme; Lesur, Vincent; Choi, Yujin; Hamoudi, Mohamed; Thébault, Erwan; Catalan, Manuel

    2015-04-01

    The World Digital Magnetic Anomaly Map is an international initiative carried out under the auspices of the International Association of Geomagnetism and Aeronomy (IAGA) and the Commission for the Geological Map of the World (CGMW). A first version of the map has been published and distributed eight years ago (WDMAM v1; Korhonen et al., 2007). After a call for an improved second version of the map in 2011, the slow process of data compilation, map preparation, evaluation and finalization is near completion, and the WDMAM v2 will be released at the International Union of Geophysics and Geodesy (IUGG) meeting to be held in Prag in June-July 2015. In this presentation we display several shortcomings of the WDMAM v1, both on continental and oceanic areas, that are hopefully alleviated in the WDMAM v2, and discuss the process leading to the new map. We reiterate a long-standing call for aeromagnetic and marine magnetic data contribution, and explore future directions to pursue the effort toward a more complete, higher resolution magnetic anomaly map of the World.

  18. Capabilities of current wildfire models when simulating topographical flow

    NASA Astrophysics Data System (ADS)

    Kochanski, A.; Jenkins, M.; Krueger, S. K.; McDermott, R.; Mell, W.

    2009-12-01

    Accurate predictions of the growth, spread and suppression of wild fires rely heavily on the correct prediction of the local wind conditions and the interactions between the fire and the local ambient airflow. Resolving local flows, often strongly affected by topographical features like hills, canyons and ridges, is a prerequisite for accurate simulation and prediction of fire behaviors. In this study, we present the results of high-resolution numerical simulations of the flow over a smooth hill, performed using (1) the NIST WFDS (WUI or Wildland-Urban-Interface version of the FDS or Fire Dynamic Simulator), and (2) the LES version of the NCAR Weather Research and Forecasting (WRF-LES) model. The WFDS model is in the initial stages of development for application to wind flow and fire spread over complex terrain. The focus of the talk is to assess how well simple topographical flow is represented by WRF-LES and the current version of WFDS. If sufficient progress has been made prior to the meeting then the importance of the discrepancies between the predicted and measured winds, in terms of simulated fire behavior, will be examined.

  19. Spectra of late type dwarf stars of known abundance for stellar population models

    NASA Technical Reports Server (NTRS)

    Oconnell, R. W.

    1990-01-01

    The project consisted of two parts. The first was to obtain new low-dispersion, long-wavelength, high S/N IUE spectra of F-G-K dwarf stars with previously determined abundances, temperatures, and gravities. To insure high quality, the spectra are either trailed, or multiple exposures are taken within the large aperture. Second, the spectra are assembled into a library which combines the new data with existing IUE Archive data to yield mean spectral energy distributions for each important type of star. My principal responsibility is the construction and maintenance of this UV spectral library. It covers the spectral range 1200-3200A and is maintained in two parts: a version including complete wavelength coverage at the full spectral resolution of the Low Resolution cameras; and a selected bandpass version, consisting of the mean flux in pre-selected 20A bands. These bands are centered on spectral features or continuum regions of special utility - e.g. the C IV lambda 1550 or Mg II lambda 2800 feature. In the middle-UV region, special emphasis is given to those features (including continuum 'breaks') which are most useful in the study of F-G-K star spectra in the integrated light of old stellar populations.

  20. High-resolution genetic maps of Eucalyptus improve Eucalyptus grandis genome assembly.

    PubMed

    Bartholomé, Jérôme; Mandrou, Eric; Mabiala, André; Jenkins, Jerry; Nabihoudine, Ibouniyamine; Klopp, Christophe; Schmutz, Jeremy; Plomion, Christophe; Gion, Jean-Marc

    2015-06-01

    Genetic maps are key tools in genetic research as they constitute the framework for many applications, such as quantitative trait locus analysis, and support the assembly of genome sequences. The resequencing of the two parents of a cross between Eucalyptus urophylla and Eucalyptus grandis was used to design a single nucleotide polymorphism (SNP) array of 6000 markers evenly distributed along the E. grandis genome. The genotyping of 1025 offspring enabled the construction of two high-resolution genetic maps containing 1832 and 1773 markers with an average marker interval of 0.45 and 0.5 cM for E. grandis and E. urophylla, respectively. The comparison between genetic maps and the reference genome highlighted 85% of collinear regions. A total of 43 noncollinear regions and 13 nonsynthetic regions were detected and corrected in the new genome assembly. This improved version contains 4943 scaffolds totalling 691.3 Mb of which 88.6% were captured by the 11 chromosomes. The mapping data were also used to investigate the effect of population size and number of markers on linkage mapping accuracy. This study provides the most reliable linkage maps for Eucalyptus and version 2.0 of the E. grandis genome. © 2014 CIRAD. New Phytologist © 2014 New Phytologist Trust.

  1. Cosmic Pressure Fronts Mapped by Chandra

    NASA Astrophysics Data System (ADS)

    2000-03-01

    A colossal cosmic "weather system" produced by the collision of two giant clusters of galaxies has been imaged by NASA's Chandra X-ray Observatory. For the first time, the pressure fronts in the system can be traced in detail, and they show a bright, but relatively cool 50 million degree Celsius central region embedded in large elongated cloud of 70 million degree Celsius gas, all of which is roiling in a faint "atmosphere"of 100 million degree Celsius gas. "We can compare this to an intergalactic cold front," said Maxim Markevitch of the Harvard-Smithsonian Center for Astrophysics, Cambridge, Mass. and leader of the international team involved in the analysis of the observations. "A major difference is that in this case, cold means 70 million degree Celsius." The gas clouds are in the core of a galaxy cluster known as Abell 2142. The cluster is six million light years across and contains hundreds of galaxies and enough gas to make a thousand more. It is one of the most massive objects in the universe. Galaxy clusters grow to vast sizes as smaller clusters are pulled inward under the influence of gravity. They collide and merge over the course of billions of years, releasing tremendous amounts of energy that heats the cluster gas to 100 million degrees Celsius. The Chandra data provides the first detailed look at the late stages of this merger process. Previously, scientists had used the German-US Roentgensatellite to produce a broad brush picture of the cluster. The elongated shape of the bright cloud suggested that two clouds were in the process of coalescing into one, but the details remained unclear. Chandra is able to measure variations of temperature, density, and pressure with unprecedented resolution. "Now we can begin to understand the physics of these mergers, which are among the most energetic events in the universe," said Markevitch. "The pressure and density maps of the cluster show a sharp boundary that can only exist in the moving environment of a merger." With this information scientists can make a comparison with computer simulations of cosmic mergers. This comparison, which is in the early stages, shows that this merger has progressed to an advanced stage. Strong shock waves predicted by the theory for the initial collision of clusters are not observed. It appears likely that these sub-clusters have collided two or three times in a billion years or more, and have nearly completed their merger. The observations were made on August 20, 1999 using the Advanced CCD Imaging Spectrometer (ACIS). The team involved scientists from Harvard-Smithsonian; the Massachusetts Institute of Technology, Cambridge; NASA's Marshall Space Flight Center, Huntsville, Ala.; the University of Hawaii, Honolulu; the University of Birmingham, U.K.; the University of Wollongong, Australia; the Space Research Organization Netherlands; the University of Rome, Italy; and the Russian Academy of Sciences. The results will be published in an upcoming issue of the Astrophysical Journal. The ACIS instrument was built for NASA by the Massachusetts Institute of Technology, Cambridge, and Pennsylvania State University, University Park. NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program. TRW, Inc., Redondo Beach, Calif., is the prime contractor for the spacecraft. The Smithsonian's Chandra X-ray Center controls science and flight operations from Cambridge, Mass. For images connected to this release, and to follow Chandra's progress, visit the Chandra site at: http://chandra.harvard.edu/photo/2000/a2142/index.html AND http://chandra.nasa.gov High resolution digital versions of the X-ray image (JPG, 300 dpi TIFF) are available at the Internet sites listed above. This image will be available on NASA Video File which airs at noon, 3:00 p.m., 6:00 p.m., 9:00 p.m. and midnight Eastern Time. NASA Television is available on GE-2, transponder 9C at 85 degrees West longitude, with vertical polarization. Frequency is on 3880.0 megahertz, with audio on 6.8 megahertz.

  2. Geoinformation web-system for processing and visualization of large archives of geo-referenced data

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.

    2010-12-01

    Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming knowledge and to concentrate on solving their specific tasks. The system would be of special importance for education in climate change domain. This work is partially supported by RFBR grant #10-07-00547, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7, SB RAS Integration Projects 4 and 9.

  3. Comparison of the progressive resolution optimizer and photon optimizer in VMAT optimization for stereotactic treatments.

    PubMed

    Liu, Han; Sintay, Benjamin; Pearman, Keith; Shang, Qingyang; Hayes, Lane; Maurer, Jacqueline; Vanderstraeten, Caroline; Wiant, David

    2018-05-20

    The photon optimization (PO) algorithm was recently released by Varian Medical Systems to improve volumetric modulated arc therapy (VMAT) optimization within Eclipse (Version 13.5). The purpose of this study is to compare the PO algorithm with its predecessor, progressive resolution optimizer (PRO) for lung SBRT and brain SRS treatments. A total of 30 patients were selected retrospectively. Previously, all the plans were generated with the PRO algorithm within Eclipse Version 13.6. In the new version of PO algorithm (Version 15), dynamic conformal arcs (DCA) were first conformed to the target, then VMAT inverse planning was performed to achieve the desired dose distributions. PTV coverages were forced to be identical for the same patient for a fair comparison. SBRT plan quality was assessed based on selected dose-volume parameters, including the conformity index, V 20 for lung, V 30 Gy for chest wall, and D 0.035 cc for other critical organs. SRS plan quality was evaluated based on the conformity index and normal tissue volumes encompassed by the 12 and 6 Gy isodose lines (V 12 and V 6 ). The modulation complexity score (MCS) was used to compare plan complexity of two algorithms. No statistically significant differences between the PRO and PO algorithms were found for any of the dosimetric parameters studied, which indicates both algorithms produce comparable plan quality. Significant improvements in the gamma passing rate (increased from 97.0% to 99.2% for SBRT and 96.1% to 98.4% for SRS), MCS (average increase of 0.15 for SBRT and 0.10 for SRS), and delivery efficiency (MU reduction of 29.8% for SBRT and 28.3% for SRS) were found for the PO algorithm. MCS showed a strong correlation with the gamma passing rate, and an inverse correlation with total MUs used. The PO algorithm offers comparable plan quality to the PRO, while minimizing MLC complexity, thereby improving the delivery efficiency and accuracy. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  4. LandEx - Fast, FOSS-Based Application for Query and Retrieval of Land Cover Patterns

    NASA Astrophysics Data System (ADS)

    Netzel, P.; Stepinski, T.

    2012-12-01

    The amount of satellite-based spatial data is continuously increasing making a development of efficient data search tools a priority. The bulk of existing research on searching satellite-gathered data concentrates on images and is based on the concept of Content-Based Image Retrieval (CBIR); however, available solutions are not efficient and robust enough to be put to use as deployable web-based search tools. Here we report on development of a practical, deployable tool that searches classified, rather than raw image. LandEx (Landscape Explorer) is a GeoWeb-based tool for Content-Based Pattern Retrieval (CBPR) contained within the National Land Cover Dataset 2006 (NLCD2006). The USGS-developed NLCD2006 is derived from Landsat multispectral images; it covers the entire conterminous U.S. with the resolution of 30 meters/pixel and it depicts 16 land cover classes. The size of NLCD2006 is about 10 Gpixels (161,000 x 100,000 pixels). LandEx is a multi-tier GeoWeb application based on Open Source Software. Main components are: GeoExt/OpenLayers (user interface), GeoServer (OGC WMS, WCS and WPS server), and GRASS (calculation engine). LandEx performs search using query-by-example approach: user selects a reference scene (exhibiting a chosen pattern of land cover classes) and the tool produces, in real time, a map indicating a degree of similarity between the reference pattern and all local patterns across the U.S. Scene pattern is encapsulated by a 2D histogram of classes and sizes of single-class clumps. Pattern similarity is based on the notion of mutual information. The resultant similarity map can be viewed and navigated in a web browser, or it can download as a GeoTiff file for more in-depth analysis. The LandEx is available at http://sil.uc.edu

  5. Cartographic Production for the FLaSH Map Study: Generation of Rugosity Grids, 2008

    USGS Publications Warehouse

    Robbins, Lisa L.; Knorr, Paul O.; Hansen, Mark

    2010-01-01

    Project Summary This series of raster data is a U.S. Geological Survey (USGS) Data Series release from the Florida Shelf Habitat Project (FLaSH). This disc contains two raster images in Environmental Systems Research Institute, Inc. (ESRI) raster grid format, jpeg image format, and Geo-referenced Tagged Image File Format (GeoTIFF). Data is also provided in non-image ASCII format. Rugosity grids at two resolutions (250 m and 1000 m) were generated for West Florida shelf waters to 250 m using a custom algorithm that follows the methods of Valentine and others (2004). The Methods portion of this document describes the specific steps used to generate the raster images. Rugosity, also referred to as roughness, ruggedness, or the surface-area ratio (Riley and others, 1999; Wilson and others, 2007), is a visual and quantitative measurement of terrain complexity, a common variable in ecological habitat studies. The rugosity of an area can affect biota by influencing habitat, providing shelter from elements, determining the quantity and type of living space, influencing the type and quantity of flora, affecting predator-prey relationships by providing cover and concealment, and, as an expression of vertical relief, can influence local environmental conditions such as temperature and moisture. In the marine environment rugosity can furthermore influence current flow rate and direction, increase the residence time of water in an area through eddying and current deflection, influence local water conditions such as chemistry, turbidity, and temperature, and influence the rate and nature of sedimentary deposition. State-of-the-art computer-mapping techniques and data-processing tools were used to develop shelf-wide raster and vector data layers. Florida Shelf Habitat (FLaSH) Mapping Project (http://coastal.er.usgs.gov/flash) endeavors to locate available data, identify data gaps, synthesize existing information, and expand our understanding of geologic processes in our dynamic coastal and marine systems.

  6. Optimal frequency domain textural edge detection filter

    NASA Technical Reports Server (NTRS)

    Townsend, J. K.; Shanmugan, K. S.; Frost, V. S.

    1985-01-01

    An optimal frequency domain textural edge detection filter is developed and its performance evaluated. For the given model and filter bandwidth, the filter maximizes the amount of output image energy placed within a specified resolution interval centered on the textural edge. Filter derivation is based on relating textural edge detection to tonal edge detection via the complex low-pass equivalent representation of narrowband bandpass signals and systems. The filter is specified in terms of the prolate spheroidal wave functions translated in frequency. Performance is evaluated using the asymptotic approximation version of the filter. This evaluation demonstrates satisfactory filter performance for ideal and nonideal textures. In addition, the filter can be adjusted to detect textural edges in noisy images at the expense of edge resolution.

  7. Sources and pathways of the upscale effects on the Southern Hemisphere jet in MPAS-CAM4 variable-resolution simulations

    DOE PAGES

    Sakaguchi, Koichi; Lu, Jian; Leung, L. Ruby; ...

    2016-10-22

    Impacts of regional grid refinement on large-scale circulations (“upscale effects”) were detected in a previous study that used the Model for Prediction Across Scales-Atmosphere coupled to the physics parameterizations of the Community Atmosphere Model version 4. The strongest upscale effect was identified in the Southern Hemisphere jet during austral winter. This study examines the detailed underlying processes by comparing two simulations at quasi-uniform resolutions of 30 and 120 km to three variable-resolution simulations in which the horizontal grids are regionally refined to 30 km in North America, South America, or Asia from 120 km elsewhere. In all the variable-resolution simulations,more » precipitation increases in convective areas inside the high-resolution domains, as in the reference quasi-uniform high-resolution simulation. With grid refinement encompassing the tropical Americas, the increased condensational heating expands the local divergent circulations (Hadley cell) meridionally such that their descending branch is shifted poleward, which also pushes the baroclinically unstable regions, momentum flux convergence, and the eddy-driven jet poleward. This teleconnection pathway is not found in the reference high-resolution simulation due to a strong resolution sensitivity of cloud radiative forcing that dominates the aforementioned teleconnection signals. The regional refinement over Asia enhances Rossby wave sources and strengthens the upper level southerly flow, both facilitating the cross-equatorial propagation of stationary waves. Evidence indicates that this teleconnection pathway is also found in the reference high-resolution simulation. Lastly, the result underlines the intricate diagnoses needed to understand the upscale effects in global variable-resolution simulations, with implications for science investigations using the computationally efficient modeling framework.« less

  8. Sources and pathways of the upscale effects on the Southern Hemisphere jet in MPAS-CAM4 variable-resolution simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakaguchi, Koichi; Lu, Jian; Leung, L. Ruby

    Impacts of regional grid refinement on large-scale circulations (“upscale effects”) were detected in a previous study that used the Model for Prediction Across Scales-Atmosphere coupled to the physics parameterizations of the Community Atmosphere Model version 4. The strongest upscale effect was identified in the Southern Hemisphere jet during austral winter. This study examines the detailed underlying processes by comparing two simulations at quasi-uniform resolutions of 30 and 120 km to three variable-resolution simulations in which the horizontal grids are regionally refined to 30 km in North America, South America, or Asia from 120 km elsewhere. In all the variable-resolution simulations,more » precipitation increases in convective areas inside the high-resolution domains, as in the reference quasi-uniform high-resolution simulation. With grid refinement encompassing the tropical Americas, the increased condensational heating expands the local divergent circulations (Hadley cell) meridionally such that their descending branch is shifted poleward, which also pushes the baroclinically unstable regions, momentum flux convergence, and the eddy-driven jet poleward. This teleconnection pathway is not found in the reference high-resolution simulation due to a strong resolution sensitivity of cloud radiative forcing that dominates the aforementioned teleconnection signals. The regional refinement over Asia enhances Rossby wave sources and strengthens the upper level southerly flow, both facilitating the cross-equatorial propagation of stationary waves. Evidence indicates that this teleconnection pathway is also found in the reference high-resolution simulation. Lastly, the result underlines the intricate diagnoses needed to understand the upscale effects in global variable-resolution simulations, with implications for science investigations using the computationally efficient modeling framework.« less

  9. Process-level improvements in CMIP5 models and their impact on tropical variability, the Southern Ocean, and monsoons

    NASA Astrophysics Data System (ADS)

    Lauer, Axel; Jones, Colin; Eyring, Veronika; Evaldsson, Martin; Hagemann, Stefan; Mäkelä, Jarmo; Martin, Gill; Roehrig, Romain; Wang, Shiyu

    2018-01-01

    The performance of updated versions of the four earth system models (ESMs) CNRM, EC-Earth, HadGEM, and MPI-ESM is assessed in comparison to their predecessor versions used in Phase 5 of the Coupled Model Intercomparison Project. The Earth System Model Evaluation Tool (ESMValTool) is applied to evaluate selected climate phenomena in the models against observations. This is the first systematic application of the ESMValTool to assess and document the progress made during an extensive model development and improvement project. This study focuses on the South Asian monsoon (SAM) and the West African monsoon (WAM), the coupled equatorial climate, and Southern Ocean clouds and radiation, which are known to exhibit systematic biases in present-day ESMs. The analysis shows that the tropical precipitation in three out of four models is clearly improved. Two of three updated coupled models show an improved representation of tropical sea surface temperatures with one coupled model not exhibiting a double Intertropical Convergence Zone (ITCZ). Simulated cloud amounts and cloud-radiation interactions are improved over the Southern Ocean. Improvements are also seen in the simulation of the SAM and WAM, although systematic biases remain in regional details and the timing of monsoon rainfall. Analysis of simulations with EC-Earth at different horizontal resolutions from T159 up to T1279 shows that the synoptic-scale variability in precipitation over the SAM and WAM regions improves with higher model resolution. The results suggest that the reasonably good agreement of modeled and observed mean WAM and SAM rainfall in lower-resolution models may be a result of unrealistic intensity distributions.

  10. MODTRAN3: Suitability as a flux-divergence code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, G.P.; Chetwynd, J.H.; Wang, J.

    1995-04-01

    The Moderate Resolution Atmospheric Radiance and Transmittance Model (MODTRAN3) is the developmental version of MODTRAN and MODTRAN2. The Geophysics Directorate, Phillips Laboratory, released a beta version of this model in October 1994. It encompasses all the capabilities of LOWTRAN7, the historic 20 cm{sup -1} resolution (full width at half maximum, FWHM) radiance code, but incorporates a much more sensitive molecular band model with 2 cm{sup -1} resolution. The band model is based directly upon the HITRAN spectral parameters, including both temperature and pressure (line shape) dependencies. Validation against full Voigt line-by-line calculations (e.g., FASCODE) has shown excellent agreement. In addition,more » simple timing runs demonstrate potential improvement of more than a factor of 100 for a typical 500 cm{sup -1} spectral interval and comparable vertical layering. Not only is MODTRAN an excellent band model for {open_quotes}full path{close_quotes} calculations (that is, radiance and/or transmittance from point A to point B), but it replicates layer-specific quantities to a very high degree of accuracy. Such layer quantities, derived from ratios and differences of longer path MODTRAN calculations from point A to adjacent layer boundaries, can be used to provide inversion algorithm weighting functions or similarly formulated quantities. One of the most exciting new applications is the rapid calculation of reliable IR cooling rates, including species, altitude, and spectral distinctions, as well as the standard spectrally integrated quantities. Comparisons with prior line-by-line cooling rate calculations are excellent, and the techniques can be extended to incorporate global climatologies of both standard and trace atmospheric species.« less

  11. Moderate Resolution Imaging Spectroradiometer (MODIS) Overview

    USGS Publications Warehouse

    ,

    2008-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) is an instrument that collects remotely sensed data used by scientists for monitoring, modeling, and assessing the effects of natural processes and human actions on the Earth's surface. The continual calibration of the MODIS instruments, the refinement of algorithms used to create higher-level products, and the ongoing product validation make MODIS images a valuable time series (2000-present) of geophysical and biophysical land-surface measurements. Carried on two National Aeronautics and Space Administration (NASA) Earth Observing System (EOS) satellites, MODIS acquires morning (EOS-Terra) and afternoon (EOS-Aqua) views almost daily. Terra data acquisitions began in February 2000 and Aqua data acquisitions began in July 2002. Land data are generated only as higher-level products, removing the burden of common types of data processing from the user community. MODIS-based products describing ecological dynamics, radiation budget, and land cover are projected onto a sinusoidal mapping grid and distributed as 10- by 10-degree tiles at 250-, 500-, or 1,000-meter spatial resolution. Some products are also created on a 0.05-degree geographic grid to support climate modeling studies. All MODIS products are distributed in the Hierarchical Data Format-Earth Observing System (HDF-EOS) file format and are available through file transfer protocol (FTP) or on digital video disc (DVD) media. Versions 4 and 5 of MODIS land data products are currently available and represent 'validated' collections defined in stages of accuracy that are based on the number of field sites and time periods for which the products have been validated. Version 5 collections incorporate the longest time series of both Terra and Aqua MODIS data products.

  12. Reexamination of group velocities of structured light pulses

    NASA Astrophysics Data System (ADS)

    Saari, Peeter

    2018-06-01

    Recently, a series of theoretical and experimental papers on free-space propagation of pulsed Laguerre-Gaussian and Bessel beams was published, which reached contradictory and controversial results about group velocities of such pulses. Depending on the measurement scheme, the group velocity can be defined differently. We analyze how different versions of group velocity are related to the measurable travel time (time of flight) of the pulse between input (source) and output (detecting) planes. The analysis is tested on a theoretical model—the Bessel-Gauss pulse whose propagation path exhibits both subluminal and superluminal regions. Our main conclusion from resolving the contradictions in the literature is that different versions of group velocity are appropriate, depending on whether or not the beam is hollow and how the pulse is recorded in the output plane—integrally or with spatial resolution.

  13. NO2 Total and Tropospheric Vertical Column Densities from OMI on EOS Aura: Update

    NASA Technical Reports Server (NTRS)

    Gleason, J.F.; Bucsela, E.J.; Celarier, E.A.; Veefkind, J.P.; Kim, S.W.; Frost, G.F.

    2009-01-01

    The Ozone Monitoring Instrument (OMI), which is on the EOS AURA satellite, retrieves vertical column densities (VCDs) of NO2, along with those of several other trace gases. The relatively high spatial resolution and daily global coverage of the instrument make it particularly well-suited to monitoring tropospheric pollution at scales on the order of 20 km. The OMI NO2 algorithm distinguishes polluted regions from background stratospheric NO2 using a separation algorithm that relies on the smoothly varying stratospheric NO2 and estimations of both stratospheric and tropospheric air mass factors (AMFs). Version 1 of OMI NO2 data has been released for public use. An overview of OMI NO2 data, some recent results and a description of the improvements for version 2 of the algorithm will be presented.

  14. The implementation of NEMS GFS Aerosol Component (NGAC) Version 1.0 for global dust forecasting at NOAA/NCEP

    PubMed Central

    Lu, Cheng-Hsuan; da Silva, Arlindo; Wang, Jun; Moorthi, Shrinivas; Chin, Mian; Colarco, Peter; Tang, Youhua; Bhattacharjee, Partha S.; Chen, Shen-Po; Chuang, Hui-Ya; Juang, Hann-Ming Henry; McQueen, Jeffery; Iredell, Mark

    2018-01-01

    The NOAA National Centers for Environmental Prediction (NCEP) implemented NEMS GFS Aerosol Component (NGAC) for global dust forecasting in collaboration with NASA Goddard Space Flight Center (GSFC). NGAC Version 1.0 has been providing 5 day dust forecasts at 1°×1° resolution on a global scale, once per day at 00:00 Coordinated Universal Time (UTC), since September 2012. This is the first global system capable of interactive atmosphere aerosol forecasting at NCEP. The implementation of NGAC V1.0 reflects an effective and efficient transitioning of NASA research advances to NCEP operations, paving the way for NCEP to provide global aerosol products serving a wide range of stakeholders as well as to allow the effects of aerosols on weather forecasts and climate prediction to be considered. PMID:29652411

  15. The implementation of NEMS GFS Aerosol Component (NGAC) Version 1.0 for global dust forecasting at NOAA/NCEP.

    PubMed

    Lu, Cheng-Hsuan; da Silva, Arlindo; Wang, Jun; Moorthi, Shrinivas; Chin, Mian; Colarco, Peter; Tang, Youhua; Bhattacharjee, Partha S; Chen, Shen-Po; Chuang, Hui-Ya; Juang, Hann-Ming Henry; McQueen, Jeffery; Iredell, Mark

    2016-01-01

    The NOAA National Centers for Environmental Prediction (NCEP) implemented NEMS GFS Aerosol Component (NGAC) for global dust forecasting in collaboration with NASA Goddard Space Flight Center (GSFC). NGAC Version 1.0 has been providing 5 day dust forecasts at 1°×1° resolution on a global scale, once per day at 00:00 Coordinated Universal Time (UTC), since September 2012. This is the first global system capable of interactive atmosphere aerosol forecasting at NCEP. The implementation of NGAC V1.0 reflects an effective and efficient transitioning of NASA research advances to NCEP operations, paving the way for NCEP to provide global aerosol products serving a wide range of stakeholders as well as to allow the effects of aerosols on weather forecasts and climate prediction to be considered.

  16. CALIPSO lidar calibration at 532 nm: version 4 nighttime algorithm

    NASA Astrophysics Data System (ADS)

    Kar, Jayanta; Vaughan, Mark A.; Lee, Kam-Pui; Tackett, Jason L.; Avery, Melody A.; Garnier, Anne; Getzewich, Brian J.; Hunt, William H.; Josset, Damien; Liu, Zhaoyan; Lucker, Patricia L.; Magill, Brian; Omar, Ali H.; Pelon, Jacques; Rogers, Raymond R.; Toth, Travis D.; Trepte, Charles R.; Vernier, Jean-Paul; Winker, David M.; Young, Stuart A.

    2018-03-01

    Data products from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on board Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) were recently updated following the implementation of new (version 4) calibration algorithms for all of the Level 1 attenuated backscatter measurements. In this work we present the motivation for and the implementation of the version 4 nighttime 532 nm parallel channel calibration. The nighttime 532 nm calibration is the most fundamental calibration of CALIOP data, since all of CALIOP's other radiometric calibration procedures - i.e., the 532 nm daytime calibration and the 1064 nm calibrations during both nighttime and daytime - depend either directly or indirectly on the 532 nm nighttime calibration. The accuracy of the 532 nm nighttime calibration has been significantly improved by raising the molecular normalization altitude from 30-34 km to the upper possible signal acquisition range of 36-39 km to substantially reduce stratospheric aerosol contamination. Due to the greatly reduced molecular number density and consequently reduced signal-to-noise ratio (SNR) at these higher altitudes, the signal is now averaged over a larger number of samples using data from multiple adjacent granules. Additionally, an enhanced strategy for filtering the radiation-induced noise from high-energy particles was adopted. Further, the meteorological model used in the earlier versions has been replaced by the improved Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2), model. An aerosol scattering ratio of 1.01 ± 0.01 is now explicitly used for the calibration altitude. These modifications lead to globally revised calibration coefficients which are, on average, 2-3 % lower than in previous data releases. Further, the new calibration procedure is shown to eliminate biases at high altitudes that were present in earlier versions and consequently leads to an improved representation of stratospheric aerosols. Validation results using airborne lidar measurements are also presented. Biases relative to collocated measurements acquired by the Langley Research Center (LaRC) airborne High Spectral Resolution Lidar (HSRL) are reduced from 3.6 % ± 2.2 % in the version 3 data set to 1.6 % ± 2.4 % in the version 4 release.

  17. Quasi-Global Precipitation as Depicted in the GPCPV2.2 and TMPA V7

    NASA Technical Reports Server (NTRS)

    Huffman, George J.; Bolvin, David T.; Nelkin, Eric J.; Adler, Robert F.

    2012-01-01

    After a lengthy incubation period, the year 2012 saw the release of the Global Precipitation Climatology Project (GPCP) Version 2.2 monthly dataset and the TRMM Multi-satellite Precipitation Analysis (TMPA) Version 7. One primary feature of the new data sets is that DMSP SSMIS data are now used, which entailed a great deal of development work to overcome calibration issues. In addition, the GPCP V2.2 included a slight upgrade to the gauge analysis input datasets, particularly over China, while the TMPA V7 saw more-substantial upgrades: 1) The gauge analysis record in Version 6 used the (older) GPCP monitoring product through April 2005 and the CAMS analysis thereafter, which introduced an inhomogeneity. Version 7 uses the Version 6 GPCC Full analysis, switching to the Version 4 Monitoring analysis thereafter. 2) The inhomogeneously processed AMSU record in Version 6 is uniformly processed in Version 7. 3) The TMI and SSMI input data have been upgraded to the GPROF2010 algorithm. The global-change, water cycle, and other user communities are acutely interested in how these data sets compare, as consistency between differently processed, long-term, quasi-global data sets provides some assurance that the statistics computed from them provide a good representation of the atmosphere's behavior. Within resolution differences, the two data sets agree well over land as the gauge data (which tend to dominate the land results) are the same in both. Over ocean the results differ more because the satellite products used for calibration are based on very different algorithms and the dominant input data sets are different. The time series of tropical (30 N-S) ocean average precipitation shows that the TMPA V7 follows the TMI-PR Combined Product calibrator, although running approximately 5% higher on average. The GPCP and TMPA time series are fairly consistent, although the GPCP runs approximately 10% lower than the TMPA, and has a somewhat larger interannual variation. As well, the GPCP and TMPA interannual variations have an apparent phase shift, with GPCP running a few months later. Additional diagnostics will include mean maps and selected scatter plots.

  18. Accounting for observation uncertainties in an evaluation metric of low latitude turbulent air-sea fluxes: application to the comparison of a suite of IPSL model versions

    NASA Astrophysics Data System (ADS)

    Servonnat, Jérôme; Găinuşă-Bogdan, Alina; Braconnot, Pascale

    2017-09-01

    Turbulent momentum and heat (sensible heat and latent heat) fluxes at the air-sea interface are key components of the whole energetic of the Earth's climate. The evaluation of these fluxes in the climate models is still difficult because of the large uncertainties associated with the reference products. In this paper we present an objective metric accounting for reference uncertainties to evaluate the annual cycle of the low latitude turbulent fluxes of a suite of IPSL climate models. This metric consists in a Hotelling T 2 test between the simulated and observed field in a reduce space characterized by the dominant modes of variability that are common to both the model and the reference, taking into account the observational uncertainty. The test is thus more severe when uncertainties are small as it is the case for sea surface temperature (SST). The results of the test show that for almost all variables and all model versions the model-reference differences are not zero. It is not possible to distinguish between model versions for sensible heat and meridional wind stress, certainly due to the large observational uncertainties. All model versions share similar biases for the different variables. There is no improvement between the reference versions of the IPSL model used for CMIP3 and CMIP5. The test also reveals that the higher horizontal resolution fails to improve the representation of the turbulent surface fluxes compared to the other versions. The representation of the fluxes is further degraded in a version with improved atmospheric physics with an amplification of some of the biases in the Indian Ocean and in the intertropical convergence zone. The ranking of the model versions for the turbulent fluxes is not correlated with the ranking found for SST. This highlights that despite the fact that SST gradients are important for the large-scale atmospheric circulation patterns, other factors such as wind speed, and air-sea temperature contrast play an important role in the representation of turbulent fluxes.

  19. New Collections of Aura Atmospheric data Products at the GES DISC

    NASA Technical Reports Server (NTRS)

    Johnson, James; Ahmad, Suraiya; Gerasimov, Irina; Lepthoukh, Gregory

    2008-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is the primary archive of atmospheric composition data from the Aura Ozone Monitoring Instrument (OMI), Microwave Limb sounder (MLS), and High-Resolution Dynamics Limb Sounder (HIRDLS) instruments. The most recent versions of Aura OMI, MLS and HIRDLS data are available free to the public (http://disc.gsfc.nasa.gov/Aura). TES data are at ASDC (http://eosweb.larc.nasa.gov).

  20. Pion and proton showers in the CALICE scintillator-steel analogue hadron calorimeter

    NASA Astrophysics Data System (ADS)

    Bilki, B.; Repond, J.; Xia, L.; Eigen, G.; Thomson, M. A.; Ward, D. R.; Benchekroun, D.; Hoummada, A.; Khoulaki, Y.; Chang, S.; Khan, A.; Kim, D. H.; Kong, D. J.; Oh, Y. D.; Blazey, G. C.; Dyshkant, A.; Francis, K.; Lima, J. G. R.; Salcido, R.; Zutshi, V.; Salvatore, F.; Kawagoe, K.; Miyazaki, Y.; Sudo, Y.; Suehara, T.; Tomita, T.; Ueno, H.; Yoshioka, T.; Apostolakis, J.; Dannheim, D.; Folger, G.; Ivantchenko, V.; Klempt, W.; Lucaci-Timoce, A.-I.; Ribon, A.; Schlatter, D.; Sicking, E.; Uzhinskiy, V.; Giraud, J.; Grondin, D.; Hostachy, J.-Y.; Morin, L.; Brianne, E.; Cornett, U.; David, D.; Ebrahimi, A.; Falley, G.; Gadow, K.; Göttlicher, P.; Günter, C.; Hartbrich, O.; Hermberg, B.; Karstensen, S.; Krivan, F.; Krüger, K.; Lu, S.; Lutz, B.; Morozov, S.; Morgunov, V.; Neubüser, C.; Reinecke, M.; Sefkow, F.; Smirnov, P.; Tran, H. L.; Buhmann, P.; Garutti, E.; Laurien, S.; Matysek, M.; Ramilli, M.; Briggl, K.; Eckert, P.; Harion, T.; Munwes, Y.; Schultz-Coulon, H.-Ch.; Shen, W.; Stamen, R.; Norbeck, E.; Northacker, D.; Onel, Y.; van Doren, B.; Wilson, G. W.; Wing, M.; Combaret, C.; Caponetto, L.; Eté, R.; Grenier, G.; Han, R.; Ianigro, J. C.; Kieffer, R.; Laktineh, I.; Lumb, N.; Mathez, H.; Mirabito, L.; Petrukhin, A.; Steen, A.; Berenguer Antequera, J.; Calvo Alamillo, E.; Fouz, M.-C.; Marin, J.; Puerta-Pelayo, J.; Verdugo, A.; Corriveau, F.; Bobchenko, B.; Chistov, R.; Chadeeva, M.; Danilov, M.; Drutskoy, A.; Epifantsev, A.; Markin, O.; Mironov, D.; Mizuk, R.; Novikov, E.; Rusinov, V.; Tarkovsky, E.; Besson, D.; Buzhan, P.; Ilyin, A.; Popova, E.; Gabriel, M.; Kiesling, C.; van der Kolk, N.; Simon, F.; Soldner, C.; Szalay, M.; Tesar, M.; Weuste, L.; Amjad, M. S.; Bonis, J.; Callier, S.; Conforti di Lorenzo, S.; Cornebise, P.; Dulucq, F.; Fleury, J.; Frisson, T.; Martin-Chassard, G.; Pöschl, R.; Raux, L.; Richard, F.; Rouëné, J.; Seguin-Moreau, N.; de la Taille, Ch.; Anduze, M.; Boudry, V.; Brient, J.-C.; Clerc, C.; Cornat, R.; Frotin, M.; Gastaldi, F.; Matthieu, A.; Mora de Freitas, P.; Musat, G.; Ruan, M.; Videau, H.; Zacek, J.; Cvach, J.; Gallus, P.; Havranek, M.; Janata, M.; Kvasnicka, J.; Lednicky, D.; Marcisovsky, M.; Polak, I.; Popule, J.; Tomasek, L.; Tomasek, M.; Sicho, P.; Smolik, J.; Vrba, V.; Zalesak, J.; Jeans, D.; Weber, S.

    2015-04-01

    Showers produced by positive hadrons in the highly granular CALICE scintillator-steel analogue hadron calorimeter were studied. The experimental data were collected at CERN and FNAL for single particles with initial momenta from 10 to 80 GeV/c. The calorimeter response and resolution and spatial characteristics of shower development for proton- and pion-induced showers for test beam data and simulations using GEANT4 version 9.6 are compared.

  1. Toward 10-km mesh global climate simulations

    NASA Astrophysics Data System (ADS)

    Ohfuchi, W.; Enomoto, T.; Takaya, K.; Yoshioka, M. K.

    2002-12-01

    An atmospheric general circulation model (AGCM) that runs very efficiently on the Earth Simulator (ES) was developed. The ES is a gigantic vector-parallel computer with the peak performance of 40 Tflops. The AGCM, named AFES (AGCM for ES), was based on the version 5.4.02 of an AGCM developed jointly by the Center for Climate System Research, the University of Tokyo and the Japanese National Institute for Environmental Sciences. The AFES was, however, totally rewritten in FORTRAN90 and MPI while the original AGCM was written in FORTRAN77 and not capable of parallel computing. The AFES achieved 26 Tflops (about 65 % of the peak performance of the ES) at resolution of T1279L96 (10-km horizontal resolution and 500-m vertical resolution in middle troposphere to lower stratosphere). Some results of 10- to 20-day global simulations will be presented. At this moment, only short-term simulations are possible due to data storage limitation. As ten tera flops computing is achieved, peta byte data storage are necessary to conduct climate-type simulations at this super-high resolution global simulations. Some possibilities for future research topics in global super-high resolution climate simulations will be discussed. Some target topics are mesoscale structures and self-organization of the Baiu-Meiyu front over Japan, cyclogenecsis over the North Pacific and typhoons around the Japan area. Also improvement in local precipitation with increasing horizontal resolution will be demonstrated.

  2. Computational imaging through a fiber-optic bundle

    NASA Astrophysics Data System (ADS)

    Lodhi, Muhammad A.; Dumas, John Paul; Pierce, Mark C.; Bajwa, Waheed U.

    2017-05-01

    Compressive sensing (CS) has proven to be a viable method for reconstructing high-resolution signals using low-resolution measurements. Integrating CS principles into an optical system allows for higher-resolution imaging using lower-resolution sensor arrays. In contrast to prior works on CS-based imaging, our focus in this paper is on imaging through fiber-optic bundles, in which manufacturing constraints limit individual fiber spacing to around 2 μm. This limitation essentially renders fiber-optic bundles as low-resolution sensors with relatively few resolvable points per unit area. These fiber bundles are often used in minimally invasive medical instruments for viewing tissue at macro and microscopic levels. While the compact nature and flexibility of fiber bundles allow for excellent tissue access in-vivo, imaging through fiber bundles does not provide the fine details of tissue features that is demanded in some medical situations. Our hypothesis is that adapting existing CS principles to fiber bundle-based optical systems will overcome the resolution limitation inherent in fiber-bundle imaging. In a previous paper we examined the practical challenges involved in implementing a highly parallel version of the single-pixel camera while focusing on synthetic objects. This paper extends the same architecture for fiber-bundle imaging under incoherent illumination and addresses some practical issues associated with imaging physical objects. Additionally, we model the optical non-idealities in the system to get lower modelling errors.

  3. Single Photon Counting Large Format Imaging Sensors with High Spatial and Temporal Resolution

    NASA Astrophysics Data System (ADS)

    Siegmund, O. H. W.; Ertley, C.; Vallerga, J. V.; Cremer, T.; Craven, C. A.; Lyashenko, A.; Minot, M. J.

    High time resolution astronomical and remote sensing applications have been addressed with microchannel plate based imaging, photon time tagging detector sealed tube schemes. These are being realized with the advent of cross strip readout techniques with high performance encoding electronics and atomic layer deposited (ALD) microchannel plate technologies. Sealed tube devices up to 20 cm square have now been successfully implemented with sub nanosecond timing and imaging. The objective is to provide sensors with large areas (25 cm2 to 400 cm2) with spatial resolutions of <20 μm FWHM and timing resolutions of <100 ps for dynamic imaging. New high efficiency photocathodes for the visible regime are discussed, which also allow response down below 150nm for UV sensing. Borosilicate MCPs are providing high performance, and when processed with ALD techniques are providing order of magnitude lifetime improvements and enhanced photocathode stability. New developments include UV/visible photocathodes, ALD MCPs, and high resolution cross strip anodes for 100 mm detectors. Tests with 50 mm format cross strip readouts suitable for Planacon devices show spatial resolutions better than 20 μm FWHM, with good image linearity while using low gain ( 106). Current cross strip encoding electronics can accommodate event rates of >5 MHz and event timing accuracy of 100 ps. High-performance ASIC versions of these electronics are in development with better event rate, power and mass suitable for spaceflight instruments.

  4. Sparse coded image super-resolution using K-SVD trained dictionary based on regularized orthogonal matching pursuit.

    PubMed

    Sajjad, Muhammad; Mehmood, Irfan; Baik, Sung Wook

    2015-01-01

    Image super-resolution (SR) plays a vital role in medical imaging that allows a more efficient and effective diagnosis process. Usually, diagnosing is difficult and inaccurate from low-resolution (LR) and noisy images. Resolution enhancement through conventional interpolation methods strongly affects the precision of consequent processing steps, such as segmentation and registration. Therefore, we propose an efficient sparse coded image SR reconstruction technique using a trained dictionary. We apply a simple and efficient regularized version of orthogonal matching pursuit (ROMP) to seek the coefficients of sparse representation. ROMP has the transparency and greediness of OMP and the robustness of the L1-minization that enhance the dictionary learning process to capture feature descriptors such as oriented edges and contours from complex images like brain MRIs. The sparse coding part of the K-SVD dictionary training procedure is modified by substituting OMP with ROMP. The dictionary update stage allows simultaneously updating an arbitrary number of atoms and vectors of sparse coefficients. In SR reconstruction, ROMP is used to determine the vector of sparse coefficients for the underlying patch. The recovered representations are then applied to the trained dictionary, and finally, an optimization leads to high-resolution output of high-quality. Experimental results demonstrate that the super-resolution reconstruction quality of the proposed scheme is comparatively better than other state-of-the-art schemes.

  5. Detection of proximal caries using digital radiographic systems with different resolutions.

    PubMed

    Nikneshan, Sima; Abbas, Fatemeh Mashhadi; Sabbagh, Sedigheh

    2015-01-01

    Dental radiography is an important tool for detection of caries and digital radiography is the latest advancement in this regard. Spatial resolution is a characteristic of digital receptors used for describing the quality of images. This study was aimed to compare the diagnostic accuracy of two digital radiographic systems with three different resolutions for detection of noncavitated proximal caries. Diagnostic accuracy. Seventy premolar teeth were mounted in 14 gypsum blocks. Digora; Optime and RVG Access were used for obtaining digital radiographs. Six observers evaluated the proximal surfaces in radiographs for each resolution in order to determine the depth of caries based on a 4-point scale. The teeth were then histologically sectioned, and the results of histologic analysis were considered as the gold standard. Data were entered using SPSS version 18 software and the Kruskal-Wallis test was used for data analysis. P <0.05 was considered as statistically significant. No significant difference was found between different resolutions for detection of proximal caries (P > 0.05). RVG access system had the highest specificity (87.7%) and Digora; Optime at high resolution had the lowest specificity (84.2%). Furthermore, Digora; Optime had higher sensitivity for detection of caries exceeding outer half of enamel. Judgment of oral radiologists for detection of the depth of caries had higher reliability than that of restorative dentistry specialists. The three resolutions of Digora; Optime and RVG access had similar accuracy in detection of noncavitated proximal caries.

  6. A new version of Visual tool for estimating the fractal dimension of images

    NASA Astrophysics Data System (ADS)

    Grossu, I. V.; Felea, D.; Besliu, C.; Jipa, Al.; Bordeianu, C. C.; Stan, E.; Esanu, T.

    2010-04-01

    This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images (Grossu et al., 2009 [1]). The earlier version was limited to bi-dimensional sets of points, stored in bitmap files. The application was extended for working also with comma separated values files and three-dimensional images. New version program summaryProgram title: Fractal Analysis v02 Catalogue identifier: AEEG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9999 No. of bytes in distributed program, including test data, etc.: 4 366 783 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30 M Classification: 14 Catalogue identifier of previous version: AEEG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 1999 Does the new version supersede the previous version?: Yes Nature of problem: Estimating the fractal dimension of 2D and 3D images. Solution method: Optimized implementation of the box-counting algorithm. Reasons for new version:The previous version was limited to bitmap image files. The new application was extended in order to work with objects stored in comma separated values (csv) files. The main advantages are: Easier integration with other applications (csv is a widely used, simple text file format); Less resources consumed and improved performance (only the information of interest, the "black points", are stored); Higher resolution (the points coordinates are loaded into Visual Basic double variables [2]); Possibility of storing three-dimensional objects (e.g. the 3D Sierpinski gasket). In this version the optimized box-counting algorithm [1] was extended to the three-dimensional case. Summary of revisions:The application interface was changed from SDI (single document interface) to MDI (multi-document interface). One form was added in order to provide a graphical user interface for the new functionalities (fractal analysis of 2D and 3D images stored in csv files). Additional comments: User friendly graphical interface; Easy deployment mechanism. Running time: In the first approximation, the algorithm is linear. References:[1] I.V. Grossu, C. Besliu, M.V. Rusu, Al. Jipa, C.C. Bordeianu, D. Felea, Comput. Phys. Comm. 180 (2009) 1999-2001.[2] F. Balena, Programming Microsoft Visual Basic 6.0, Microsoft Press, US, 1999.

  7. Enhanced Seismic Imaging of Turbidite Deposits in Chicontepec Basin, Mexico

    NASA Astrophysics Data System (ADS)

    Chavez-Perez, S.; Vargas-Meleza, L.

    2007-05-01

    We test, as postprocessing tools, a combination of migration deconvolution and geometric attributes to attack the complex problems of reflector resolution and detection in migrated seismic volumes. Migration deconvolution has been empirically shown to be an effective approach for enhancing the illumination of migrated images, which are blurred versions of the subsurface reflectivity distribution, by decreasing imaging artifacts, improving spatial resolution, and alleviating acquisition footprint problems. We utilize migration deconvolution as a means to improve the quality and resolution of 3D prestack time migrated results from Chicontepec basin, Mexico, a very relevant portion of the producing onshore sector of Pemex, the Mexican petroleum company. Seismic data covers the Agua Fria, Coapechaca, and Tajin fields. It exhibits acquisition footprint problems, migration artifacts and a severe lack of resolution in the target area, where turbidite deposits need to be characterized between major erosional surfaces. Vertical resolution is about 35 m and the main hydrocarbon plays are turbidite beds no more than 60 m thick. We also employ geometric attributes (e.g., coherent energy and curvature), computed after migration deconvolution, to detect and map out depositional features, and help design development wells in the area. Results of this workflow show imaging enhancement and allow us to identify meandering channels and individual sand bodies, previously undistinguishable in the original seismic migrated images.

  8. Modelling Precipitation and Temperature Extremes: The Importance of Horizontal Resolution

    NASA Astrophysics Data System (ADS)

    Shields, C. A.; Kiehl, J. T.; Meehl, G. A.

    2013-12-01

    Understanding Earth's water cycle on a warming planet is of critical importance in society's ability to adapt to climate change. Extreme weather events, such as floods, heat waves, and drought will likely change with the water cycle as greenhouse gases continue to rise. Location, duration, and intensity of extreme events can be studied using complex earth system models. Here, we employ the fully coupled Community Earth System Model (CESM1.0) to evaluate extreme event impacts for different possible future forcing scenarios. Simulations applying the Representative Concentration Pathway (RCP) scenarios 2.6 and 8.5 were chosen to bracket the range of model responses. Because extreme weather events happen on a regional scale, there is a tendency to favor using higher resolution models, i.e. models that can represent regional features with greater accuracy. Within the CESM1.0 framework, we evaluate both the standard 1 degree resolution (1 degree atmosphere/land coupled to 1 degree ocean/sea ice), and the higher 0.5 degree resolution version (0.5 degree atmosphere/land coupled to 1 degree ocean/sea ice), focusing on extreme precipitation events, heat waves, and droughts. We analyze a variety of geographical regions, but generally find that benefits from increased horizontal resolution are most significant on the regional scale.

  9. Challenges in the development of very high resolution Earth System Models for climate science

    NASA Astrophysics Data System (ADS)

    Rasch, Philip J.; Xie, Shaocheng; Ma, Po-Lun; Lin, Wuyin; Wan, Hui; Qian, Yun

    2017-04-01

    The authors represent the 20+ members of the ACME atmosphere development team. The US Department of Energy (DOE) has, like many other organizations around the world, identified the need for an Earth System Model capable of rapid completion of decade to century length simulations at very high (vertical and horizontal) resolution with good climate fidelity. Two years ago DOE initiated a multi-institution effort called ACME (Accelerated Climate Modeling for Energy) to meet this an extraordinary challenge, targeting a model eventually capable of running at 10-25km horizontal and 20-400m vertical resolution through the troposphere on exascale computational platforms at speeds sufficient to complete 5+ simulated years per day. I will outline the challenges our team has encountered in development of the atmosphere component of this model, and the strategies we have been using for tuning and debugging a model that we can barely afford to run on today's computational platforms. These strategies include: 1) evaluation at lower resolutions; 2) ensembles of short simulations to explore parameter space, and perform rough tuning and evaluation; 3) use of regionally refined versions of the model for probing high resolution model behavior at less expense; 4) use of "auto-tuning" methodologies for model tuning; and 5) brute force long climate simulations.

  10. Clinical trial: the treatment of gastro-oesophageal reflux disease in primary care--prospective randomized comparison of rabeprazole 20 mg with esomeprazole 20 and 40 mg.

    PubMed

    Eggleston, A; Katelaris, P H; Nandurkar, S; Thorpe, P; Holtmann, G

    2009-05-01

    A trial of empirical PPI therapy is usual practice for most patients with symptoms of gastro-oesophageal reflux disease (GERD) in primary care. To determine if the 4-week efficacy of rabeprazole 20 mg for resolving heartburn and regurgitation symptoms is non-inferior to esomeprazole 40 mg or 20 mg. In all, 1392 patients were randomized to rabeprazole 20 mg, esomeprazole 20 mg or 40 mg once daily. Patients, doctors and assessors were blinded. Symptom resolution data were collected on days 0-7 and day-28 using the Patient Assessment of Upper Gastrointestinal Disorders Symptom Severity Index with a shortened version used on days 8-27. Rabeprazole 20 mg was non-inferior to esomeprazole 40 mg for complete resolution of regurgitation and satisfactory resolution of heartburn and regurgitation. For complete heartburn resolution, the efficacy of rabeprazole 20 mg and esomeprazole 40 mg was statistically indistinguishable, although the non-inferiority test was inconclusive. Rabeprazole 20 mg was non-inferior to esomeprazole 20 mg for all outcomes. In uninvestigated GERD patients, rabeprazole 20 mg was non-inferior to esomeprazole 40 mg for complete and satisfactory relief of regurgitation and satisfactory relief of heartburn, and not different for complete resolution of heartburn.

  11. Vorticity-divergence semi-Lagrangian global atmospheric model SL-AV20: dynamical core

    NASA Astrophysics Data System (ADS)

    Tolstykh, Mikhail; Shashkin, Vladimir; Fadeev, Rostislav; Goyman, Gordey

    2017-05-01

    SL-AV (semi-Lagrangian, based on the absolute vorticity equation) is a global hydrostatic atmospheric model. Its latest version, SL-AV20, provides global operational medium-range weather forecast with 20 km resolution over Russia. The lower-resolution configurations of SL-AV20 are being tested for seasonal prediction and climate modeling. The article presents the model dynamical core. Its main features are a vorticity-divergence formulation at the unstaggered grid, high-order finite-difference approximations, semi-Lagrangian semi-implicit discretization and the reduced latitude-longitude grid with variable resolution in latitude. The accuracy of SL-AV20 numerical solutions using a reduced lat-lon grid and the variable resolution in latitude is tested with two idealized test cases. Accuracy and stability of SL-AV20 in the presence of the orography forcing are tested using the mountain-induced Rossby wave test case. The results of all three tests are in good agreement with other published model solutions. It is shown that the use of the reduced grid does not significantly affect the accuracy up to the 25 % reduction in the number of grid points with respect to the regular grid. Variable resolution in latitude allows us to improve the accuracy of a solution in the region of interest.

  12. Hurricane Intensity Forecasts with a Global Mesoscale Model on the NASA Columbia Supercomputer

    NASA Technical Reports Server (NTRS)

    Shen, Bo-Wen; Tao, Wei-Kuo; Atlas, Robert

    2006-01-01

    It is known that General Circulation Models (GCMs) have insufficient resolution to accurately simulate hurricane near-eye structure and intensity. The increasing capabilities of high-end computers (e.g., the NASA Columbia Supercomputer) have changed this. In 2004, the finite-volume General Circulation Model at a 1/4 degree resolution, doubling the resolution used by most of operational NWP center at that time, was implemented and run to obtain promising landfall predictions for major hurricanes (e.g., Charley, Frances, Ivan, and Jeanne). In 2005, we have successfully implemented the 1/8 degree version, and demonstrated its performance on intensity forecasts with hurricane Katrina (2005). It is found that the 1/8 degree model is capable of simulating the radius of maximum wind and near-eye wind structure, and thereby promising intensity forecasts. In this study, we will further evaluate the model s performance on intensity forecasts of hurricanes Ivan, Jeanne, Karl in 2004. Suggestions for further model development will be made in the end.

  13. Classification of Clouds and Deep Convection from GEOS-5 Using Satellite Observations

    NASA Technical Reports Server (NTRS)

    Putman, William; Suarez, Max

    2010-01-01

    With the increased resolution of global atmospheric models and the push toward global cloud resolving models, the resemblance of model output to satellite observations has become strikingly similar. As we progress with our adaptation of the Goddard Earth Observing System Model, Version 5 (GEOS-5) as a high resolution cloud system resolving model, evaluation of cloud properties and deep convection require in-depth analysis beyond a visual comparison. Outgoing long-wave radiation (OLR) provides a sufficient comparison with infrared (IR) satellite imagery to isolate areas of deep convection. We have adopted a binning technique to generate a series of histograms for OLR which classify the presence and fraction of clear sky versus deep convection in the tropics that can be compared with a similar analyses of IR imagery from composite Geostationary Operational Environmental Satellite (GOES) observations. We will present initial results that have been used to evaluate the amount of deep convective parameterization required within the model as we move toward cloud system resolving resolutions of 10- to 1-km globally.

  14. An efficient multi-resolution GA approach to dental image alignment

    NASA Astrophysics Data System (ADS)

    Nassar, Diaa Eldin; Ogirala, Mythili; Adjeroh, Donald; Ammar, Hany

    2006-02-01

    Automating the process of postmortem identification of individuals using dental records is receiving an increased attention in forensic science, especially with the large volume of victims encountered in mass disasters. Dental radiograph alignment is a key step required for automating the dental identification process. In this paper, we address the problem of dental radiograph alignment using a Multi-Resolution Genetic Algorithm (MR-GA) approach. We use location and orientation information of edge points as features; we assume that affine transformations suffice to restore geometric discrepancies between two images of a tooth, we efficiently search the 6D space of affine parameters using GA progressively across multi-resolution image versions, and we use a Hausdorff distance measure to compute the similarity between a reference tooth and a query tooth subject to a possible alignment transform. Testing results based on 52 teeth-pair images suggest that our algorithm converges to reasonable solutions in more than 85% of the test cases, with most of the error in the remaining cases due to excessive misalignments.

  15. The Next-generation Berkeley High Resolution NO2 (BEHR NO2) Retrieval: Design and Preliminary Emissions Constraints

    NASA Astrophysics Data System (ADS)

    Laughner, J.; Cohen, R. C.

    2017-12-01

    Recent work has identified a number of assumptions made in NO2 retrievals that lead to biases in the retrieved NO2 column density. These include the treatment of the surface as an isotropic reflector, the absence of lightning NO2 in high resolution a priori profiles, and the use of monthly averaged a priori profiles. We present a new release of the Berkeley High Resolution (BEHR) OMI NO2 retrieval based on the new NASA Standard Product (version 3) that addresses these assumptions by: accounting for surface anisotropy by using a BRDF albedo product, using an updated method of regridding NO2 data, and revised NO2 a priori profiles that better account for lightning NO2 and daily variation in the profile shape. We quantify the effect these changes have on the retrieved NO2 column densities and the resultant impact these updates have on constraints of urban NOx emissions for select cities throughout the United States.

  16. VizieR Online Data Catalog: The FIRST Survey Catalog, Version 2014Dec17 (Helfand+ 2015)

    NASA Astrophysics Data System (ADS)

    Helfand, D. J.; White, R. L.; Becker, R. H.

    2015-05-01

    The Faint Images of the Radio Sky at Twenty centimeters (FIRST) began in 1993. It uses the VLA (Very Large Array, a facility of the National Radio Observatory (NRAO)) at a frequency of 1.4GHz, and it is slated to 10,000 deg2 of the North and South Galactic Caps, to a sensitivity of about 1mJy with an angular resolution of about 5''. The images produced by an automated mapping pipeline have pixels of 1.8'', a typical rms of 0.15mJy, and a resolution of 5''; the images are available on the Internet (see the FIRST home page at http://sundog.stsci.edu/ for details). The source catalogue is derived from the images. This catalog from the 1993 through 2011 observations contains 946,432 sources from the north and south Galactic caps. It covers a total of 10,575 square degrees of the sky (8444 square degrees in the north and 2131 square degrees in the south). In this version of the catalog, images taken in the the new EVLA configuration have been re-reduced using shallower CLEAN thresholds in order to reduce the "CLEAN bias" in those images. Also, the EVLA images are not co-added with older VLA images to avoid problems resulting from the different frequencies and noise properties of the configurations. That leads to small gaps in the sky coverage at boundaries between the EVLA and VLA regions. As a result, the area covered by this release of the catalog is about 60 square degrees smaller than the earlier release of the catalog (13Jun05, also available here as the "first13.dat" file), and the total number of sources is reduced by nearly 25,000. The previous version of the catalog does have sources in the overlap regions, but their flux densities are considered unreliable due to calibration errors. The flux densities should be more accurate in this catalog, biases are smaller, and the incidence of spurious sources is also reduced. Over most of the survey area, the detection limit is 1 mJy. A region along the equatorial strip (RA=21.3 to 3.3hr, Dec=-1 to 1deg) has a deeper detection threshold because two epochs of observation were combined. The typical detection threshold in this region is 0.75mJy. There are approximately 4,500 sources below the 1mJy threshold used for most previous versions of the catalog. The previous versions http://sundog.stsci.edu/first/catalogs/ (2 data files).

  17. Rationale and study design of the Prospective comparison of Angiotensin Receptor neprilysin inhibitor with Angiotensin receptor blocker MEasuring arterial sTiffness in the eldERly (PARAMETER) study

    PubMed Central

    Williams, Bryan; Cockcroft, John R; Kario, Kazuomi; Zappe, Dion H; Cardenas, Pamela; Hester, Allen; Brunel, Patrick; Zhang, Jack

    2014-01-01

    Introduction Hypertension in elderly people is characterised by elevated systolic blood pressure (SBP) and increased pulse pressure (PP), which indicate large artery ageing and stiffness. LCZ696, a first-in-class angiotensin receptor neprilysin inhibitor (ARNI), is being developed to treat hypertension and heart failure. The Prospective comparison of Angiotensin Receptor neprilysin inhibitor with Angiotensin receptor blocker MEasuring arterial sTiffness in the eldERly (PARAMETER) study will assess the efficacy of LCZ696 versus olmesartan on aortic stiffness and central aortic haemodynamics. Methods and analysis In this 52-week multicentre study, patients with hypertension aged ≥60 years with a mean sitting (ms) SBP ≥150 to <180 and a PP>60 mm Hg will be randomised to once daily LCZ696 200 mg or olmesartan 20 mg for 4 weeks, followed by a forced-titration to double the initial doses for the next 8 weeks. At 12–24 weeks, if the BP target has not been attained (msSBP <140  and ms diastolic BP <90 mm Hg), amlodipine (2.5–5 mg) and subsequently hydrochlorothiazide (6.25–25 mg) can be added. The primary and secondary endpoints are changes from baseline in central aortic systolic pressure (CASP) and central aortic PP (CAPP) at week 12, respectively. Other secondary endpoints are the changes in CASP and CAPP at week 52. A sample size of 432 randomised patients is estimated to ensure a power of 90% to assess the superiority of LCZ696 over olmesartan at week 12 in the change from baseline of mean CASP, assuming an SD of 19 mm Hg, the difference of 6.5 mm Hg and a 15% dropout rate. The primary variable will be analysed using a two-way analysis of covariance. Ethics and dissemination The study was initiated in December 2012 and final results are expected in 2015. The results of this study will impact the design of future phase III studies assessing cardiovascular protection. Clinical trials identifier EUDract number 2012-002899-14 and ClinicalTrials.gov NCT01692301. PMID:24496699

  18. 'Lyell' Panorama inside Victoria Crater (False Color)

    NASA Technical Reports Server (NTRS)

    2008-01-01

    During four months prior to the fourth anniversary of its landing on Mars, NASA's Mars Exploration Rover Opportunity examined rocks inside an alcove called 'Duck Bay' in the western portion of Victoria Crater. The main body of the crater appears in the upper right of this stereo panorama, with the far side of the crater lying about 800 meters (half a mile) away. Bracketing that part of the view are two promontories on the crater's rim at either side of Duck Bay. They are 'Cape Verde,' about 6 meters (20 feet) tall, on the left, and 'Cabo Frio,' about 15 meters (50 feet) tall, on the right. The rest of the image, other than sky and portions of the rover, is ground within Duck Bay.

    Opportunity's targets of study during the last quarter of 2007 were rock layers within a band exposed around the interior of the crater, about 6 meters (20 feet) from the rim. Bright rocks within the band are visible in the foreground of the panorama. The rover science team assigned informal names to three subdivisions of the band: 'Steno,' 'Smith,' and 'Lyell.'

    This view combines many images taken by Opportunity's panoramic camera (Pancam) from the 1,332nd through 1,379th Martian days, or sols, of the mission (Oct. 23 to Dec. 11, 2007). Images taken through Pancam filters centered on wavelengths of 753 nanometers, 535 nanometers and 432 nanometers were mixed to produce this view, which is presented in a false-color stretch to bring out subtle color differences in the scene. Some visible patterns in dark and light tones are the result of combining frames that were affected by dust on the front sapphire window of the rover's camera.

    Opportunity landed on Jan. 25, 2004, Universal Time, (Jan. 24, Pacific Time) inside a much smaller crater about 6 kilometers (4 miles) north of Victoria Crater, to begin a surface mission designed to last 3 months and drive about 600 meters (0.4 mile).

  19. Sensitivity studies of high-resolution RegCM3 simulations of precipitation over the European Alps: the effect of lateral boundary conditions and domain size

    NASA Astrophysics Data System (ADS)

    Nadeem, Imran; Formayer, Herbert

    2016-11-01

    A suite of high-resolution (10 km) simulations were performed with the International Centre for Theoretical Physics (ICTP) Regional Climate Model (RegCM3) to study the effect of various lateral boundary conditions (LBCs), domain size, and intermediate domains on simulated precipitation over the Great Alpine Region. The boundary conditions used were ECMWF ERA-Interim Reanalysis with grid spacing 0.75∘, the ECMWF ERA-40 Reanalysis with grid spacing 1.125 and 2.5∘, and finally the 2.5∘ NCEP/DOE AMIP-II Reanalysis. The model was run in one-way nesting mode with direct nesting of the high-resolution RCM (horizontal grid spacing Δx = 10 km) with driving reanalysis, with one intermediate resolution nest (Δx = 30 km) between high-resolution RCM and reanalysis forcings, and also with two intermediate resolution nests (Δx = 90 km and Δx = 30 km) for simulations forced with LBC of resolution 2.5∘. Additionally, the impact of domain size was investigated. The results of multiple simulations were evaluated using different analysis techniques, e.g., Taylor diagram and a newly defined useful statistical parameter, called Skill-Score, for evaluation of daily precipitation simulated by the model. It has been found that domain size has the major impact on the results, while different resolution and versions of LBCs, e.g., 1.125∘ ERA40 and 0.7∘ ERA-Interim, do not produce significantly different results. It is also noticed that direct nesting with reasonable domain size, seems to be the most adequate method for reproducing precipitation over complex terrain, while introducing intermediate resolution nests seems to deteriorate the results.

  20. Impact of the "Symmetric Instability of the Computational Kind" at mesoscale- and submesoscale-permitting resolutions

    NASA Astrophysics Data System (ADS)

    Ducousso, Nicolas; Le Sommer, J.; Molines, J.-M.; Bell, M.

    2017-12-01

    The energy- and enstrophy-conserving momentum advection scheme (EEN) used over the last 10 years in NEMO is subject to a spurious numerical instability. This instability, referred to as the Symmetric Instability of the Computational Kind (SICK), arises from a discrete imbalance between the two components of the vector-invariant form of momentum advection. The properties and the method for removing this instability have been documented by Hollingsworth et al. (1983), but the extent to which the SICK may interfere with processes of interest at mesoscale- and submesoscale-permitting resolutions is still unkown. In this paper, the impact of the SICK in realistic ocean model simulations is assessed by comparing model integrations with different versions of the EEN momentum advection scheme. Investigations are undertaken with a global mesoscale-permitting resolution (1/4 °) configuration and with a regional North Atlantic Ocean submesoscale-permitting resolution (1/60 °) configuration. At both resolutions, the instability is found to alter primarily the most energetic current systems, such as equatorial jets, western boundary currents and coherent vortices. The impact of the SICK is found to increase with model resolution with a noticeable impact at mesoscale-permitting resolution and a dramatic impact at submesoscale-permitting resolution. The SICK is shown to distort the normal functioning of current systems, by redirecting the slow energy transfer between balanced motions to a spurious energy transfer to internal inertia-gravity waves and to dissipation. Our results indicate that the SICK is likely to have significantly corrupted NEMO solutions (when run with the EEN scheme) at mesocale-permitting and finer resolutions over the last 10 years.

  1. An Accuracy Assessment of the CALIOP/CALIPSO Version 2/Version 3 Daytime Aerosol Extinction Product Based on a Detailed Multi-Sensor, Multi-Platform Case Study

    NASA Technical Reports Server (NTRS)

    Kacenelenbogen, M.; Vaughan, M. A.; Redemann, J.; Hoff, R. M.; Rogers, R. R.; Ferrare, R. A.; Russell, P. B.; Hostetler, C. A.; Hair, J. W.; Holben, B. N.

    2011-01-01

    The Cloud Aerosol LIdar with Orthogonal Polarization (CALIOP), on board the CALIPSO platform, has measured profiles of total attenuated backscatter coefficient (level 1 products) since June 2006. CALIOP s level 2 products, such as the aerosol backscatter and extinction coefficient profiles, are retrieved using a complex succession of automated algorithms. The goal of this study is to help identify potential shortcomings in the CALIOP version 2 level 2 aerosol extinction product and to illustrate some of the motivation for the changes that have been introduced in the next version of CALIOP data (version 3, released in June 2010). To help illustrate the potential factors contributing to the uncertainty of the CALIOP aerosol extinction retrieval, we focus on a one-day, multi-instrument, multiplatform comparison study during the CALIPSO and Twilight Zone (CATZ) validation campaign on 4 August 2007. On that day, we observe a consistency in the Aerosol Optical Depth (AOD) values recorded by four different instruments (i.e. spaceborne MODerate Imaging Spectroradiometer, MODIS: 0.67 and POLarization and Directionality of Earth s Reflectances, POLDER: 0.58, airborne High Spectral Resolution Lidar, HSRL: 0.52 and ground-based AErosol RObotic NETwork, AERONET: 0.48 to 0.73) while CALIOP AOD is a factor of two lower (0.32 at 532 nm). This case study illustrates the following potential sources of uncertainty in the CALIOP AOD: (i) CALIOP s low signal-to-noise ratio (SNR) leading to the misclassification and/or lack of aerosol layer identification, especially close to the Earth s surface; (ii) the cloud contamination of CALIOP version 2 aerosol backscatter and extinction profiles; (iii) potentially erroneous assumptions of the aerosol extinction-to-backscatter ratio (Sa) used in CALIOP s extinction retrievals; and (iv) calibration coefficient biases in the CALIOP daytime attenuated backscatter coefficient profiles. The use of version 3 CALIOP extinction retrieval for our case study seems to partially fix factor (i) although the aerosol retrieved by CALIOP is still somewhat lower than the profile measured by HSRL; the cloud contamination (ii) appears to be corrected; no particular change is apparent in the observation-based CALIOP Sa value (iii). Our case study also showed very little difference in version 2 and version 3 CALIOP attenuated backscatter coefficient profiles, illustrating a minor change in the calibration scheme (iv).

  2. Extended and refined multi sensor reanalysis of total ozone for the period 1970-2012

    NASA Astrophysics Data System (ADS)

    van der A, R. J.; Allaart, M. A. F.; Eskes, H. J.

    2015-07-01

    The ozone multi-sensor reanalysis (MSR) is a multi-decadal ozone column data record constructed using all available ozone column satellite data sets, surface Brewer and Dobson observations and a data assimilation technique with detailed error modelling. The result is a high-resolution time series of 6-hourly global ozone column fields and forecast error fields that may be used for ozone trend analyses as well as detailed case studies. The ozone MSR is produced in two steps. First, the latest reprocessed versions of all available ozone column satellite data sets are collected and then are corrected for biases as a function of solar zenith angle (SZA), viewing zenith angle (VZA), time (trend), and stratospheric temperature using surface observations of the ozone column from Brewer and Dobson spectrophotometers from the World Ozone and Ultraviolet Radiation Data Centre (WOUDC). Subsequently the de-biased satellite observations are assimilated within the ozone chemistry and data assimilation model TMDAM. The MSR2 (MSR version 2) reanalysis upgrade described in this paper consists of an ozone record for the 43-year period 1970-2012. The chemistry transport model and data assimilation system have been adapted to improve the resolution, error modelling and processing speed. Backscatter ultraviolet (BUV) satellite observations have been included for the period 1970-1977. The total record is extended by 13 years compared to the first version of the ozone multi sensor reanalysis, the MSR1. The latest total ozone retrievals of 15 satellite instruments are used: BUV-Nimbus4, TOMS-Nimbus7, TOMS-EP, SBUV-7, -9, -11, -14, -16, -17, -18, -19, GOME, SCIAMACHY, OMI and GOME-2. The resolution of the model runs, assimilation and output is increased from 2° × 3° to 1° × 1°. The analysis is driven by 3-hourly meteorology from the ERA-Interim reanalysis of the European Centre for Medium-Range Weather Forecasts (ECMWF) starting from 1979, and ERA-40 before that date. The chemistry parameterization has been updated. The performance of the MSR2 analysis is studied with the help of observation-minus-forecast (OmF) departures from the data assimilation, by comparisons with the individual station observations and with ozone sondes. The OmF statistics show that the mean bias of the MSR2 analyses is less than 1 % with respect to de-biased satellite observations after 1979.

  3. Simulations of Astrophysical Jets in Dense Environments

    NASA Astrophysics Data System (ADS)

    Krause, Martin; Gaibler, Volker; Camenzind, Max

    We have simulated the interaction of jets with a galactic wind at high resolution using the magnetohydrodynamics code NIRVANA on the NEC SX-6 at the HLRS. This setup may describe a typical situation for the starbursting radio galaxies of the early universe. The results show a clear resolution dependence in the expected way, but the formed clumps are denser than expected from linear extrapolation. We also report our recent progress in the adaptation of the magnetic part of NIRVANA to the SX-6. The code is now fully tuned to the machine and reached more than 3 Gflops. We plan to use this new code version to extend our study of magnetized jets down to very low jet densities. This should be especially applicable to the conditions in the young universe.

  4. Multivariate curve resolution of incomplete fused multiset data from chromatographic and spectrophotometric analyses for drug photostability studies.

    PubMed

    De Luca, Michele; Ragno, Gaetano; Ioele, Giuseppina; Tauler, Romà

    2014-07-21

    An advanced and powerful chemometric approach is proposed for the analysis of incomplete multiset data obtained by fusion of hyphenated liquid chromatographic DAD/MS data with UV spectrophotometric data from acid-base titration and kinetic degradation experiments. Column- and row-wise augmented data blocks were combined and simultaneously processed by means of a new version of the multivariate curve resolution-alternating least squares (MCR-ALS) technique, including the simultaneous analysis of incomplete multiset data from different instrumental techniques. The proposed procedure was applied to the detailed study of the kinetic photodegradation process of the amiloride (AML) drug. All chemical species involved in the degradation and equilibrium reactions were resolved and the pH dependent kinetic pathway described. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Comparison and validation of gridded precipitation datasets for Spain

    NASA Astrophysics Data System (ADS)

    Quintana-Seguí, Pere; Turco, Marco; Míguez-Macho, Gonzalo

    2016-04-01

    In this study, two gridded precipitation datasets are compared and validated in Spain: the recently developed SAFRAN dataset and the Spain02 dataset. These are validated using rain gauges and they are also compared to the low resolution ERA-Interim reanalysis. The SAFRAN precipitation dataset has been recently produced, using the SAFRAN meteorological analysis, which is extensively used in France (Durand et al. 1993, 1999; Quintana-Seguí et al. 2008; Vidal et al., 2010) and which has recently been applied to Spain (Quintana-Seguí et al., 2015). SAFRAN uses an optimal interpolation (OI) algorithm and uses all available rain gauges from the Spanish State Meteorological Agency (Agencia Estatal de Meteorología, AEMET). The product has a spatial resolution of 5 km and it spans from September 1979 to August 2014. This dataset has been produced mainly to be used in large scale hydrological applications. Spain02 (Herrera et al. 2012, 2015) is another high quality precipitation dataset for Spain based on a dense network of quality-controlled stations and it has different versions at different resolutions. In this study we used the version with a resolution of 0.11°. The product spans from 1971 to 2010. Spain02 is well tested and widely used, mainly, but not exclusively, for RCM model validation and statistical downscliang. ERA-Interim is a well known global reanalysis with a spatial resolution of ˜79 km. It has been included in the comparison because it is a widely used product for continental and global scale studies and also in smaller scale studies in data poor countries. Thus, its comparison with higher resolution products of a data rich country, such as Spain, allows us to quantify the errors made when using such datasets for national scale studies, in line with some of the objectives of the EU-FP7 eartH2Observe project. The comparison shows that SAFRAN and Spain02 perform similarly, even though their underlying principles are different. Both products are largely better than ERA-Interim, which has a much coarser representation of the relief, which is crucial for precipitation. These results are a contribution to the Spanish Case Study of the eartH2Observe project, which is focused on the simulation of drought processes in Spain using Land-Surface Models (LSM). This study will also be helpful in the Spanish MARCO project, which aims at improving the ability of RCMs to simulate hydrometeorological extremes.

  6. Dynamic grid refinement for partial differential equations on parallel computers

    NASA Technical Reports Server (NTRS)

    Mccormick, S.; Quinlan, D.

    1989-01-01

    The fast adaptive composite grid method (FAC) is an algorithm that uses various levels of uniform grids to provide adaptive resolution and fast solution of PDEs. An asynchronous version of FAC, called AFAC, that completely eliminates the bottleneck to parallelism is presented. This paper describes the advantage that this algorithm has in adaptive refinement for moving singularities on multiprocessor computers. This work is applicable to the parallel solution of two- and three-dimensional shock tracking problems.

  7. Scientific Studies of the High-Latitude Ionosphere with the Ionosphere Dynamics and ElectroDynamics - Data Assimilation (IDED-DA) Model

    DTIC Science & Technology

    2014-09-23

    conduct simulations with a high-latitude data assimilation model. The specific objectives are to study magnetosphere-ionosphere ( M -I) coupling processes...based on three physics-based models, including a magnetosphere-ionosphere ( M -I) electrodynamics model, an ionosphere model, and a magnetic...inversion code. The ionosphere model is a high-resolution version of the Ionosphere Forecast Model ( IFM ), which is a 3-D, multi-ion model of the ionosphere

  8. An atlas of stellar spectra between 2.00 and 2.45 micrometers (Arnaud, Gilmore, and Collier Cameron 1989)

    NASA Technical Reports Server (NTRS)

    Warren, Wayne N., Jr.

    1990-01-01

    The machine-readable version of the atlas, as it is currently being distributed from the Astronomical Data Center, is described. The atlas represent a collection of spectra in the wavelength range 2.00 to 2.45 micros having a resolution of approximately 0.02 micron. The sample of 73 stars includes a supergiant, giants, dwarfs, and subdwarfs with a chemical abundance range of about -2 to +0.5 dex.

  9. North Atlantic Tropical Cyclones: historical simulations and future changes with the new high-resolution Arpege AGCM.

    NASA Astrophysics Data System (ADS)

    Pilon, R.; Chauvin, F.; Palany, P.; Belmadani, A.

    2017-12-01

    A new version of the variable high-resolution Meteo-France Arpege atmospheric general circulation model (AGCM) has been developed for tropical cyclones (TC) studies, with a focus on the North Atlantic basin, where the model horizontal resolution is 15 km. Ensemble historical AMIP (Atmospheric Model Intercomparison Project)-type simulations (1965-2014) and future projections (2020-2080) under the IPCC (Intergovernmental Panel on Climate Change) representative concentration pathway (RCP) 8.5 scenario have been produced. TC-like vortices tracking algorithm is used to investigate TC activity and variability. TC frequency, genesis, geographical distribution and intensity are examined. Historical simulations are compared to best-track and reanalysis datasets. Model TC frequency is generally realistic but tends to be too high during the rst decade of the historical simulations. Biases appear to originate from both the tracking algorithm and model climatology. Nevertheless, the model is able to simulate extremely well intense TCs corresponding to category 5 hurricanes in the North Atlantic, where grid resolution is highest. Interaction between developing TCs and vertical wind shear is shown to be contributing factor for TC variability. Future changes in TC activity and properties are also discussed.

  10. Derivation and Error Analysis of the Earth Magnetic Anomaly Grid at 2 arc min Resolution Version 3 (EMAG2v3)

    NASA Astrophysics Data System (ADS)

    Meyer, B.; Chulliat, A.; Saltus, R.

    2017-12-01

    The Earth Magnetic Anomaly Grid at 2 arc min resolution version 3, EMAG2v3, combines marine and airborne trackline observations, satellite data, and magnetic observatory data to map the location, intensity, and extent of lithospheric magnetic anomalies. EMAG2v3 includes over 50 million new data points added to NCEI's Geophysical Database System (GEODAS) in recent years. The new grid relies only on observed data, and does not utilize a priori geologic structure or ocean-age information. Comparing this grid to other global magnetic anomaly compilations (e.g., EMAG2 and WDMAM), we can see that the inclusion of a priori ocean-age patterns forces an artificial linear pattern to the grid; the data-only approach allows for greater complexity in representing the evolution along oceanic spreading ridges and continental margins. EMAG2v3 also makes use of the satellite-derived lithospheric field model MF7 in order to accurately represent anomalies with wavelengths greater than 300 km and to create smooth grid merging boundaries. The heterogeneous distribution of errors in the observations used in compiling the EMAG2v3 was explored, and is reported in the final distributed grid. This grid is delivered at both 4 km continuous altitude above WGS84, as well as at sea level for all oceanic and coastal regions.

  11. High resolution observations of small-scale gravity waves and turbulence features in the OH airglow layer

    NASA Astrophysics Data System (ADS)

    Sedlak, René; Hannawald, Patrick; Schmidt, Carsten; Wüst, Sabine; Bittner, Michael

    2017-04-01

    A new version of the Fast Airglow Imager (FAIM) for the detection of atmospheric waves in the OH airglow layer has been set up at the German Remote Sensing Data Centre (DFD) of the German Aerospace Centre (DLR) at Oberpfaffenhofen (48.09 ° N, 11.28 ° E), Germany. The spatial resolution of the instrument is 17 m/pixel in zenith direction with a field of view (FOV) of 11.1 km x 9.0 km at the OH layer height of ca. 87 km. Since November 2015, the system has been in operation in two different setups (zenith angles 46 ° and 0 °) with a temporal resolution of 2.5 to 2.8 s. In a first case study we present observations of two small wave-like features that might be attributed to gravity wave instabilities. In order to spectrally analyse harmonic structures even on small spatial scales down to 550 m horizontal wavelength, we made use of the Maximum Entropy Method (MEM) since this method exhibits an excellent wavelength resolution. MEM further allows analysing relatively short data series, which considerably helps to reduce problems such as stationarity of the underlying data series from a statistical point of view. We present an observation of the subsequent decay of well-organized wave fronts into eddies, which we tentatively interpret in terms of an indication for the onset of turbulence. Another remarkable event which demonstrates the technical capabilities of the instrument was observed during the night of 4th to 5th April 2016. It reveals the disintegration of a rather homogenous brightness variation into several filaments moving in different directions and with different speeds. It resembles the formation of a vortex with a horizontal axis of rotation likely related to a vertical wind shear. This case shows a notable similarity to what is expected from theoretical modelling of Kelvin-Helmholtz instabilities (KHIs). The comparatively high spatial resolution of the presented new version of the FAIM airglow imager provides new insights into the structure of atmospheric wave instability and turbulent processes. Infrared imaging of wave dynamics on the sub-kilometre scale in the airglow layer supports the findings of theoretical simulations and modellings. Parts of this research received funding from the Bavarian State Ministry of the Environment and Consumer Protection.

  12. Calibrated, Enhanced-Resolution Brightness Temperature Earth System Data Record: A New Era for Gridded Passive Microwave Data

    NASA Astrophysics Data System (ADS)

    Hardman, M.; Brodzik, M. J.; Long, D. G.

    2017-12-01

    Since 1978, the satellite passive microwave data record has been a mainstay of remote sensing of the cryosphere, providing twice-daily, near-global spatial coverage for monitoring changes in hydrologic and cryospheric parameters that include precipitation, soil moisture, surface water, vegetation, snow water equivalent, sea ice concentration and sea ice motion. Up until recently, the available global gridded passive microwave data sets have not been produced consistently. Various projections (equal-area, polar stereographic), a number of different gridding techniques were used, along with various temporal sampling as well as a mix of Level 2 source data versions. In addition, not all data from all sensors have been processed completely and they have not been processed in any one consistent way. Furthermore, the original gridding techniques were relatively primitive and were produced on 25 km grids using the original EASE-Grid definition that is not easily accommodated in modern software packages. As part of NASA MEaSUREs, we have re-processed all data from SMMR, all SSM/I-SSMIS and AMSR-E instruments, using the most mature Level 2 data. The Calibrated, Enhanced-Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR) gridded data are now available from the NSIDC DAAC. The data are distributed as netCDF files that comply with CF-1.6 and ACDD-1.3 conventions. The data have been produced on EASE 2.0 projections at smoothed, 25 kilometer resolution and spatially-enhanced resolutions, up to 3.125 km depending on channel frequency, using the radiometer version of the Scatterometer Image Reconstruction (rSIR) method. We expect this newly produced data set to enable scientists to better analyze trends in coastal regions, marginal ice zones and in mountainous terrain that were not possible with the previous gridded passive microwave data. The use of the EASE-Grid 2.0 definition and netCDF-CF formatting allows users to extract compliant geotiff images and provides for easy importing and correct reprojection interoperability in many standard packages. As a consistently-processed, high-quality satellite passive microwave ESDR, we expect this data set to replace earlier gridded passive microwave data sets, and to pave the way for new insights from higher-resolution derived geophysical products.

  13. High-resolution confocal imaging of wall ingrowth deposition in plant transfer cells: Semi-quantitative analysis of phloem parenchyma transfer cell development in leaf minor veins of Arabidopsis.

    PubMed

    Nguyen, Suong T T; McCurdy, David W

    2015-04-23

    Transfer cells (TCs) are trans-differentiated versions of existing cell types designed to facilitate enhanced membrane transport of nutrients at symplasmic/apoplasmic interfaces. This transport capacity is conferred by intricate wall ingrowths deposited secondarily on the inner face of the primary cell wall, hence promoting the potential trans-membrane flux of solutes and consequently assigning TCs as having key roles in plant growth and productivity. However, TCs are typically positioned deep within tissues and have been studied mostly by electron microscopy. Recent advances in fluorophore labelling of plant cell walls using a modified pseudo-Schiff-propidium iodide (mPS-PI) staining procedure in combination with high-resolution confocal microscopy have allowed visualization of cellular details of individual tissue layers in whole mounts, hence enabling study of tissue and cellular architecture without the need for tissue sectioning. Here we apply a simplified version of the mPS-PI procedure for confocal imaging of cellulose-enriched wall ingrowths in vascular TCs at the whole tissue level. The simplified mPS-PI staining procedure produced high-resolution three-dimensional images of individual cell types in vascular bundles and, importantly, wall ingrowths in phloem parenchyma (PP) TCs in minor veins of Arabidopsis leaves and companion cell TCs in pea. More efficient staining of tissues was obtained by replacing complex clearing procedures with a simple post-fixation bleaching step. We used this modified procedure to survey the presence of PP TCs in other tissues of Arabidopsis including cotyledons, cauline leaves and sepals. This high-resolution imaging enabled us to classify different stages of wall ingrowth development in Arabidopsis leaves, hence enabling semi-quantitative assessment of the extent of wall ingrowth deposition in PP TCs at the whole leaf level. Finally, we conducted a defoliation experiment as an example of using this approach to statistically analyze responses of PP TC development to leaf ablation. Use of a modified mPS-PI staining technique resulted in high-resolution confocal imaging of polarized wall ingrowth deposition in TCs. This technique can be used in place of conventional electron microscopy and opens new possibilities to study mechanisms determining polarized deposition of wall ingrowths and use reverse genetics to identify regulatory genes controlling TC trans-differentiation.

  14. A tool to include gamma analysis software into a quality assurance program.

    PubMed

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  15. Multi-GPU maximum entropy image synthesis for radio astronomy

    NASA Astrophysics Data System (ADS)

    Cárcamo, M.; Román, P. E.; Casassus, S.; Moral, V.; Rannou, F. R.

    2018-01-01

    The maximum entropy method (MEM) is a well known deconvolution technique in radio-interferometry. This method solves a non-linear optimization problem with an entropy regularization term. Other heuristics such as CLEAN are faster but highly user dependent. Nevertheless, MEM has the following advantages: it is unsupervised, it has a statistical basis, it has a better resolution and better image quality under certain conditions. This work presents a high performance GPU version of non-gridding MEM, which is tested using real and simulated data. We propose a single-GPU and a multi-GPU implementation for single and multi-spectral data, respectively. We also make use of the Peer-to-Peer and Unified Virtual Addressing features of newer GPUs which allows to exploit transparently and efficiently multiple GPUs. Several ALMA data sets are used to demonstrate the effectiveness in imaging and to evaluate GPU performance. The results show that a speedup from 1000 to 5000 times faster than a sequential version can be achieved, depending on data and image size. This allows to reconstruct the HD142527 CO(6-5) short baseline data set in 2.1 min, instead of 2.5 days that takes a sequential version on CPU.

  16. Long wave infrared (8 to 14 microns) hyperspectral imager based on an uncooled thermal camera and the traditional CI block interferometer (SI-LWIR-UC)

    NASA Astrophysics Data System (ADS)

    Cabib, Dario; Lavi, Moshe; Gil, Amir; Milman, Uri

    2011-06-01

    Since the early '90's CI has been involved in the development of FTIR hyperspectral imagers based on a Sagnac or similar type of interferometer. CI also pioneered the commercialization of such hyperspectral imagers in those years. After having developed a visible version based on a CCD in the early '90's (taken on by a spin-off company for biomedical applications) and a 3 to 5 micron infrared version based on a cooled InSb camera in 2008, it is now developing an LWIR version based on an uncooled camera for the 8 to 14 microns range. In this paper we will present design features and expected performance of the system. The instrument is designed to be rugged for field use, yield a relatively high spectral resolution of 8 cm-1, an IFOV of 0.5 mrad., a 640x480 pixel spectral cube in less than a minute and a noise equivalent spectral radiance of 40 nW/cm2/sr/cm-1 at 10μ. The actually measured performance will be presented in a future paper.

  17. A simple and compact mechanical velocity selector of use to analyze/select molecular alignment in supersonic seeded beams

    NASA Astrophysics Data System (ADS)

    Pirani, F.; Cappelletti, D.; Vecchiocattivi, F.; Vattuone, L.; Gerbi, A.; Rocca, M.; Valbusa, U.

    2004-02-01

    A light and compact mechanical velocity selector, of novel design, for applications in supersonic molecular-beam studies has been developed. It represents a simplified version of the traditional, 50 year old, slotted disks velocity selector. Taking advantage of new materials and improved machining techniques, the new version has been realized with only two rotating slotted disks, driven by an electrical motor with adjustable frequency of rotation, and thus has a much smaller weight and size with respect to the original design, which may allow easier implementation in most of the available molecular-beam apparatuses. This new type of selector, which maintains a sufficiently high velocity resolution, has been developed for sampling molecules with different degrees of rotational alignment, like those emerging from a seeded supersonic expansion. This sampling is the crucial step to realize new molecular-beam experiments to study the effect of molecular alignment in collisional processes.

  18. The Implementation of NEMS GFS Aerosol Component (NGAC) Version 1.0 for Global Dust Forecasting at NOAA NCEP

    NASA Technical Reports Server (NTRS)

    Lu, Cheng-Hsuan; Da Silva, Arlindo M.; Wang, Jun; Moorthi, Shrinivas; Chin, Mian; Colarco, Peter; Tang, Youhua; Bhattacharjee, Partha S.; Chen, Shen-Po; Chuang, Hui-Ya; hide

    2016-01-01

    The NOAA National Centers for Environmental Prediction (NCEP) implemented the NOAA Environmental Modeling System (NEMS) Global Forecast System (GFS) Aerosol Component (NGAC) for global dust forecasting in collaboration with NASA Goddard Space Flight Center (GSFC). NGAC Version 1.0 has been providing 5-day dust forecasts at 1deg x 1deg resolution on a global scale, once per day at 00:00 Coordinated Universal Time (UTC), since September 2012. This is the first global system capable of interactive atmosphere aerosol forecasting at NCEP. The implementation of NGAC V1.0 reflects an effective and efficient transitioning of NASA research advances to NCEP operations, paving the way for NCEP to provide global aerosol products serving a wide range of stakeholders, as well as to allow the effects of aerosols on weather forecasts and climate prediction to be considered.

  19. Mars Global Reference Atmospheric Model (Mars-GRAM) Version 3.8: Users Guide

    NASA Astrophysics Data System (ADS)

    Justus, C. G.; James, B. F.

    1999-05-01

    Mars Global Reference Atmospheric Model (Mars-GRAM) Version 3.8 is presented and its new features are discussed. Mars-GRAM uses new values of planetary reference ellipsoid radii, gravity term, and rotation rate (consistent with current JPL values) and includes centrifugal effects on gravity. The model now uses NASA Ames Global Circulation Model low resolution topography. Curvature corrections are applied to winds and limits based on speed of sound are applied. Altitude of the F1 ionization peak and density scale height, including effects of change of molecular weight with altitude are computed. A check is performed to disallow temperatures below CO2 sublimination. This memorandum includes instructions on obtaining Mars-GRAM source code and data files and running the program. Sample input and output are provided. An example of incorporating Mars-GRAM as an atmospheric subroutine in a trajectory code is also given.

  20. SLIDE - a web-based tool for interactive visualization of large-scale -omics data.

    PubMed

    Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon

    2018-06-28

    Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.

  1. Mars Global Reference Atmospheric Model (Mars-GRAM) Version 3.8: Users Guide

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; James, B. F.

    1999-01-01

    Mars Global Reference Atmospheric Model (Mars-GRAM) Version 3.8 is presented and its new features are discussed. Mars-GRAM uses new values of planetary reference ellipsoid radii, gravity term, and rotation rate (consistent with current JPL values) and includes centrifugal effects on gravity. The model now uses NASA Ames Global Circulation Model low resolution topography. Curvature corrections are applied to winds and limits based on speed of sound are applied. Altitude of the F1 ionization peak and density scale height, including effects of change of molecular weight with altitude are computed. A check is performed to disallow temperatures below CO2 sublimination. This memorandum includes instructions on obtaining Mars-GRAM source code and data files and running the program. Sample input and output are provided. An example of incorporating Mars-GRAM as an atmospheric subroutine in a trajectory code is also given.

  2. Technical Report Series on Global Modeling and Data Assimilation. Volume 20; The Climate of the FVCCM-3 Model

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); Chang, Yehui; Schubert, Siegfried D.; Lin, Shian-Jiann; Nebuda, Sharon; Shen, Bo-Wen

    2001-01-01

    This document describes the climate of version 1 of the NASA-NCAR model developed at the Data Assimilation Office (DAO). The model consists of a new finite-volume dynamical core and an implementation of the NCAR climate community model (CCM-3) physical parameterizations. The version of the model examined here was integrated at a resolution of 2 degrees latitude by 2.5 degrees longitude and 32 levels. The results are based on assimilation that was forced with observed sea surface temperature and sea ice for the period 1979-1995, and are compared with NCEP/NCAR reanalyses and various other observational data sets. The results include an assessment of seasonal means, subseasonal transients including the Madden Julian Oscillation, and interannual variability. The quantities include zonal and meridional winds, temperature, specific humidity, geopotential height, stream function, velocity potential, precipitation, sea level pressure, and cloud radiative forcing.

  3. Pattern recognition analysis of polar clouds during summer and winter

    NASA Technical Reports Server (NTRS)

    Ebert, Elizabeth E.

    1992-01-01

    A pattern recognition algorithm is demonstrated which classifies eighteen surface and cloud types in high-latitude AVHRR imagery based on several spectral and textural features, then estimates the cloud properties (fractional coverage, albedo, and brightness temperature) using a hybrid histogram and spatial coherence technique. The summertime version of the algorithm uses both visible and infrared data (AVHRR channels 1-4), while the wintertime version uses only infrared data (AVHRR channels 3-5). Three days of low-resolution AVHRR imagery from the Arctic and Antarctic during January and July 1984 were analyzed for cloud type and fractional coverage. The analysis showed significant amounts of high cloudiness in the Arctic during one day in winter. The Antarctic summer scene was characterized by heavy cloud cover in the southern ocean and relatively clear conditions in the continental interior. A large region of extremely low brightness temperatures in East Antarctica during winter suggests the presence of polar stratospheric cloud.

  4. Major modes of short-term climate variability in the newly developed NUIST Earth System Model (NESM)

    NASA Astrophysics Data System (ADS)

    Cao, Jian; Wang, Bin; Xiang, Baoqiang; Li, Juan; Wu, Tianjie; Fu, Xiouhua; Wu, Liguang; Min, Jinzhong

    2015-05-01

    A coupled earth system model (ESM) has been developed at the Nanjing University of Information Science and Technology (NUIST) by using version 5.3 of the European Centre Hamburg Model (ECHAM), version 3.4 of the Nucleus for European Modelling of the Ocean (NEMO), and version 4.1 of the Los Alamos sea ice model (CICE). The model is referred to as NUIST ESM1 (NESM1). Comprehensive and quantitative metrics are used to assess the model's major modes of climate variability most relevant to subseasonal-to-interannual climate prediction. The model's assessment is placed in a multi-model framework. The model yields a realistic annual mean and annual cycle of equatorial SST, and a reasonably realistic precipitation climatology, but has difficulty in capturing the spring-fall asymmetry and monsoon precipitation domains. The ENSO mode is reproduced well with respect to its spatial structure, power spectrum, phase locking to the annual cycle, and spatial structures of the central Pacific (CP)-ENSO and eastern Pacific (EP)-ENSO; however, the equatorial SST variability, biennial component of ENSO, and the amplitude of CP-ENSO are overestimated. The model captures realistic intraseasonal variability patterns, the vertical-zonal structures of the first two leading predictable modes of Madden-Julian Oscillation (MJO), and its eastward propagation; but the simulated MJO speed is significantly slower than observed. Compared with the T42 version, the high resolution version (T159) demonstrates improved simulation with respect to the climatology, interannual variance, monsoon-ENSO lead-lag correlation, spatial structures of the leading mode of the Asian-Australian monsoon rainfall variability, and the eastward propagation of the MJO.

  5. DMI's Baltic Sea Coastal operational forecasting system

    NASA Astrophysics Data System (ADS)

    Murawski, Jens; Berg, Per; Weismann Poulsen, Jacob

    2017-04-01

    Operational forecasting is challenged with bridging the gap between the large scales of the driving weather systems and the local, human scales of the model applications. The limit of what can be represented by local model has been continuously shifted to higher and higher spatial resolution, with the aim to better resolve the local dynamic and to make it possible to describe processes that could only be parameterised in older versions, with the ultimate goal to improve the quality of the forecast. Current hardware trends demand a str onger focus on the development of efficient, highly parallelised software and require a refactoring of the code with a solid focus on portable performance. The gained performance can be used for running high resolution model with a larger coverage. Together with the development of efficient two-way nesting routines, this has made it possible to approach the near-coastal zone with model applications that can run in a time effective way. Denmarks Meteorological Institute uses the HBM(1) ocean circulation model for applications that covers the entire Baltic Sea and North Sea with an integrated model set-up that spans the range of horizontal resolution from 1nm for the entire Baltic Sea to approx. 200m resolution in local fjords (Limfjord). For the next model generation, the high resolution set-ups are going to be extended and new high resolution domains in coastal zones are either implemented or tested for operational use. For the first time it will be possible to cover large stretches of the Baltic coastal zone with sufficiently high resolution to model the local hydrodynamic adequately. (1) HBM stands for HIROMB-BOOS-Model, whereas HIROMB stands for "High Resolution Model for the Baltic Sea" and BOOS stands for "Baltic Operational Oceanography System".

  6. Future changes in regional precipitation simulated by a half-degree coupled climate model: Sensitivity to horizontal resolution

    DOE PAGES

    Shields, Christine A.; Kiehl, Jeffrey T.; Meehl, Gerald A.

    2016-06-02

    The global fully coupled half-degree Community Climate System Model Version 4 (CCSM4) was integrated for a suite of climate change ensemble simulations including five historical runs, five Representative Concentration Pathway 8.5 [RCP8.5) runs, and a long Pre-Industrial control run. This study focuses on precipitation at regional scales and its sensitivity to horizontal resolution. The half-degree historical CCSM4 simulations are compared to observations, where relevant, and to the standard 1° CCSM4. Both the halfdegree and 1° resolutions are coupled to a nominal 1° ocean. North American and South Asian/Indian monsoon regimes are highlighted because these regimes demonstrate improvements due to highermore » resolution, primarily because of better-resolved topography. Agriculturally sensitive areas are analyzed and include Southwest, Central, and Southeast U.S., Southern Europe, and Australia. Both mean and extreme precipitation is discussed for convective and large-scale precipitation processes. Convective precipitation tends to decrease with increasing resolution and large-scale precipitation tends to increase. Improvements for the half-degree agricultural regions can be found for mean and extreme precipitation in the Southeast U.S., Southern Europe, and Australian regions. Climate change responses differ between the model resolutions for the U.S. Southwest/Central regions and are seasonally dependent in the Southeast and Australian regions. Both resolutions project a clear drying signal across Southern Europe due to increased greenhouse warming. As a result, differences between resolutions tied to the representation of convective and large-scale precipitation play an important role in the character of the climate change and depend on regional influences.« less

  7. Lunar Polar Illumination for Power Analysis

    NASA Technical Reports Server (NTRS)

    Fincannon, James

    2008-01-01

    This paper presents illumination analyses using the latest Earth-based radar digital elevation model (DEM) of the lunar south pole and an independently developed analytical tool. These results enable the optimum sizing of solar/energy storage lunar surface power systems since they quantify the timing and durations of illuminated and shadowed periods. Filtering and manual editing of the DEM based on comparisons with independent imagery were performed and a reduced resolution version of the DEM was produced to reduce the analysis time. A comparison of the DEM with lunar limb imagery was performed in order to validate the absolute heights over the polar latitude range, the accuracy of which affects the impact of long range, shadow-casting terrain. Average illumination and energy storage duration maps of the south pole region are provided for the worst and best case lunar day using the reduced resolution DEM. Average illumination fractions and energy storage durations are presented for candidate low energy storage duration south pole sites. The best site identified using the reduced resolution DEM required a 62 hr energy storage duration using a fast recharge power system. Solar and horizon terrain elevations as well as illumination fraction profiles are presented for the best identified site and the data for both the reduced resolution and high resolution DEMs compared. High resolution maps for three low energy storage duration areas are presented showing energy storage duration for the worst case lunar day, surface height, and maximum absolute surface slope.

  8. SoilGrids250m: Global gridded soil information based on machine learning

    PubMed Central

    Mendes de Jesus, Jorge; Heuvelink, Gerard B. M.; Ruiperez Gonzalez, Maria; Kilibarda, Milan; Blagotić, Aleksandar; Shangguan, Wei; Wright, Marvin N.; Geng, Xiaoyuan; Bauer-Marschallinger, Bernhard; Guevara, Mario Antonio; Vargas, Rodrigo; MacMillan, Robert A.; Batjes, Niels H.; Leenaars, Johan G. B.; Ribeiro, Eloi; Wheeler, Ichsani; Mantel, Stephan; Kempen, Bas

    2017-01-01

    This paper describes the technical development and accuracy assessment of the most recent and improved version of the SoilGrids system at 250m resolution (June 2016 update). SoilGrids provides global predictions for standard numeric soil properties (organic carbon, bulk density, Cation Exchange Capacity (CEC), pH, soil texture fractions and coarse fragments) at seven standard depths (0, 5, 15, 30, 60, 100 and 200 cm), in addition to predictions of depth to bedrock and distribution of soil classes based on the World Reference Base (WRB) and USDA classification systems (ca. 280 raster layers in total). Predictions were based on ca. 150,000 soil profiles used for training and a stack of 158 remote sensing-based soil covariates (primarily derived from MODIS land products, SRTM DEM derivatives, climatic images and global landform and lithology maps), which were used to fit an ensemble of machine learning methods—random forest and gradient boosting and/or multinomial logistic regression—as implemented in the R packages ranger, xgboost, nnet and caret. The results of 10–fold cross-validation show that the ensemble models explain between 56% (coarse fragments) and 83% (pH) of variation with an overall average of 61%. Improvements in the relative accuracy considering the amount of variation explained, in comparison to the previous version of SoilGrids at 1 km spatial resolution, range from 60 to 230%. Improvements can be attributed to: (1) the use of machine learning instead of linear regression, (2) to considerable investments in preparing finer resolution covariate layers and (3) to insertion of additional soil profiles. Further development of SoilGrids could include refinement of methods to incorporate input uncertainties and derivation of posterior probability distributions (per pixel), and further automation of spatial modeling so that soil maps can be generated for potentially hundreds of soil variables. Another area of future research is the development of methods for multiscale merging of SoilGrids predictions with local and/or national gridded soil products (e.g. up to 50 m spatial resolution) so that increasingly more accurate, complete and consistent global soil information can be produced. SoilGrids are available under the Open Data Base License. PMID:28207752

  9. The Mars Climate Database (MCD version 5.3)

    NASA Astrophysics Data System (ADS)

    Millour, Ehouarn; Forget, Francois; Spiga, Aymeric; Vals, Margaux; Zakharov, Vladimir; Navarro, Thomas; Montabone, Luca; Lefevre, Franck; Montmessin, Franck; Chaufray, Jean-Yves; Lopez-Valverde, Miguel; Gonzalez-Galindo, Francisco; Lewis, Stephen; Read, Peter; Desjean, Marie-Christine; MCD/GCM Development Team

    2017-04-01

    Our Global Circulation Model (GCM) simulates the atmospheric environment of Mars. It is developped at LMD (Laboratoire de Meteorologie Dynamique, Paris, France) in close collaboration with several teams in Europe (LATMOS, France, University of Oxford, The Open University, the Instituto de Astrofisica de Andalucia), and with the support of ESA (European Space Agency) and CNES (French Space Agency). GCM outputs are compiled to build a Mars Climate Database, a freely available tool useful for the scientific and engineering communities. The Mars Climate Database (MCD) has over the years been distributed to more than 300 teams around the world. The latest series of reference simulations have been compiled in a new version (v5.3) of the MCD, released in the first half of 2017. To summarize, MCD v5.3 provides: - Climatologies over a series of synthetic dust scenarios: standard (climatology) year, cold (ie: low dust), warm (ie: dusty atmosphere) and dust storm, all topped by various cases of Extreme UV solar inputs (low, mean or maximum). These scenarios have been derived from home-made, instrument-derived (TES, THEMIS, MCS, MERs), dust climatology of the last 8 Martian years. The MCD also provides simulation outputs (MY24-31) representative of these actual years. - Mean values and statistics of main meteorological variables (atmospheric temperature, density, pressure and winds), as well as surface pressure and temperature, CO2 ice cover, thermal and solar radiative fluxes, dust column opacity and mixing ratio, [H20] vapor and ice columns, concentrations of many species: [CO], [O2], [O], [N2], [H2], [O3], ... - A high resolution mode which combines high resolution (32 pixel/degree) MOLA topography records and Viking Lander 1 pressure records with raw lower resolution GCM results to yield, within the restriction of the procedure, high resolution values of atmospheric variables. - The possibility to reconstruct realistic conditions by combining the provided climatology with additional large scale and small scale perturbations schemes. At EGU, we will report on the latest improvements in the Mars Climate Database, with comparisons with available measurements from orbit (e.g.: TES, MCS) and landers (Viking, Phoenix, MSL).

  10. Evaluation of Abdominal Computed Tomography Image Quality Using a New Version of Vendor-Specific Model-Based Iterative Reconstruction.

    PubMed

    Jensen, Corey T; Telesmanich, Morgan E; Wagner-Bartak, Nicolaus A; Liu, Xinming; Rong, John; Szklaruk, Janio; Qayyum, Aliya; Wei, Wei; Chandler, Adam G; Tamm, Eric P

    2017-01-01

    To qualitatively and quantitatively compare abdominal computed tomography (CT) images reconstructed with a new version of model-based iterative reconstruction (Veo 3.0; GE Healthcare) to those created with Veo 2.0. This retrospective study was approved by our institutional review board and was Health Insurance Portability and Accountability Act compliant. The raw data from 29 consecutive patients who had undergone CT abdomen scanning was used to reconstruct 4 sets of 3.75-mm axial images: Veo 2.0, Veo 3.0 standard, Veo 3.0 5% resolution preference (RP), and Veo 3.0 20% RP. A slice thickness optimization of 3.75 mm and texture feature was selected for Veo 3.0 reconstructions.The images were reviewed by 3 independent readers in a blinded, randomized fashion using a 5-point Likert scale and 5-point comparative scale.Multiple 2-dimensional circular regions of interest were defined for noise and contrast-to-noise ratio measurements. Line profiles were drawn across the 7 lp/cm bar pattern of the CatPhan 600 phantom for spatial resolution evaluation. The Veo 3.0 standard image set was scored better than Veo 2.0 in terms of artifacts (mean difference, 0.43; 95% confidence interval [95% CI], 0.25-0.6; P < 0.0001), overall image quality (mean difference, 0.87; 95% CI, 0.62-1.13; P < 0.0001) and qualitative resolution (mean difference, 0.9; 95% CI, 0.69-1.1; P < 0.0001). Although the Veo 3.0 standard and RP05 presets were preferred across most categories, the Veo 3.0 RP20 series ranked best for bone detail. Image noise and spatial resolution increased along a spectrum with Veo 2.0 the lowest and RP20 the highest. Veo 3.0 enhances imaging evaluation relative to Veo 2.0; readers preferred Veo 3.0 image appearance despite the associated mild increases in image noise. These results provide suggested parameters to be used clinically and as a basis for future evaluations, such as focal lesion detection, in the oncology setting.

  11. SoilGrids250m: Global gridded soil information based on machine learning.

    PubMed

    Hengl, Tomislav; Mendes de Jesus, Jorge; Heuvelink, Gerard B M; Ruiperez Gonzalez, Maria; Kilibarda, Milan; Blagotić, Aleksandar; Shangguan, Wei; Wright, Marvin N; Geng, Xiaoyuan; Bauer-Marschallinger, Bernhard; Guevara, Mario Antonio; Vargas, Rodrigo; MacMillan, Robert A; Batjes, Niels H; Leenaars, Johan G B; Ribeiro, Eloi; Wheeler, Ichsani; Mantel, Stephan; Kempen, Bas

    2017-01-01

    This paper describes the technical development and accuracy assessment of the most recent and improved version of the SoilGrids system at 250m resolution (June 2016 update). SoilGrids provides global predictions for standard numeric soil properties (organic carbon, bulk density, Cation Exchange Capacity (CEC), pH, soil texture fractions and coarse fragments) at seven standard depths (0, 5, 15, 30, 60, 100 and 200 cm), in addition to predictions of depth to bedrock and distribution of soil classes based on the World Reference Base (WRB) and USDA classification systems (ca. 280 raster layers in total). Predictions were based on ca. 150,000 soil profiles used for training and a stack of 158 remote sensing-based soil covariates (primarily derived from MODIS land products, SRTM DEM derivatives, climatic images and global landform and lithology maps), which were used to fit an ensemble of machine learning methods-random forest and gradient boosting and/or multinomial logistic regression-as implemented in the R packages ranger, xgboost, nnet and caret. The results of 10-fold cross-validation show that the ensemble models explain between 56% (coarse fragments) and 83% (pH) of variation with an overall average of 61%. Improvements in the relative accuracy considering the amount of variation explained, in comparison to the previous version of SoilGrids at 1 km spatial resolution, range from 60 to 230%. Improvements can be attributed to: (1) the use of machine learning instead of linear regression, (2) to considerable investments in preparing finer resolution covariate layers and (3) to insertion of additional soil profiles. Further development of SoilGrids could include refinement of methods to incorporate input uncertainties and derivation of posterior probability distributions (per pixel), and further automation of spatial modeling so that soil maps can be generated for potentially hundreds of soil variables. Another area of future research is the development of methods for multiscale merging of SoilGrids predictions with local and/or national gridded soil products (e.g. up to 50 m spatial resolution) so that increasingly more accurate, complete and consistent global soil information can be produced. SoilGrids are available under the Open Data Base License.

  12. Continuation of SAGE and MLS High-Resolution Ozone Profiles with the Suomi NPP OMPS Limb Profiler

    NASA Astrophysics Data System (ADS)

    Kramarova, N. A.; Bhartia, P. K.; Moy, L.; Chen, Z.; Frith, S. M.

    2015-12-01

    The Ozone Mapper and Profiler Suite (OMPS) Limb Profiler (LP) onboard the Suomi NPP satellite is design to measure ozone profiles with a high vertical resolution (~2 km) and dense spatial sampling (~1° latitude). The LP sensor represents a new generation of the US ozone profile instruments with the plan for a follow-up limb instrument onboard the Joint Polar Satellite System 2 (JPSS-2) in 2021. In this study we will examine the suitability of using LP profiles to continue the EOS climate ozone profile record from the SAGE and MLS datasets. First of all, we evaluate the accuracy in determining the LP tangent height by analyzing measured and calculated radiances. The accurate estimation of the tangent height is critical for limb observations. Several methods were explored to estimate the uncertainties in the LP tangent height registration, and the results will be briefly summarized in this presentation. Version 2 of LP data, released in May 2014, includes a static adjustment of ~1.5 km and a dynamic tangent height adjustment within each orbit. A recent analysis of Version 2 Level 1 radiances revealed a 100 m step in the tangent height that occurred on 26 April 2013, due to a switch to two star trackers in determining spacecraft position. In addition, a ~200 m shift in the tangent height along each orbit was detected. These uncertainties in tangent height registrations can affect the stability of the LP ozone record. Therefore, the second step in our study includes a validation of LP ozone profiles against correlative satellite ozone measurements (Aura MLS, ACE-FTS, OSIRIS, and SBUV) with the focus on time-dependent changes. We estimate relative drifts between OMPS LP and correlative ozone records to evaluate stability of the LP measurements. We also test the tangent height corrections found in the internal analysis of Version 2 measurements to determine their effect on the long-term stability of the LP ozone record.

  13. The Community Climate System Model Version 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gent, Peter R.; Danabasoglu, Gokhan; Donner, Leo J.

    The fourth version of the Community Climate System Model (CCSM4) was recently completed and released to the climate community. This paper describes developments to all the CCSM components, and documents fully coupled pre-industrial control runs compared to the previous version, CCSM3. Using the standard atmosphere and land resolution of 1{sup o} results in the sea surface temperature biases in the major upwelling regions being comparable to the 1.4{sup o} resolution CCSM3. Two changes to the deep convection scheme in the atmosphere component result in the CCSM4 producing El Nino/Southern Oscillation variability with a much more realistic frequency distribution than themore » CCSM3, although the amplitude is too large compared to observations. They also improve the representation of the Madden-Julian Oscillation, and the frequency distribution of tropical precipitation. A new overflow parameterization in the ocean component leads to an improved simulation of the deep ocean density structure, especially in the North Atlantic. Changes to the CCSM4 land component lead to a much improved annual cycle of water storage, especially in the tropics. The CCSM4 sea ice component uses much more realistic albedos than the CCSM3, and the Arctic sea ice concentration is improved in the CCSM4. An ensemble of 20th century simulations runs produce an excellent match to the observed September Arctic sea ice extent from 1979 to 2005. The CCSM4 ensemble mean increase in globally-averaged surface temperature between 1850 and 2005 is larger than the observed increase by about 0.4 C. This is consistent with the fact that the CCSM4 does not include a representation of the indirect effects of aerosols, although other factors may come into play. The CCSM4 still has significant biases, such as the mean precipitation distribution in the tropical Pacific Ocean, too much low cloud in the Arctic, and the latitudinal distributions of short-wave and long-wave cloud forcings.« less

  14. C-GLORSv5: an improved multipurpose global ocean eddy-permitting physical reanalysis

    NASA Astrophysics Data System (ADS)

    Storto, Andrea; Masina, Simona

    2016-11-01

    Global ocean reanalyses combine in situ and satellite ocean observations with a general circulation ocean model to estimate the time-evolving state of the ocean, and they represent a valuable tool for a variety of applications, ranging from climate monitoring and process studies to downstream applications, initialization of long-range forecasts and regional studies. The purpose of this paper is to document the recent upgrade of C-GLORS (version 5), the latest ocean reanalysis produced at the Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC) that covers the meteorological satellite era (1980-present) and it is being updated in delayed time mode. The reanalysis is run at eddy-permitting resolution (1/4° horizontal resolution and 50 vertical levels) and consists of a three-dimensional variational data assimilation system, a surface nudging and a bias correction scheme. With respect to the previous version (v4), C-GLORSv5 contains a number of improvements. In particular, background- and observation-error covariances have been retuned, allowing a flow-dependent inflation in the globally averaged background-error variance. An additional constraint on the Arctic sea-ice thickness was introduced, leading to a realistic ice volume evolution. Finally, the bias correction scheme and the initialization strategy were retuned. Results document that the new reanalysis outperforms the previous version in many aspects, especially in representing the variability of global heat content and associated steric sea level in the last decade, the top 80 m ocean temperature biases and root mean square errors, and the Atlantic Ocean meridional overturning circulation; slight worsening in the high-latitude salinity and deep ocean temperature emerge though, providing the motivation for further tuning of the reanalysis system. The dataset is available in NetCDF format at doi:10.1594/PANGAEA.857995.

  15. Quantification of terrestrial ecosystem carbon dynamics in the conterminous United States combining a process-based biogeochemical model and MODIS and AmeriFlux data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Min; Zhuang, Qianlai; Cook, D.

    2011-08-31

    Satellite remote sensing provides continuous temporal and spatial information of terrestrial ecosystems. Using these remote sensing data and eddy flux measurements and biogeochemical models, such as the Terrestrial Ecosystem Model (TEM), should provide a more adequate quantification of carbon dynamics of terrestrial ecosystems. Here we use Moderate Resolution Imaging Spectroradiometer (MODIS) Enhanced Vegetation Index (EVI), Land Surface Water Index (LSWI) and carbon flux data of AmeriFlux to conduct such a study. We first modify the gross primary production (GPP) modeling in TEM by incorporating EVI and LSWI to account for the effects of the changes of canopy photosynthetic capacity, phenologymore » and water stress. Second, we parameterize and verify the new version of TEM with eddy flux data. We then apply the model to the conterminous United States over the period 2000-2005 at a 0.05-0.05 spatial resolution. We find that the new version of TEM made improvement over the previous version and generally captured the expected temporal and spatial patterns of regional carbon dynamics. We estimate that regional GPP is between 7.02 and 7.78 PgC yr{sup -1} and net primary production (NPP) ranges from 3.81 to 4.38 Pg Cyr{sup -1} and net ecosystem production (NEP) varies within 0.08- 0.73 PgC yr{sup -1} over the period 2000-2005 for the conterminous United States. The uncertainty due to parameterization is 0.34, 0.65 and 0.18 PgC yr{sup -1} for the regional estimates of GPP, NPP and NEP, respectively. The effects of extreme climate and disturbances such as severe drought in 2002 and destructive Hurricane Katrina in 2005 were captured by the model. Our study provides a new independent and more adequate measure of carbon fluxes for the conterminous United States, which will benefit studies of carbon-climate feedback and facilitate policy-making of carbon management and climate.« less

  16. Parallelization of a Fully-Distributed Hydrologic Model using Sub-basin Partitioning

    NASA Astrophysics Data System (ADS)

    Vivoni, E. R.; Mniszewski, S.; Fasel, P.; Springer, E.; Ivanov, V. Y.; Bras, R. L.

    2005-12-01

    A primary obstacle towards advances in watershed simulations has been the limited computational capacity available to most models. The growing trend of model complexity, data availability and physical representation has not been matched by adequate developments in computational efficiency. This situation has created a serious bottleneck which limits existing distributed hydrologic models to small domains and short simulations. In this study, we present novel developments in the parallelization of a fully-distributed hydrologic model. Our work is based on the TIN-based Real-time Integrated Basin Simulator (tRIBS), which provides continuous hydrologic simulation using a multiple resolution representation of complex terrain based on a triangulated irregular network (TIN). While the use of TINs reduces computational demand, the sequential version of the model is currently limited over large basins (>10,000 km2) and long simulation periods (>1 year). To address this, a parallel MPI-based version of the tRIBS model has been implemented and tested using high performance computing resources at Los Alamos National Laboratory. Our approach utilizes domain decomposition based on sub-basin partitioning of the watershed. A stream reach graph based on the channel network structure is used to guide the sub-basin partitioning. Individual sub-basins or sub-graphs of sub-basins are assigned to separate processors to carry out internal hydrologic computations (e.g. rainfall-runoff transformation). Routed streamflow from each sub-basin forms the major hydrologic data exchange along the stream reach graph. Individual sub-basins also share subsurface hydrologic fluxes across adjacent boundaries. We demonstrate how the sub-basin partitioning provides computational feasibility and efficiency for a set of test watersheds in northeastern Oklahoma. We compare the performance of the sequential and parallelized versions to highlight the efficiency gained as the number of processors increases. We also discuss how the coupled use of TINs and parallel processing can lead to feasible long-term simulations in regional watersheds while preserving basin properties at high-resolution.

  17. AIRS Version 6 Products and Data Services at NASA GES DISC

    NASA Astrophysics Data System (ADS)

    Ding, F.; Savtchenko, A. K.; Hearty, T. J.; Theobald, M. L.; Vollmer, B.; Esfandiari, E.

    2013-12-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is the home of processing, archiving, and distribution services for data from the Atmospheric Infrared Sounder (AIRS) mission. The AIRS mission is entering its 11th year of global observations of the atmospheric state, including temperature and humidity profiles, outgoing longwave radiation, cloud properties, and trace gases. The GES DISC, in collaboration with the AIRS Project, released data from the Version 6 algorithm in early 2013. The new algorithm represents a significant improvement over previous versions in terms of greater stability, yield, and quality of products. Among the most substantial advances are: improved soundings of Tropospheric and Sea Surface Temperatures; larger improvements with increasing cloud cover; improved retrievals of surface spectral emissivity; near-complete removal of spurious temperature bias trends seen in earlier versions; substantially improved retrieval yield (i.e., number of soundings accepted for output) for climate studies; AIRS-Only retrievals with comparable accuracy to AIRS+AMSU (Advanced Microwave Sounding Unit) retrievals; and more realistic hemispheric seasonal variability and global distribution of carbon monoxide. The GES DISC is working to bring the distribution services up-to-date with these new developments. Our focus is on popular services, like variable subsetting and quality screening, which are impacted by the new elements in Version 6. Other developments in visualization services, such as Giovanni, Near-Real Time imagery, and a granule-map viewer, are progressing along with the introduction of the new data; each service presents its own challenge. This presentation will demonstrate the most significant improvements in Version 6 AIRS products, such as newly added variables (higher resolution outgoing longwave radiation, new cloud property products, etc.), the new quality control schema, and improved retrieval yields. We will also demonstrate the various distribution and visualization services for AIRS data products. The cloud properties, model physics, and water and energy cycles research communities are invited to take advantage of the improvements in Version 6 AIRS products and the various services at GES DISC which provide them.

  18. Highlights of the Version 8 SBUV and TOMS Datasets Released at this Symposium

    NASA Technical Reports Server (NTRS)

    Bhartia, Pawan K.; McPeters, Richard D.; Flynn, Lawrence E.; Wellemeyer, Charles G.

    2004-01-01

    Last October was the 25th anniversary of the launch of the SBUV and TOMS instruments on NASA's Nimbus-7 satellite. Total Ozone and ozone profile datasets produced by these and following instruments have produced a quarter century long record. Over time we have released several versions of these datasets to incorporate advances in UV radiative transfer, inverse modeling, and instrument characterization. In this meeting we are releasing datasets produced from the version 8 algorithms. They replace the previous versions (V6 SBUV, and V7 TOMS) released about a decade ago. About a dozen companion papers in this meeting provide details of the new algorithms and intercomparison of the new data with external data. In this paper we present key features of the new algorithm, and discuss how the new results differ from those released previously. We show that the new datasets have better internal consistency and also agree better with external datasets. A key feature of the V8 SBUV algorithm is that the climatology has no influence on inter-annual variability and trends; it only affects the mean values and, to a limited extent, the seasonal dependence. By contrast, climatology does have some influence on TOMS total O3 trends, particularly at large solar zenith angles. For this reason, and also because TOMS record has gaps, md EP/TOMS is suffering from data quality problems, we recommend using SBUV total ozone data for applications where the high spatial resolution of TOMS is not essential.

  19. Hi-Corrector: a fast, scalable and memory-efficient package for normalizing large-scale Hi-C data.

    PubMed

    Li, Wenyuan; Gong, Ke; Li, Qingjiao; Alber, Frank; Zhou, Xianghong Jasmine

    2015-03-15

    Genome-wide proximity ligation assays, e.g. Hi-C and its variant TCC, have recently become important tools to study spatial genome organization. Removing biases from chromatin contact matrices generated by such techniques is a critical preprocessing step of subsequent analyses. The continuing decline of sequencing costs has led to an ever-improving resolution of the Hi-C data, resulting in very large matrices of chromatin contacts. Such large-size matrices, however, pose a great challenge on the memory usage and speed of its normalization. Therefore, there is an urgent need for fast and memory-efficient methods for normalization of Hi-C data. We developed Hi-Corrector, an easy-to-use, open source implementation of the Hi-C data normalization algorithm. Its salient features are (i) scalability-the software is capable of normalizing Hi-C data of any size in reasonable times; (ii) memory efficiency-the sequential version can run on any single computer with very limited memory, no matter how little; (iii) fast speed-the parallel version can run very fast on multiple computing nodes with limited local memory. The sequential version is implemented in ANSI C and can be easily compiled on any system; the parallel version is implemented in ANSI C with the MPI library (a standardized and portable parallel environment designed for solving large-scale scientific problems). The package is freely available at http://zhoulab.usc.edu/Hi-Corrector/. © The Author 2014. Published by Oxford University Press.

  20. Surface mineral maps of Afghanistan derived from HyMap imaging spectrometer data, version 2

    USGS Publications Warehouse

    Kokaly, Raymond F.; King, Trude V.V.; Hoefen, Todd M.

    2013-01-01

    This report presents a new version of surface mineral maps derived from HyMap imaging spectrometer data collected over Afghanistan in the fall of 2007. This report also describes the processing steps applied to the imaging spectrometer data. The 218 individual flight lines composing the Afghanistan dataset, covering more than 438,000 square kilometers, were georeferenced to a mosaic of orthorectified Landsat images. The HyMap data were converted from radiance to reflectance using a radiative transfer program in combination with ground-calibration sites and a network of cross-cutting calibration flight lines. The U.S. Geological Survey Material Identification and Characterization Algorithm (MICA) was used to generate two thematic maps of surface minerals: a map of iron-bearing minerals and other materials, which have their primary absorption features at the shorter wavelengths of the reflected solar wavelength range, and a map of carbonates, phyllosilicates, sulfates, altered minerals, and other materials, which have their primary absorption features at the longer wavelengths of the reflected solar wavelength range. In contrast to the original version, version 2 of these maps is provided at full resolution of 23-meter pixel size. The thematic maps, MICA summary images, and the material fit and depth images are distributed in digital files linked to this report, in a format readable by remote sensing software and Geographic Information Systems (GIS). The digital files can be downloaded from http://pubs.usgs.gov/ds/787/downloads/.

  1. A variable vertical resolution weather model with an explicitly resolved planetary boundary layer

    NASA Technical Reports Server (NTRS)

    Helfand, H. M.

    1981-01-01

    A version of the fourth order weather model incorporating surface wind stress data from SEASAT A scatterometer observations is presented. The Monin-Obukhov similarity theory is used to relate winds at the top of the surface layer to surface wind stress. A reasonable approximation of surface fluxes of heat, moisture, and momentum are obtainable using this method. A Richardson number adjustment scheme based on the ideas of Chang is used to allow for turbulence effects.

  2. A high throughput spectral image microscopy system

    NASA Astrophysics Data System (ADS)

    Gesley, M.; Puri, R.

    2018-01-01

    A high throughput spectral image microscopy system is configured for rapid detection of rare cells in large populations. To overcome flow cytometry rates and use of fluorophore tags, a system architecture integrates sample mechanical handling, signal processors, and optics in a non-confocal version of light absorption and scattering spectroscopic microscopy. Spectral images with native contrast do not require the use of exogeneous stain to render cells with submicron resolution. Structure may be characterized without restriction to cell clusters of differentiation.

  3. Software Management System

    NASA Technical Reports Server (NTRS)

    1994-01-01

    A software management system, originally developed for Goddard Space Flight Center (GSFC) by Century Computing, Inc. has evolved from a menu and command oriented system to a state-of-the art user interface development system supporting high resolution graphics workstations. Transportable Applications Environment (TAE) was initially distributed through COSMIC and backed by a TAE support office at GSFC. In 1993, Century Computing assumed the support and distribution functions and began marketing TAE Plus, the system's latest version. The software is easy to use and does not require programming experience.

  4. MINOS 5.0 User’s Guide.

    DTIC Science & Technology

    1983-12-01

    1111 1 MICROCOPY RESOLUTION TEST CHARTjNA IONAL BURE U OF STANDARDS 193-A LK7> systems 1-,i11 Optimization 2 tLaboratory MINOS 5.0 USER’S GUIDE by...encouragement from George Dantzig and the benefit of his modeling activity within SOL, notably on the energy-economic model PILOT . We thank him warmly for...provided by running various versions of MINOS during their work on PILOT . (We note that PILOT has grown to 1500 constraints and 4000 variables, and now has

  5. Performance and quality assessment of the global ocean eddy-permitting physical reanalysis GLORYS2V4.

    NASA Astrophysics Data System (ADS)

    Garric, Gilles; Parent, Laurent; Greiner, Eric; Drévillon, Marie; Hamon, Mathieu; Lellouche, Jean-Michel; Régnier, Charly; Desportes, Charles; Le Galloudec, Olivier; Bricaud, Clement; Drillet, Yann; Hernandez, Fabrice; Le Traon, Pierre-Yves

    2017-04-01

    The purpose of this presentation is to give an overview of the recent upgrade of GLORYS2 (version 4 and GLORYS2V4 hereafter), the latest ocean reanalysis produced at Mercator Ocean that covers the altimetry era (1993-2015) in the framework of Copernicus Marine Environment Monitoring Service (CMEMS; http://marine.copernicus.eu/). The reanalysis is run at eddy-permitting resolution (¼° horizontal resolution and 75 vertical levels) with the NEMO model and driven at the surface by ERA-Interim reanalysis from ECMWF (European Centre for Medium-Range Weather Forecasts). The reanalysis system uses a multi-data and multivariate reduced order Kalman filter based on the singular extended evolutive Kalman (SEEK) filter formulation together with a 3D-VAR large scale bias correction. The assimilated observations are along-track satellite altimetry, sea surface temperature, sea ice concentration and in-situ profiles of temperature and salinity. With respect to the previous version (GLORYS2V3), GLORYS2V4 contains a number of improvements. In particular: a) new initial temperature and salinity conditions derived from EN4 data base with a better mass equilibrium with altimetry, b) the use of the updated delayed mode CORA in situ observations from CMEMS, c) a new hybrid Mean Dynamical Topography (MDT) for the assimilation scheme referenced over the 1993-2013 period, d) a better observation operator for altimetry observations for the data assimilation scheme: e) A correction of large scale ERA-Interim atmospheric surface (precipitations and radiative) fluxes as in GLORYS2V3 but towards new satellite data set f) an update of the climatological runoff data base by using the latest version of Dai's 2009 data set for the global ocean together with better account of freshwater fluxes from polar ice sheet's glaciers. The presentation will show that the new reanalysis outperforms the previous version in many aspects such as biases and root mean squared error and, especially in representing the variability of global heat and salt content and associated steric sea level in the last two decades. The dataset is available in NetCDF format and GLORYS2V4 best analysis products are distributed onto the CMEMS data portal.

  6. Assessment of Precipitation Trends over Europe by Comparing ERA-20C with a New Homogenized Observational GPCC Dataset

    NASA Astrophysics Data System (ADS)

    Rustemeier, E.; Ziese, M.; Meyer-Christoffer, A.; Finger, P.; Schneider, U.; Becker, A.

    2015-12-01

    Reliable data is essential for robust climate analysis. The ERA-20C reanalysis was developed during the projects ERA-CLIM and ERA-CLIM2. These projects focus on multi-decadal reanalyses of the global climate system. To ensure data quality and provide end users with information about uncertainties in these products, the 4th work package of ERA_CLIM2 deals with the quality assessment of the products including quality control and error estimation.In doing so, the monthly totals of the ERA-20C reanalysis are compared to two corresponding Global Precipitation Climatology Centre (GPCC) products; the Full Data Reanalysis Version 7 and the new HOMogenized PRecipitation Analysis of European in-situ data (HOMPRA Europe).ERA-20C reanalysis was produced based on ECMWFs IFS version Cy38r1 with a spatial resolution of about 125 km. It covers the time period 1900 to 2010. Only surface observations are assimilated namely marine winds and pressure. This allows the comparison with independent, not assimilated data. The GPCC Full Data Reanalysis Version 7 comprises monthly land-surface precipitation from approximately 75,000 rain-gauges covering the time period 1901-2013. For this paper, the version with 1° resolution is utilized. For trend analysis, a monthly European subset of the ERA-20C reanalysis is investigated spanning the years 1951-2005. The European subset will be compared to a new homogenized GPCC data set HOMPRA Europe. The latter is based on a collective of 5373 homogenized monthly rain gauge time series, carefully chosen from the GPCC archive of precipitation data.For the spatial and temporal evaluation of ERA-20C, global scores on monthly, seasonal and annual time scales are calculated. These include contingency table scores, correlation, along with spatial scores such as the fractional skill score. Unsurprisingly regions with strongest deviations are those of data scarcity, mountainous regions with their luv and lee effects, and monsoon regions. They all exhibit strong biases throughout their series, and severe shifts in the means. The new HOMPRA Europe data set is useful in particular for trend analysis. Therefore it is compared to a monthly European subset of the ERA-20C reanalysis for the same period, i.e. the years 1951-2005, to study the ERA-20C capability in reproducing observed trends across Europe.

  7. Development of fine-resolution analyses and expanded large-scale forcing properties. Part II: Scale-awareness and application to single-column model experiments

    DOE PAGES

    Feng, Sha; Vogelmann, Andrew M.; Li, Zhijin; ...

    2015-01-20

    Fine-resolution three-dimensional fields have been produced using the Community Gridpoint Statistical Interpolation (GSI) data assimilation system for the U.S. Department of Energy’s Atmospheric Radiation Measurement Program (ARM) Southern Great Plains region. The GSI system is implemented in a multi-scale data assimilation framework using the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. From the fine-resolution three-dimensional fields, large-scale forcing is derived explicitly at grid-scale resolution; a subgrid-scale dynamic component is derived separately, representing subgrid-scale horizontal dynamic processes. Analyses show that the subgrid-scale dynamic component is often a major component over the large-scale forcing for grid scalesmore » larger than 200 km. The single-column model (SCM) of the Community Atmospheric Model version 5 (CAM5) is used to examine the impact of the grid-scale and subgrid-scale dynamic components on simulated precipitation and cloud fields associated with a mesoscale convective system. It is found that grid-scale size impacts simulated precipitation, resulting in an overestimation for grid scales of about 200 km but an underestimation for smaller grids. The subgrid-scale dynamic component has an appreciable impact on the simulations, suggesting that grid-scale and subgrid-scale dynamic components should be considered in the interpretation of SCM simulations.« less

  8. Downscaling SMAP Radiometer Soil Moisture over the CONUS using Soil-Climate Information and Ensemble Learning

    NASA Astrophysics Data System (ADS)

    Abbaszadeh, P.; Moradkhani, H.

    2017-12-01

    Soil moisture contributes significantly towards the improvement of weather and climate forecast and understanding terrestrial ecosystem processes. It is known as a key hydrologic variable in the agricultural drought monitoring, flood modeling and irrigation management. While satellite retrievals can provide an unprecedented information on soil moisture at global-scale, the products are generally at coarse spatial resolutions (25-50 km2). This often hampers their use in regional or local studies, which normally require a finer resolution of the data set. This work presents a new framework based on an ensemble learning method while using soil-climate information derived from remote-sensing and ground-based observations to downscale the level 3 daily composite version (L3_SM_P) of SMAP radiometer soil moisture over the Continental U.S. (CONUS) at 1 km spatial resolution. In the proposed method, a suite of remotely sensed and in situ data sets in addition to soil texture information and topography data among others were used. The downscaled product was validated against in situ soil moisture measurements collected from a limited number of core validation sites and several hundred sparse soil moisture networks throughout the CONUS. The obtained results indicated a great potential of the proposed methodology to derive the fine resolution soil moisture information applicable for fine resolution hydrologic modeling, data assimilation and other regional studies.

  9. The ALMA correlator

    NASA Astrophysics Data System (ADS)

    Escoffier, R. P.; Comoretto, G.; Webber, J. C.; Baudry, A.; Broadwell, C. M.; Greenberg, J. H.; Treacy, R. R.; Cais, P.; Quertier, B.; Camino, P.; Bos, A.; Gunst, A. W.

    2007-02-01

    Aims: The Atacama Large Millimeter Array (ALMA) is an international astronomy facility to be used for detecting and imaging all types of astronomical sources at millimeter and submillimeter wavelengths at a 5000-m elevation site in the Atacama Desert of Chile. Our main aims are: describe the correlator sub-system which is that part of the ALMA system that combines the signal from up to 64 remote individual radio antennas and forms them into a single instrument; emphasize the high spectral resolution and the configuration flexibility available with the ALMA correlator. Methods: The main digital signal processing features and a block diagram of the correlator being constructed for the ALMA radio astronomy observatory are presented. Tables of observing modes and spectral resolutions offered by the correlator system are given together with some examples of multi-resolution spectral modes. Results: The correlator is delivered by quadrants and the first quadrant is being tested while most of the other printed circuit cards required by the system have been produced. In its final version the ALMA correlator will process the outputs of up to 64 antennas using an instantaneous bandwidth of 8 GHz in each of two polarizations per antenna. In the frequency division mode, unrivalled spectral flexibility together with very high resolution (3.8 kHz) and up to 8192 spectral points are achieved. In the time division mode high time resolution is available with minimum data dump rates of 16 ms for all cross-products.

  10. Timing Characterization of Helium-4 Fast Neutron Detector with EJ-309 Organic Liquid Scintillator

    NASA Astrophysics Data System (ADS)

    Liang, Yinong; Zhu, Ting; Enqvist, Andreas

    2018-01-01

    Recently, the Helium-4 gas fast neutron scintillation detectors is being used in time-sensitive measurements, such time-of-flight and multiplicity counting. In this paper, a set of time aligned signals was acquired in a coincidence measurement using the Helium-4 gas detectors and EJ-309 liquid scintillators. The high-speed digitizer system is implanted with a trigger moving average window (MAW) unit combing with its constant fraction discriminator (CFD) feature. It can calculate a "time offset" to the timestamp value to get a higher resolution timestamp (up to 50 ps), which is better than the digitizer's time resolution (4 ns) [1]. The digitized waveforms were saved to the computer hard drive and post processed with digital analysis code to determine the difference of their arrival times. The full-width at half-maximum (FWHM) of the Gaussian fit was used as to examine the resolution. For the cascade decay of Cobalt-60 (1.17 and 1.33 MeV), the first version of the Helium-4 detector with two Hamamatsu R580 photomultipliers (PMT) installed at either end of the cylindrical gas chamber (20 cm in length and 4.4 cm in diameter) has a time resolution which is about 3.139 ns FWHM. With improved knowledge of the timing performance, the Helium-4 scintillation detectors are excellent for neutron energy spectrometry applications requiring high temporal and energy resolutions.

  11. Comparison of the GOSAT TANSO-FTS TIR CH volume mixing ratio vertical profiles with those measured by ACE-FTS, ESA MIPAS, IMK-IAA MIPAS, and 16 NDACC stations

    NASA Astrophysics Data System (ADS)

    Olsen, Kevin S.; Strong, Kimberly; Walker, Kaley A.; Boone, Chris D.; Raspollini, Piera; Plieninger, Johannes; Bader, Whitney; Conway, Stephanie; Grutter, Michel; Hannigan, James W.; Hase, Frank; Jones, Nicholas; de Mazière, Martine; Notholt, Justus; Schneider, Matthias; Smale, Dan; Sussmann, Ralf; Saitoh, Naoko

    2017-10-01

    The primary instrument on the Greenhouse gases Observing SATellite (GOSAT) is the Thermal And Near infrared Sensor for carbon Observations (TANSO) Fourier transform spectrometer (FTS). TANSO-FTS uses three short-wave infrared (SWIR) bands to retrieve total columns of CO2 and CH4 along its optical line of sight and one thermal infrared (TIR) channel to retrieve vertical profiles of CO2 and CH4 volume mixing ratios (VMRs) in the troposphere. We examine version 1 of the TANSO-FTS TIR CH4 product by comparing co-located CH4 VMR vertical profiles from two other remote-sensing FTS systems: the Canadian Space Agency's Atmospheric Chemistry Experiment FTS (ACE-FTS) on SCISAT (version 3.5) and the European Space Agency's Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) on Envisat (ESA ML2PP version 6 and IMK-IAA reduced-resolution version V5R_CH4_224/225), as well as 16 ground stations with the Network for the Detection of Atmospheric Composition Change (NDACC). This work follows an initial inter-comparison study over the Arctic, which incorporated a ground-based FTS at the Polar Environment Atmospheric Research Laboratory (PEARL) at Eureka, Canada, and focuses on tropospheric and lower-stratospheric measurements made at middle and tropical latitudes between 2009 and 2013 (mid-2012 for MIPAS). For comparison, vertical profiles from all instruments are interpolated onto a common pressure grid, and smoothing is applied to ACE-FTS, MIPAS, and NDACC vertical profiles. Smoothing is needed to account for differences between the vertical resolution of each instrument and differences in the dependence on a priori profiles. The smoothing operators use the TANSO-FTS a priori and averaging kernels in all cases. We present zonally averaged mean CH4 differences between each instrument and TANSO-FTS with and without smoothing, and we examine their information content, their sensitive altitude range, their correlation, their a priori dependence, and the variability within each data set. Partial columns are calculated from the VMR vertical profiles, and their correlations are examined. We find that the TANSO-FTS vertical profiles agree with the ACE-FTS and both MIPAS retrievals' vertical profiles within 4 % (± ˜ 40 ppbv) below 15 km when smoothing is applied to the profiles from instruments with finer vertical resolution but that the relative differences can increase to on the order of 25 % when no smoothing is applied. Computed partial columns are tightly correlated for each pair of data sets. We investigate whether the difference between TANSO-FTS and other CH4 VMR data products varies with latitude. Our study reveals a small dependence of around 0.1 % per 10 degrees latitude, with smaller differences over the tropics and greater differences towards the poles.

  12. WRF-Cordex simulations for Europe: mean and extreme precipitation for present and future climates

    NASA Astrophysics Data System (ADS)

    Cardoso, Rita M.; Soares, Pedro M. M.; Miranda, Pedro M. A.

    2013-04-01

    The Weather Research and Forecast (WRF-ARW) model, version 3.3.1, was used to perform the European domain Cordex simulations, at 50km resolution. A first simulation, forced by ERA-Interim (1989-2009), was carried out to evaluate the models performance to represent the mean and extreme precipitation in present European climate. This evaluation is based in the comparison of WRF results against the ECAD regular gridded dataset of daily precipitation. Results are comparable to recent studies with other models for the European region, at this resolution. For the same domain a control and a future scenario (RCP8.5) simulation was performed to assess the climate change impact on the mean and extreme precipitation. These regional simulations were forced by EC-EARTH model results, and, encompass the periods from 1960-2006 and 2006-2100, respectively.

  13. MODTRAN: a moderate resolution model for LOWTRAN. Technical report, 12 May 1986-11 May 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berk, A.; Bernstein, L.S.; Robertson, D.C.

    1987-07-08

    This interim technical report describes a new band-model formulation for the LOWTRAN 6 atmospheric transmittance/radiation computer code. Band-model parameters for H/sub 2/O, CO/sub 2/, O/sub 3/, CO, CH/sub 4/, O/sub 2/, and N/sub 2/ were calculated using the 1986 HITRAN line atlas. They were calculated for 1 /cm bins from 0 - 17,900/cm and at five temperatures from 200 to 300K. This transmittance model and associated subroutines were integrated into LOWTRAN 6. The spectral resolution of this new option is better than 5/cm (FWHM). A preliminary version of the code was delivered to AFGL for testing. Validation against FASCOD2 calculationsmore » will be the emphasis for the remainder of this effort.« less

  14. Reprocessing 30 years of ISCCP: Addressing satellite intercalibration for deriving a long-term cloud climatology

    NASA Astrophysics Data System (ADS)

    Young, A. H.; Knapp, K. R.; Inamdar, A.; Hankins, W. B.; Rossow, W. B.

    2017-12-01

    The International Satellite Cloud Climatology Project (ISCCP) has made significant changes in preparation for a reprocessing at NOAA's NCEI. This presentation will highlight these changes and the resulting new cloud products along with the challenges faced to address satellite intercalibration issues. The intercalibration challenges are largely due to the product's reliance on satellite observations from both polar orbiting (LEO) and geostationary (GEO) satellites. The presentation will also focus on the new products (ISCCP-H) which are reprocessed at a higher spatial resolution than previous versions (ISCCP-D) due to the use of higher resolution input data (e.g., 10 km geostationary and 4 km AVHRR data). Improvements, caveats, and a comparison against the predecessor D-Series product will also be presented. ISCCP-H data is now available at: https://www.ncdc.noaa.gov/isccp

  15. mrtailor: a tool for PDB-file preparation for the generation of external restraints.

    PubMed

    Gruene, Tim

    2013-09-01

    Model building starting from, for example, a molecular-replacement solution with low sequence similarity introduces model bias, which can be difficult to detect, especially at low resolution. The program mrtailor removes low-similarity regions from a template PDB file according to sequence similarity between the target sequence and the template sequence and maps the target sequence onto the PDB file. The modified PDB file can be used to generate external restraints for low-resolution refinement with reduced model bias and can be used as a starting point for model building and refinement. The program can call ProSMART [Nicholls et al. (2012), Acta Cryst. D68, 404-417] directly in order to create external restraints suitable for REFMAC5 [Murshudov et al. (2011), Acta Cryst. D67, 355-367]. Both a command-line version and a GUI exist.

  16. Streamflow simulation for continental-scale river basins

    NASA Astrophysics Data System (ADS)

    Nijssen, Bart; Lettenmaier, Dennis P.; Liang, Xu; Wetzel, Suzanne W.; Wood, Eric F.

    1997-04-01

    A grid network version of the two-layer variable infiltration capacity (VIC-2L) macroscale hydrologic model is described. VIC-2L is a hydrologically based soil- vegetation-atmosphere transfer scheme designed to represent the land surface in numerical weather prediction and climate models. The grid network scheme allows streamflow to be predicted for large continental rivers. Off-line (observed and estimated surface meteorological and radiative forcings) applications of the model to the Columbia River (1° latitude-longitude spatial resolution) and Delaware River (0.5° resolution) are described. The model performed quite well in both applications, reproducing the seasonal hydrograph and annual flow volumes to within a few percent. Difficulties in reproducing observed streamflow in the arid portion of the Snake River basin are attributed to groundwater-surface water interactions, which are not modeled by VIC-2L.

  17. Estimating top-of-atmosphere thermal infrared radiance using MERRA-2 atmospheric data

    NASA Astrophysics Data System (ADS)

    Kleynhans, Tania; Montanaro, Matthew; Gerace, Aaron; Kanan, Christopher

    2017-05-01

    Thermal infrared satellite images have been widely used in environmental studies. However, satellites have limited temporal resolution, e.g., 16 day Landsat or 1 to 2 day Terra MODIS. This paper investigates the use of the Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) reanalysis data product, produced by NASA's Global Modeling and Assimilation Office (GMAO) to predict global topof-atmosphere (TOA) thermal infrared radiance. The high temporal resolution of the MERRA-2 data product presents opportunities for novel research and applications. Various methods were applied to estimate TOA radiance from MERRA-2 variables namely (1) a parameterized physics based method, (2) Linear regression models and (3) non-linear Support Vector Regression. Model prediction accuracy was evaluated using temporally and spatially coincident Moderate Resolution Imaging Spectroradiometer (MODIS) thermal infrared data as reference data. This research found that Support Vector Regression with a radial basis function kernel produced the lowest error rates. Sources of errors are discussed and defined. Further research is currently being conducted to train deep learning models to predict TOA thermal radiance

  18. Upgrade of the compact neutron spectrometer for high flux environments

    NASA Astrophysics Data System (ADS)

    Osipenko, M.; Bellucci, A.; Ceriale, V.; Corsini, D.; Gariano, G.; Gatti, F.; Girolami, M.; Minutoli, S.; Panza, F.; Pillon, M.; Ripani, M.; Trucchi, D. M.

    2018-03-01

    In this paper new version of the 6Li-based neutron spectrometer for high flux environments is described. The new spectrometer was built with commercial single crystal Chemical Vapour Deposition diamonds of electronic grade. These crystals feature better charge collection as well as higher radiation hardness. New metal contacts approaching ohmic conditions were deposited on the diamonds suppressing build-up of space charge observed in the previous prototypes. New passive preamplification of the signal at detector side was implemented to improve its resolution. This preamplification is based on the RF transformer not sensitive to high neutron flux. The compact mechanical design allowed to reduce detector size to a tube of 1 cm diameter and 13 cm long. The spectrometer was tested in the thermal column of TRIGA reactor and at the DD neutron generator. The test results indicate an energy resolution of 300 keV (FWHM), reduced to 72 keV (RMS) excluding energy loss, and coincidence timing resolution of 160 ps (FWHM). The measured data are in agreement with Geant4 simulations except for larger energy loss tail presumably related to imperfections of metal contacts and glue expansion.

  19. Towards a new multiscale air quality transport model using the fully unstructured anisotropic adaptive mesh technology of Fluidity (version 4.1.9)

    NASA Astrophysics Data System (ADS)

    Zheng, J.; Zhu, J.; Wang, Z.; Fang, F.; Pain, C. C.; Xiang, J.

    2015-10-01

    An integrated method of advanced anisotropic hr-adaptive mesh and discretization numerical techniques has been, for first time, applied to modelling of multiscale advection-diffusion problems, which is based on a discontinuous Galerkin/control volume discretization on unstructured meshes. Over existing air quality models typically based on static-structured grids using a locally nesting technique, the advantage of the anisotropic hr-adaptive model has the ability to adapt the mesh according to the evolving pollutant distribution and flow features. That is, the mesh resolution can be adjusted dynamically to simulate the pollutant transport process accurately and effectively. To illustrate the capability of the anisotropic adaptive unstructured mesh model, three benchmark numerical experiments have been set up for two-dimensional (2-D) advection phenomena. Comparisons have been made between the results obtained using uniform resolution meshes and anisotropic adaptive resolution meshes. Performance achieved in 3-D simulation of power plant plumes indicates that this new adaptive multiscale model has the potential to provide accurate air quality modelling solutions effectively.

  20. The ChIP-exo Method: Identifying Protein-DNA Interactions with Near Base Pair Precision.

    PubMed

    Perreault, Andrea A; Venters, Bryan J

    2016-12-23

    Chromatin immunoprecipitation (ChIP) is an indispensable tool in the fields of epigenetics and gene regulation that isolates specific protein-DNA interactions. ChIP coupled to high throughput sequencing (ChIP-seq) is commonly used to determine the genomic location of proteins that interact with chromatin. However, ChIP-seq is hampered by relatively low mapping resolution of several hundred base pairs and high background signal. The ChIP-exo method is a refined version of ChIP-seq that substantially improves upon both resolution and noise. The key distinction of the ChIP-exo methodology is the incorporation of lambda exonuclease digestion in the library preparation workflow to effectively footprint the left and right 5' DNA borders of the protein-DNA crosslink site. The ChIP-exo libraries are then subjected to high throughput sequencing. The resulting data can be leveraged to provide unique and ultra-high resolution insights into the functional organization of the genome. Here, we describe the ChIP-exo method that we have optimized and streamlined for mammalian systems and next-generation sequencing-by-synthesis platform.

  1. Radiometric resolution enhancement by lossy compression as compared to truncation followed by lossless compression

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Manohar, Mareboyana

    1994-01-01

    Recent advances in imaging technology make it possible to obtain imagery data of the Earth at high spatial, spectral and radiometric resolutions from Earth orbiting satellites. The rate at which the data is collected from these satellites can far exceed the channel capacity of the data downlink. Reducing the data rate to within the channel capacity can often require painful trade-offs in which certain scientific returns are sacrificed for the sake of others. In this paper we model the radiometric version of this form of lossy compression by dropping a specified number of least significant bits from each data pixel and compressing the remaining bits using an appropriate lossless compression technique. We call this approach 'truncation followed by lossless compression' or TLLC. We compare the TLLC approach with applying a lossy compression technique to the data for reducing the data rate to the channel capacity, and demonstrate that each of three different lossy compression techniques (JPEG/DCT, VQ and Model-Based VQ) give a better effective radiometric resolution than TLLC for a given channel rate.

  2. A daily, 1 km resolution data set of downscaled Greenland ice sheet surface mass balance (1958-2015)

    NASA Astrophysics Data System (ADS)

    Noël, Brice; van de Berg, Willem Jan; Machguth, Horst; Lhermitte, Stef; Howat, Ian; Fettweis, Xavier; van den Broeke, Michiel R.

    2016-10-01

    This study presents a data set of daily, 1 km resolution Greenland ice sheet (GrIS) surface mass balance (SMB) covering the period 1958-2015. Applying corrections for elevation, bare ice albedo and accumulation bias, the high-resolution product is statistically downscaled from the native daily output of the polar regional climate model RACMO2.3 at 11 km. The data set includes all individual SMB components projected to a down-sampled version of the Greenland Ice Mapping Project (GIMP) digital elevation model and ice mask. The 1 km mask better resolves narrow ablation zones, valley glaciers, fjords and disconnected ice caps. Relative to the 11 km product, the more detailed representation of isolated glaciated areas leads to increased precipitation over the southeastern GrIS. In addition, the downscaled product shows a significant increase in runoff owing to better resolved low-lying marginal glaciated regions. The combined corrections for elevation and bare ice albedo markedly improve model agreement with a newly compiled data set of ablation measurements.

  3. The Open-source Data Inventory for Anthropogenic CO2, version 2016 (ODIAC2016): a global monthly fossil fuel CO2 gridded emissions data product for tracer transport simulations and surface flux inversions

    NASA Astrophysics Data System (ADS)

    Oda, Tomohiro; Maksyutov, Shamil; Andres, Robert J.

    2018-01-01

    The Open-source Data Inventory for Anthropogenic CO2 (ODIAC) is a global high-spatial-resolution gridded emissions data product that distributes carbon dioxide (CO2) emissions from fossil fuel combustion. The emissions spatial distributions are estimated at a 1 × 1 km spatial resolution over land using power plant profiles (emissions intensity and geographical location) and satellite-observed nighttime lights. This paper describes the year 2016 version of the ODIAC emissions data product (ODIAC2016) and presents analyses that help guide data users, especially for atmospheric CO2 tracer transport simulations and flux inversion analysis. Since the original publication in 2011, we have made modifications to our emissions modeling framework in order to deliver a comprehensive global gridded emissions data product. Major changes from the 2011 publication are (1) the use of emissions estimates made by the Carbon Dioxide Information Analysis Center (CDIAC) at the Oak Ridge National Laboratory (ORNL) by fuel type (solid, liquid, gas, cement manufacturing, gas flaring, and international aviation and marine bunkers); (2) the use of multiple spatial emissions proxies by fuel type such as (a) nighttime light data specific to gas flaring and (b) ship/aircraft fleet tracks; and (3) the inclusion of emissions temporal variations. Using global fuel consumption data, we extrapolated the CDIAC emissions estimates for the recent years and produced the ODIAC2016 emissions data product that covers 2000-2015. Our emissions data can be viewed as an extended version of CDIAC gridded emissions data product, which should allow data users to impose global fossil fuel emissions in a more comprehensive manner than the original CDIAC product. Our new emissions modeling framework allows us to produce future versions of the ODIAC emissions data product with a timely update. Such capability has become more significant given the CDIAC/ORNL's shutdown. The ODIAC data product could play an important role in supporting carbon cycle science, especially modeling studies with space-based CO2 data collected in near real time by ongoing carbon observing missions such as the Japanese Greenhouse gases Observing SATellite (GOSAT), NASA's Orbiting Carbon Observatory-2 (OCO-2), and upcoming future missions. The ODIAC emissions data product including the latest version of the ODIAC emissions data (ODIAC2017, 2000-2016) is distributed from http://db.cger.nies.go.jp/dataset/ODIAC/ with a DOI (https://doi.org/10.17595/20170411.001).

  4. Chandra Discovers Elusive "Hot Bubble" in Planetary Nebula

    NASA Astrophysics Data System (ADS)

    2000-06-01

    NASA's Chandra X-ray Observatory has imaged for the first time a "hot bubble" of gas surrounding a dying, Sun-like star. This large region of very hot gas in the planetary nebula BD+30 3639 has a peculiar shape and contains elements produced in the core of the dying star. "The new Chandra image offers conclusive proof for the existence of the "hot bubble" that theorists have long predicted," said Professor Joel Kastner, of the Chester F. Carlson Center of Imaging Science at the Rochester Institute of Technology. Kastner leads a team of scientists who reported on this observation at the 196th national meeting of the American Astronomical Society in Rochester, New York. The Chandra image shows a region of 3 million degree Celsius gas that appears to fit inside the shell of ionized gas seen by the Hubble Space Telescope. The optical and X-ray emitting regions of BD+30 3639, which lies between 5000 and 8000 light years away, are roughly one million times the volume of our solar system. A planetary nebula (so called because it looks like a planet when viewed with a small telescope) is formed when a dying red giant star puffs off its outer layer, leaving behind a hot core that will eventually collapse to form a dense star called a white dwarf. According to theory, a "hot bubble" is formed when a new, two million mile per hour wind emanating from the hot core rams into the ejected atmosphere, producing energetic shocks and heating the interaction region to temperatures of millions of degrees. Previous X-ray observations hinted that X rays might be coming from a region larger than the central star but it remained for Chandra to provide definite proof. The shape of the X-ray emission was a surprise to the researchers. "This suggests that the red giant atmosphere was not ejected symmetrically,"said Kastner. "It might be pointing to an unseen companion star," The spectrum shows a large abundance of neon in the X-ray-emitting gas. This indicates that gas contained in the hot bubble gas was dredged up from the deepest layers of the central star, where nuclear fusion altered the chemical composition of the gas prior to its being ejected. Thus the Chandra data may offer new insight into the process whereby dying stars enrich the Milky Way in fusion products. The observation was made in March 2000 using the Advanced CCD Imaging Spectrometer (ACIS). Kastner's collaborators on the project are Prof. Noam Soker of the University of Haifa, Israel; Prof. Saul Rappaport of MIT; Dr. Ruth Knill-Dgani of the University of Texas, Austin; and Dr. Saeqa Vrtilek of the Harvard-Smithsonian Center for Astrophysics. The ACIS instrument was built for NASA by the Massachusetts Institute of Technology, Cambridge, and Pennsylvania State University, University Park. NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program. TRW, Inc., Redondo Beach, Calif., is the prime contractor for the spacecraft. The Smithsonian's Chandra X-ray Center controls science and flight operations from Cambridge, Mass. High resolution digital versions of the X-ray image (JPG, 300 dpi TIFF ) and other information associated with this release are available on the Internet at: http://chandra.harvard.edu AND http://chandra.nasa.gov

  5. Spectacular X-ray Jet Points Toward Cosmic Energy Booster

    NASA Astrophysics Data System (ADS)

    2000-06-01

    NASA's Chandra X-ray Observatory has revealed a spectacular luminous spike of X rays that emanates from the vicinity of a giant black hole in the center of the radio galaxy Pictor A. The spike, or jet, is due to a beam of particles that streaks across hundreds of thousands of light years of intergalactic space toward a brilliant X-ray hot spot that marks its end point. Pictor A Image Press Image and Caption The hot spot is at least 800 thousand light years (8 times the diameter of our Milky Way galaxy) away from where the jet originates. It is thought to represent the advancing head of the jet, which brightens conspicuously where it plows into the tenuous gas of intergalactic space. The jet, powered by the giant black hole, originates from a region of space no bigger than the solar system. "Both the brightness and the spectrum of the X rays are very different from what theory predicts," Professor Andrew Wilson reported today at the 196th national meeting of the American Astronomical Society in Rochester, New York. Wilson, of the University of Maryland, College Park, along with Dr. Patrick Shopbell and Dr. Andrew Young, also of the University of Maryland, are submitting an article on this research to the Astrophysical Journal. "The Chandra observations are telling us that something out there is producing many more high-energy particles than we expected," said Wilson. One possible explanation for the X rays is that shock waves along the side and head of the X-ray jet are accelerating electrons and possibly protons to speeds close to that of light. In the process the electrons are boosted to energies as high as 100 million times their own rest mass energy. These electrons lose their energy rapidly as they produce X rays, so this could be the first direct evidence of this process so far outside a galaxy. The hot spot has been seen with optical and radio telescopes. Radio telescopes have also observed a faint jet. Jets are thought to be produced by the extreme electromagnetic forces created by magnetized gas swirling toward a black hole. Although most of the material falls into the black hole, some can be ejected at extremely high speeds. Magnetic fields spun out by these forces can extend over vast distances and may help explain the narrowness of the jet. The Chandra observation of Pictor A was made on January 18, 2000 for eight hours using the Advanced CCD Imaging Spectrometer (ACIS). The ACIS instrument was built for NASA by the Massachusetts Institute of Technology, Cambridge, and Pennsylvania State University, University Park. NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program. TRW, Inc., Redondo Beach, Calif., is the prime contractor for the spacecraft. The Smithsonian's Chandra X-ray Center controls science and flight operations from Cambridge, Mass. Images associated with this release are available on the World Wide Web at: http://chandra.harvard.edu AND http://chandra.nasa.gov High resolution digital versions of the X-ray image (JPG, 300 dpi TIFF) are available at the Internet sites listed above. This image will be available on NASA Video File which airs at noon, 3:00 p.m., 6:00 p.m., 9:00 p.m. and midnight Eastern Time. NASA Television is available on GE-2, transponder 9C at 85 degrees West longitude, with vertical polarization. Frequency is on 3880.0 megahertz, with audio on 6.8 megahertz.

  6. HUBBLE'S INFRARED GALAXY GALLERY

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Astronomers have used the NASA Hubble Space Telescope to produce an infrared 'photo essay' of spiral galaxies. By penetrating the dust clouds swirling around the centers of these galaxies, the telescope's infrared vision is offering fresh views of star birth. These six images, taken with the Near Infrared Camera and Multi-Object Spectrometer, showcase different views of spiral galaxies, from a face-on image of an entire galaxy to a close-up of a core. The top row shows spirals at diverse angles, from face-on, (left); to slightly tilted, (center); to edge-on, (right). The bottom row shows close-ups of the hubs of three galaxies. In these images, red corresponds to glowing hydrogen, the raw material for star birth. The red knots outlining the curving spiral arms in NGC 5653 and NGC 3593, for example, pinpoint rich star-forming regions where the surrounding hydrogen gas is heated by intense ultraviolet radiation from young, massive stars. In visible light, many of these regions can be hidden from view by the clouds of gas and dust in which they were born. The glowing hydrogen found inside the cores of these galaxies, as in NGC 6946, may be due to star birth; radiation from active galactic nuclei (AGN), which are powered by massive black holes; or a combination of both. White is light from middle-age stars. Clusters of stars appear as white dots, as in NGC 2903. The galaxy cores are mostly white because of their dense concentration of stars. The dark material seen in these images is dust. These galaxies are part of a Hubble census of about 100 spiral galaxies. Astronomers at Space Telescope Science Institute took these images to fill gaps in the scheduling of a campaign using the NICMOS-3 camera. The data were non-proprietary, and were made available to the entire astronomical community. Filters: Three filters were used: red, blue, and green. Red represents emission at the Paschen Alpha line (light from glowing hydrogen) at a wavelength of 1.87 microns. Blue shows the galaxies in near-infrared light, measured between 1.4 and 1.8 microns (H-band emission). Green is a mixture of the two. Distance of galaxies from Earth: NGC 5653 - 161 million light-years; NGC 3593 - 28 million light-years; NGC 891 - 24 million light-years; NGC 4826 - 19 million light-years; NGC 2903 - 25 million light-years; and NGC 6946 - 20 million light-years. Credits: Torsten Boeker, Space Telescope Science Institute, and NASA NOTE TO EDITORS: Image files and photo caption are available on the Internet at: http://oposite.stsci.edu/pubinfo/pr/1999/10 or via links in http://oposite.stsci.edu/pubinfo/latest.html and http://oposite.stsci.edu/pubinfo/pictures.html Higher resolution digital versions of (300 dpi JPEG and TIFF) of the release photo are available at: http://oposite.stsci.edu/pubinfo/pr/1999/10/extra-photos.html STScI press releases and other information are available automatically by sending an Internet electronic mail message to pio-request@stsci.edu. In the body of the message (not the subject line) users should type the word 'subscribe' (don't use quotes). The system will respond with a confirmation of the subscription, and users will receive new press releases as they are issued. To unsubscribe, send mail to pio-request@stsci.edu. Leave the subject line blank, and type 'unsubscribe' (don't use quotes) in the body of the message.

  7. [Development of an original computer program FISHMet: use for molecular cytogenetic diagnosis and genome mapping by fluorescent in situ hybridization (FISH)].

    PubMed

    Iurov, Iu B; Khazatskiĭ, I A; Akindinov, V A; Dovgilov, L V; Kobrinskiĭ, B A; Vorsanova, S G

    2000-08-01

    Original software FISHMet has been developed and tried for improving the efficiency of diagnosis of hereditary diseases caused by chromosome aberrations and for chromosome mapping by fluorescent in situ hybridization (FISH) method. The program allows creation and analysis of pseudocolor chromosome images and hybridization signals in the Windows 95 system, allows computer analysis and editing of the results of pseudocolor hybridization in situ, including successive imposition of initial black-and-white images created using fluorescent filters (blue, green, and red), and editing of each image individually or of a summary pseudocolor image in BMP, TIFF, and JPEG formats. Components of image computer analysis system (LOMO, Leitz Ortoplan, and Axioplan fluorescent microscopes, COHU 4910 and Sanyo VCB-3512P CCD cameras, Miro-Video, Scion LG-3 and VG-5 image capture maps, and Pentium 100 and Pentium 200 computers) and specialized software for image capture and visualization (Scion Image PC and Video-Cup) have been used with good results in the study.

  8. Lava and Snow on Klyuchevskaya Volcano [high res

    NASA Image and Video Library

    2013-09-20

    IDL TIFF file This false-color (shortwave infrared, near infrared, green) satellite image reveals an active lava flow on the western slopes of Klyuchevskaya Volcano. Klyuchevskaya is one of several active volcanoes on the Kamchatka Peninsula in far eastern Russia. The lava flow itself is bright red. Snow on Klyuchevskaya and nearby mountains is cyan, while bare ground and volcanic debris is gray or brown. Vegetation is green. The image was collected by Landsat 8 on September 9, 2013. NASA Earth Observatory image by Jesse Allen and Robert Simmon, using Instrument: Landsat 8 - OLI More info: 1.usa.gov/1evspH7 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  9. Global Long-Term SeaWiFS Deep Blue Aerosol Products available at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Sayer, A. M.; Bettenhausen, Corey; Wei, Jennifer C.; Ostrenga, Dana M.; Vollmer, Bruce E.; Hsu, Nai-Yung; Kempler, Steven J.

    2012-01-01

    Long-term climate data records about aerosols are needed in order to improve understanding of air quality, radiative forcing, and for many other applications. The Sea-viewing Wide Field-of-view Sensor (SeaWiFS) provides a global well-calibrated 13- year (1997-2010) record of top-of-atmosphere radiance, suitable for use in retrieval of atmospheric aerosol optical depth (AOD). Recently, global aerosol products derived from SeaWiFS with Deep Blue algorithm (SWDB) have become available for the entire mission, as part of the NASA Making Earth Science data records for Use in Research for Earth Science (MEaSUREs) program. The latest Deep Blue algorithm retrieves aerosol properties not only over bright desert surfaces, but also vegetated surfaces, oceans, and inland water bodies. Comparisons with AERONET observations have shown that the data are suitable for quantitative scientific use [1],[2]. The resolution of Level 2 pixels is 13.5x13.5 km2 at the center of the swath. Level 3 daily and monthly data are composed by using best quality level 2 pixels at resolution of both 0.5ox0.5o and 1.0ox1.0o. Focusing on the southwest Asia region, this presentation shows seasonal variations of AOD, and the result of comparisons of 5-years (2003- 2007) of AOD from SWDB (Version 3) and MODIS Aqua (Version 5.1) for Dark Target (MYD-DT) and Deep Blue (MYD-DB) algorithms.

  10. The National Solar Radiation Data Base (NSRDB)

    DOE PAGES

    Sengupta, Manajit; Xie, Yu; Lopez, Anthony; ...

    2018-03-19

    The National Solar Radiation Data Base (NSRDB), consisting of solar radiation and meteorological data over the United States and regions of the surrounding countries, is a publicly open dataset that has been created and disseminated during the last 23 years. This paper briefly reviews the complete package of surface observations, models, and satellite data used for the latest version of the NSRDB as well as improvements in the measurement and modeling technologies deployed in the NSRDB over the years. The current NSRDB provides solar irradiance at a 4-km horizontal resolution for each 30-min interval from 1998 to 2016 computed bymore » the National Renewable Energy Laboratory's (NREL's) Physical Solar Model (PSM) and products from the National Oceanic and Atmospheric Administration's (NOAA's) Geostationary Operational Environmental Satellite (GOES), the National Ice Center's (NIC's) Interactive Multisensor Snow and Ice Mapping System (IMS), and the National Aeronautics and Space Administration's (NASA's) Moderate Resolution Imaging Spectroradiometer (MODIS) and Modern Era Retrospective analysis for Research and Applications, version 2 (MERRA-2). The NSRDB irradiance data have been validated and shown to agree with surface observations with mean percentage biases within 5% and 10% for global horizontal irradiance (GHI) and direct normal irradiance (DNI), respectively. The data can be freely accessed via https://nsrdb.nrel.gov or through an application programming interface (API). During the last 23 years, the NSRDB has been widely used by an ever-growing group of researchers and industry both directly and through tools such as NREL's System Advisor Model.« less

  11. MethBank 3.0: a database of DNA methylomes across a variety of species.

    PubMed

    Li, Rujiao; Liang, Fang; Li, Mengwei; Zou, Dong; Sun, Shixiang; Zhao, Yongbing; Zhao, Wenming; Bao, Yiming; Xiao, Jingfa; Zhang, Zhang

    2018-01-04

    MethBank (http://bigd.big.ac.cn/methbank) is a database that integrates high-quality DNA methylomes across a variety of species and provides an interactive browser for visualization of methylation data. Here, we present an updated implementation of MethBank (version 3.0) by incorporating more DNA methylomes from multiple species and equipping with more enhanced functionalities for data annotation and more friendly web interfaces for data presentation, search and visualization. MethBank 3.0 features large-scale integration of high-quality methylomes, involving 34 consensus reference methylomes derived from a large number of human samples, 336 single-base resolution methylomes from different developmental stages and/or tissues of five plants, and 18 single-base resolution methylomes from gametes and early embryos at multiple stages of two animals. Additionally, it is enhanced by improving the functionalities for data annotation, which accordingly enables systematic identification of methylation sites closely associated with age, sites with constant methylation levels across different ages, differentially methylated promoters, age-specific differentially methylated cytosines/regions, and methylated CpG islands. Moreover, MethBank provides tools to estimate human methylation age online and to identify differentially methylated promoters, respectively. Taken together, MethBank is upgraded with significant improvements and advances over the previous version, which is of great help for deciphering DNA methylation regulatory mechanisms for epigenetic studies. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Reproducibility of myelin content-based human habenula segmentation at 3 Tesla.

    PubMed

    Kim, Joo-Won; Naidich, Thomas P; Joseph, Joshmi; Nair, Divya; Glasser, Matthew F; O'halloran, Rafael; Doucet, Gaelle E; Lee, Won Hee; Krinsky, Hannah; Paulino, Alejandro; Glahn, David C; Anticevic, Alan; Frangou, Sophia; Xu, Junqian

    2018-03-26

    In vivo morphological study of the human habenula, a pair of small epithalamic nuclei adjacent to the dorsomedial thalamus, has recently gained significant interest for its role in reward and aversion processing. However, segmenting the habenula from in vivo magnetic resonance imaging (MRI) is challenging due to the habenula's small size and low anatomical contrast. Although manual and semi-automated habenula segmentation methods have been reported, the test-retest reproducibility of the segmented habenula volume and the consistency of the boundaries of habenula segmentation have not been investigated. In this study, we evaluated the intra- and inter-site reproducibility of in vivo human habenula segmentation from 3T MRI (0.7-0.8 mm isotropic resolution) using our previously proposed semi-automated myelin contrast-based method and its fully-automated version, as well as a previously published manual geometry-based method. The habenula segmentation using our semi-automated method showed consistent boundary definition (high Dice coefficient, low mean distance, and moderate Hausdorff distance) and reproducible volume measurement (low coefficient of variation). Furthermore, the habenula boundary in our semi-automated segmentation from 3T MRI agreed well with that in the manual segmentation from 7T MRI (0.5 mm isotropic resolution) of the same subjects. Overall, our proposed semi-automated habenula segmentation showed reliable and reproducible habenula localization, while its fully-automated version offers an efficient way for large sample analysis. © 2018 Wiley Periodicals, Inc.

  13. Demonstration of Effects on Tropical Cyclone Forecasts with a High Resolution Global Model from Variation in Cumulus Convection Parameterization

    NASA Technical Reports Server (NTRS)

    Miller, Timothy L.; Robertson, Franklin R.; Cohen, Charles; Mackaro, Jessica

    2009-01-01

    The Goddard Earth Observing System Model, Version 5 (GEOS-5) is a system of models that have been developed at Goddard Space Flight Center to support NASA's earth science research in data analysis, observing system modeling and design, climate and weather prediction, and basic research. The work presented used GEOS-5 with 0.25o horizontal resolution and 72 vertical levels (up to 0.01 hP) resolving both the troposphere and stratosphere, with closer packing of the levels close to the surface. The model includes explicit (grid-scale) moist physics, as well as convective parameterization schemes. Results will be presented that will demonstrate strong dependence in the results of modeling of a strong hurricane on the type of convective parameterization scheme used. The previous standard (default) option in the model was the Relaxed Arakawa-Schubert (RAS) scheme, which uses a quasi-equilibrium closure. In the cases shown, this scheme does not permit the efficient development of a strong storm in comparison with observations. When this scheme is replaced by a modified version of the Kain-Fritsch scheme, which was originally developed for use on grids with intervals of order 25 km such as the present one, the storm is able to develop to a much greater extent, closer to that of reality. Details of the two cases will be shown in order to elucidate the differences in the two modeled storms.

  14. The National Solar Radiation Data Base (NSRDB)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Manajit; Xie, Yu; Lopez, Anthony

    The National Solar Radiation Data Base (NSRDB), consisting of solar radiation and meteorological data over the United States and regions of the surrounding countries, is a publicly open dataset that has been created and disseminated during the last 23 years. This paper briefly reviews the complete package of surface observations, models, and satellite data used for the latest version of the NSRDB as well as improvements in the measurement and modeling technologies deployed in the NSRDB over the years. The current NSRDB provides solar irradiance at a 4-km horizontal resolution for each 30-min interval from 1998 to 2016 computed bymore » the National Renewable Energy Laboratory's (NREL's) Physical Solar Model (PSM) and products from the National Oceanic and Atmospheric Administration's (NOAA's) Geostationary Operational Environmental Satellite (GOES), the National Ice Center's (NIC's) Interactive Multisensor Snow and Ice Mapping System (IMS), and the National Aeronautics and Space Administration's (NASA's) Moderate Resolution Imaging Spectroradiometer (MODIS) and Modern Era Retrospective analysis for Research and Applications, version 2 (MERRA-2). The NSRDB irradiance data have been validated and shown to agree with surface observations with mean percentage biases within 5% and 10% for global horizontal irradiance (GHI) and direct normal irradiance (DNI), respectively. The data can be freely accessed via https://nsrdb.nrel.gov or through an application programming interface (API). During the last 23 years, the NSRDB has been widely used by an ever-growing group of researchers and industry both directly and through tools such as NREL's System Advisor Model.« less

  15. TimeSet: A computer program that accesses five atomic time services on two continents

    NASA Technical Reports Server (NTRS)

    Petrakis, P. L.

    1993-01-01

    TimeSet is a shareware program for accessing digital time services by telephone. At its initial release, it was capable of capturing time signals only from the U.S. Naval Observatory to set a computer's clock. Later the ability to synchronize with the National Institute of Standards and Technology was added. Now, in Version 7.10, TimeSet is able to access three additional telephone time services in Europe - in Sweden, Austria, and Italy - making a total of five official services addressable by the program. A companion program, TimeGen, allows yet another source of telephone time data strings for callers equipped with TimeSet version 7.10. TimeGen synthesizes UTC time data strings in the Naval Observatory's format from an accurately set and maintained DOS computer clock, and transmits them to callers. This allows an unlimited number of 'freelance' time generating stations to be created. Timesetting from TimeGen is made feasible by the advent of Becker's RighTime, a shareware program that learns the drift characteristics of a computer's clock and continuously applies a correction to keep it accurate, and also brings .01 second resolution to the DOS clock. With clock regulation by RighTime and periodic update calls by the TimeGen station to an official time source via TimeSet, TimeGen offers the same degree of accuracy within the resolution of the computer clock as any official atomic time source.

  16. Active x-ray optics for Generation-X, the next high resolution x-ray observatory

    NASA Astrophysics Data System (ADS)

    Elvis, Martin; Brissenden, R. J.; Fabbiano, G.; Schwartz, D. A.; Reid, P.; Podgorski, W.; Eisenhower, M.; Juda, M.; Phillips, J.; Cohen, L.; Wolk, S.

    2006-06-01

    X-rays provide one of the few bands through which we can study the epoch of reionization, when the first galaxies, black holes and stars were born. To reach the sensitivity required to image these first discrete objects in the universe needs a major advance in X-ray optics. Generation-X (Gen-X) is currently the only X-ray astronomy mission concept that addresses this goal. Gen-X aims to improve substantially on the Chandra angular resolution and to do so with substantially larger effective area. These two goals can only be met if a mirror technology can be developed that yields high angular resolution at much lower mass/unit area than the Chandra optics, matching that of Constellation-X (Con-X). We describe an approach to this goal based on active X-ray optics that correct the mid-frequency departures from an ideal Wolter optic on-orbit. We concentrate on the problems of sensing figure errors, calculating the corrections required, and applying those corrections. The time needed to make this in-flight calibration is reasonable. A laboratory version of these optics has already been developed by others and is successfully operating at synchrotron light sources. With only a moderate investment in these optics the goals of Gen-X resolution can be realized.

  17. Sparse representation based image interpolation with nonlocal autoregressive modeling.

    PubMed

    Dong, Weisheng; Zhang, Lei; Lukac, Rastislav; Shi, Guangming

    2013-04-01

    Sparse representation is proven to be a promising approach to image super-resolution, where the low-resolution (LR) image is usually modeled as the down-sampled version of its high-resolution (HR) counterpart after blurring. When the blurring kernel is the Dirac delta function, i.e., the LR image is directly down-sampled from its HR counterpart without blurring, the super-resolution problem becomes an image interpolation problem. In such cases, however, the conventional sparse representation models (SRM) become less effective, because the data fidelity term fails to constrain the image local structures. In natural images, fortunately, many nonlocal similar patches to a given patch could provide nonlocal constraint to the local structure. In this paper, we incorporate the image nonlocal self-similarity into SRM for image interpolation. More specifically, a nonlocal autoregressive model (NARM) is proposed and taken as the data fidelity term in SRM. We show that the NARM-induced sampling matrix is less coherent with the representation dictionary, and consequently makes SRM more effective for image interpolation. Our extensive experimental results demonstrate that the proposed NARM-based image interpolation method can effectively reconstruct the edge structures and suppress the jaggy/ringing artifacts, achieving the best image interpolation results so far in terms of PSNR as well as perceptual quality metrics such as SSIM and FSIM.

  18. Evolution of miniature detectors and focal plane arrays for infrared sensors

    NASA Astrophysics Data System (ADS)

    Watts, Louis A.

    1993-06-01

    Sensors that are sensitive in the infrared spectral region have been under continuous development since the WW2 era. A quest for the military advantage of 'seeing in the dark' has pushed thermal imaging technology toward high spatial and temporal resolution for night vision equipment, fire control, search track, and seeker 'homing' guidance sensing devices. Similarly, scientific applications have pushed spectral resolution for chemical analysis, remote sensing of earth resources, and astronomical exploration applications. As a result of these developments, focal plane arrays (FPA) are now available with sufficient sensitivity for both high spatial and narrow bandwidth spectral resolution imaging over large fields of view. Such devices combined with emerging opto-electronic developments in integrated FPA data processing techniques can yield miniature sensors capable of imaging reflected sunlight in the near IR and emitted thermal energy in the Mid-wave (MWIR) and longwave (LWIR) IR spectral regions. Robotic space sensors equipped with advanced versions of these FPA's will provide high resolution 'pictures' of their surroundings, perform remote analysis of solid, liquid, and gas matter, or selectively look for 'signatures' of specific objects. Evolutionary trends and projections of future low power micro detector FPA developments for day/night operation or use in adverse viewing conditions are presented in the following test.

  19. Evolution of miniature detectors and focal plane arrays for infrared sensors

    NASA Technical Reports Server (NTRS)

    Watts, Louis A.

    1993-01-01

    Sensors that are sensitive in the infrared spectral region have been under continuous development since the WW2 era. A quest for the military advantage of 'seeing in the dark' has pushed thermal imaging technology toward high spatial and temporal resolution for night vision equipment, fire control, search track, and seeker 'homing' guidance sensing devices. Similarly, scientific applications have pushed spectral resolution for chemical analysis, remote sensing of earth resources, and astronomical exploration applications. As a result of these developments, focal plane arrays (FPA) are now available with sufficient sensitivity for both high spatial and narrow bandwidth spectral resolution imaging over large fields of view. Such devices combined with emerging opto-electronic developments in integrated FPA data processing techniques can yield miniature sensors capable of imaging reflected sunlight in the near IR and emitted thermal energy in the Mid-wave (MWIR) and longwave (LWIR) IR spectral regions. Robotic space sensors equipped with advanced versions of these FPA's will provide high resolution 'pictures' of their surroundings, perform remote analysis of solid, liquid, and gas matter, or selectively look for 'signatures' of specific objects. Evolutionary trends and projections of future low power micro detector FPA developments for day/night operation or use in adverse viewing conditions are presented in the following test.

  20. Detection and Attribution of Regional Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bala, G; Mirin, A

    2007-01-19

    We developed a high resolution global coupled modeling capability to perform breakthrough studies of the regional climate change. The atmospheric component in our simulation uses a 1{sup o} latitude x 1.25{sup o} longitude grid which is the finest resolution ever used for the NCAR coupled climate model CCSM3. Substantial testing and slight retuning was required to get an acceptable control simulation. The major accomplishment is the validation of this new high resolution configuration of CCSM3. There are major improvements in our simulation of the surface wind stress and sea ice thickness distribution in the Arctic. Surface wind stress and oceanmore » circulation in the Antarctic Circumpolar Current are also improved. Our results demonstrate that the FV version of the CCSM coupled model is a state of the art climate model whose simulation capabilities are in the class of those used for IPCC assessments. We have also provided 1000 years of model data to Scripps Institution of Oceanography to estimate the natural variability of stream flow in California. In the future, our global model simulations will provide boundary data to high-resolution mesoscale model that will be used at LLNL. The mesoscale model would dynamically downscale the GCM climate to regional scale on climate time scales.« less

Top