Abatzoglou, John T; Dobrowski, Solomon Z; Parks, Sean A; Hegewisch, Katherine C
2018-01-09
We present TerraClimate, a dataset of high-spatial resolution (1/24°, ~4-km) monthly climate and climatic water balance for global terrestrial surfaces from 1958-2015. TerraClimate uses climatically aided interpolation, combining high-spatial resolution climatological normals from the WorldClim dataset, with coarser resolution time varying (i.e., monthly) data from other sources to produce a monthly dataset of precipitation, maximum and minimum temperature, wind speed, vapor pressure, and solar radiation. TerraClimate additionally produces monthly surface water balance datasets using a water balance model that incorporates reference evapotranspiration, precipitation, temperature, and interpolated plant extractable soil water capacity. These data provide important inputs for ecological and hydrological studies at global scales that require high spatial resolution and time varying climate and climatic water balance data. We validated spatiotemporal aspects of TerraClimate using annual temperature, precipitation, and calculated reference evapotranspiration from station data, as well as annual runoff from streamflow gauges. TerraClimate datasets showed noted improvement in overall mean absolute error and increased spatial realism relative to coarser resolution gridded datasets.
NASA Astrophysics Data System (ADS)
Abatzoglou, John T.; Dobrowski, Solomon Z.; Parks, Sean A.; Hegewisch, Katherine C.
2018-01-01
We present TerraClimate, a dataset of high-spatial resolution (1/24°, ~4-km) monthly climate and climatic water balance for global terrestrial surfaces from 1958-2015. TerraClimate uses climatically aided interpolation, combining high-spatial resolution climatological normals from the WorldClim dataset, with coarser resolution time varying (i.e., monthly) data from other sources to produce a monthly dataset of precipitation, maximum and minimum temperature, wind speed, vapor pressure, and solar radiation. TerraClimate additionally produces monthly surface water balance datasets using a water balance model that incorporates reference evapotranspiration, precipitation, temperature, and interpolated plant extractable soil water capacity. These data provide important inputs for ecological and hydrological studies at global scales that require high spatial resolution and time varying climate and climatic water balance data. We validated spatiotemporal aspects of TerraClimate using annual temperature, precipitation, and calculated reference evapotranspiration from station data, as well as annual runoff from streamflow gauges. TerraClimate datasets showed noted improvement in overall mean absolute error and increased spatial realism relative to coarser resolution gridded datasets.
John T. Abatzoglou; Solomon Z. Dobrowski; Sean A. Parks; Katherine C. Hegewisch
2018-01-01
We present TerraClimate, a dataset of high-spatial resolution (1/24°, ~4-km) monthly climate and climatic water balance for global terrestrial surfaces from 1958â2015. TerraClimate uses climatically aided interpolation, combining high-spatial resolution climatological normals from the WorldClim dataset, with coarser resolution time varying (i.e., monthly) data from...
Mapping and spatiotemporal analysis tool for hydrological data: Spellmap
USDA-ARS?s Scientific Manuscript database
Lack of data management and analyses tools is one of the major limitations to effectively evaluate and use large datasets of high-resolution atmospheric, surface, and subsurface observations. High spatial and temporal resolution datasets better represent the spatiotemporal variability of hydrologica...
Enhancing Conservation with High Resolution Productivity Datasets for the Conterminous United States
NASA Astrophysics Data System (ADS)
Robinson, Nathaniel Paul
Human driven alteration of the earth's terrestrial surface is accelerating through land use changes, intensification of human activity, climate change, and other anthropogenic pressures. These changes occur at broad spatio-temporal scales, challenging our ability to effectively monitor and assess the impacts and subsequent conservation strategies. While satellite remote sensing (SRS) products enable monitoring of the earth's terrestrial surface continuously across space and time, the practical applications for conservation and management of these products are limited. Often the processes driving ecological change occur at fine spatial resolutions and are undetectable given the resolution of available datasets. Additionally, the links between SRS data and ecologically meaningful metrics are weak. Recent advances in cloud computing technology along with the growing record of high resolution SRS data enable the development of SRS products that quantify ecologically meaningful variables at relevant scales applicable for conservation and management. The focus of my dissertation is to improve the applicability of terrestrial gross and net primary productivity (GPP/NPP) datasets for the conterminous United States (CONUS). In chapter one, I develop a framework for creating high resolution datasets of vegetation dynamics. I use the entire archive of Landsat 5, 7, and 8 surface reflectance data and a novel gap filling approach to create spatially continuous 30 m, 16-day composites of the normalized difference vegetation index (NDVI) from 1986 to 2016. In chapter two, I integrate this with other high resolution datasets and the MOD17 algorithm to create the first high resolution GPP and NPP datasets for CONUS. I demonstrate the applicability of these products for conservation and management, showing the improvements beyond currently available products. In chapter three, I utilize this dataset to evaluate the relationships between land ownership and terrestrial production across the CONUS domain. The main results of this work are three publicly available datasets: 1) 30 m Landsat NDVI; 2) 250 m MODIS based GPP and NPP; and 3) 30 m Landsat based GPP and NPP. My goal is that these products prove useful for the wider scientific, conservation, and land management communities as we continue to strive for better conservation and management practices.
T1-weighted in vivo human whole brain MRI dataset with an ultrahigh isotropic resolution of 250 μm.
Lüsebrink, Falk; Sciarra, Alessandro; Mattern, Hendrik; Yakupov, Renat; Speck, Oliver
2017-03-14
We present an ultrahigh resolution in vivo human brain magnetic resonance imaging (MRI) dataset. It consists of T 1 -weighted whole brain anatomical data acquired at 7 Tesla with a nominal isotropic resolution of 250 μm of a single young healthy Caucasian subject and was recorded using prospective motion correction. The raw data amounts to approximately 1.2 TB and was acquired in eight hours total scan time. The resolution of this dataset is far beyond any previously published in vivo structural whole brain dataset. Its potential use is to build an in vivo MR brain atlas. Methods for image reconstruction and image restoration can be improved as the raw data is made available. Pre-processing and segmentation procedures can possibly be enhanced for high magnetic field strength and ultrahigh resolution data. Furthermore, potential resolution induced changes in quantitative data analysis can be assessed, e.g., cortical thickness or volumetric measures, as high quality images with an isotropic resolution of 1 and 0.5 mm of the same subject are included in the repository as well.
T1-weighted in vivo human whole brain MRI dataset with an ultrahigh isotropic resolution of 250 μm
NASA Astrophysics Data System (ADS)
Lüsebrink, Falk; Sciarra, Alessandro; Mattern, Hendrik; Yakupov, Renat; Speck, Oliver
2017-03-01
We present an ultrahigh resolution in vivo human brain magnetic resonance imaging (MRI) dataset. It consists of T1-weighted whole brain anatomical data acquired at 7 Tesla with a nominal isotropic resolution of 250 μm of a single young healthy Caucasian subject and was recorded using prospective motion correction. The raw data amounts to approximately 1.2 TB and was acquired in eight hours total scan time. The resolution of this dataset is far beyond any previously published in vivo structural whole brain dataset. Its potential use is to build an in vivo MR brain atlas. Methods for image reconstruction and image restoration can be improved as the raw data is made available. Pre-processing and segmentation procedures can possibly be enhanced for high magnetic field strength and ultrahigh resolution data. Furthermore, potential resolution induced changes in quantitative data analysis can be assessed, e.g., cortical thickness or volumetric measures, as high quality images with an isotropic resolution of 1 and 0.5 mm of the same subject are included in the repository as well.
NASA Astrophysics Data System (ADS)
Ota, Junko; Umehara, Kensuke; Ishimaru, Naoki; Ohno, Shunsuke; Okamoto, Kentaro; Suzuki, Takanori; Shirai, Naoki; Ishida, Takayuki
2017-02-01
As the capability of high-resolution displays grows, high-resolution images are often required in Computed Tomography (CT). However, acquiring high-resolution images takes a higher radiation dose and a longer scanning time. In this study, we applied the Sparse-coding-based Super-Resolution (ScSR) method to generate high-resolution images without increasing the radiation dose. We prepared the over-complete dictionary learned the mapping between low- and highresolution patches and seek a sparse representation of each patch of the low-resolution input. These coefficients were used to generate the high-resolution output. For evaluation, 44 CT cases were used as the test dataset. We up-sampled images up to 2 or 4 times and compared the image quality of the ScSR scheme and bilinear and bicubic interpolations, which are the traditional interpolation schemes. We also compared the image quality of three learning datasets. A total of 45 CT images, 91 non-medical images, and 93 chest radiographs were used for dictionary preparation respectively. The image quality was evaluated by measuring peak signal-to-noise ratio (PSNR) and structure similarity (SSIM). The differences of PSNRs and SSIMs between the ScSR method and interpolation methods were statistically significant. Visual assessment confirmed that the ScSR method generated a high-resolution image with sharpness, whereas conventional interpolation methods generated over-smoothed images. To compare three different training datasets, there were no significance between the CT, the CXR and non-medical datasets. These results suggest that the ScSR provides a robust approach for application of up-sampling CT images and yields substantial high image quality of extended images in CT.
McKenzie, Grant; Janowicz, Krzysztof
2017-01-01
Gaining access to inexpensive, high-resolution, up-to-date, three-dimensional road network data is a top priority beyond research, as such data would fuel applications in industry, governments, and the broader public alike. Road network data are openly available via user-generated content such as OpenStreetMap (OSM) but lack the resolution required for many tasks, e.g., emergency management. More importantly, however, few publicly available data offer information on elevation and slope. For most parts of the world, up-to-date digital elevation products with a resolution of less than 10 meters are a distant dream and, if available, those datasets have to be matched to the road network through an error-prone process. In this paper we present a radically different approach by deriving road network elevation data from massive amounts of in-situ observations extracted from user-contributed data from an online social fitness tracking application. While each individual observation may be of low-quality in terms of resolution and accuracy, taken together they form an accurate, high-resolution, up-to-date, three-dimensional road network that excels where other technologies such as LiDAR fail, e.g., in case of overpasses, overhangs, and so forth. In fact, the 1m spatial resolution dataset created in this research based on 350 million individual 3D location fixes has an RMSE of approximately 3.11m compared to a LiDAR-based ground-truth and can be used to enhance existing road network datasets where individual elevation fixes differ by up to 60m. In contrast, using interpolated data from the National Elevation Dataset (NED) results in 4.75m RMSE compared to the base line. We utilize Linked Data technologies to integrate the proposed high-resolution dataset with OpenStreetMap road geometries without requiring any changes to the OSM data model.
Policy makers need to understand how land cover change alters storm water regimes, yet existing methods do not fully utilize newly available datasets to quantify storm water changes at a landscape-scale. Here, we use high-resolution, remotely-sensed land cover, imperviousness, an...
Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset
NASA Astrophysics Data System (ADS)
Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.
2017-12-01
Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.
NASA Astrophysics Data System (ADS)
Li, Tao; Zheng, Xiaogu; Dai, Yongjiu; Yang, Chi; Chen, Zhuoqi; Zhang, Shupeng; Wu, Guocan; Wang, Zhonglei; Huang, Chengcheng; Shen, Yan; Liao, Rongwei
2014-09-01
As part of a joint effort to construct an atmospheric forcing dataset for mainland China with high spatiotemporal resolution, a new approach is proposed to construct gridded near-surface temperature, relative humidity, wind speed and surface pressure with a resolution of 1 km×1 km. The approach comprises two steps: (1) fit a partial thin-plate smoothing spline with orography and reanalysis data as explanatory variables to ground-based observations for estimating a trend surface; (2) apply a simple kriging procedure to the residual for trend surface correction. The proposed approach is applied to observations collected at approximately 700 stations over mainland China. The generated forcing fields are compared with the corresponding components of the National Centers for Environmental Prediction (NCEP) Climate Forecast System Reanalysis dataset and the Princeton meteorological forcing dataset. The comparison shows that, both within the station network and within the resolutions of the two gridded datasets, the interpolation errors of the proposed approach are markedly smaller than the two gridded datasets.
Open Science CBS Neuroimaging Repository: Sharing ultra-high-field MR images of the brain.
Tardif, Christine Lucas; Schäfer, Andreas; Trampel, Robert; Villringer, Arno; Turner, Robert; Bazin, Pierre-Louis
2016-01-01
Magnetic resonance imaging at ultra high field opens the door to quantitative brain imaging at sub-millimeter isotropic resolutions. However, novel image processing tools to analyze these new rich datasets are lacking. In this article, we introduce the Open Science CBS Neuroimaging Repository: a unique repository of high-resolution and quantitative images acquired at 7 T. The motivation for this project is to increase interest for high-resolution and quantitative imaging and stimulate the development of image processing tools developed specifically for high-field data. Our growing repository currently includes datasets from MP2RAGE and multi-echo FLASH sequences from 28 and 20 healthy subjects respectively. These datasets represent the current state-of-the-art in in-vivo relaxometry at 7 T, and are now fully available to the entire neuroimaging community. Copyright © 2015 Elsevier Inc. All rights reserved.
James, Eric P.; Benjamin, Stanley G.; Marquis, Melinda
2016-10-28
A new gridded dataset for wind and solar resource estimation over the contiguous United States has been derived from hourly updated 1-h forecasts from the National Oceanic and Atmospheric Administration High-Resolution Rapid Refresh (HRRR) 3-km model composited over a three-year period (approximately 22 000 forecast model runs). The unique dataset features hourly data assimilation, and provides physically consistent wind and solar estimates for the renewable energy industry. The wind resource dataset shows strong similarity to that previously provided by a Department of Energy-funded study, and it includes estimates in southern Canada and northern Mexico. The solar resource dataset represents anmore » initial step towards application-specific fields such as global horizontal and direct normal irradiance. This combined dataset will continue to be augmented with new forecast data from the advanced HRRR atmospheric/land-surface model.« less
NASA Astrophysics Data System (ADS)
Silvestro, Francesco; Parodi, Antonio; Campo, Lorenzo
2017-04-01
The characterization of the hydrometeorological extremes, both in terms of rainfall and streamflow, in a given region plays a key role in the environmental monitoring provided by the flood alert services. In last years meteorological simulations (both near real-time and historical reanalysis) were available at increasing spatial and temporal resolutions, making possible long-period hydrological reanalysis in which the meteo dataset is used as input in distributed hydrological models. In this work, a very high resolution meteorological reanalysis dataset, namely Express-Hydro (CIMA, ISAC-CNR, GAUSS Special Project PR45DE), was employed as input in the hydrological model Continuum in order to produce long time series of streamflows in the Liguria territory, located in the Northern part of Italy. The original dataset covers the whole Europe territory in the 1979-2008 period, at 4 km of spatial resolution and 3 hours of time resolution. Analyses in terms of comparison between the rainfall estimated by the dataset and the observations (available from the local raingauges network) were carried out, and a bias correction was also performed in order to better match the observed climatology. An extreme analysis was eventually carried on the streamflows time series obtained by the simulations, by comparing them with the results of the same hydrological model fed with the observed time series of rainfall. The results of the analysis are shown and discussed.
Evaluation of precipitation extremes over the Asian domain: observation and modelling studies
NASA Astrophysics Data System (ADS)
Kim, In-Won; Oh, Jaiho; Woo, Sumin; Kripalani, R. H.
2018-04-01
In this study, a comparison in the precipitation extremes as exhibited by the seven reference datasets is made to ascertain whether the inferences based on these datasets agree or they differ. These seven datasets, roughly grouped in three categories i.e. rain-gauge based (APHRODITE, CPC-UNI), satellite-based (TRMM, GPCP1DD) and reanalysis based (ERA-Interim, MERRA, and JRA55), having a common data period 1998-2007 are considered. Focus is to examine precipitation extremes in the summer monsoon rainfall over South Asia, East Asia and Southeast Asia. Measures of extreme precipitation include the percentile thresholds, frequency of extreme precipitation events and other quantities. Results reveal that the differences in displaying extremes among the datasets are small over South Asia and East Asia but large differences among the datasets are displayed over the Southeast Asian region including the maritime continent. Furthermore, precipitation data appear to be more consistent over East Asia among the seven datasets. Decadal trends in extreme precipitation are consistent with known results over South and East Asia. No trends in extreme precipitation events are exhibited over Southeast Asia. Outputs of the Coupled Model Intercomparison Project Phase 5 (CMIP5) simulation data are categorized as high, medium and low-resolution models. The regions displaying maximum intensity of extreme precipitation appear to be dependent on model resolution. High-resolution models simulate maximum intensity of extreme precipitation over the Indian sub-continent, medium-resolution models over northeast India and South China and the low-resolution models over Bangladesh, Myanmar and Thailand. In summary, there are differences in displaying extreme precipitation statistics among the seven datasets considered here and among the 29 CMIP5 model data outputs.
Sorichetta, Alessandro; Hornby, Graeme M.; Stevens, Forrest R.; Gaughan, Andrea E.; Linard, Catherine; Tatem, Andrew J.
2015-01-01
The Latin America and the Caribbean region is one of the most urbanized regions in the world, with a total population of around 630 million that is expected to increase by 25% by 2050. In this context, detailed and contemporary datasets accurately describing the distribution of residential population in the region are required for measuring the impacts of population growth, monitoring changes, supporting environmental and health applications, and planning interventions. To support these needs, an open access archive of high-resolution gridded population datasets was created through disaggregation of the most recent official population count data available for 28 countries located in the region. These datasets are described here along with the approach and methods used to create and validate them. For each country, population distribution datasets, having a resolution of 3 arc seconds (approximately 100 m at the equator), were produced for the population count year, as well as for 2010, 2015, and 2020. All these products are available both through the WorldPop Project website and the WorldPop Dataverse Repository. PMID:26347245
Sorichetta, Alessandro; Hornby, Graeme M; Stevens, Forrest R; Gaughan, Andrea E; Linard, Catherine; Tatem, Andrew J
2015-01-01
The Latin America and the Caribbean region is one of the most urbanized regions in the world, with a total population of around 630 million that is expected to increase by 25% by 2050. In this context, detailed and contemporary datasets accurately describing the distribution of residential population in the region are required for measuring the impacts of population growth, monitoring changes, supporting environmental and health applications, and planning interventions. To support these needs, an open access archive of high-resolution gridded population datasets was created through disaggregation of the most recent official population count data available for 28 countries located in the region. These datasets are described here along with the approach and methods used to create and validate them. For each country, population distribution datasets, having a resolution of 3 arc seconds (approximately 100 m at the equator), were produced for the population count year, as well as for 2010, 2015, and 2020. All these products are available both through the WorldPop Project website and the WorldPop Dataverse Repository.
NASA Astrophysics Data System (ADS)
Hoffmeister, Dirk; Kramm, Tanja; Curdt, Constanze; Maleki, Sedigheh; Khormali, Farhad; Kehl, Martin
2016-04-01
The Iranian loess plateau is covered by loess deposits, up to 70 m thick. Tectonic uplift triggered deep erosion and valley incision into the loess and underlying marine deposits. Soil development strongly relates to the aspect of these incised slopes, because on northern slopes vegetation protects the soil surface against erosion and facilitates formation and preservation of a Cambisol, whereas on south-facing slopes soils were probably eroded and weakly developed Entisols formed. While the whole area is intensively stocked with sheep and goat, rain-fed cropping of winter wheat is practiced on the valley floors. Most time of the year, the soil surface is unprotected against rainfall, which is one of the factors promoting soil erosion and serious flooding. However, little information is available on soil distribution, plant cover and the geomorphological evolution of the plateau, as well as on potentials and problems in land use. Thus, digital landform and soil mapping is needed. As a requirement of digital landform and soil mapping, four different landform classification methods were compared and evaluated. These geomorphometric classifications were run on two different scales. On the whole area an ASTER GDEM and SRTM dataset (30 m pixel resolution) was used. Likewise, two high-resolution digital elevation models were derived from Pléiades satellite stereo-imagery (< 1m pixel resolution, 10 by 10 km). The high-resolution information of this dataset was aggregated to datasets of 5 and 10 m scale. The applied classification methods are the Geomorphons approach, an object-based image approach, the topographical position index and a mainly slope based approach. The accuracy of the classification was checked with a location related image dataset obtained in a field survey (n ~ 150) in September 2015. The accuracy of the DEMs was compared to measured DGPS trenches and map-based elevation data. The overall derived accuracy of the landform classification based on the high-resolution DEM with a resolution of 5 m is approximately 70% and on a 10 m resolution >58%. For the 30 m resolution datasets is the achieved accuracy approximately 40%, as several small scale features are not recognizable in this resolution. Thus, for an accurate differentiation between different important landform types, high-resolution datasets are necessary for this strongly shaped area. One major problem of this approach are the different classes derived by each method and the various class annotations. The result of this evaluation will be regarded for the derivation of landform and soil maps.
NHDPlusHR: A national geospatial framework for surface-water information
Viger, Roland; Rea, Alan H.; Simley, Jeffrey D.; Hanson, Karen M.
2016-01-01
The U.S. Geological Survey is developing a new geospatial hydrographic framework for the United States, called the National Hydrography Dataset Plus High Resolution (NHDPlusHR), that integrates a diversity of the best-available information, robustly supports ongoing dataset improvements, enables hydrographic generalization to derive alternate representations of the network while maintaining feature identity, and supports modern scientific computing and Internet accessibility needs. This framework is based on the High Resolution National Hydrography Dataset, the Watershed Boundaries Dataset, and elevation from the 3-D Elevation Program, and will provide an authoritative, high precision, and attribute-rich geospatial framework for surface-water information for the United States. Using this common geospatial framework will provide a consistent basis for indexing water information in the United States, eliminate redundancy, and harmonize access to, and exchange of water information.
High-resolution near real-time drought monitoring in South Asia
NASA Astrophysics Data System (ADS)
Aadhar, Saran; Mishra, Vimal
2017-10-01
Drought in South Asia affect food and water security and pose challenges for millions of people. For policy-making, planning, and management of water resources at sub-basin or administrative levels, high-resolution datasets of precipitation and air temperature are required in near-real time. We develop a high-resolution (0.05°) bias-corrected precipitation and temperature data that can be used to monitor near real-time drought conditions over South Asia. Moreover, the dataset can be used to monitor climatic extremes (heat and cold waves, dry and wet anomalies) in South Asia. A distribution mapping method was applied to correct bias in precipitation and air temperature, which performed well compared to the other bias correction method based on linear scaling. Bias-corrected precipitation and temperature data were used to estimate Standardized precipitation index (SPI) and Standardized Precipitation Evapotranspiration Index (SPEI) to assess the historical and current drought conditions in South Asia. We evaluated drought severity and extent against the satellite-based Normalized Difference Vegetation Index (NDVI) anomalies and satellite-driven Drought Severity Index (DSI) at 0.05°. The bias-corrected high-resolution data can effectively capture observed drought conditions as shown by the satellite-based drought estimates. High resolution near real-time dataset can provide valuable information for decision-making at district and sub-basin levels.
NASA Astrophysics Data System (ADS)
Dube, Timothy; Mutanga, Onisimo
2015-03-01
Aboveground biomass estimation is critical in understanding forest contribution to regional carbon cycles. Despite the successful application of high spatial and spectral resolution sensors in aboveground biomass (AGB) estimation, there are challenges related to high acquisition costs, small area coverage, multicollinearity and limited availability. These challenges hamper the successful regional scale AGB quantification. The aim of this study was to assess the utility of the newly-launched medium-resolution multispectral Landsat 8 Operational Land Imager (OLI) dataset with a large swath width, in quantifying AGB in a forest plantation. We applied different sets of spectral analysis (test I: spectral bands; test II: spectral vegetation indices and test III: spectral bands + spectral vegetation indices) in testing the utility of Landsat 8 OLI using two non-parametric algorithms: stochastic gradient boosting and the random forest ensembles. The results of the study show that the medium-resolution multispectral Landsat 8 OLI dataset provides better AGB estimates for Eucalyptus dunii, Eucalyptus grandis and Pinus taeda especially when using the extracted spectral information together with the derived spectral vegetation indices. We also noted that incorporating the optimal subset of the most important selected medium-resolution multispectral Landsat 8 OLI bands improved AGB accuracies. We compared medium-resolution multispectral Landsat 8 OLI AGB estimates with Landsat 7 ETM + estimates and the latter yielded lower estimation accuracies. Overall, this study demonstrates the invaluable potential and strength of applying the relatively affordable and readily available newly-launched medium-resolution Landsat 8 OLI dataset, with a large swath width (185-km) in precisely estimating AGB. This strength of the Landsat OLI dataset is crucial especially in sub-Saharan Africa where high-resolution remote sensing data availability remains a challenge.
Compartmentalized Low-Rank Recovery for High-Resolution Lipid Unsuppressed MRSI
Bhattacharya, Ipshita; Jacob, Mathews
2017-01-01
Purpose To introduce a novel algorithm for the recovery of high-resolution magnetic resonance spectroscopic imaging (MRSI) data with minimal lipid leakage artifacts, from dual-density spiral acquisition. Methods The reconstruction of MRSI data from dual-density spiral data is formulated as a compartmental low-rank recovery problem. The MRSI dataset is modeled as the sum of metabolite and lipid signals, each of which is support limited to the brain and extracranial regions, respectively, in addition to being orthogonal to each other. The reconstruction problem is formulated as an optimization problem, which is solved using iterative reweighted nuclear norm minimization. Results The comparisons of the scheme against dual-resolution reconstruction algorithm on numerical phantom and in vivo datasets demonstrate the ability of the scheme to provide higher spatial resolution and lower lipid leakage artifacts. The experiments demonstrate the ability of the scheme to recover the metabolite maps, from lipid unsuppressed datasets with echo time (TE)=55 ms. Conclusion The proposed reconstruction method and data acquisition strategy provide an efficient way to achieve high-resolution metabolite maps without lipid suppression. This algorithm would be beneficial for fast metabolic mapping and extension to multislice acquisitions. PMID:27851875
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oubeidillah, Abdoul A; Kao, Shih-Chieh; Ashfaq, Moetasim
2014-01-01
To extend geographical coverage, refine spatial resolution, and improve modeling efficiency, a computation- and data-intensive effort was conducted to organize a comprehensive hydrologic dataset with post-calibrated model parameters for hydro-climate impact assessment. Several key inputs for hydrologic simulation including meteorologic forcings, soil, land class, vegetation, and elevation were collected from multiple best-available data sources and organized for 2107 hydrologic subbasins (8-digit hydrologic units, HUC8s) in the conterminous United States at refined 1/24 (~4 km) spatial resolution. Using high-performance computing for intensive model calibration, a high-resolution parameter dataset was prepared for the macro-scale Variable Infiltration Capacity (VIC) hydrologic model. The VICmore » simulation was driven by DAYMET daily meteorological forcing and was calibrated against USGS WaterWatch monthly runoff observations for each HUC8. The results showed that this new parameter dataset may help reasonably simulate runoff at most US HUC8 subbasins. Based on this exhaustive calibration effort, it is now possible to accurately estimate the resources required for further model improvement across the entire conterminous United States. We anticipate that through this hydrologic parameter dataset, the repeated effort of fundamental data processing can be lessened, so that research efforts can emphasize the more challenging task of assessing climate change impacts. The pre-organized model parameter dataset will be provided to interested parties to support further hydro-climate impact assessment.« less
Interpolation of diffusion weighted imaging datasets.
Dyrby, Tim B; Lundell, Henrik; Burke, Mark W; Reislev, Nina L; Paulson, Olaf B; Ptito, Maurice; Siebner, Hartwig R
2014-12-01
Diffusion weighted imaging (DWI) is used to study white-matter fibre organisation, orientation and structural connectivity by means of fibre reconstruction algorithms and tractography. For clinical settings, limited scan time compromises the possibilities to achieve high image resolution for finer anatomical details and signal-to-noise-ratio for reliable fibre reconstruction. We assessed the potential benefits of interpolating DWI datasets to a higher image resolution before fibre reconstruction using a diffusion tensor model. Simulations of straight and curved crossing tracts smaller than or equal to the voxel size showed that conventional higher-order interpolation methods improved the geometrical representation of white-matter tracts with reduced partial-volume-effect (PVE), except at tract boundaries. Simulations and interpolation of ex-vivo monkey brain DWI datasets revealed that conventional interpolation methods fail to disentangle fine anatomical details if PVE is too pronounced in the original data. As for validation we used ex-vivo DWI datasets acquired at various image resolutions as well as Nissl-stained sections. Increasing the image resolution by a factor of eight yielded finer geometrical resolution and more anatomical details in complex regions such as tract boundaries and cortical layers, which are normally only visualized at higher image resolutions. Similar results were found with typical clinical human DWI dataset. However, a possible bias in quantitative values imposed by the interpolation method used should be considered. The results indicate that conventional interpolation methods can be successfully applied to DWI datasets for mining anatomical details that are normally seen only at higher resolutions, which will aid in tractography and microstructural mapping of tissue compartments. Copyright © 2014. Published by Elsevier Inc.
High-Resolution Digital Terrain Models of the Sacramento/San Joaquin Delta Region, California
Coons, Tom; Soulard, Christopher E.; Knowles, Noah
2008-01-01
The U.S. Geological Survey (USGS) Western Region Geographic Science Center, in conjunction with the USGS Water Resources Western Branch of Regional Research, has developed a high-resolution elevation dataset covering the Sacramento/San Joaquin Delta region of California. The elevation data were compiled photogrammically from aerial photography (May 2002) with a scale of 1:15,000. The resulting dataset has a 10-meter horizontal resolution grid of elevation values. The vertical accuracy was determined to be 1 meter. Two versions of the elevation data are available: the first dataset has all water coded as zero, whereas the second dataset has bathymetry data merged with the elevation data. The projection of both datasets is set to UTM Zone 10, NAD 1983. The elevation data are clipped into files that spatially approximate 7.5-minute USGS quadrangles, with about 100 meters of overlap to facilitate combining the files into larger regions without data gaps. The files are named after the 7.5-minute USGS quadrangles that cover the same general spatial extent. File names that include a suffix (_b) indicate that the bathymetry data are included (for example, sac_east versus sac_east_b). These files are provided in ESRI Grid format.
Given the relatively high cost of mapping impervious surfaces at regional scales, substantial effort is being expended in the development of moderate-resolution, satellite-based methods for estimating impervious surface area (ISA). To rigorously assess the accuracy of these data ...
High-Resolution Near Real-Time Drought Monitoring in South Asia
NASA Astrophysics Data System (ADS)
Aadhar, S.; Mishra, V.
2017-12-01
Drought in South Asia affect food and water security and pose challenges for millions of people. For policy-making, planning and management of water resources at the sub-basin or administrative levels, high-resolution datasets of precipitation and air temperature are required in near-real time. Here we develop a high resolution (0.05 degree) bias-corrected precipitation and temperature data that can be used to monitor near real-time drought conditions over South Asia. Moreover, the dataset can be used to monitor climatic extremes (heat waves, cold waves, dry and wet anomalies) in South Asia. A distribution mapping method was applied to correct bias in precipitation and air temperature (maximum and minimum), which performed well compared to the other bias correction method based on linear scaling. Bias-corrected precipitation and temperature data were used to estimate Standardized precipitation index (SPI) and Standardized Precipitation Evapotranspiration Index (SPEI) to assess the historical and current drought conditions in South Asia. We evaluated drought severity and extent against the satellite-based Normalized Difference Vegetation Index (NDVI) anomalies and satellite-driven Drought Severity Index (DSI) at 0.05˚. We find that the bias-corrected high-resolution data can effectively capture observed drought conditions as shown by the satellite-based drought estimates. High resolution near real-time dataset can provide valuable information for decision-making at district and sub- basin levels.
NASA Astrophysics Data System (ADS)
Aires, Filipe; Miolane, Léo; Prigent, Catherine; Pham Duc, Binh; Papa, Fabrice; Fluet-Chouinard, Etienne; Lehner, Bernhard
2017-04-01
The Global Inundation Extent from Multi-Satellites (GIEMS) provides multi-year monthly variations of the global surface water extent at 25kmx25km resolution. It is derived from multiple satellite observations. Its spatial resolution is usually compatible with climate model outputs and with global land surface model grids but is clearly not adequate for local applications that require the characterization of small individual water bodies. There is today a strong demand for high-resolution inundation extent datasets, for a large variety of applications such as water management, regional hydrological modeling, or for the analysis of mosquitos-related diseases. A new procedure is introduced to downscale the GIEMS low spatial resolution inundations to a 3 arc second (90 m) dataset. The methodology is based on topography and hydrography information from the HydroSHEDS database. A new floodability index is adopted and an innovative smoothing procedure is developed to ensure the smooth transition, in the high-resolution maps, between the low-resolution boxes from GIEMS. Topography information is relevant for natural hydrology environments controlled by elevation, but is more limited in human-modified basins. However, the proposed downscaling approach is compatible with forthcoming fusion with other more pertinent satellite information in these difficult regions. The resulting GIEMS-D3 database is the only high spatial resolution inundation database available globally at the monthly time scale over the 1993-2007 period. GIEMS-D3 is assessed by analyzing its spatial and temporal variability, and evaluated by comparisons to other independent satellite observations from visible (Google Earth and Landsat), infrared (MODIS) and active microwave (SAR).
NASA Astrophysics Data System (ADS)
van Osnabrugge, Bart; Weerts, Albrecht; Uijlenhoet, Remko
2017-04-01
Gridded areal precipitation, as one of the most important hydrometeorological input variables for initial state estimation in operational hydrological forecasting, is available in the form of raster data sets (e.g. HYRAS and EOBS) for the River Rhine basin. These datasets are compiled off-line on a daily time step using station data with the highest possible spatial density. However, such a product is not available operationally and at an hourly discretisation. Therefore, we constructed an hourly gridded precipitation dataset at 1.44 km2 resolution for the Rhine basin for the period from 1998 to present using a REGNIE-like interpolation procedure (Weerts et al., 2008) using a low and a high density rain gauge network. The datasets were validated against daily HYRAS (Rauthe, 2013) and EOBS (Haylock, 2008) data. The main goal of the operational procedure is to emulate the HYRAS dataset as good as possible, as the daily HYRAS dataset is used in the off-line calibration of the hydrological model. Our main findings are that even with low station density, the spatial patterns found in the HYRAS data set are well reproduced. With low station density (years 1999-2006) our dataset underestimates precipitation compared to HYRAS and EOBS, notably during the winter. However, interpolation based on the same set of stations overestimates precipitation compared to EOBS for the years 2006-2014. This discrepancy disappears when switching to the high station density. We also analyze the robustness of the hourly precipitation fields by comparing with stations not used during interpolation. Specific issues regarding the data when creating the gridded precipitation fields will be highlighted. Finally, the datasets are used to drive an hourly and daily gridded WFLOW_HBV model of the Rhine at the same spatial resolution. Haylock, M.R., N. Hofstra, A.M.G. Klein Tank, E.J. Klok, P.D. Jones and M. New. 2008: A European daily high-resolution gridded dataset of surface temperature and precipitation. J. Geophys. Res (Atmospheres), 113, D20119, doi:10.1029/2008JD10201 Rauthe, M., Steiner, H., Riediger, U., Mazurkiewicz, A., Gratzki, A. 2013: A Central European precipitation climatology - Part 1: Generation and validation of a high-resolution gridded daily data set (HYRAS). Meteorologische Zeitschrift, 22(3), 235 256 Weerts, A.H., D. Meißner, and S. Rademacher, 2008. Input data rainfall-runoff model operational system FEWS-NL & FEWS-DE. Technical report, Deltares.
The impact of the resolution of meteorological datasets on catchment-scale drought studies
NASA Astrophysics Data System (ADS)
Hellwig, Jost; Stahl, Kerstin
2017-04-01
Gridded meteorological datasets provide the basis to study drought at a range of scales, including catchment scale drought studies in hydrology. They are readily available to study past weather conditions and often serve real time monitoring as well. As these datasets differ in spatial/temporal coverage and spatial/temporal resolution, for most studies there is a tradeoff between these features. Our investigation examines whether biases occur when studying drought on catchment scale with low resolution input data. For that, a comparison among the datasets HYRAS (covering Central Europe, 1x1 km grid, daily data, 1951 - 2005), E-OBS (Europe, 0.25° grid, daily data, 1950-2015) and GPCC (whole world, 0.5° grid, monthly data, 1901 - 2013) is carried out. Generally, biases in precipitation increase with decreasing resolution. Most important variations are found during summer. In low mountain range of Central Europe the datasets of sparse resolution (E-OBS, GPCC) overestimate dry days and underestimate total precipitation since they are not able to describe high spatial variability. However, relative measures like the correlation coefficient reveal good consistencies of dry and wet periods, both for absolute precipitation values and standardized indices like the Standardized Precipitation Index (SPI) or Standardized Precipitation Evaporation Index (SPEI). Particularly the most severe droughts derived from the different datasets match very well. These results indicate that absolute values of sparse resolution datasets applied to catchment scale might be critical to use for an assessment of the hydrological drought at catchment scale, whereas relative measures for determining periods of drought are more trustworthy. Therefore, studies on drought, that downscale meteorological data, should carefully consider their data needs and focus on relative measures for dry periods if sufficient for the task.
Collection and Analysis of Crowd Data with Aerial, Rooftop, and Ground Views
2014-11-10
collected these datasets using different aircrafts. Erista 8 HL OctaCopter is a heavy-lift aerial platform capable of using high-resolution cinema ...is another high-resolution camera that is cinema grade and high quality, with the capability of capturing videos with 4K resolution at 30 frames per...292.58 Imaging Systems and Accessories Blackmagic Production Camera 4 Crowd Counting using 4K Cameras High resolution cinema grade digital video
NASA Astrophysics Data System (ADS)
Van Gordon, M.; Van Gordon, S.; Min, A.; Sullivan, J.; Weiner, Z.; Tappan, G. G.
2017-12-01
Using support vector machine (SVM) learning and high-accuracy hand-classified maps, we have developed a publicly available land cover classification tool for the West African Sahel. Our classifier produces high-resolution and regionally calibrated land cover maps for the Sahel, representing a significant contribution to the data available for this region. Global land cover products are unreliable for the Sahel, and accurate land cover data for the region are sparse. To address this gap, the U.S. Geological Survey and the Regional Center for Agriculture, Hydrology and Meteorology (AGRHYMET) in Niger produced high-quality land cover maps for the region via hand-classification of Landsat images. This method produces highly accurate maps, but the time and labor required constrain the spatial and temporal resolution of the data products. By using these hand-classified maps alongside SVM techniques, we successfully increase the resolution of the land cover maps by 1-2 orders of magnitude, from 2km-decadal resolution to 30m-annual resolution. These high-resolution regionally calibrated land cover datasets, along with the classifier we developed to produce them, lay the foundation for major advances in studies of land surface processes in the region. These datasets will provide more accurate inputs for food security modeling, hydrologic modeling, analyses of land cover change and climate change adaptation efforts. The land cover classification tool we have developed will be publicly available for use in creating additional West Africa land cover datasets with future remote sensing data and can be adapted for use in other parts of the world.
NASA Astrophysics Data System (ADS)
Cruden, A. R.; Vollgger, S.
2016-12-01
The emerging capability of UAV photogrammetry combines a simple and cost-effective method to acquire digital aerial images with advanced computer vision algorithms that compute spatial datasets from a sequence of overlapping digital photographs from various viewpoints. Depending on flight altitude and camera setup, sub-centimeter spatial resolution orthophotographs and textured dense point clouds can be achieved. Orientation data can be collected for detailed structural analysis by digitally mapping such high-resolution spatial datasets in a fraction of time and with higher fidelity compared to traditional mapping techniques. Here we describe a photogrammetric workflow applied to a structural study of folds and fractures within alternating layers of sandstone and mudstone at a coastal outcrop in SE Australia. We surveyed this location using a downward looking digital camera mounted on commercially available multi-rotor UAV that autonomously followed waypoints at a set altitude and speed to ensure sufficient image overlap, minimum motion blur and an appropriate resolution. The use of surveyed ground control points allowed us to produce a geo-referenced 3D point cloud and an orthophotograph from hundreds of digital images at a spatial resolution < 10 mm per pixel, and cm-scale location accuracy. Orientation data of brittle and ductile structures were semi-automatically extracted from these high-resolution datasets using open-source software. This resulted in an extensive and statistically relevant orientation dataset that was used to 1) interpret the progressive development of folds and faults in the region, and 2) to generate a 3D structural model that underlines the complex internal structure of the outcrop and quantifies spatial variations in fold geometries. Overall, our work highlights how UAV photogrammetry can contribute to new insights in structural analysis.
A reanalysis dataset of the South China Sea.
Zeng, Xuezhi; Peng, Shiqiu; Li, Zhijin; Qi, Yiquan; Chen, Rongyu
2014-01-01
Ocean reanalysis provides a temporally continuous and spatially gridded four-dimensional estimate of the ocean state for a better understanding of the ocean dynamics and its spatial/temporal variability. Here we present a 19-year (1992-2010) high-resolution ocean reanalysis dataset of the upper ocean in the South China Sea (SCS) produced from an ocean data assimilation system. A wide variety of observations, including in-situ temperature/salinity profiles, ship-measured and satellite-derived sea surface temperatures, and sea surface height anomalies from satellite altimetry, are assimilated into the outputs of an ocean general circulation model using a multi-scale incremental three-dimensional variational data assimilation scheme, yielding a daily high-resolution reanalysis dataset of the SCS. Comparisons between the reanalysis and independent observations support the reliability of the dataset. The presented dataset provides the research community of the SCS an important data source for studying the thermodynamic processes of the ocean circulation and meso-scale features in the SCS, including their spatial and temporal variability.
A reanalysis dataset of the South China Sea
Zeng, Xuezhi; Peng, Shiqiu; Li, Zhijin; Qi, Yiquan; Chen, Rongyu
2014-01-01
Ocean reanalysis provides a temporally continuous and spatially gridded four-dimensional estimate of the ocean state for a better understanding of the ocean dynamics and its spatial/temporal variability. Here we present a 19-year (1992–2010) high-resolution ocean reanalysis dataset of the upper ocean in the South China Sea (SCS) produced from an ocean data assimilation system. A wide variety of observations, including in-situ temperature/salinity profiles, ship-measured and satellite-derived sea surface temperatures, and sea surface height anomalies from satellite altimetry, are assimilated into the outputs of an ocean general circulation model using a multi-scale incremental three-dimensional variational data assimilation scheme, yielding a daily high-resolution reanalysis dataset of the SCS. Comparisons between the reanalysis and independent observations support the reliability of the dataset. The presented dataset provides the research community of the SCS an important data source for studying the thermodynamic processes of the ocean circulation and meso-scale features in the SCS, including their spatial and temporal variability. PMID:25977803
NASA Astrophysics Data System (ADS)
Nesbit, P. R.; Hugenholtz, C.; Durkin, P.; Hubbard, S. M.; Kucharczyk, M.; Barchyn, T.
2016-12-01
Remote sensing and digital mapping have started to revolutionize geologic mapping in recent years as a result of their realized potential to provide high resolution 3D models of outcrops to assist with interpretation, visualization, and obtaining accurate measurements of inaccessible areas. However, in stratigraphic mapping applications in complex terrain, it is difficult to acquire information with sufficient detail at a wide spatial coverage with conventional techniques. We demonstrate the potential of a UAV and Structure from Motion (SfM) photogrammetric approach for improving 3D stratigraphic mapping applications within a complex badland topography. Our case study is performed in Dinosaur Provincial Park (Alberta, Canada), mapping late Cretaceous fluvial meander belt deposits of the Dinosaur Park formation amidst a succession of steeply sloping hills and abundant drainages - creating a challenge for stratigraphic mapping. The UAV-SfM dataset (2 cm spatial resolution) is compared directly with a combined satellite and aerial LiDAR dataset (30 cm spatial resolution) to reveal advantages and limitations of each dataset before presenting a unique workflow that utilizes the dense point cloud from the UAV-SfM dataset for analysis. The UAV-SfM dense point cloud minimizes distortion, preserves 3D structure, and records an RGB attribute - adding potential value in future studies. The proposed UAV-SfM workflow allows for high spatial resolution remote sensing of stratigraphy in complex topographic environments. This extended capability can add value to field observations and has the potential to be integrated with subsurface petroleum models.
Individual Brain Charting, a high-resolution fMRI dataset for cognitive mapping.
Pinho, Ana Luísa; Amadon, Alexis; Ruest, Torsten; Fabre, Murielle; Dohmatob, Elvis; Denghien, Isabelle; Ginisty, Chantal; Becuwe-Desmidt, Séverine; Roger, Séverine; Laurier, Laurence; Joly-Testault, Véronique; Médiouni-Cloarec, Gaëlle; Doublé, Christine; Martins, Bernadette; Pinel, Philippe; Eger, Evelyn; Varoquaux, Gaël; Pallier, Christophe; Dehaene, Stanislas; Hertz-Pannier, Lucie; Thirion, Bertrand
2018-06-12
Functional Magnetic Resonance Imaging (fMRI) has furthered brain mapping on perceptual, motor, as well as higher-level cognitive functions. However, to date, no data collection has systematically addressed the functional mapping of cognitive mechanisms at a fine spatial scale. The Individual Brain Charting (IBC) project stands for a high-resolution multi-task fMRI dataset that intends to provide the objective basis toward a comprehensive functional atlas of the human brain. The data refer to a cohort of 12 participants performing many different tasks. The large amount of task-fMRI data on the same subjects yields a precise mapping of the underlying functions, free from both inter-subject and inter-site variability. The present article gives a detailed description of the first release of the IBC dataset. It comprises a dozen of tasks, addressing both low- and high- level cognitive functions. This openly available dataset is thus intended to become a reference for cognitive brain mapping.
NASA Astrophysics Data System (ADS)
Quintana-Seguí, Pere; Turco, Marco; Herrera, Sixto; Miguez-Macho, Gonzalo
2017-04-01
Offline land surface model (LSM) simulations are useful for studying the continental hydrological cycle. Because of the nonlinearities in the models, the results are very sensitive to the quality of the meteorological forcing; thus, high-quality gridded datasets of screen-level meteorological variables are needed. Precipitation datasets are particularly difficult to produce due to the inherent spatial and temporal heterogeneity of that variable. They do, however, have a large impact on the simulations, and it is thus necessary to carefully evaluate their quality in great detail. This paper reports the quality of two high-resolution precipitation datasets for Spain at the daily time scale: the new SAFRAN-based dataset and Spain02. SAFRAN is a meteorological analysis system that was designed to force LSMs and has recently been extended to the entirety of Spain for a long period of time (1979/1980-2013/2014). Spain02 is a daily precipitation dataset for Spain and was created mainly to validate regional climate models. In addition, ERA-Interim is included in the comparison to show the differences between local high-resolution and global low-resolution products. The study compares the different precipitation analyses with rain gauge data and assesses their temporal and spatial similarities to the observations. The validation of SAFRAN with independent data shows that this is a robust product. SAFRAN and Spain02 have very similar scores, although the latter slightly surpasses the former. The scores are robust with altitude and throughout the year, save perhaps in summer when a diminished skill is observed. As expected, SAFRAN and Spain02 perform better than ERA-Interim, which has difficulty capturing the effects of the relief on precipitation due to its low resolution. However, ERA-Interim reproduces spells remarkably well in contrast to the low skill shown by the high-resolution products. The high-resolution gridded products overestimate the number of precipitation days, which is a problem that affects SAFRAN more than Spain02 and is likely caused by the interpolation method. Both SAFRAN and Spain02 underestimate high precipitation events, but SAFRAN does so more than Spain02. The overestimation of low precipitation events and the underestimation of intense episodes will probably have hydrological consequences once the data are used to force a land surface or hydrological model.
High-Resolution Data for a Low-Resolution World
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brady, Brendan Williams
2016-05-10
In the past 15 years, the upper section of Cañon de Valle has been severely altered by wildfires and subsequent runoff events. Loss of root structures on high-angle slopes results in debris flow and sediment accumulation in the narrow canyon bottom. The original intent of the study described here was to better understand the changes occurring in watershed soil elevations over the course of several post-fire years. An elevation dataset from 5 years post-Cerro Grande fire was compared to high-resolution LiDAR data from 14 years post-Cerro Grande fire (also 3 years post-Las Conchas fire). The following analysis was motivated bymore » a problematic comparison of these datasets of unlike resolution, and therefore focuses on what the data reveals of itself. The objective of this study is to highlight the effects vegetation can have on remote sensing data that intends to read ground surface elevation.« less
Robinson, Nathaniel; Allred, Brady; Jones, Matthew; ...
2017-08-21
Satellite derived vegetation indices (VIs) are broadly used in ecological research, ecosystem modeling, and land surface monitoring. The Normalized Difference Vegetation Index (NDVI), perhaps the most utilized VI, has countless applications across ecology, forestry, agriculture, wildlife, biodiversity, and other disciplines. Calculating satellite derived NDVI is not always straight-forward, however, as satellite remote sensing datasets are inherently noisy due to cloud and atmospheric contamination, data processing failures, and instrument malfunction. Readily available NDVI products that account for these complexities are generally at coarse resolution; high resolution NDVI datasets are not conveniently accessible and developing them often presents numerous technical and methodologicalmore » challenges. Here, we address this deficiency by producing a Landsat derived, high resolution (30 m), long-term (30+ years) NDVI dataset for the conterminous United States. We use Google Earth Engine, a planetary-scale cloud-based geospatial analysis platform, for processing the Landsat data and distributing the final dataset. We use a climatology driven approach to fill missing data and validate the dataset with established remote sensing products at multiple scales. We provide access to the composites through a simple web application, allowing users to customize key parameters appropriate for their application, question, and region of interest.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, Nathaniel; Allred, Brady; Jones, Matthew
Satellite derived vegetation indices (VIs) are broadly used in ecological research, ecosystem modeling, and land surface monitoring. The Normalized Difference Vegetation Index (NDVI), perhaps the most utilized VI, has countless applications across ecology, forestry, agriculture, wildlife, biodiversity, and other disciplines. Calculating satellite derived NDVI is not always straight-forward, however, as satellite remote sensing datasets are inherently noisy due to cloud and atmospheric contamination, data processing failures, and instrument malfunction. Readily available NDVI products that account for these complexities are generally at coarse resolution; high resolution NDVI datasets are not conveniently accessible and developing them often presents numerous technical and methodologicalmore » challenges. Here, we address this deficiency by producing a Landsat derived, high resolution (30 m), long-term (30+ years) NDVI dataset for the conterminous United States. We use Google Earth Engine, a planetary-scale cloud-based geospatial analysis platform, for processing the Landsat data and distributing the final dataset. We use a climatology driven approach to fill missing data and validate the dataset with established remote sensing products at multiple scales. We provide access to the composites through a simple web application, allowing users to customize key parameters appropriate for their application, question, and region of interest.« less
NASA Astrophysics Data System (ADS)
Brown, I.; Wennbom, M.
2013-12-01
Climate change, population growth and changes in traditional lifestyles have led to instabilities in traditional demarcations between neighboring ethic and religious groups in the Sahel region. This has resulted in a number of conflicts as groups resort to arms to settle disputes. Such disputes often centre on or are justified by competition for resources. The conflict in Darfur has been controversially explained by resource scarcity resulting from climate change. Here we analyse established methods of using satellite imagery to assess vegetation health in Darfur. Multi-decadal time series of observations are available using low spatial resolution visible-near infrared imagery. Typically normalized difference vegetation index (NDVI) analyses are produced to describe changes in vegetation ';greenness' or ';health'. Such approaches have been widely used to evaluate the long term development of vegetation in relation to climate variations across a wide range of environments from the Arctic to the Sahel. These datasets typically measure peak NDVI observed over a given interval and may introduce bias. It is furthermore unclear how the spatial organization of sparse vegetation may affect low resolution NDVI products. We develop and assess alternative measures of vegetation including descriptors of the growing season, wetness and resource availability. Expanding the range of parameters used in the analysis reduces our dependence on peak NDVI. Furthermore, these descriptors provide a better characterization of the growing season than the single NDVI measure. Using multi-sensor data we combine high temporal/moderate spatial resolution data with low temporal/high spatial resolution data to improve the spatial representativity of the observations and to provide improved spatial analysis of vegetation patterns. The approach places the high resolution observations in the NDVI context space using a longer time series of lower resolution imagery. The vegetation descriptors derived are evaluated using independent high spatial resolution datasets that reveal the pattern and health of vegetation at metre scales. We also use climate variables to support the interpretation of these data. We conclude that the spatio-temporal patterns in Darfur vegetation and climate datasets suggest that labelling the conflict a climate-change conflict is inaccurate and premature.
A New High Resolution Climate Dataset for Climate Change Impacts Assessments in New England
NASA Astrophysics Data System (ADS)
Komurcu, M.; Huber, M.
2016-12-01
Assessing regional impacts of climate change (such as changes in extreme events, land surface hydrology, water resources, energy, ecosystems and economy) requires much higher resolution climate variables than those available from global model projections. While it is possible to run global models in higher resolution, the high computational cost associated with these simulations prevent their use in such manner. To alleviate this problem, dynamical downscaling offers a method to deliver higher resolution climate variables. As part of an NSF EPSCoR funded interdisciplinary effort to assess climate change impacts on New Hampshire ecosystems, hydrology and economy (the New Hampshire Ecosystems and Society project), we create a unique high-resolution climate dataset for New England. We dynamically downscale global model projections under a high impact emissions scenario using the Weather Research and Forecasting model (WRF) with three nested grids of 27, 9 and 3 km horizontal resolution with the highest resolution innermost grid focusing over New England. We prefer dynamical downscaling over other methods such as statistical downscaling because it employs physical equations to progressively simulate climate variables as atmospheric processes interact with surface processes, emissions, radiation, clouds, precipitation and other model components, hence eliminates fix relationships between variables. In addition to simulating mean changes in regional climate, dynamical downscaling also allows for the simulation of climate extremes that significantly alter climate change impacts. We simulate three time slices: 2006-2015, 2040-2060 and 2080-2100. This new high-resolution climate dataset (with more than 200 variables saved in hourly (six hourly) intervals for the highest resolution domain (outer two domains)) along with model input and restart files used in our WRF simulations will be publicly available for use to the broader scientific community to support in-depth climate change impacts assessments for New England. We present results focusing on future changes in New England extreme events.
Enhancing GIS Capabilities for High Resolution Earth Science Grids
NASA Astrophysics Data System (ADS)
Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.
2017-12-01
Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.
Gangodagamage, Chandana; Wullschleger, Stan
2014-07-03
The dataset represents microtopographic characterization of the ice-wedge polygon landscape in Barrow, Alaska. Three microtopographic features are delineated using 0.25 m high resolution digital elevation dataset derived from LiDAR. The troughs, rims, and centers are the three categories in this classification scheme. The polygon troughs are the surface expression of the ice-wedges that are in lower elevations than the interior polygon. The elevated shoulders of the polygon interior immediately adjacent to the polygon troughs are the polygon rims for the low center polygons. In case of high center polygons, these features are the topographic highs. In this classification scheme, both topographic highs and rims are considered as polygon rims. The next version of the dataset will include more refined classification scheme including separate classes for rims ad topographic highs. The interior part of the polygon just adjacent to the polygon rims are the polygon centers.
NASA Astrophysics Data System (ADS)
Madricardo, Fantina; Foglini, Federica; Kruss, Aleksandra; Ferrarin, Christian; Pizzeghello, Nicola Marco; Murri, Chiara; Rossi, Monica; Bajo, Marco; Bellafiore, Debora; Campiani, Elisabetta; Fogarin, Stefano; Grande, Valentina; Janowski, Lukasz; Keppel, Erica; Leidi, Elisa; Lorenzetti, Giuliano; Maicu, Francesco; Maselli, Vittorio; Mercorella, Alessandra; Montereale Gavazzi, Giacomo; Minuzzo, Tiziano; Pellegrini, Claudio; Petrizzo, Antonio; Prampolini, Mariacristina; Remia, Alessandro; Rizzetto, Federica; Rovere, Marzia; Sarretta, Alessandro; Sigovini, Marco; Sinapi, Luigi; Umgiesser, Georg; Trincardi, Fabio
2017-09-01
Tidal channels are crucial for the functioning of wetlands, though their morphological properties, which are relevant for seafloor habitats and flow, have been understudied so far. Here, we release a dataset composed of Digital Terrain Models (DTMs) extracted from a total of 2,500 linear kilometres of high-resolution multibeam echosounder (MBES) data collected in 2013 covering the entire network of tidal channels and inlets of the Venice Lagoon, Italy. The dataset comprises also the backscatter (BS) data, which reflect the acoustic properties of the seafloor, and the tidal current fields simulated by means of a high-resolution three-dimensional unstructured hydrodynamic model. The DTMs and the current fields help define how morphological and benthic properties of tidal channels are affected by the action of currents. These data are of potential broad interest not only to geomorphologists, oceanographers and ecologists studying the morphology, hydrodynamics, sediment transport and benthic habitats of tidal environments, but also to coastal engineers and stakeholders for cost-effective monitoring and sustainable management of this peculiar shallow coastal system.
Madricardo, Fantina; Foglini, Federica; Kruss, Aleksandra; Ferrarin, Christian; Pizzeghello, Nicola Marco; Murri, Chiara; Rossi, Monica; Bajo, Marco; Bellafiore, Debora; Campiani, Elisabetta; Fogarin, Stefano; Grande, Valentina; Janowski, Lukasz; Keppel, Erica; Leidi, Elisa; Lorenzetti, Giuliano; Maicu, Francesco; Maselli, Vittorio; Mercorella, Alessandra; Montereale Gavazzi, Giacomo; Minuzzo, Tiziano; Pellegrini, Claudio; Petrizzo, Antonio; Prampolini, Mariacristina; Remia, Alessandro; Rizzetto, Federica; Rovere, Marzia; Sarretta, Alessandro; Sigovini, Marco; Sinapi, Luigi; Umgiesser, Georg; Trincardi, Fabio
2017-01-01
Tidal channels are crucial for the functioning of wetlands, though their morphological properties, which are relevant for seafloor habitats and flow, have been understudied so far. Here, we release a dataset composed of Digital Terrain Models (DTMs) extracted from a total of 2,500 linear kilometres of high-resolution multibeam echosounder (MBES) data collected in 2013 covering the entire network of tidal channels and inlets of the Venice Lagoon, Italy. The dataset comprises also the backscatter (BS) data, which reflect the acoustic properties of the seafloor, and the tidal current fields simulated by means of a high-resolution three-dimensional unstructured hydrodynamic model. The DTMs and the current fields help define how morphological and benthic properties of tidal channels are affected by the action of currents. These data are of potential broad interest not only to geomorphologists, oceanographers and ecologists studying the morphology, hydrodynamics, sediment transport and benthic habitats of tidal environments, but also to coastal engineers and stakeholders for cost-effective monitoring and sustainable management of this peculiar shallow coastal system. PMID:28872636
Madricardo, Fantina; Foglini, Federica; Kruss, Aleksandra; Ferrarin, Christian; Pizzeghello, Nicola Marco; Murri, Chiara; Rossi, Monica; Bajo, Marco; Bellafiore, Debora; Campiani, Elisabetta; Fogarin, Stefano; Grande, Valentina; Janowski, Lukasz; Keppel, Erica; Leidi, Elisa; Lorenzetti, Giuliano; Maicu, Francesco; Maselli, Vittorio; Mercorella, Alessandra; Montereale Gavazzi, Giacomo; Minuzzo, Tiziano; Pellegrini, Claudio; Petrizzo, Antonio; Prampolini, Mariacristina; Remia, Alessandro; Rizzetto, Federica; Rovere, Marzia; Sarretta, Alessandro; Sigovini, Marco; Sinapi, Luigi; Umgiesser, Georg; Trincardi, Fabio
2017-09-05
Tidal channels are crucial for the functioning of wetlands, though their morphological properties, which are relevant for seafloor habitats and flow, have been understudied so far. Here, we release a dataset composed of Digital Terrain Models (DTMs) extracted from a total of 2,500 linear kilometres of high-resolution multibeam echosounder (MBES) data collected in 2013 covering the entire network of tidal channels and inlets of the Venice Lagoon, Italy. The dataset comprises also the backscatter (BS) data, which reflect the acoustic properties of the seafloor, and the tidal current fields simulated by means of a high-resolution three-dimensional unstructured hydrodynamic model. The DTMs and the current fields help define how morphological and benthic properties of tidal channels are affected by the action of currents. These data are of potential broad interest not only to geomorphologists, oceanographers and ecologists studying the morphology, hydrodynamics, sediment transport and benthic habitats of tidal environments, but also to coastal engineers and stakeholders for cost-effective monitoring and sustainable management of this peculiar shallow coastal system.
Bayesian Peptide Peak Detection for High Resolution TOF Mass Spectrometry.
Zhang, Jianqiu; Zhou, Xiaobo; Wang, Honghui; Suffredini, Anthony; Zhang, Lin; Huang, Yufei; Wong, Stephen
2010-11-01
In this paper, we address the issue of peptide ion peak detection for high resolution time-of-flight (TOF) mass spectrometry (MS) data. A novel Bayesian peptide ion peak detection method is proposed for TOF data with resolution of 10 000-15 000 full width at half-maximum (FWHW). MS spectra exhibit distinct characteristics at this resolution, which are captured in a novel parametric model. Based on the proposed parametric model, a Bayesian peak detection algorithm based on Markov chain Monte Carlo (MCMC) sampling is developed. The proposed algorithm is tested on both simulated and real datasets. The results show a significant improvement in detection performance over a commonly employed method. The results also agree with expert's visual inspection. Moreover, better detection consistency is achieved across MS datasets from patients with identical pathological condition.
Bayesian Peptide Peak Detection for High Resolution TOF Mass Spectrometry
Zhang, Jianqiu; Zhou, Xiaobo; Wang, Honghui; Suffredini, Anthony; Zhang, Lin; Huang, Yufei; Wong, Stephen
2011-01-01
In this paper, we address the issue of peptide ion peak detection for high resolution time-of-flight (TOF) mass spectrometry (MS) data. A novel Bayesian peptide ion peak detection method is proposed for TOF data with resolution of 10 000–15 000 full width at half-maximum (FWHW). MS spectra exhibit distinct characteristics at this resolution, which are captured in a novel parametric model. Based on the proposed parametric model, a Bayesian peak detection algorithm based on Markov chain Monte Carlo (MCMC) sampling is developed. The proposed algorithm is tested on both simulated and real datasets. The results show a significant improvement in detection performance over a commonly employed method. The results also agree with expert’s visual inspection. Moreover, better detection consistency is achieved across MS datasets from patients with identical pathological condition. PMID:21544266
NASA Astrophysics Data System (ADS)
Martin-Hernandez, Natalia; Vicente-Serrano, Sergio; Azorin-Molina, Cesar; Begueria-Portugues, Santiago; Reig-Gracia, Fergus; Zabalza-Martínez, Javier
2017-04-01
We have analysed trends in the Normalized Difference Vegetation Index (NDVI) in the Iberian Peninsula and The Balearic Islands over the period 1981 - 2015 using a new high resolution data set from the entire available NOAA - AVHRR images (IBERIAN NDVI dataset). After a complete processing including geocoding, calibration, cloud removal, topographic correction and temporal filtering, we obtained bi-weekly time series. To assess the accuracy of the new IBERIAN NDVI time-series, we have compared temporal variability and trends of NDVI series with those results reported by GIMMS 3g and MODIS (MOD13A3) NDVI datasets. In general, the IBERIAN NDVI showed high reliability with these two products but showing higher spatial resolution than the GIMMS dataset and covering two more decades than the MODIS dataset. Using the IBERIAN NDVI dataset, we analysed NDVI trends by means of the non-parametric Mann-Kendall test and Theil-Sen slope estimator. In average, vegetation trends in the study area show an increase over the last decades. However, there are local spatial differences: the main increase has been recorded in humid regions of the north of the Iberian Peninsula. The statistical techniques allow finding abrupt and gradual changes in different land cover types during the analysed period. These changes are related with human activity due to land transformations (from dry to irrigated land), land abandonment and forest recovery.
NASA Astrophysics Data System (ADS)
Liebel, L.; Körner, M.
2016-06-01
In optical remote sensing, spatial resolution of images is crucial for numerous applications. Space-borne systems are most likely to be affected by a lack of spatial resolution, due to their natural disadvantage of a large distance between the sensor and the sensed object. Thus, methods for single-image super resolution are desirable to exceed the limits of the sensor. Apart from assisting visual inspection of datasets, post-processing operations—e.g., segmentation or feature extraction—can benefit from detailed and distinguishable structures. In this paper, we show that recently introduced state-of-the-art approaches for single-image super resolution of conventional photographs, making use of deep learning techniques, such as convolutional neural networks (CNN), can successfully be applied to remote sensing data. With a huge amount of training data available, end-to-end learning is reasonably easy to apply and can achieve results unattainable using conventional handcrafted algorithms. We trained our CNN on a specifically designed, domain-specific dataset, in order to take into account the special characteristics of multispectral remote sensing data. This dataset consists of publicly available SENTINEL-2 images featuring 13 spectral bands, a ground resolution of up to 10m, and a high radiometric resolution and thus satisfying our requirements in terms of quality and quantity. In experiments, we obtained results superior compared to competing approaches trained on generic image sets, which failed to reasonably scale satellite images with a high radiometric resolution, as well as conventional interpolation methods.
Development and application of GIS-based PRISM integration through a plugin approach
NASA Astrophysics Data System (ADS)
Lee, Woo-Seop; Chun, Jong Ahn; Kang, Kwangmin
2014-05-01
A PRISM (Parameter-elevation Regressions on Independent Slopes Model) QGIS-plugin was developed on Quantum GIS platform in this study. This Quantum GIS plugin system provides user-friendly graphic user interfaces (GUIs) so that users can obtain gridded meteorological data of high resolutions (1 km × 1 km). Also, this software is designed to run on a personal computer so that it does not require an internet access or a sophisticated computer system. This module is a user-friendly system that a user can generate PRISM data with ease. The proposed PRISM QGIS-plugin is a hybrid statistical-geographic model system that uses coarse resolution datasets (APHRODITE datasets in this study) with digital elevation data to generate the fine-resolution gridded precipitation. To validate the performance of the software, Prek Thnot River Basin in Kandal, Cambodia is selected for application. Overall statistical analysis shows promising outputs generated by the proposed plugin. Error measures such as RMSE (Root Mean Square Error) and MAPE (Mean Absolute Percentage Error) were used to evaluate the performance of the developed PRISM QGIS-plugin. Evaluation results using RMSE and MAPE were 2.76 mm and 4.2%, respectively. This study suggested that the plugin can be used to generate high resolution precipitation datasets for hydrological and climatological studies at a watershed where observed weather datasets are limited.
NASA Astrophysics Data System (ADS)
Mosier, T. M.; Hill, D. F.; Sharp, K. V.
2013-12-01
High spatial resolution time-series data are critical for many hydrological and earth science studies. Multiple groups have developed historical and forecast datasets of high-resolution monthly time-series for regions of the world such as the United States (e.g. PRISM for hindcast data and MACA for long-term forecasts); however, analogous datasets have not been available for most data scarce regions. The current work fills this data need by producing and freely distributing hindcast and forecast time-series datasets of monthly precipitation and mean temperature for all global land surfaces, gridded at a 30 arc-second resolution. The hindcast data are constructed through a Delta downscaling method, using as inputs 0.5 degree monthly time-series and 30 arc-second climatology global weather datasets developed by Willmott & Matsuura and WorldClim, respectively. The forecast data are formulated using a similar downscaling method, but with an additional step to remove bias from the climate variable's probability distribution over each region of interest. The downscaling package is designed to be compatible with a number of general circulation models (GCM) (e.g. with GCMs developed for the IPCC AR4 report and CMIP5), and is presently implemented using time-series data from the NCAR CESM1 model in conjunction with 30 arc-second future decadal climatologies distributed by the Consultative Group on International Agricultural Research. The resulting downscaled datasets are 30 arc-second time-series forecasts of monthly precipitation and mean temperature available for all global land areas. As an example of these data, historical and forecast 30 arc-second monthly time-series from 1950 through 2070 are created and analyzed for the region encompassing Pakistan. For this case study, forecast datasets corresponding to the future representative concentration pathways 45 and 85 scenarios developed by the IPCC are presented and compared. This exercise highlights a range of potential meteorological trends for the Pakistan region and more broadly serves to demonstrate the utility of the presented 30 arc-second monthly precipitation and mean temperature datasets for use in data scarce regions.
Gesch, D.; Williams, J.; Miller, W.
2001-01-01
Elevation models produced from Shuttle Radar Topography Mission (SRTM) data will be the most comprehensive, consistently processed, highest resolution topographic dataset ever produced for the Earth's land surface. Many applications that currently use elevation data will benefit from the increased availability of data with higher accuracy, quality, and resolution, especially in poorly mapped areas of the globe. SRTM data will be produced as seamless data, thereby avoiding many of the problems inherent in existing multi-source topographic databases. Serving as precursors to SRTM datasets, the U.S. Geological Survey (USGS) has produced and is distributing seamless elevation datasets that facilitate scientific use of elevation data over large areas. GTOPO30 is a global elevation model with a 30 arc-second resolution (approximately 1-kilometer). The National Elevation Dataset (NED) covers the United States at a resolution of 1 arc-second (approximately 30-meters). Due to their seamless format and broad area coverage, both GTOPO30 and NED represent an advance in the usability of elevation data, but each still includes artifacts from the highly variable source data used to produce them. The consistent source data and processing approach for SRTM data will result in elevation products that will be a significant addition to the current availability of seamless datasets, specifically for many areas outside the U.S. One application that demonstrates some advantages that may be realized with SRTM data is delineation of land surface drainage features (watersheds and stream channels). Seamless distribution of elevation data in which a user interactively specifies the area of interest and order parameters via a map server is already being successfully demonstrated with existing USGS datasets. Such an approach for distributing SRTM data is ideal for a dataset that undoubtedly will be of very high interest to the spatial data user community.
Labay, Keith A.; Haeussler, Peter J.
2008-01-01
A new Digital Elevation Model was created using the best available high-resolution topography and multibeam bathymetry surrounding the area of Seward, Alaska. Datasets of (1) LIDAR topography collected for the Kenai Watershed Forum, (2) Seward harbor soundings from the U.S. Army Corp of Engineers, and (3) multibeam bathymetry from the National Oceanic and Atmospheric Administration contributed to the final combined product. These datasets were placed into a common coordinate system, horizontal datum, vertical datum, and data format prior to being combined. The projected coordinate system of Universal Transverse Mercator Zone 6 North American Datum of 1927 was used for the horizontal coordinates. Z-values in meters were referenced to the tidal datum of Mean High Water. Gaps between the datasets were interpolated to create the final seamless 5-meter grid covering the area of interest around Seward, Alaska.
A Two-Stream Deep Fusion Framework for High-Resolution Aerial Scene Classification
Liu, Fuxian
2018-01-01
One of the challenging problems in understanding high-resolution remote sensing images is aerial scene classification. A well-designed feature representation method and classifier can improve classification accuracy. In this paper, we construct a new two-stream deep architecture for aerial scene classification. First, we use two pretrained convolutional neural networks (CNNs) as feature extractor to learn deep features from the original aerial image and the processed aerial image through saliency detection, respectively. Second, two feature fusion strategies are adopted to fuse the two different types of deep convolutional features extracted by the original RGB stream and the saliency stream. Finally, we use the extreme learning machine (ELM) classifier for final classification with the fused features. The effectiveness of the proposed architecture is tested on four challenging datasets: UC-Merced dataset with 21 scene categories, WHU-RS dataset with 19 scene categories, AID dataset with 30 scene categories, and NWPU-RESISC45 dataset with 45 challenging scene categories. The experimental results demonstrate that our architecture gets a significant classification accuracy improvement over all state-of-the-art references. PMID:29581722
A Two-Stream Deep Fusion Framework for High-Resolution Aerial Scene Classification.
Yu, Yunlong; Liu, Fuxian
2018-01-01
One of the challenging problems in understanding high-resolution remote sensing images is aerial scene classification. A well-designed feature representation method and classifier can improve classification accuracy. In this paper, we construct a new two-stream deep architecture for aerial scene classification. First, we use two pretrained convolutional neural networks (CNNs) as feature extractor to learn deep features from the original aerial image and the processed aerial image through saliency detection, respectively. Second, two feature fusion strategies are adopted to fuse the two different types of deep convolutional features extracted by the original RGB stream and the saliency stream. Finally, we use the extreme learning machine (ELM) classifier for final classification with the fused features. The effectiveness of the proposed architecture is tested on four challenging datasets: UC-Merced dataset with 21 scene categories, WHU-RS dataset with 19 scene categories, AID dataset with 30 scene categories, and NWPU-RESISC45 dataset with 45 challenging scene categories. The experimental results demonstrate that our architecture gets a significant classification accuracy improvement over all state-of-the-art references.
NASA Astrophysics Data System (ADS)
Javernick, L.; Bertoldi, W.; Redolfi, M.
2017-12-01
Accessing or acquiring high quality, low-cost topographic data has never been easier due to recent developments of the photogrammetric techniques of Structure-from-Motion (SfM). Researchers can acquire the necessary SfM imagery with various platforms, with the ability to capture millimetre resolution and accuracy, or large-scale areas with the help of unmanned platforms. Such datasets in combination with numerical modelling have opened up new opportunities to study river environments physical and ecological relationships. While numerical models overall predictive accuracy is most influenced by topography, proper model calibration requires hydraulic data and morphological data; however, rich hydraulic and morphological datasets remain scarce. This lack in field and laboratory data has limited model advancement through the inability to properly calibrate, assess sensitivity, and validate the models performance. However, new time-lapse imagery techniques have shown success in identifying instantaneous sediment transport in flume experiments and their ability to improve hydraulic model calibration. With new capabilities to capture high resolution spatial and temporal datasets of flume experiments, there is a need to further assess model performance. To address this demand, this research used braided river flume experiments and captured time-lapse observed sediment transport and repeat SfM elevation surveys to provide unprecedented spatial and temporal datasets. Through newly created metrics that quantified observed and modeled activation, deactivation, and bank erosion rates, the numerical model Delft3d was calibrated. This increased temporal data of both high-resolution time series and long-term temporal coverage provided significantly improved calibration routines that refined calibration parameterization. Model results show that there is a trade-off between achieving quantitative statistical and qualitative morphological representations. Specifically, statistical agreement simulations suffered to represent braiding planforms (evolving toward meandering), and parameterization that ensured braided produced exaggerated activation and bank erosion rates. Marie Sklodowska-Curie Individual Fellowship: River-HMV, 656917
The Status of the NASA MEaSUREs Combined ASTER and MODIS Emissivity Over Land (CAMEL) Products
NASA Astrophysics Data System (ADS)
Borbas, E. E.; Feltz, M.; Hulley, G. C.; Knuteson, R. O.; Hook, S. J.
2017-12-01
As part of a NASA MEaSUREs Land Surface Temperature and Emissivity project, the University of Wisconsin, Space Science and Engineering Center and the NASA's Jet Propulsion Laboratory have developed a global monthly mean emissivity Earth System Data Record (ESDR). The CAMEL ESDR was produced by merging two current state-of-the-art emissivity datasets: the UW-Madison MODIS Infrared emissivity dataset (UWIREMIS), and the JPL ASTER Global Emissivity Dataset v4 (GEDv4). The dataset includes monthly global data records of emissivity, uncertainty at 13 hinge points between 3.6-14.3 µm, and Principal Components Analysis (PCA) coefficients at 5 kilometer resolution for years 2003 to 2015. A high spectral resolution algorithm is also provided for HSR applications. The dataset is currently being tested in sounder retrieval algorithm (e.g. CrIS, IASI) and has already been implemented in RTTOV-12 for immediate use in numerical weather modeling and data assimilation. This poster will present the current status of the dataset.
a Novel Framework for Remote Sensing Image Scene Classification
NASA Astrophysics Data System (ADS)
Jiang, S.; Zhao, H.; Wu, W.; Tan, Q.
2018-04-01
High resolution remote sensing (HRRS) images scene classification aims to label an image with a specific semantic category. HRRS images contain more details of the ground objects and their spatial distribution patterns than low spatial resolution images. Scene classification can bridge the gap between low-level features and high-level semantics. It can be applied in urban planning, target detection and other fields. This paper proposes a novel framework for HRRS images scene classification. This framework combines the convolutional neural network (CNN) and XGBoost, which utilizes CNN as feature extractor and XGBoost as a classifier. Then, this framework is evaluated on two different HRRS images datasets: UC-Merced dataset and NWPU-RESISC45 dataset. Our framework achieved satisfying accuracies on two datasets, which is 95.57 % and 83.35 % respectively. From the experiments result, our framework has been proven to be effective for remote sensing images classification. Furthermore, we believe this framework will be more practical for further HRRS scene classification, since it costs less time on training stage.
Khomri, Bilal; Christodoulidis, Argyrios; Djerou, Leila; Babahenini, Mohamed Chaouki; Cheriet, Farida
2018-05-01
Retinal vessel segmentation plays an important role in the diagnosis of eye diseases and is considered as one of the most challenging tasks in computer-aided diagnosis (CAD) systems. The main goal of this study was to propose a method for blood-vessel segmentation that could deal with the problem of detecting vessels of varying diameters in high- and low-resolution fundus images. We proposed to use the particle swarm optimization (PSO) algorithm to improve the multiscale line detection (MSLD) method. The PSO algorithm was applied to find the best arrangement of scales in the MSLD method and to handle the problem of multiscale response recombination. The performance of the proposed method was evaluated on two low-resolution (DRIVE and STARE) and one high-resolution fundus (HRF) image datasets. The data include healthy (H) and diabetic retinopathy (DR) cases. The proposed approach improved the sensitivity rate against the MSLD by 4.7% for the DRIVE dataset and by 1.8% for the STARE dataset. For the high-resolution dataset, the proposed approach achieved 87.09% sensitivity rate, whereas the MSLD method achieves 82.58% sensitivity rate at the same specificity level. When only the smallest vessels were considered, the proposed approach improved the sensitivity rate by 11.02% and by 4.42% for the healthy and the diabetic cases, respectively. Integrating the proposed method in a comprehensive CAD system for DR screening would allow the reduction of false positives due to missed small vessels, misclassified as red lesions. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
High resolution population distribution maps for Southeast Asia in 2010 and 2015.
Gaughan, Andrea E; Stevens, Forrest R; Linard, Catherine; Jia, Peng; Tatem, Andrew J
2013-01-01
Spatially accurate, contemporary data on human population distributions are vitally important to many applied and theoretical researchers. The Southeast Asia region has undergone rapid urbanization and population growth over the past decade, yet existing spatial population distribution datasets covering the region are based principally on population count data from censuses circa 2000, with often insufficient spatial resolution or input data to map settlements precisely. Here we outline approaches to construct a database of GIS-linked circa 2010 census data and methods used to construct fine-scale (∼100 meters spatial resolution) population distribution datasets for each country in the Southeast Asia region. Landsat-derived settlement maps and land cover information were combined with ancillary datasets on infrastructure to model population distributions for 2010 and 2015. These products were compared with those from two other methods used to construct commonly used global population datasets. Results indicate mapping accuracies are consistently higher when incorporating land cover and settlement information into the AsiaPop modelling process. Using existing data, it is possible to produce detailed, contemporary and easily updatable population distribution datasets for Southeast Asia. The 2010 and 2015 datasets produced are freely available as a product of the AsiaPop Project and can be downloaded from: www.asiapop.org.
High Resolution Population Distribution Maps for Southeast Asia in 2010 and 2015
Gaughan, Andrea E.; Stevens, Forrest R.; Linard, Catherine; Jia, Peng; Tatem, Andrew J.
2013-01-01
Spatially accurate, contemporary data on human population distributions are vitally important to many applied and theoretical researchers. The Southeast Asia region has undergone rapid urbanization and population growth over the past decade, yet existing spatial population distribution datasets covering the region are based principally on population count data from censuses circa 2000, with often insufficient spatial resolution or input data to map settlements precisely. Here we outline approaches to construct a database of GIS-linked circa 2010 census data and methods used to construct fine-scale (∼100 meters spatial resolution) population distribution datasets for each country in the Southeast Asia region. Landsat-derived settlement maps and land cover information were combined with ancillary datasets on infrastructure to model population distributions for 2010 and 2015. These products were compared with those from two other methods used to construct commonly used global population datasets. Results indicate mapping accuracies are consistently higher when incorporating land cover and settlement information into the AsiaPop modelling process. Using existing data, it is possible to produce detailed, contemporary and easily updatable population distribution datasets for Southeast Asia. The 2010 and 2015 datasets produced are freely available as a product of the AsiaPop Project and can be downloaded from: www.asiapop.org. PMID:23418469
NASA Astrophysics Data System (ADS)
Polverari, F.; Talone, M.; Crapolicchio, R. Levy, G.; Marzano, F.
2013-12-01
The European Remote-sensing Satellite (ERS)-2 scatterometer provides wind retrievals over Ocean. To satisfy the needs of high quality and homogeneous set of scatterometer measurements, the European Space Agency (ESA) has developed the project Advanced Scatterometer Processing System (ASPS) with which a long-term dataset of new ERS-2 wind products, with an enhanced resolution of 25km square, has been generated by the reprocessing of the entire ERS mission. This paper presents the main results of the validation work of such new dataset using in situ measurements provided by the Prediction and Research Moored Array in the Tropical Atlantic (PIRATA). The comparison indicates that, on average, the scatterometer data agree well with buoys measurements, however the scatterometer tends to overestimates lower winds and underestimates higher winds.
Downscaling global precipitation for local applications - a case for the Rhine basin
NASA Astrophysics Data System (ADS)
Sperna Weiland, Frederiek; van Verseveld, Willem; Schellekens, Jaap
2017-04-01
Within the EU FP7 project eartH2Observe a global Water Resources Re-analysis (WRR) is being developed. This re-analysis consists of meteorological and hydrological water balance variables with global coverage, spanning the period 1979-2014 at 0.25 degrees resolution (Schellekens et al., 2016). The dataset can be of special interest in regions with limited in-situ data availability, yet for local scale analysis particularly in mountainous regions, a resolution of 0.25 degrees may be too coarse and downscaling the data to a higher resolution may be required. A downscaling toolbox has been made that includes spatial downscaling of precipitation based on the global WorldClim dataset that is available at 1 km resolution as a monthly climatology (Hijmans et al., 2005). The input of the down-scaling tool are either the global eartH2Observe WRR1 and WRR2 datasets based on the WFDEI correction methodology (Weedon et al., 2014) or the global Multi-Source Weighted-Ensemble Precipitation (MSWEP) dataset (Beck et al., 2016). Here we present a validation of the datasets over the Rhine catchment by means of a distributed hydrological model (wflow, Schellekens et al., 2014) using a number of precipitation scenarios. (1) We start by running the model using the local reference dataset derived by spatial interpolation of gauge observations. Furthermore we use (2) the MSWEP dataset at the native 0.25-degree resolution followed by (3) MSWEP downscaled with the WorldClim dataset and final (4) MSWEP downscaled with the local reference dataset. The validation will be based on comparison of the modeled river discharges as well as rainfall statistics. We expect that down-scaling the MSWEP dataset with the WorldClim data to higher resolution will increase its performance. To test the performance of the down-scaling routine we have added a run with MSWEP data down-scaled with the local dataset and compare this with the run based on the local dataset itself. - Beck, H. E. et al., 2016. MSWEP: 3-hourly 0.25° global gridded precipitation (1979-2015) by merging gauge, satellite, and reanalysis data, Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-236, accepted for final publication. - Hijmans, R.J. et al., 2005. Very high resolution interpolated climate surfaces for global land areas. International Journal of Climatology 25: 1965-1978. - Schellekens, J. et al., 2016. A global water resources ensemble of hydrological models: the eartH2Observe Tier-1 dataset, Earth Syst. Sci. Data Discuss., doi:10.5194/essd-2016-55, under review. - Schellekens, J. et al., 2014. Rapid setup of hydrological and hydraulic models using OpenStreetMap and the SRTM derived digital elevation model. Environmental Modelling&Software - Weedon, G.P. et al., 2014. The WFDEI meteorological forcing data set: WATCH Forcing Data methodology applied to ERA-Interim reanalysis data. Water Resources Research, 50, doi:10.1002/2014WR015638.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xiaoma; Zhou, Yuyu; Asrar, Ghassem R.
High spatiotemporal land surface temperature (LST) datasets are increasingly needed in a variety of fields such as ecology, hydrology, meteorology, epidemiology, and energy systems. Moderate Resolution Imaging Spectroradiometer (MODIS) LST is one of such high spatiotemporal datasets that are widely used. But, it has large amount of missing values primarily because of clouds. Gapfilling the missing values is an important approach to create high spatiotemporal LST datasets. However current gapfilling methods have limitations in terms of accuracy and time required to assemble the data over large areas (e.g., national and continental levels). In this study, we developed a 3-step hybridmore » method by integrating a combination of daily merging, spatiotemporal gapfilling, and temporal interpolation methods, to create a high spatiotemporal LST dataset using the four daily LST observations from the two MODIS instruments on Terra and Aqua satellites. We applied this method in urban and surrounding areas for the conterminous U.S. in 2010. The evaluation of the gapfilled LST product indicates that its root mean squared error (RMSE) to be 3.3K for mid-daytime (1:30 pm) and 2.7K for mid-13 nighttime (1:30 am) observations. The method can be easily extended to other years and regions and is also applicable to other satellite products. This seamless daily (mid-daytime and mid-nighttime) LST product with 1 km spatial resolution is of great value for studying effects of urbanization (e.g., urban heat island) and the related impacts on people, ecosystems, energy systems and other infrastructure for cities.« less
NASA Astrophysics Data System (ADS)
Lary, D. J.
2013-12-01
A BigData case study is described where multiple datasets from several satellites, high-resolution global meteorological data, social media and in-situ observations are combined using machine learning on a distributed cluster using an automated workflow. The global particulate dataset is relevant to global public health studies and would not be possible to produce without the use of the multiple big datasets, in-situ data and machine learning.To greatly reduce the development time and enhance the functionality a high level language capable of parallel processing has been used (Matlab). A key consideration for the system is high speed access due to the large data volume, persistence of the large data volumes and a precise process time scheduling capability.
An Intercomparison of Large-Extent Tree Canopy Cover Geospatial Datasets
NASA Astrophysics Data System (ADS)
Bender, S.; Liknes, G.; Ruefenacht, B.; Reynolds, J.; Miller, W. P.
2017-12-01
As a member of the Multi-Resolution Land Characteristics Consortium (MRLC), the U.S. Forest Service (USFS) is responsible for producing and maintaining the tree canopy cover (TCC) component of the National Land Cover Database (NLCD). The NLCD-TCC data are available for the conterminous United States (CONUS), coastal Alaska, Hawai'i, Puerto Rico, and the U.S. Virgin Islands. The most recent official version of the NLCD-TCC data is based primarily on reference data from 2010-2011 and is part of the multi-component 2011 version of the NLCD. NLCD data are updated on a five-year cycle. The USFS is currently producing the next official version (2016) of the NLCD-TCC data for the United States, and it will be made publicly-available in early 2018. In this presentation, we describe the model inputs, modeling methods, and tools used to produce the 30-m NLCD-TCC data. Several tree cover datasets at 30-m, as well as datasets at finer resolution, have become available in recent years due to advancements in earth observation data and their availability, computing, and sensors. We compare multiple tree cover datasets that have similar resolution to the NLCD-TCC data. We also aggregate the tree class from fine-resolution land cover datasets to a percent canopy value on a 30-m pixel, in order to compare the fine-resolution datasets to the datasets created directly from 30-m Landsat data. The extent of the tree canopy cover datasets included in the study ranges from global and national to the state level. Preliminary investigation of multiple tree cover datasets over the CONUS indicates a high amount of spatial variability. For example, in a comparison of the NLCD-TCC and the Global Land Cover Facility's Landsat Tree Cover Continuous Fields (2010) data by MRLC mapping zones, the zone-level root mean-square deviation ranges from 2% to 39% (mean=17%, median=15%). The analysis outcomes are expected to inform USFS decisions with regard to the next cycle (2021) of NLCD-TCC production.
NASA Astrophysics Data System (ADS)
Newman, A. J.; Clark, M. P.; Nijssen, B.; Wood, A.; Gutmann, E. D.; Mizukami, N.; Longman, R. J.; Giambelluca, T. W.; Cherry, J.; Nowak, K.; Arnold, J.; Prein, A. F.
2016-12-01
Gridded precipitation and temperature products are inherently uncertain due to myriad factors. These include interpolation from a sparse observation network, measurement representativeness, and measurement errors. Despite this inherent uncertainty, uncertainty is typically not included, or is a specific addition to each dataset without much general applicability across different datasets. A lack of quantitative uncertainty estimates for hydrometeorological forcing fields limits their utility to support land surface and hydrologic modeling techniques such as data assimilation, probabilistic forecasting and verification. To address this gap, we have developed a first of its kind gridded, observation-based ensemble of precipitation and temperature at a daily increment for the period 1980-2012 over the United States (including Alaska and Hawaii). A longer, higher resolution version (1970-present, 1/16th degree) has also been implemented to support real-time hydrologic- monitoring and prediction in several regional US domains. We will present the development and evaluation of the dataset, along with initial applications of the dataset for ensemble data assimilation and probabilistic evaluation of high resolution regional climate model simulations. We will also present results on the new high resolution products for Alaska and Hawaii (2 km and 250 m respectively), to complete the first ensemble observation based product suite for the entire 50 states. Finally, we will present plans to improve the ensemble dataset, focusing on efforts to improve the methods used for station interpolation and ensemble generation, as well as methods to fuse station data with numerical weather prediction model output.
A high-resolution European dataset for hydrologic modeling
NASA Astrophysics Data System (ADS)
Ntegeka, Victor; Salamon, Peter; Gomes, Goncalo; Sint, Hadewij; Lorini, Valerio; Thielen, Jutta
2013-04-01
There is an increasing demand for large scale hydrological models not only in the field of modeling the impact of climate change on water resources but also for disaster risk assessments and flood or drought early warning systems. These large scale models need to be calibrated and verified against large amounts of observations in order to judge their capabilities to predict the future. However, the creation of large scale datasets is challenging for it requires collection, harmonization, and quality checking of large amounts of observations. For this reason, only a limited number of such datasets exist. In this work, we present a pan European, high-resolution gridded dataset of meteorological observations (EFAS-Meteo) which was designed with the aim to drive a large scale hydrological model. Similar European and global gridded datasets already exist, such as the HadGHCND (Caesar et al., 2006), the JRC MARS-STAT database (van der Goot and Orlandi, 2003) and the E-OBS gridded dataset (Haylock et al., 2008). However, none of those provide similarly high spatial resolution and/or a complete set of variables to force a hydrologic model. EFAS-Meteo contains daily maps of precipitation, surface temperature (mean, minimum and maximum), wind speed and vapour pressure at a spatial grid resolution of 5 x 5 km for the time period 1 January 1990 - 31 December 2011. It furthermore contains calculated radiation, which is calculated by using a staggered approach depending on the availability of sunshine duration, cloud cover and minimum and maximum temperature, and evapotranspiration (potential evapotranspiration, bare soil and open water evapotranspiration). The potential evapotranspiration was calculated using the Penman-Monteith equation with the above-mentioned meteorological variables. The dataset was created as part of the development of the European Flood Awareness System (EFAS) and has been continuously updated throughout the last years. The dataset variables are used as inputs to the hydrological calibration and validation of EFAS as well as for establishing long-term discharge "proxy" climatologies which can then in turn be used for statistical analysis to derive return periods or other time series derivatives. In addition, this dataset will be used to assess climatological trends in Europe. Unfortunately, to date no baseline dataset at the European scale exists to test the quality of the herein presented data. Hence, a comparison against other existing datasets can therefore only be an indication of data quality. Due to availability, a comparison was made for precipitation and temperature only, arguably the most important meteorological drivers for hydrologic models. A variety of analyses was undertaken at country scale against data reported to EUROSTAT and E-OBS datasets. The comparison revealed that while the datasets showed overall similar temporal and spatial patterns, there were some differences in magnitudes especially for precipitation. It is not straightforward to define the specific cause for these differences. However, in most cases the comparatively low observation station density appears to be the principal reason for the differences in magnitude.
National Hydrography Dataset (NHD)
,
2001-01-01
The National Hydrography Dataset (NHD) is a feature-based database that interconnects and uniquely identifies the stream segments or reaches that make up the nation's surface water drainage system. NHD data was originally developed at 1:100,000 scale and exists at that scale for the whole country. High resolution NHD adds detail to the original 1:100,000-scale NHD. (Data for Alaska, Puerto Rico and the Virgin Islands was developed at high-resolution, not 1:100,000 scale.) Like the 1:100,000-scale NHD, high resolution NHD contains reach codes for networked features and isolated lakes, flow direction, names, stream level, and centerline representations for areal water bodies. Reaches are also defined to represent waterbodies and the approximate shorelines of the Great Lakes, the Atlantic and Pacific Oceans and the Gulf of Mexico. The NHD also incorporates the National Spatial Data Infrastructure framework criteria set out by the Federal Geographic Data Committee.
Bouhrara, Mustapha; Reiter, David A; Sexton, Kyle W; Bergeron, Christopher M; Zukley, Linda M; Spencer, Richard G
2017-11-01
We applied our recently introduced Bayesian analytic method to achieve clinically-feasible in-vivo mapping of the proteoglycan water fraction (PgWF) of human knee cartilage with improved spatial resolution and stability as compared to existing methods. Multicomponent driven equilibrium single-pulse observation of T 1 and T 2 (mcDESPOT) datasets were acquired from the knees of two healthy young subjects and one older subject with previous knee injury. Each dataset was processed using Bayesian Monte Carlo (BMC) analysis incorporating a two-component tissue model. We assessed the performance and reproducibility of BMC and of the conventional analysis of stochastic region contraction (SRC) in the estimation of PgWF. Stability of the BMC analysis of PgWF was tested by comparing independent high-resolution (HR) datasets from each of the two young subjects. Unlike SRC, the BMC-derived maps from the two HR datasets were essentially identical. Furthermore, SRC maps showed substantial random variation in estimated PgWF, and mean values that differed from those obtained using BMC. In addition, PgWF maps derived from conventional low-resolution (LR) datasets exhibited partial volume and magnetic susceptibility effects. These artifacts were absent in HR PgWF images. Finally, our analysis showed regional variation in PgWF estimates, and substantially higher values in the younger subjects as compared to the older subject. BMC-mcDESPOT permits HR in-vivo mapping of PgWF in human knee cartilage in a clinically-feasible acquisition time. HR mapping reduces the impact of partial volume and magnetic susceptibility artifacts compared to LR mapping. Finally, BMC-mcDESPOT demonstrated excellent reproducibility in the determination of PgWF. Published by Elsevier Inc.
On the impact of large angle CMB polarization data on cosmological parameters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lattanzi, Massimiliano; Mandolesi, Nazzareno; Natoli, Paolo
We study the impact of the large-angle CMB polarization datasets publicly released by the WMAP and Planck satellites on the estimation of cosmological parameters of the ΛCDM model. To complement large-angle polarization, we consider the high resolution (or 'high-ℓ') CMB datasets from either WMAP or Planck as well as CMB lensing as traced by Planck 's measured four point correlation function. In the case of WMAP, we compute the large-angle polarization likelihood starting over from low resolution frequency maps and their covariance matrices, and perform our own foreground mitigation technique, which includes as a possible alternative Planck 353 GHz datamore » to trace polarized dust. We find that the latter choice induces a downward shift in the optical depth τ, roughly of order 2σ, robust to the choice of the complementary high resolution dataset. When the Planck 353 GHz is consistently used to minimize polarized dust emission, WMAP and Planck 70 GHz large-angle polarization data are in remarkable agreement: by combining them we find τ = 0.066 {sup +0.012}{sub −0.013}, again very stable against the particular choice for high-ℓ data. We find that the amplitude of primordial fluctuations A {sub s} , notoriously degenerate with τ, is the parameter second most affected by the assumptions on polarized dust removal, but the other parameters are also affected, typically between 0.5 and 1σ. In particular, cleaning dust with Planck 's 353 GHz data imposes a 1σ downward shift in the value of the Hubble constant H {sub 0}, significantly contributing to the tension reported between CMB based and direct measurements of the present expansion rate. On the other hand, we find that the appearance of the so-called low ℓ anomaly, a well-known tension between the high- and low-resolution CMB anisotropy amplitude, is not significantly affected by the details of large-angle polarization, or by the particular high-ℓ dataset employed.« less
A new global anthropogenic heat estimation based on high-resolution nighttime light data
Yang, Wangming; Luan, Yibo; Liu, Xiaolei; Yu, Xiaoyong; Miao, Lijuan; Cui, Xuefeng
2017-01-01
Consumption of fossil fuel resources leads to global warming and climate change. Apart from the negative impact of greenhouse gases on the climate, the increasing emission of anthropogenic heat from energy consumption also brings significant impacts on urban ecosystems and the surface energy balance. The objective of this work is to develop a new method of estimating the global anthropogenic heat budget and validate it on the global scale with a high precision and resolution dataset. A statistical algorithm was applied to estimate the annual mean anthropogenic heat (AH-DMSP) from 1992 to 2010 at 1×1 km2 spatial resolution for the entire planet. AH-DMSP was validated for both provincial and city scales, and results indicate that our dataset performs well at both scales. Compared with other global anthropogenic heat datasets, the AH-DMSP has a higher precision and finer spatial distribution. Although there are some limitations, the AH-DMSP could provide reliable, multi-scale anthropogenic heat information, which could be used for further research on regional or global climate change and urban ecosystems. PMID:28829436
Local X-ray Computed Tomography Imaging for Mineralogical and Pore Characterization
NASA Astrophysics Data System (ADS)
Mills, G.; Willson, C. S.
2015-12-01
Sample size, material properties and image resolution are all tradeoffs that must be considered when imaging porous media samples with X-ray computed tomography. In many natural and engineered samples, pore and throat sizes span several orders of magnitude and are often correlated with the material composition. Local tomography is a nondestructive technique that images a subvolume, within a larger specimen, at high resolution and uses low-resolution tomography data from the larger specimen to reduce reconstruction error. The high-resolution, subvolume data can be used to extract important fine-scale properties but, due to the additional noise associated with the truncated dataset, it makes segmentation of different materials and mineral phases a challenge. The low-resolution data of a larger specimen is typically of much higher-quality making material characterization much easier. In addition, the imaging of a larger domain, allows for mm-scale bulk properties and heterogeneities to be determined. In this research, a 7 mm diameter and ~15 mm in length sandstone core was scanned twice. The first scan was performed to cover the entire diameter and length of the specimen at an image voxel resolution of 4.1 μm. The second scan was performed on a subvolume, ~1.3 mm in length and ~2.1 mm in diameter, at an image voxel resolution of 1.08 μm. After image processing and segmentation, the pore network structure and mineralogical features were extracted from the low-resolution dataset. Due to the noise in the truncated high-resolution dataset, several image processing approaches were applied prior to image segmentation and extraction of the pore network structure and mineralogy. Results from the different truncated tomography segmented data sets are compared to each other to evaluate the potential of each approach in identifying the different solid phases from the original 16 bit data set. The truncated tomography segmented data sets were also compared to the whole-core tomography segmented data set in two ways: (1) assessment of the porosity and pore size distribution at different scales; and (2) comparison of the mineralogical composition and distribution. Finally, registration of the two datasets will be used to show how the pore structure and mineralogy details at the two scales can be used to supplement each other.
A high-resolution 7-Tesla fMRI dataset from complex natural stimulation with an audio movie.
Hanke, Michael; Baumgartner, Florian J; Ibe, Pierre; Kaule, Falko R; Pollmann, Stefan; Speck, Oliver; Zinke, Wolf; Stadler, Jörg
2014-01-01
Here we present a high-resolution functional magnetic resonance (fMRI) dataset - 20 participants recorded at high field strength (7 Tesla) during prolonged stimulation with an auditory feature film ("Forrest Gump"). In addition, a comprehensive set of auxiliary data (T1w, T2w, DTI, susceptibility-weighted image, angiography) as well as measurements to assess technical and physiological noise components have been acquired. An initial analysis confirms that these data can be used to study common and idiosyncratic brain response patterns to complex auditory stimulation. Among the potential uses of this dataset are the study of auditory attention and cognition, language and music perception, and social perception. The auxiliary measurements enable a large variety of additional analysis strategies that relate functional response patterns to structural properties of the brain. Alongside the acquired data, we provide source code and detailed information on all employed procedures - from stimulus creation to data analysis. In order to facilitate replicative and derived works, only free and open-source software was utilized.
Shtull-Trauring, E; Bernstein, N
2018-05-01
Agriculture is the largest global consumer of freshwater. As the volume of international trade continues to rise, so does the understanding that trade of water-intensive crops from areas with high precipitation, to arid regions can help mitigate water scarcity, highlighting the importance of crop water accounting. Virtual-Water, or Water-Footprint [WF] of agricultural crops, is a powerful indicator for assessing the extent of water use by plants, contamination of water bodies by agricultural practices and trade between countries, which underlies any international trade of crops. Most available studies of virtual-water flows by import/export of agricultural commodities were based on global databases, which are considered to be of limited accuracy. The present study analyzes the WF of crop production, import, and export on a country level, using Israel as a case study, comparing data from two high-resolution local databases and two global datasets. Results for local datasets demonstrate a WF of ~1200Million Cubic Meters [MCM]/year) for total crop production, ~1000MCM/year for import and ~250MCM/year for export. Fruits and vegetables comprise ~80% of Export WF (~200MCM/year), ~50% of crop production and only ~20% of the imports. Economic Water Productivity [EWP] ($/m 3 ) for fruits and vegetables is 1.5 higher compared to other crops. Moreover, the results based on local and global datasets varied significantly, demonstrating the importance of developing high-resolution local datasets based on local crop coefficients. Performing high resolution WF analysis can help in developing agricultural policies that include support for low WF/high EWP and limit high WF/low EWP crop export, where water availability is limited. Copyright © 2017 Elsevier B.V. All rights reserved.
Rapid, semi-automatic fracture and contact mapping for point clouds, images and geophysical data
NASA Astrophysics Data System (ADS)
Thiele, Samuel T.; Grose, Lachlan; Samsu, Anindita; Micklethwaite, Steven; Vollgger, Stefan A.; Cruden, Alexander R.
2017-12-01
The advent of large digital datasets from unmanned aerial vehicle (UAV) and satellite platforms now challenges our ability to extract information across multiple scales in a timely manner, often meaning that the full value of the data is not realised. Here we adapt a least-cost-path solver and specially tailored cost functions to rapidly interpolate structural features between manually defined control points in point cloud and raster datasets. We implement the method in the geographic information system QGIS and the point cloud and mesh processing software CloudCompare. Using these implementations, the method can be applied to a variety of three-dimensional (3-D) and two-dimensional (2-D) datasets, including high-resolution aerial imagery, digital outcrop models, digital elevation models (DEMs) and geophysical grids. We demonstrate the algorithm with four diverse applications in which we extract (1) joint and contact patterns in high-resolution orthophotographs, (2) fracture patterns in a dense 3-D point cloud, (3) earthquake surface ruptures of the Greendale Fault associated with the Mw7.1 Darfield earthquake (New Zealand) from high-resolution light detection and ranging (lidar) data, and (4) oceanic fracture zones from bathymetric data of the North Atlantic. The approach improves the consistency of the interpretation process while retaining expert guidance and achieves significant improvements (35-65 %) in digitisation time compared to traditional methods. Furthermore, it opens up new possibilities for data synthesis and can quantify the agreement between datasets and an interpretation.
NASA Astrophysics Data System (ADS)
Li, Z.; Clark, E. P.
2017-12-01
Large scale and fine resolution riverine bathymetry data is critical for flood inundation modelingbut not available over the continental United States (CONUS). Previously we implementedbankfull hydraulic geometry based approaches to simulate bathymetry for individual riversusing NHDPlus v2.1 data and 10 m National Elevation Dataset (NED). USGS has recentlydeveloped High Resolution NHD data (NHDPlus HR Beta) (USGS, 2017), and thisenhanced dataset has a significant improvement on its spatial correspondence with 10 m DEM.In this study, we used this high resolution data, specifically NHDFlowline and NHDArea,to create bathymetry/terrain for CONUS river channels and floodplains. A software packageNHDPlus Inundation Modeler v5.0 Beta was developed for this project as an Esri ArcGIShydrological analysis extension. With the updated tools, raw 10 m DEM was first hydrologicallytreated to remove artificial blockages (e.g., overpasses, bridges and eve roadways, etc.) usinglow pass moving window filters. Cross sections were then automatically constructed along eachflowline to extract elevation from the hydrologically treated DEM. In this study, river channelshapes were approximated using quadratic curves to reduce uncertainties from commonly usedtrapezoids. We calculated underneath water channel elevation at each cross section samplingpoint using bankfull channel dimensions that were estimated from physiographicprovince/division based regression equations (Bieger et al. 2015). These elevation points werethen interpolated to generate bathymetry raster. The simulated bathymetry raster wasintegrated with USGS NED and Coastal National Elevation Database (CoNED) (whereveravailable) to make seamless terrain-bathymetry dataset. Channel bathymetry was alsointegrated to the HAND (Height above Nearest Drainage) dataset to improve large scaleinundation modeling. The generated terrain-bathymetry was processed at WatershedBoundary Dataset Hydrologic Unit 4 (WBDHU4) level.
Nanomaterial datasets to advance tomography in scanning transmission electron microscopy
Levin, Barnaby D. A.; Padgett, Elliot; Chen, Chien-Chun; ...
2016-06-07
Electron tomography in materials science has flourished with the demand to characterize nanoscale materials in three dimensions (3D). Access to experimental data is vital for developing and validating reconstruction methods that improve resolution and reduce radiation dose requirements. This work presents five high-quality scanning transmission electron microscope (STEM) tomography datasets in order to address the critical need for open access data in this field. The datasets represent the current limits of experimental technique, are of high quality, and contain materials with structural complexity. Included are tomographic series of a hyperbranched Co 2 P nanocrystal, platinum nanoparticles on a carbonmore » nanofibre imaged over the complete 180° tilt range, a platinum nanoparticle and a tungsten needle both imaged at atomic resolution by equal slope tomography, and a through-focal tilt series of PtCu nanoparticles. A volumetric reconstruction from every dataset is provided for comparison and development of post-processing and visualization techniques. Researchers interested in creating novel data processing and reconstruction algorithms will now have access to state of the art experimental test data.« less
Nanomaterial datasets to advance tomography in scanning transmission electron microscopy.
Levin, Barnaby D A; Padgett, Elliot; Chen, Chien-Chun; Scott, M C; Xu, Rui; Theis, Wolfgang; Jiang, Yi; Yang, Yongsoo; Ophus, Colin; Zhang, Haitao; Ha, Don-Hyung; Wang, Deli; Yu, Yingchao; Abruña, Hector D; Robinson, Richard D; Ercius, Peter; Kourkoutis, Lena F; Miao, Jianwei; Muller, David A; Hovden, Robert
2016-06-07
Electron tomography in materials science has flourished with the demand to characterize nanoscale materials in three dimensions (3D). Access to experimental data is vital for developing and validating reconstruction methods that improve resolution and reduce radiation dose requirements. This work presents five high-quality scanning transmission electron microscope (STEM) tomography datasets in order to address the critical need for open access data in this field. The datasets represent the current limits of experimental technique, are of high quality, and contain materials with structural complexity. Included are tomographic series of a hyperbranched Co2P nanocrystal, platinum nanoparticles on a carbon nanofibre imaged over the complete 180° tilt range, a platinum nanoparticle and a tungsten needle both imaged at atomic resolution by equal slope tomography, and a through-focal tilt series of PtCu nanoparticles. A volumetric reconstruction from every dataset is provided for comparison and development of post-processing and visualization techniques. Researchers interested in creating novel data processing and reconstruction algorithms will now have access to state of the art experimental test data.
Nanomaterial datasets to advance tomography in scanning transmission electron microscopy
Levin, Barnaby D.A.; Padgett, Elliot; Chen, Chien-Chun; Scott, M.C.; Xu, Rui; Theis, Wolfgang; Jiang, Yi; Yang, Yongsoo; Ophus, Colin; Zhang, Haitao; Ha, Don-Hyung; Wang, Deli; Yu, Yingchao; Abruña, Hector D.; Robinson, Richard D.; Ercius, Peter; Kourkoutis, Lena F.; Miao, Jianwei; Muller, David A.; Hovden, Robert
2016-01-01
Electron tomography in materials science has flourished with the demand to characterize nanoscale materials in three dimensions (3D). Access to experimental data is vital for developing and validating reconstruction methods that improve resolution and reduce radiation dose requirements. This work presents five high-quality scanning transmission electron microscope (STEM) tomography datasets in order to address the critical need for open access data in this field. The datasets represent the current limits of experimental technique, are of high quality, and contain materials with structural complexity. Included are tomographic series of a hyperbranched Co2P nanocrystal, platinum nanoparticles on a carbon nanofibre imaged over the complete 180° tilt range, a platinum nanoparticle and a tungsten needle both imaged at atomic resolution by equal slope tomography, and a through-focal tilt series of PtCu nanoparticles. A volumetric reconstruction from every dataset is provided for comparison and development of post-processing and visualization techniques. Researchers interested in creating novel data processing and reconstruction algorithms will now have access to state of the art experimental test data. PMID:27272459
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Lin, Yuh-Lang
2004-01-01
During the research project, sounding datasets were generated for the region surrounding 9 major airports, including Dallas, TX, Boston, MA, New York, NY, Chicago, IL, St. Louis, MO, Atlanta, GA, Miami, FL, San Francico, CA, and Los Angeles, CA. The numerical simulation of winter and summer environments during which no instrument flight rule impact was occurring at these 9 terminals was performed using the most contemporary version of the Terminal Area PBL Prediction System (TAPPS) model nested from 36 km to 6 km to 1 km horizontal resolution and very detailed vertical resolution in the planetary boundary layer. The soundings from the 1 km model were archived at 30 minute time intervals for a 24 hour period and the vertical dependent variables as well as derived quantities, i.e., 3-dimensional wind components, temperatures, pressures, mixing ratios, turbulence kinetic energy and eddy dissipation rates were then interpolated to 5 m vertical resolution up to 1000 m elevation above ground level. After partial validation against field experiment datasets for Dallas as well as larger scale and much coarser resolution observations at the other 8 airports, these sounding datasets were sent to NASA for use in the Virtual Air Space and Modeling program. The application of these datasets being to determine representative airport weather environments to diagnose the response of simulated wake vortices to realistic atmospheric environments. These virtual datasets are based on large scale observed atmospheric initial conditions that are dynamically interpolated in space and time. The 1 km nested-grid simulated datasets providing a very coarse and highly smoothed representation of airport environment meteorological conditions. Details concerning the airport surface forcing are virtually absent from these simulated datasets although the observed background atmospheric processes have been compared to the simulated fields and the fields were found to accurately replicate the flows surrounding the airport where coarse verification data were available as well as where airport scale datasets were available.
Soil Erosion map of Europe based on high resolution input datasets
NASA Astrophysics Data System (ADS)
Panagos, Panos; Borrelli, Pasquale; Meusburger, Katrin; Ballabio, Cristiano; Alewell, Christine
2015-04-01
Modelling soil erosion in European Union is of major importance for agro-environmental policies. Soil erosion estimates are important inputs for the Common Agricultural Policy (CAP) and the implementation of the Soil Thematic Strategy. Using the findings of a recent pan-European data collection through the EIONET network, it was concluded that most Member States are applying the empirical Revised Universal Soil Loss Equation (RUSLE) for the modelling soil erosion at National level. This model was chosen for the pan-European soil erosion risk assessment and it is based on 6 input factors. Compared to past approaches, each of the factors is modelled using the latest pan-European datasets, expertise and data from Member states and high resolution remote sensing data. The soil erodibility (K-factor) is modelled using the recently published LUCAS topsoil database with 20,000 point measurements and incorporating the surface stone cover which can reduce K-factor by 15%. The rainfall erosivity dataset (R-factor) has been implemented using high temporal resolution rainfall data from more than 1,500 precipitation stations well distributed in Europe. The cover-management (C-factor) incorporates crop statistics and management practices such as cover crops, tillage practices and plant residuals. The slope length and steepness (combined LS-factor) is based on the first ever 25m Digital Elevation Model (DEM) of Europe. Finally, the support practices (P-factor) is modelled for first time at this scale taking into account the 270,000 LUCAS earth observations and the Good Agricultural and Environmental Condition (GAEC) that farmers have to follow in Europe. The high resolution input layers produce the final soil erosion risk map at 100m resolution and allow policy makers to run future land use, management and climate change scenarios.
NASA Astrophysics Data System (ADS)
Nord, G.; Braud, I.; Boudevillain, B.; Gérard, S.; Molinié, G.; Vandervaere, J. P.; Huza, J.; Le Coz, J.; Dramais, G.; Legout, C.; Berne, A.; Grazioli, J.; Raupach, T.; Van Baelen, J.; Wijbrans, A.; Delrieu, G.; Andrieu, J.; Caliano, M.; Aubert, C.; Teuling, R.; Le Boursicaud, R.; Branger, F.; Vincendon, B.; Horner, I.
2014-12-01
A comprehensive hydrometeorological dataset is presented spanning the period 1 Jan 2011-31 Dec 2014 to improve the understanding and simulation of the hydrological processes leading to flash floods in a mesoscale catchment (Auzon, 116 km2) of the Mediterranean region. The specificity of the dataset is its high space-time resolution, especially concerning rainfall and the hydrological response which is particularly adapted to the highly spatially variable rainfall events that may occur in this region. This type of dataset is rare in scientific literature because of the quantity and type of sensors for meteorology and surface hydrology. Rainfall data include continuous precipitation measured by rain-gages (5 min time step for the research network of 21 rain-gages and 1h time step for the operational network of 9 rain-gages), S-band Doppler dual-polarization radar (1 km2, 5 min resolution), and disdrometers (11 sensors working at 1 min time step). During the special observation period (SOP-1) and enhanced observation period (Sep-Dec 2012, Sep-Dec 2013) of the HyMeX (Hydrological Cycle in the Mediterranean Experiment) project, two X-band radars provided precipitation measurements at very fine spatial and temporal scales (1 ha, 5 min). Meteorological data are taken from the operational surface weather observation stations of Meteo France at the hourly time resolution (6 stations in the region of interest). The monitoring of surface hydrology and suspended sediment is multi-scale and based on nested catchments. Three hydrometric stations measure water discharge and additional physico-chemical variables at a 2-10 min time resolution. Two experimental plots monitor overland flow and erosion at 1 min time resolution on a hillslope with vineyard. A network of 11 gauges continuously measures water level and temperature in headwater subcatchments at a time resolution of 2-5 min. A network of soil moisture sensors enable the continuous measurement of soil volumetric water content at 20 min time resolution at 9 sites. Additionally, opportunistic observations (soil moisture measurements and stream gauging) were performed during floods between 2012 and 2014. The data are appropriate for understanding rainfall variability, improving areal rainfall estimations and progress in distributed hydrological modelling.
NASA Astrophysics Data System (ADS)
Nord, G.; Braud, I.; Boudevillain, B.; Gérard, S.; Molinié, G.; Vandervaere, J. P.; Huza, J.; Le Coz, J.; Dramais, G.; Legout, C.; Berne, A.; Grazioli, J.; Raupach, T.; Van Baelen, J.; Wijbrans, A.; Delrieu, G.; Andrieu, J.; Caliano, M.; Aubert, C.; Teuling, R.; Le Boursicaud, R.; Branger, F.; Vincendon, B.; Horner, I.
2015-12-01
A comprehensive hydrometeorological dataset is presented spanning the period 1 Jan 2011-31 Dec 2014 to improve the understanding and simulation of the hydrological processes leading to flash floods in a mesoscale catchment (Auzon, 116 km2) of the Mediterranean region. The specificity of the dataset is its high space-time resolution, especially concerning rainfall and the hydrological response which is particularly adapted to the highly spatially variable rainfall events that may occur in this region. This type of dataset is rare in scientific literature because of the quantity and type of sensors for meteorology and surface hydrology. Rainfall data include continuous precipitation measured by rain-gages (5 min time step for the research network of 21 rain-gages and 1h time step for the operational network of 9 rain-gages), S-band Doppler dual-polarization radar (1 km2, 5 min resolution), and disdrometers (11 sensors working at 1 min time step). During the special observation period (SOP-1) and enhanced observation period (Sep-Dec 2012, Sep-Dec 2013) of the HyMeX (Hydrological Cycle in the Mediterranean Experiment) project, two X-band radars provided precipitation measurements at very fine spatial and temporal scales (1 ha, 5 min). Meteorological data are taken from the operational surface weather observation stations of Meteo France at the hourly time resolution (6 stations in the region of interest). The monitoring of surface hydrology and suspended sediment is multi-scale and based on nested catchments. Three hydrometric stations measure water discharge and additional physico-chemical variables at a 2-10 min time resolution. Two experimental plots monitor overland flow and erosion at 1 min time resolution on a hillslope with vineyard. A network of 11 gauges continuously measures water level and temperature in headwater subcatchments at a time resolution of 2-5 min. A network of soil moisture sensors enable the continuous measurement of soil volumetric water content at 20 min time resolution at 9 sites. Additionally, opportunistic observations (soil moisture measurements and stream gauging) were performed during floods between 2012 and 2014. The data are appropriate for understanding rainfall variability, improving areal rainfall estimations and progress in distributed hydrological modelling.
NASA Astrophysics Data System (ADS)
Garcia Galiano, S. G.; Giraldo Osorio, J. D.; Nguyen, P.; Hsu, K. L.; Braithwaite, D.; Olmos, P.; Sorooshian, S.
2015-12-01
Studying Spain's long-term variability and changing trends in rainfall, due to its unique position in the Mediterranean basin (i.e., the latitudinal gradient from North to South and its orographic variation), can provide a valuable insight into how hydroclimatology of the region has changed. A recently released high resolution satellite-based global daily precipitation climate dataset PERSIANN-CDR (Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network - Climate Data Record), provided the opportunity to conduct such study. It covers the period 01/01/1983 - to date, at 0.25° resolution. In areas without a dense network of rain-gauges, the PERSIANN-CDR dataset could be useful for identifying the reliability of regional climate models (RCMs), in order to build robust RCMs ensemble for reducing the uncertainties in the climate and hydrological projections. However, before using this data set for RCM evaluation, an assessment of performance of PERSIANN-CDR dataset against in-situ observations is necessary. The high-resolution gridded daily rain-gauge dataset, named Spain02, was employed in this study. The variable Dry Spell Lengths (DSL) considering 1 mm and 10 mm as thresholds of daily rainfall, and the time period 1988-2007 was defined for the study. A procedure for improving the consistency and homogeneity between the two datasets was applied. The assessment is based on distributional similarity and the well-known statistical tests (Smirnov-Kolmogorov of two samples and Chi-Square) are used as fitting criteria. The results demonstrate good fit of PERSIANN-CDR over whole Spain, for threshold 10 mm/day. However, for threshold 1 mm/day PERSIANN-CDR compares well with Spain02 dataset for areas with high values of rainfall (North of Spain); while in semiarid areas (South East of Spain) there is strong overestimation of short DSLs. Overall, PERSIANN-CDR demonstrate its robustness in the simulation of DSLs for the highest thresholds.
Fluid Lensing based Machine Learning for Augmenting Earth Science Coral Datasets
NASA Astrophysics Data System (ADS)
Li, A.; Instrella, R.; Chirayath, V.
2016-12-01
Recently, there has been increased interest in monitoring the effects of climate change upon the world's marine ecosystems, particularly coral reefs. These delicate ecosystems are especially threatened due to their sensitivity to ocean warming and acidification, leading to unprecedented levels of coral bleaching and die-off in recent years. However, current global aquatic remote sensing datasets are unable to quantify changes in marine ecosystems at spatial and temporal scales relevant to their growth. In this project, we employ various supervised and unsupervised machine learning algorithms to augment existing datasets from NASA's Earth Observing System (EOS), using high resolution airborne imagery. This method utilizes NASA's ongoing airborne campaigns as well as its spaceborne assets to collect remote sensing data over these afflicted regions, and employs Fluid Lensing algorithms to resolve optical distortions caused by the fluid surface, producing cm-scale resolution imagery of these diverse ecosystems from airborne platforms. Support Vector Machines (SVMs) and K-mean clustering methods were applied to satellite imagery at 0.5m resolution, producing segmented maps classifying coral based on percent cover and morphology. Compared to a previous study using multidimensional maximum a posteriori (MAP) estimation to separate these features in high resolution airborne datasets, SVMs are able to achieve above 75% accuracy when augmented with existing MAP estimates, while unsupervised methods such as K-means achieve roughly 68% accuracy, verified by manually segmented reference data provided by a marine biologist. This effort thus has broad applications for coastal remote sensing, by helping marine biologists quantify behavioral trends spanning large areas and over longer timescales, and to assess the health of coral reefs worldwide.
An assessment of differences in gridded precipitation datasets in complex terrain
NASA Astrophysics Data System (ADS)
Henn, Brian; Newman, Andrew J.; Livneh, Ben; Daly, Christopher; Lundquist, Jessica D.
2018-01-01
Hydrologic modeling and other geophysical applications are sensitive to precipitation forcing data quality, and there are known challenges in spatially distributing gauge-based precipitation over complex terrain. We conduct a comparison of six high-resolution, daily and monthly gridded precipitation datasets over the Western United States. We compare the long-term average spatial patterns, and interannual variability of water-year total precipitation, as well as multi-year trends in precipitation across the datasets. We find that the greatest absolute differences among datasets occur in high-elevation areas and in the maritime mountain ranges of the Western United States, while the greatest percent differences among datasets relative to annual total precipitation occur in arid and rain-shadowed areas. Differences between datasets in some high-elevation areas exceed 200 mm yr-1 on average, and relative differences range from 5 to 60% across the Western United States. In areas of high topographic relief, true uncertainties and biases are likely higher than the differences among the datasets; we present evidence of this based on streamflow observations. Precipitation trends in the datasets differ in magnitude and sign at smaller scales, and are sensitive to how temporal inhomogeneities in the underlying precipitation gauge data are handled.
Digital Astronaut Photography: A Discovery Dataset for Archaeology
NASA Technical Reports Server (NTRS)
Stefanov, William L.
2010-01-01
Astronaut photography acquired from the International Space Station (ISS) using commercial off-the-shelf cameras offers a freely-accessible source for high to very high resolution (4-20 m/pixel) visible-wavelength digital data of Earth. Since ISS Expedition 1 in 2000, over 373,000 images of the Earth-Moon system (including land surface, ocean, atmospheric, and lunar images) have been added to the Gateway to Astronaut Photography of Earth online database (http://eol.jsc.nasa.gov ). Handheld astronaut photographs vary in look angle, time of acquisition, solar illumination, and spatial resolution. These attributes of digital astronaut photography result from a unique combination of ISS orbital dynamics, mission operations, camera systems, and the individual skills of the astronaut. The variable nature of astronaut photography makes the dataset uniquely useful for archaeological applications in comparison with more traditional nadir-viewing multispectral datasets acquired from unmanned orbital platforms. For example, surface features such as trenches, walls, ruins, urban patterns, and vegetation clearing and regrowth patterns may be accentuated by low sun angles and oblique viewing conditions (Fig. 1). High spatial resolution digital astronaut photographs can also be used with sophisticated land cover classification and spatial analysis approaches like Object Based Image Analysis, increasing the potential for use in archaeological characterization of landscapes and specific sites.
Moving towards Hyper-Resolution Hydrologic Modeling
NASA Astrophysics Data System (ADS)
Rouf, T.; Maggioni, V.; Houser, P.; Mei, Y.
2017-12-01
Developing a predictive capability for terrestrial hydrology across landscapes, with water, energy and nutrients as the drivers of these dynamic systems, faces the challenge of scaling meter-scale process understanding to practical modeling scales. Hyper-resolution land surface modeling can provide a framework for addressing science questions that we are not able to answer with coarse modeling scales. In this study, we develop a hyper-resolution forcing dataset from coarser resolution products using a physically based downscaling approach. These downscaling techniques rely on correlations with landscape variables, such as topography, roughness, and land cover. A proof-of-concept has been implemented over the Oklahoma domain, where high-resolution observations are available for validation purposes. Hourly NLDAS (North America Land Data Assimilation System) forcing data (i.e., near-surface air temperature, pressure, and humidity) have been downscaled to 500m resolution over the study area for 2015-present. Results show that correlation coefficients between the downscaled temperature dataset and ground observations are consistently higher than the ones between the NLDAS temperature data at their native resolution and ground observations. Not only correlation coefficients are higher, but also the deviation around the 1:1 line in the density scatterplots is smaller for the downscaled dataset than the original one with respect to the ground observations. Results are therefore encouraging as they demonstrate that the 500m temperature dataset has a good agreement with the ground information and can be adopted to force the land surface model for soil moisture estimation. The study has been expanded to wind speed and direction, incident longwave and shortwave radiation, pressure, and precipitation. Precipitation is well known to vary dramatically with elevation and orography. Therefore, we are pursuing a downscaling technique based on both topographical and vegetation characteristics.
NASA Astrophysics Data System (ADS)
Midekisa, A.; Bennet, A.; Gething, P. W.; Holl, F.; Andrade-Pacheco, R.; Savory, D. J.; Hugh, S. J.
2016-12-01
Spatially detailed and temporally dynamic land use land cover data is necessary to monitor the state of the land surface for various applications. Yet, such data at a continental to global scale is lacking. Here, we developed high resolution (30 meter) annual land use land cover layers for the continental Africa using Google Earth Engine. To capture ground truth training data, high resolution satellite imageries were visually inspected and used to identify 7, 212 sample Landsat pixels that were comprised entirely of one of seven land use land cover classes (water, man-made impervious surface, high biomass, low biomass, rock, sand and bare soil). For model validation purposes, 80% of points from each class were used as training data, with 20% withheld as a validation dataset. Cloud free Landsat 7 annual composites for 2000 to 2015 were generated and spectral bands from the Landsat images were then extracted for each of the training and validation sample points. In addition to the Landsat spectral bands, spectral indices such as normalized difference vegetation index (NDVI) and normalized difference water index (NDWI) were used as covariates in the model. Additionally, calibrated night time light imageries from the National Oceanic and Atmospheric Administration (NOAA) were included as a covariate. A decision tree classification algorithm was applied to predict the 7 land cover classes for the periods 2000 to 2015 using the training dataset. Using the validation dataset, classification accuracy including omission error and commission error were computed for each land cover class. Model results showed that overall accuracy of classification was high (88%). This high resolution land cover product developed for the continental Africa will be available for public use and can potentially enhance the ability of monitoring and studying the state of the Earth's surface.
Evaluation of the Global Land Data Assimilation System (GLDAS) air temperature data products
Ji, Lei; Senay, Gabriel B.; Verdin, James P.
2015-01-01
There is a high demand for agrohydrologic models to use gridded near-surface air temperature data as the model input for estimating regional and global water budgets and cycles. The Global Land Data Assimilation System (GLDAS) developed by combining simulation models with observations provides a long-term gridded meteorological dataset at the global scale. However, the GLDAS air temperature products have not been comprehensively evaluated, although the accuracy of the products was assessed in limited areas. In this study, the daily 0.25° resolution GLDAS air temperature data are compared with two reference datasets: 1) 1-km-resolution gridded Daymet data (2002 and 2010) for the conterminous United States and 2) global meteorological observations (2000–11) archived from the Global Historical Climatology Network (GHCN). The comparison of the GLDAS datasets with the GHCN datasets, including 13 511 weather stations, indicates a fairly high accuracy of the GLDAS data for daily temperature. The quality of the GLDAS air temperature data, however, is not always consistent in different regions of the world; for example, some areas in Africa and South America show relatively low accuracy. Spatial and temporal analyses reveal a high agreement between GLDAS and Daymet daily air temperature datasets, although spatial details in high mountainous areas are not sufficiently estimated by the GLDAS data. The evaluation of the GLDAS data demonstrates that the air temperature estimates are generally accurate, but caution should be taken when the data are used in mountainous areas or places with sparse weather stations.
Wainwright, Haruko M; Seki, Akiyuki; Chen, Jinsong; Saito, Kimiaki
2017-02-01
This paper presents a multiscale data integration method to estimate the spatial distribution of air dose rates in the regional scale around the Fukushima Daiichi Nuclear Power Plant. We integrate various types of datasets, such as ground-based walk and car surveys, and airborne surveys, all of which have different scales, resolutions, spatial coverage, and accuracy. This method is based on geostatistics to represent spatial heterogeneous structures, and also on Bayesian hierarchical models to integrate multiscale, multi-type datasets in a consistent manner. The Bayesian method allows us to quantify the uncertainty in the estimates, and to provide the confidence intervals that are critical for robust decision-making. Although this approach is primarily data-driven, it has great flexibility to include mechanistic models for representing radiation transport or other complex correlations. We demonstrate our approach using three types of datasets collected at the same time over Fukushima City in Japan: (1) coarse-resolution airborne surveys covering the entire area, (2) car surveys along major roads, and (3) walk surveys in multiple neighborhoods. Results show that the method can successfully integrate three types of datasets and create an integrated map (including the confidence intervals) of air dose rates over the domain in high resolution. Moreover, this study provides us with various insights into the characteristics of each dataset, as well as radiocaesium distribution. In particular, the urban areas show high heterogeneity in the contaminant distribution due to human activities as well as large discrepancy among different surveys due to such heterogeneity. Copyright © 2016 Elsevier Ltd. All rights reserved.
q-Space Upsampling Using x-q Space Regularization.
Chen, Geng; Dong, Bin; Zhang, Yong; Shen, Dinggang; Yap, Pew-Thian
2017-09-01
Acquisition time in diffusion MRI increases with the number of diffusion-weighted images that need to be acquired. Particularly in clinical settings, scan time is limited and only a sparse coverage of the vast q -space is possible. In this paper, we show how non-local self-similar information in the x - q space of diffusion MRI data can be harnessed for q -space upsampling. More specifically, we establish the relationships between signal measurements in x - q space using a patch matching mechanism that caters to unstructured data. We then encode these relationships in a graph and use it to regularize an inverse problem associated with recovering a high q -space resolution dataset from its low-resolution counterpart. Experimental results indicate that the high-resolution datasets reconstructed using the proposed method exhibit greater quality, both quantitatively and qualitatively, than those obtained using conventional methods, such as interpolation using spherical radial basis functions (SRBFs).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lekadir, Karim, E-mail: karim.lekadir@upf.edu; Hoogendoorn, Corné; Armitage, Paul
Purpose: This paper presents a statistical approach for the prediction of trabecular bone parameters from low-resolution multisequence magnetic resonance imaging (MRI) in children, thus addressing the limitations of high-resolution modalities such as HR-pQCT, including the significant exposure of young patients to radiation and the limited applicability of such modalities to peripheral bones in vivo. Methods: A statistical predictive model is constructed from a database of MRI and HR-pQCT datasets, to relate the low-resolution MRI appearance in the cancellous bone to the trabecular parameters extracted from the high-resolution images. The description of the MRI appearance is achieved between subjects by usingmore » a collection of feature descriptors, which describe the texture properties inside the cancellous bone, and which are invariant to the geometry and size of the trabecular areas. The predictive model is built by fitting to the training data a nonlinear partial least square regression between the input MRI features and the output trabecular parameters. Results: Detailed validation based on a sample of 96 datasets shows correlations >0.7 between the trabecular parameters predicted from low-resolution multisequence MRI based on the proposed statistical model and the values extracted from high-resolution HRp-QCT. Conclusions: The obtained results indicate the promise of the proposed predictive technique for the estimation of trabecular parameters in children from multisequence MRI, thus reducing the need for high-resolution radiation-based scans for a fragile population that is under development and growth.« less
NASA's Earth Science Use of Commercially Availiable Remote Sensing Datasets: Cover Image
NASA Technical Reports Server (NTRS)
Underwood, Lauren W.; Goward, Samuel N.; Fearon, Matthew G.; Fletcher, Rose; Garvin, Jim; Hurtt, George
2008-01-01
The cover image incorporates high resolution stereo pairs acquired from the DigitalGlobe(R) QuickBird sensor. It shows a digital elevation model of Meteor Crater, Arizona at approximately 1.3 meter point-spacing. Image analysts used the Leica Photogrammetry Suite to produce the DEM. The outside portion was computed from two QuickBird panchromatic scenes acquired October 2006, while an Optech laser scan dataset was used for the crater s interior elevations. The crater s terrain model and image drape were created in a NASA Constellation Program project focused on simulating lunar surface environments for prototyping and testing lunar surface mission analysis and planning tools. This work exemplifies NASA s Scientific Data Purchase legacy and commercial high resolution imagery applications, as scientists use commercial high resolution data to examine lunar analog Earth landscapes for advanced planning and trade studies for future lunar surface activities. Other applications include landscape dynamics related to volcanism, hydrologic events, climate change, and ice movement.
A Review of Land-Cover Mapping Activities in Coastal Alabama and Mississippi
Smith, Kathryn E.L.; Nayegandhi, Amar; Brock, John C.
2010-01-01
INTRODUCTION Land-use and land-cover (LULC) data provide important information for environmental management. Data pertaining to land-cover and land-management activities are a common requirement for spatial analyses, such as watershed modeling, climate change, and hazard assessment. In coastal areas, land development, storms, and shoreline modification amplify the need for frequent and detailed land-cover datasets. The northern Gulf of Mexico coastal area is no exception. The impact of severe storms, increases in urban area, dramatic changes in land cover, and loss of coastal-wetland habitat all indicate a vital need for reliable and comparable land-cover data. Four main attributes define a land-cover dataset: the date/time of data collection, the spatial resolution, the type of classification, and the source data. The source data are the foundation dataset used to generate LULC classification and are typically remotely sensed data, such as aerial photography or satellite imagery. These source data have a large influence on the final LULC data product, so much so that one can classify LULC datasets into two general groups: LULC data derived from aerial photography and LULC data derived from satellite imagery. The final LULC data can be converted from one format to another (for instance, vector LULC data can be converted into raster data for analysis purposes, and vice versa), but each subsequent dataset maintains the imprint of the source medium within its spatial accuracy and data features. The source data will also influence the spatial and temporal resolution, as well as the type of classification. The intended application of the LULC data typically defines the type of source data and methodology, with satellite imagery being selected for large landscapes (state-wide, national data products) and repeatability (environmental monitoring and change analysis). The coarse spatial scale and lack of refined land-use categories are typical drawbacks to satellite-based land-use classifications. Aerial photography is typically selected for smaller landscapes (watershed-basin scale), for greater definition of the land-use categories, and for increased spatial resolution. Disadvantages of using photography include time-consuming digitization, high costs for imagery collection, and lack of seasonal data. Recently, the availability of high-resolution satellite imagery has generated a new category of LULC data product. These new datasets have similar strengths to the aerial-photo-based LULC in that they possess the potential for refined definition of land-use categories and increased spatial resolution but also have the benefit of satellite-based classifications, such as repeatability for change analysis. LULC classification based on high-resolution satellite imagery is still in the early stages of development but merits greater attention because environmental-monitoring and landscape-modeling programs rely heavily on LULC data. This publication summarizes land-use and land-cover mapping activities for Alabama and Mississippi coastal areas within the U.S. Geological Survey (USGS) Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility Project boundaries. Existing LULC datasets will be described, as well as imagery data sources and ancillary data that may provide ground-truth or satellite training data for a forthcoming land-cover classification. Finally, potential areas for a high-resolution land-cover classification in the Alabama-Mississippi region will be identified.
NASA Astrophysics Data System (ADS)
Nozawa, T.
2016-12-01
Recently, Japan Aerospace Exploration Agency (JAXA) has developed a new long-term snow cover extent (SCE) product using Advanced Very High Resolution Radiometer (AVHRR) and Moderate Resolution Imaging Spectroradiometer (MODIS) data spanning from 1980's to date. This new product (JAXA/SCE) has higher spatial resolution and smaller commission error compared with traditional SCE dataset of National Oceanic and Atmospheric Administration (NOAA/SCE). Continuity of the algorithm is another strong point in JAXA/SCE. According to the new JAXA/SCE dataset, the Eurasian SCE has been significantly retreating since 1980's, especially in late spring and early summer. Here, we investigate impacts of early summer Eurasian snow cover change on atmospheric circulation in Northern mid-latitudes, especially over the East Asia, using the new JAXA/SCE dataset and a few reanalysis data. We will present analyzed results on relationships between early summer SCE anomaly over the Eurasia and changes in atmospheric circulations such as upper level zonal jets (changes in strength, positions, etc.) over the East Asia.
Cluster Active Archive: lessons learnt
NASA Astrophysics Data System (ADS)
Laakso, H. E.; Perry, C. H.; Taylor, M. G.; Escoubet, C. P.; Masson, A.
2010-12-01
The ESA Cluster Active Archive (CAA) was opened to public in February 2006 after an initial three-year development phase. It provides access (both web GUI and command-line tool are available) to the calibrated full-resolution datasets of the four-satellite Cluster mission. The data archive is publicly accessible and suitable for science use and publication by the world-wide scientific community. There are more than 350 datasets from each spacecraft, including high-resolution magnetic and electric DC and AC fields as well as full 3-dimensional electron and ion distribution functions and moments from a few eV to hundreds of keV. The Cluster mission has been in operation since February 2001, and currently although the CAA can provide access to some recent observations, the ingestion of some other datasets can be delayed by a few years due to large and difficult calibration routines of aging detectors. The quality of the datasets is the central matter to the CAA. Having the same instrument on four spacecraft allows the cross-instrument comparisons and provide confidence on some of the instrumental calibration parameters. Furthermore it is highly important that many physical parameters are measured by more than one instrument which allow to perform extensive and continuous cross-calibration analyses. In addition some of the instruments can be regarded as absolute or reference measurements for other instruments. The CAA attempts to avoid as much as possible mission-specific acronyms and concepts and tends to use more generic terms in describing the datasets and their contents in order to ease the usage of the CAA data by “non-Cluster” scientists. Currently the CAA has more 1000 users and every month more than 150 different users log in the CAA for plotting and/or downloading observations. The users download about 1 TeraByte of data every month. The CAA has separated the graphical tool from the download tool because full-resolution datasets can be visualized in many ways and so there is no one-to-one correspondence between graphical products and full-resolution datasets. The CAA encourages users to contact the CAA team for all kind of issues whether it concerns the user interface, the content of the datasets, the quality of the observations or provision of new type of services. The CAA runs regular annual reviews on the data products and the user services in order to improve the quality and usability of the CAA system to the world-wide user community. The CAA is continuously being upgraded in terms of datasets and services.
Non-model-based correction of respiratory motion using beat-to-beat 3D spiral fat-selective imaging.
Keegan, Jennifer; Gatehouse, Peter D; Yang, Guang-Zhong; Firmin, David N
2007-09-01
To demonstrate the feasibility of retrospective beat-to-beat correction of respiratory motion, without the need for a respiratory motion model. A high-resolution three-dimensional (3D) spiral black-blood scan of the right coronary artery (RCA) of six healthy volunteers was acquired over 160 cardiac cycles without respiratory gating. One spiral interleaf was acquired per cardiac cycle, prior to each of which a complete low-resolution fat-selective 3D spiral dataset was acquired. The respiratory motion (3D translation) on each cardiac cycle was determined by cross-correlating a region of interest (ROI) in the fat around the artery in the low-resolution datasets with that on a reference end-expiratory dataset. The measured translations were used to correct the raw data of the high-resolution spiral interleaves. Beat-to-beat correction provided consistently good results, with the image quality being better than that obtained with a fixed superior-inferior tracking factor of 0.6 and better than (N = 5) or equal to (N = 1) that achieved using a subject-specific retrospective 3D translation motion model. Non-model-based correction of respiratory motion using 3D spiral fat-selective imaging is feasible, and in this small group of volunteers produced better-quality images than a subject-specific retrospective 3D translation motion model. (c) 2007 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Labzovskii, Lev D.; Papayannis, Alexandros; Binietoglou, Ioannis; Banks, Robert F.; Baldasano, Jose M.; Toanca, Florica; Tzanis, Chris G.; Christodoulakis, John
2018-02-01
Accurate continuous measurements of relative humidity (RH) vertical profiles in the lower troposphere have become a significant scientific challenge. In recent years a synergy of various ground-based remote sensing instruments have been successfully used for RH vertical profiling, which has resulted in the improvement of spatial resolution and, in some cases, of the accuracy of the measurement. Some studies have also suggested the use of high-resolution model simulations as input datasets into RH vertical profiling techniques. In this paper we apply two synergetic methods for RH profiling, including the synergy of lidar with a microwave radiometer and high-resolution atmospheric modeling. The two methods are employed for RH retrieval between 100 and 6000 m with increased spatial resolution, based on datasets from the HygrA-CD (Hygroscopic Aerosols to Cloud Droplets) campaign conducted in Athens, Greece from May to June 2014. RH profiles from synergetic methods are then compared with those retrieved using single instruments or as simulated by high-resolution models. Our proposed technique for RH profiling provides improved statistical agreement with reference to radiosoundings by 27 % when the lidar-radiometer (in comparison with radiometer measurements) approach is used and by 15 % when a lidar model is used (in comparison with WRF-model simulations). Mean uncertainty of RH due to temperature bias in RH profiling was ˜ 4.34 % for the lidar-radiometer and ˜ 1.22 % for the lidar-model methods. However, maximum uncertainty in RH retrievals due to temperature bias showed that lidar-model method is more reliable at heights greater than 2000 m. Overall, our results have demonstrated the capability of both combined methods for daytime measurements in heights between 100 and 6000 m when lidar-radiometer or lidar-WRF combined datasets are available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levin, Barnaby D. A.; Padgett, Elliot; Chen, Chien-Chun
Electron tomography in materials science has flourished with the demand to characterize nanoscale materials in three dimensions (3D). Access to experimental data is vital for developing and validating reconstruction methods that improve resolution and reduce radiation dose requirements. This work presents five high-quality scanning transmission electron microscope (STEM) tomography datasets in order to address the critical need for open access data in this field. The datasets represent the current limits of experimental technique, are of high quality, and contain materials with structural complexity. Included are tomographic series of a hyperbranched Co 2 P nanocrystal, platinum nanoparticles on a carbonmore » nanofibre imaged over the complete 180° tilt range, a platinum nanoparticle and a tungsten needle both imaged at atomic resolution by equal slope tomography, and a through-focal tilt series of PtCu nanoparticles. A volumetric reconstruction from every dataset is provided for comparison and development of post-processing and visualization techniques. Researchers interested in creating novel data processing and reconstruction algorithms will now have access to state of the art experimental test data.« less
A high-resolution 7-Tesla fMRI dataset from complex natural stimulation with an audio movie
Hanke, Michael; Baumgartner, Florian J.; Ibe, Pierre; Kaule, Falko R.; Pollmann, Stefan; Speck, Oliver; Zinke, Wolf; Stadler, Jörg
2014-01-01
Here we present a high-resolution functional magnetic resonance (fMRI) dataset – 20 participants recorded at high field strength (7 Tesla) during prolonged stimulation with an auditory feature film (“Forrest Gump”). In addition, a comprehensive set of auxiliary data (T1w, T2w, DTI, susceptibility-weighted image, angiography) as well as measurements to assess technical and physiological noise components have been acquired. An initial analysis confirms that these data can be used to study common and idiosyncratic brain response patterns to complex auditory stimulation. Among the potential uses of this dataset are the study of auditory attention and cognition, language and music perception, and social perception. The auxiliary measurements enable a large variety of additional analysis strategies that relate functional response patterns to structural properties of the brain. Alongside the acquired data, we provide source code and detailed information on all employed procedures – from stimulus creation to data analysis. In order to facilitate replicative and derived works, only free and open-source software was utilized. PMID:25977761
NASA Astrophysics Data System (ADS)
Senanayake, I. P.; Yeo, I. Y.; Tangdamrongsub, N.; Willgoose, G. R.; Hancock, G. R.; Wells, T.; Fang, B.; Lakshmi, V.
2017-12-01
Long-term soil moisture datasets at high spatial resolution are important in agricultural, hydrological, and climatic applications. The soil moisture estimates can be achieved using satellite remote sensing observations. However, the satellite soil moisture data are typically available at coarse spatial resolutions ( several tens of km), therefore require further downscaling. Different satellite soil moisture products have to be conjointly employed in developing a consistent time-series of high resolution soil moisture, while the discrepancies amongst different satellite retrievals need to be resolved. This study aims to downscale three different satellite soil moisture products, the Soil Moisture and Ocean Salinity (SMOS, 25 km), the Soil Moisture Active Passive (SMAP, 36 km) and the SMAP-Enhanced (9 km), and to conduct an inter-comparison of the downscaled results. The downscaling approach is developed based on the relationship between the diurnal temperature difference and the daily mean soil moisture content. The approach is applied to two sub-catchments (Krui and Merriwa River) of the Goulburn River catchment in the Upper Hunter region (NSW, Australia) to estimate soil moisture at 1 km resolution for 2015. The three coarse spatial resolution soil moisture products and their downscaled results will be validated with the in-situ observations obtained from the Scaling and Assimilation of Soil Moisture and Streamflow (SASMAS) network. The spatial and temporal patterns of the downscaled results will also be analysed. This study will provide the necessary insights for data selection and bias corrections to maintain the consistency of a long-term high resolution soil moisture dataset. The results will assist in developing a time-series of high resolution soil moisture data over the south-eastern Australia.
Comparison and validation of gridded precipitation datasets for Spain
NASA Astrophysics Data System (ADS)
Quintana-Seguí, Pere; Turco, Marco; Míguez-Macho, Gonzalo
2016-04-01
In this study, two gridded precipitation datasets are compared and validated in Spain: the recently developed SAFRAN dataset and the Spain02 dataset. These are validated using rain gauges and they are also compared to the low resolution ERA-Interim reanalysis. The SAFRAN precipitation dataset has been recently produced, using the SAFRAN meteorological analysis, which is extensively used in France (Durand et al. 1993, 1999; Quintana-Seguí et al. 2008; Vidal et al., 2010) and which has recently been applied to Spain (Quintana-Seguí et al., 2015). SAFRAN uses an optimal interpolation (OI) algorithm and uses all available rain gauges from the Spanish State Meteorological Agency (Agencia Estatal de Meteorología, AEMET). The product has a spatial resolution of 5 km and it spans from September 1979 to August 2014. This dataset has been produced mainly to be used in large scale hydrological applications. Spain02 (Herrera et al. 2012, 2015) is another high quality precipitation dataset for Spain based on a dense network of quality-controlled stations and it has different versions at different resolutions. In this study we used the version with a resolution of 0.11°. The product spans from 1971 to 2010. Spain02 is well tested and widely used, mainly, but not exclusively, for RCM model validation and statistical downscliang. ERA-Interim is a well known global reanalysis with a spatial resolution of ˜79 km. It has been included in the comparison because it is a widely used product for continental and global scale studies and also in smaller scale studies in data poor countries. Thus, its comparison with higher resolution products of a data rich country, such as Spain, allows us to quantify the errors made when using such datasets for national scale studies, in line with some of the objectives of the EU-FP7 eartH2Observe project. The comparison shows that SAFRAN and Spain02 perform similarly, even though their underlying principles are different. Both products are largely better than ERA-Interim, which has a much coarser representation of the relief, which is crucial for precipitation. These results are a contribution to the Spanish Case Study of the eartH2Observe project, which is focused on the simulation of drought processes in Spain using Land-Surface Models (LSM). This study will also be helpful in the Spanish MARCO project, which aims at improving the ability of RCMs to simulate hydrometeorological extremes.
Wave equation datuming applied to marine OBS data and to land high resolution seismic profiling
NASA Astrophysics Data System (ADS)
Barison, Erika; Brancatelli, Giuseppe; Nicolich, Rinaldo; Accaino, Flavio; Giustiniani, Michela; Tinivella, Umberta
2011-03-01
One key step in seismic data processing flows is the computation of static corrections, which relocate shots and receivers at the same datum plane and remove near surface weathering effects. We applied a standard static correction and a wave equation datuming and compared the obtained results in two case studies: 1) a sparse ocean bottom seismometers dataset for deep crustal prospecting; 2) a high resolution land reflection dataset for hydrogeological investigation. In both cases, a detailed velocity field, obtained by tomographic inversion of the first breaks, was adopted to relocate shots and receivers to the datum plane. The results emphasize the importance of wave equation datuming to properly handle complex near surface conditions. In the first dataset, the deployed ocean bottom seismometers were relocated to the sea level (shot positions) and a standard processing sequence was subsequently applied to the output. In the second dataset, the application of wave equation datuming allowed us to remove the coherent noise, such as ground roll, and to improve the image quality with respect to the application of static correction. The comparison of the two approaches evidences that the main reflecting markers are better resolved when the wave equation datuming procedure is adopted.
Hu, Hao; Hong, Xingchen; Terstriep, Jeff; Liu, Yan; Finn, Michael P.; Rush, Johnathan; Wendel, Jeffrey; Wang, Shaowen
2016-01-01
Geospatial data, often embedded with geographic references, are important to many application and science domains, and represent a major type of big data. The increased volume and diversity of geospatial data have caused serious usability issues for researchers in various scientific domains, which call for innovative cyberGIS solutions. To address these issues, this paper describes a cyberGIS community data service framework to facilitate geospatial big data access, processing, and sharing based on a hybrid supercomputer architecture. Through the collaboration between the CyberGIS Center at the University of Illinois at Urbana-Champaign (UIUC) and the U.S. Geological Survey (USGS), a community data service for accessing, customizing, and sharing digital elevation model (DEM) and its derived datasets from the 10-meter national elevation dataset, namely TopoLens, is created to demonstrate the workflow integration of geospatial big data sources, computation, analysis needed for customizing the original dataset for end user needs, and a friendly online user environment. TopoLens provides online access to precomputed and on-demand computed high-resolution elevation data by exploiting the ROGER supercomputer. The usability of this prototype service has been acknowledged in community evaluation.
NASA Technical Reports Server (NTRS)
Ruane, Alex C.; Goldberg, Richard; Chryssanthacopoulos, James
2014-01-01
The AgMERRA and AgCFSR climate forcing datasets provide daily, high-resolution, continuous, meteorological series over the 1980-2010 period designed for applications examining the agricultural impacts of climate variability and climate change. These datasets combine daily resolution data from retrospective analyses (the Modern-Era Retrospective Analysis for Research and Applications, MERRA, and the Climate Forecast System Reanalysis, CFSR) with in situ and remotely-sensed observational datasets for temperature, precipitation, and solar radiation, leading to substantial reductions in bias in comparison to a network of 2324 agricultural-region stations from the Hadley Integrated Surface Dataset (HadISD). Results compare favorably against the original reanalyses as well as the leading climate forcing datasets (Princeton, WFD, WFD-EI, and GRASP), and AgMERRA distinguishes itself with substantially improved representation of daily precipitation distributions and extreme events owing to its use of the MERRA-Land dataset. These datasets also peg relative humidity to the maximum temperature time of day, allowing for more accurate representation of the diurnal cycle of near-surface moisture in agricultural models. AgMERRA and AgCFSR enable a number of ongoing investigations in the Agricultural Model Intercomparison and Improvement Project (AgMIP) and related research networks, and may be used to fill gaps in historical observations as well as a basis for the generation of future climate scenarios.
A downscaled 1 km dataset of daily Greenland ice sheet surface mass balance components (1958-2014)
NASA Astrophysics Data System (ADS)
Noel, B.; Van De Berg, W. J.; Fettweis, X.; Machguth, H.; Howat, I. M.; van den Broeke, M. R.
2015-12-01
The current spatial resolution in regional climate models (RCMs), typically around 5 to 20 km, remains too coarse to accurately reproduce the spatial variability in surface mass balance (SMB) components over the narrow ablation zones, marginal outlet glaciers and neighbouring ice caps of the Greenland ice sheet (GrIS). In these topographically rough terrains, the SMB components are highly dependent on local variations in topography. However, the relatively low-resolution elevation and ice mask prescribed in RCMs contribute to significantly underestimate melt and runoff in these regions due to unresolved valley glaciers and fjords. Therefore, near-km resolution topography is essential to better capture SMB variability in these spatially restricted regions. We present a 1 km resolution dataset of daily GrIS SMB covering the period 1958-2014, which is statistically downscaled from data of the polar regional climate model RACMO2.3 at 11 km, using an elevation dependence. The dataset includes all individual SMB components projected on the elevation and ice mask from the GIMP DEM, down-sampled to 1 km. Daily runoff and sublimation are interpolated to the 1 km topography using a local regression to elevation valid for each day specifically; daily precipitation is bi-linearly downscaled without elevation corrections. The daily SMB dataset is then reconstructed by summing downscaled precipitation, sublimation and runoff. High-resolution elevation and ice mask allow for properly resolving the narrow ablation zones and valley glaciers at the GrIS margins, leading to significant increase in runoff estimate. In these regions, and especially over narrow glaciers tongues, the downscaled products improve on the original RACMO2.3 outputs by better representing local SMB patterns through a gradual ablation increase towards the GrIS margins. We discuss the impact of downscaling on the SMB components in a case study for a spatially restricted region, where large elevation discrepancies are observed between both resolutions. Owing to generally enhanced runoff in the GrIS ablation zone, the evaluation of daily downscaled SMB against ablation measurements, collected at in-situ measuring sites derived from a newly compiled ablation dataset, shows a better agreement with observations relative to native RACMO2.3 SMB at 11 km.
Three-dimensional scanning transmission electron microscopy of biological specimens.
de Jonge, Niels; Sougrat, Rachid; Northan, Brian M; Pennycook, Stephen J
2010-02-01
A three-dimensional (3D) reconstruction of the cytoskeleton and a clathrin-coated pit in mammalian cells has been achieved from a focal-series of images recorded in an aberration-corrected scanning transmission electron microscope (STEM). The specimen was a metallic replica of the biological structure comprising Pt nanoparticles 2-3 nm in diameter, with a high stability under electron beam radiation. The 3D dataset was processed by an automated deconvolution procedure. The lateral resolution was 1.1 nm, set by pixel size. Particles differing by only 10 nm in vertical position were identified as separate objects with greater than 20% dip in contrast between them. We refer to this value as the axial resolution of the deconvolution or reconstruction, the ability to recognize two objects, which were unresolved in the original dataset. The resolution of the reconstruction is comparable to that achieved by tilt-series transmission electron microscopy. However, the focal-series method does not require mechanical tilting and is therefore much faster. 3D STEM images were also recorded of the Golgi ribbon in conventional thin sections containing 3T3 cells with a comparable axial resolution in the deconvolved dataset.
Three-Dimensional Scanning Transmission Electron Microscopy of Biological Specimens
de Jonge, Niels; Sougrat, Rachid; Northan, Brian M.; Pennycook, Stephen J.
2010-01-01
A three-dimensional (3D) reconstruction of the cytoskeleton and a clathrin-coated pit in mammalian cells has been achieved from a focal-series of images recorded in an aberration-corrected scanning transmission electron microscope (STEM). The specimen was a metallic replica of the biological structure comprising Pt nanoparticles 2–3 nm in diameter, with a high stability under electron beam radiation. The 3D dataset was processed by an automated deconvolution procedure. The lateral resolution was 1.1 nm, set by pixel size. Particles differing by only 10 nm in vertical position were identified as separate objects with greater than 20% dip in contrast between them. We refer to this value as the axial resolution of the deconvolution or reconstruction, the ability to recognize two objects, which were unresolved in the original dataset. The resolution of the reconstruction is comparable to that achieved by tilt-series transmission electron microscopy. However, the focal-series method does not require mechanical tilting and is therefore much faster. 3D STEM images were also recorded of the Golgi ribbon in conventional thin sections containing 3T3 cells with a comparable axial resolution in the deconvolved dataset. PMID:20082729
Hydrologic Derivatives for Modeling and Analysis—A new global high-resolution database
Verdin, Kristine L.
2017-07-17
The U.S. Geological Survey has developed a new global high-resolution hydrologic derivative database. Loosely modeled on the HYDRO1k database, this new database, entitled Hydrologic Derivatives for Modeling and Analysis, provides comprehensive and consistent global coverage of topographically derived raster layers (digital elevation model data, flow direction, flow accumulation, slope, and compound topographic index) and vector layers (streams and catchment boundaries). The coverage of the data is global, and the underlying digital elevation model is a hybrid of three datasets: HydroSHEDS (Hydrological data and maps based on SHuttle Elevation Derivatives at multiple Scales), GMTED2010 (Global Multi-resolution Terrain Elevation Data 2010), and the SRTM (Shuttle Radar Topography Mission). For most of the globe south of 60°N., the raster resolution of the data is 3 arc-seconds, corresponding to the resolution of the SRTM. For the areas north of 60°N., the resolution is 7.5 arc-seconds (the highest resolution of the GMTED2010 dataset) except for Greenland, where the resolution is 30 arc-seconds. The streams and catchments are attributed with Pfafstetter codes, based on a hierarchical numbering system, that carry important topological information. This database is appropriate for use in continental-scale modeling efforts. The work described in this report was conducted by the U.S. Geological Survey in cooperation with the National Aeronautics and Space Administration Goddard Space Flight Center.
High resolution global gridded data for use in population studies
NASA Astrophysics Data System (ADS)
Lloyd, Christopher T.; Sorichetta, Alessandro; Tatem, Andrew J.
2017-01-01
Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website.
High resolution global gridded data for use in population studies.
Lloyd, Christopher T; Sorichetta, Alessandro; Tatem, Andrew J
2017-01-31
Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website.
NASA Astrophysics Data System (ADS)
Flores, A. N.; Smith, K.; LaPorte, P.
2011-12-01
Applications like flood forecasting, military trafficability assessment, and slope stability analysis necessitate the use of models capable of resolving hydrologic states and fluxes at spatial scales of hillslopes (e.g., 10s to 100s m). These models typically require precipitation forcings at spatial scales of kilometers or better and time intervals of hours. Yet in especially rugged terrain that typifies much of the Western US and throughout much of the developing world, precipitation data at these spatiotemporal resolutions is difficult to come by. Ground-based weather radars have significant problems in high-relief settings and are sparsely located, leaving significant gaps in coverage and high uncertainties. Precipitation gages provide accurate data at points but are very sparsely located and their placement is often not representative, yielding significant coverage gaps in a spatial and physiographic sense. Numerical weather prediction efforts have made precipitation data, including critically important information on precipitation phase, available globally and in near real-time. However, these datasets present watershed modelers with two problems: (1) spatial scales of many of these datasets are tens of kilometers or coarser, (2) numerical weather models used to generate these datasets include a land surface parameterization that in some circumstances can significantly affect precipitation predictions. We report on the development of a regional precipitation dataset for Idaho that leverages: (1) a dataset derived from a numerical weather prediction model, (2) gages within Idaho that report hourly precipitation data, and (3) a long-term precipitation climatology dataset. Hourly precipitation estimates from the Modern Era Retrospective-analysis for Research and Applications (MERRA) are stochastically downscaled using a hybrid orographic and statistical model from their native resolution (1/2 x 2/3 degrees) to a resolution of approximately 1 km. Downscaled precipitation realizations are conditioned on hourly observations from reporting gages and then conditioned again on the Parameter-elevation Regressions on Independent Slopes Model (PRISM) at the monthly timescale to reflect orographic precipitation trends common to watersheds of the Western US. While this methodology potentially introduces cross-pollination of errors due to the re-use of precipitation gage data, it nevertheless achieves an ensemble-based precipitation estimate and appropriate measures of uncertainty at a spatiotemporal resolution appropriate for watershed modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Mathew; Marshall, Matthew J.; Miller, Erin A.
2014-08-26
Understanding the interactions of structured communities known as “biofilms” and other complex matrixes is possible through the X-ray micro tomography imaging of the biofilms. Feature detection and image processing for this type of data focuses on efficiently identifying and segmenting biofilms and bacteria in the datasets. The datasets are very large and often require manual interventions due to low contrast between objects and high noise levels. Thus new software is required for the effectual interpretation and analysis of the data. This work specifies the evolution and application of the ability to analyze and visualize high resolution X-ray micro tomography datasets.
NASA Astrophysics Data System (ADS)
Barbarossa, Valerio; Huijbregts, Mark A. J.; Beusen, Arthur H. W.; Beck, Hylke E.; King, Henry; Schipper, Aafke M.
2018-03-01
Streamflow data is highly relevant for a variety of socio-economic as well as ecological analyses or applications, but a high-resolution global streamflow dataset is yet lacking. We created FLO1K, a consistent streamflow dataset at a resolution of 30 arc seconds (~1 km) and global coverage. FLO1K comprises mean, maximum and minimum annual flow for each year in the period 1960-2015, provided as spatially continuous gridded layers. We mapped streamflow by means of artificial neural networks (ANNs) regression. An ensemble of ANNs were fitted on monthly streamflow observations from 6600 monitoring stations worldwide, i.e., minimum and maximum annual flows represent the lowest and highest mean monthly flows for a given year. As covariates we used the upstream-catchment physiography (area, surface slope, elevation) and year-specific climatic variables (precipitation, temperature, potential evapotranspiration, aridity index and seasonality indices). Confronting the maps with independent data indicated good agreement (R2 values up to 91%). FLO1K delivers essential data for freshwater ecology and water resources analyses at a global scale and yet high spatial resolution.
Yang, Y X; Teo, S-K; Van Reeth, E; Tan, C H; Tham, I W K; Poh, C L
2015-08-01
Accurate visualization of lung motion is important in many clinical applications, such as radiotherapy of lung cancer. Advancement in imaging modalities [e.g., computed tomography (CT) and MRI] has allowed dynamic imaging of lung and lung tumor motion. However, each imaging modality has its advantages and disadvantages. The study presented in this paper aims at generating synthetic 4D-CT dataset for lung cancer patients by combining both continuous three-dimensional (3D) motion captured by 4D-MRI and the high spatial resolution captured by CT using the authors' proposed approach. A novel hybrid approach based on deformable image registration (DIR) and finite element method simulation was developed to fuse a static 3D-CT volume (acquired under breath-hold) and the 3D motion information extracted from 4D-MRI dataset, creating a synthetic 4D-CT dataset. The study focuses on imaging of lung and lung tumor. Comparing the synthetic 4D-CT dataset with the acquired 4D-CT dataset of six lung cancer patients based on 420 landmarks, accurate results (average error <2 mm) were achieved using the authors' proposed approach. Their hybrid approach achieved a 40% error reduction (based on landmarks assessment) over using only DIR techniques. The synthetic 4D-CT dataset generated has high spatial resolution, has excellent lung details, and is able to show movement of lung and lung tumor over multiple breathing cycles.
2016-01-01
Abstract Background Metabarcoding is becoming a common tool used to assess and compare diversity of organisms in environmental samples. Identification of OTUs is one of the critical steps in the process and several taxonomy assignment methods were proposed to accomplish this task. This publication evaluates the quality of reference datasets, alongside with several alignment and phylogeny inference methods used in one of the taxonomy assignment methods, called tree-based approach. This approach assigns anonymous OTUs to taxonomic categories based on relative placements of OTUs and reference sequences on the cladogram and support that these placements receive. New information In tree-based taxonomy assignment approach, reliable identification of anonymous OTUs is based on their placement in monophyletic and highly supported clades together with identified reference taxa. Therefore, it requires high quality reference dataset to be used. Resolution of phylogenetic trees is strongly affected by the presence of erroneous sequences as well as alignment and phylogeny inference methods used in the process. Two preparation steps are essential for the successful application of tree-based taxonomy assignment approach. Curated collections of genetic information do include erroneous sequences. These sequences have detrimental effect on the resolution of cladograms used in tree-based approach. They must be identified and excluded from the reference dataset beforehand. Various combinations of multiple sequence alignment and phylogeny inference methods provide cladograms with different topology and bootstrap support. These combinations of methods need to be tested in order to determine the one that gives highest resolution for the particular reference dataset. Completing the above mentioned preparation steps is expected to decrease the number of unassigned OTUs and thus improve the results of the tree-based taxonomy assignment approach. PMID:27932919
Holovachov, Oleksandr
2016-01-01
Metabarcoding is becoming a common tool used to assess and compare diversity of organisms in environmental samples. Identification of OTUs is one of the critical steps in the process and several taxonomy assignment methods were proposed to accomplish this task. This publication evaluates the quality of reference datasets, alongside with several alignment and phylogeny inference methods used in one of the taxonomy assignment methods, called tree-based approach. This approach assigns anonymous OTUs to taxonomic categories based on relative placements of OTUs and reference sequences on the cladogram and support that these placements receive. In tree-based taxonomy assignment approach, reliable identification of anonymous OTUs is based on their placement in monophyletic and highly supported clades together with identified reference taxa. Therefore, it requires high quality reference dataset to be used. Resolution of phylogenetic trees is strongly affected by the presence of erroneous sequences as well as alignment and phylogeny inference methods used in the process. Two preparation steps are essential for the successful application of tree-based taxonomy assignment approach. Curated collections of genetic information do include erroneous sequences. These sequences have detrimental effect on the resolution of cladograms used in tree-based approach. They must be identified and excluded from the reference dataset beforehand.Various combinations of multiple sequence alignment and phylogeny inference methods provide cladograms with different topology and bootstrap support. These combinations of methods need to be tested in order to determine the one that gives highest resolution for the particular reference dataset.Completing the above mentioned preparation steps is expected to decrease the number of unassigned OTUs and thus improve the results of the tree-based taxonomy assignment approach.
The optimization of high resolution topographic data for 1D hydrodynamic models
NASA Astrophysics Data System (ADS)
Ales, Ronovsky; Michal, Podhoranyi
2016-06-01
The main focus of our research presented in this paper is to optimize and use high resolution topographical data (HRTD) for hydrological modelling. Optimization of HRTD is done by generating adaptive mesh by measuring distance of coarse mesh and the surface of the dataset and adapting the mesh from the perspective of keeping the geometry as close to initial resolution as possible. Technique described in this paper enables computation of very accurate 1-D hydrodynamic models. In the paper, we use HEC-RAS software as a solver. For comparison, we have chosen the amount of generated cells/grid elements (in whole discretization domain and selected cross sections) with respect to preservation of the accuracy of the computational domain. Generation of the mesh for hydrodynamic modelling is strongly reliant on domain size and domain resolution. Topographical dataset used in this paper was created using LiDAR method and it captures 5.9km long section of a catchment of the river Olše. We studied crucial changes in topography for generated mesh. Assessment was done by commonly used statistical and visualization methods.
The optimization of high resolution topographic data for 1D hydrodynamic models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ales, Ronovsky, E-mail: ales.ronovsky@vsb.cz; Michal, Podhoranyi
2016-06-08
The main focus of our research presented in this paper is to optimize and use high resolution topographical data (HRTD) for hydrological modelling. Optimization of HRTD is done by generating adaptive mesh by measuring distance of coarse mesh and the surface of the dataset and adapting the mesh from the perspective of keeping the geometry as close to initial resolution as possible. Technique described in this paper enables computation of very accurate 1-D hydrodynamic models. In the paper, we use HEC-RAS software as a solver. For comparison, we have chosen the amount of generated cells/grid elements (in whole discretization domainmore » and selected cross sections) with respect to preservation of the accuracy of the computational domain. Generation of the mesh for hydrodynamic modelling is strongly reliant on domain size and domain resolution. Topographical dataset used in this paper was created using LiDAR method and it captures 5.9km long section of a catchment of the river Olše. We studied crucial changes in topography for generated mesh. Assessment was done by commonly used statistical and visualization methods.« less
A closer look at temperature changes with remote sensing
NASA Astrophysics Data System (ADS)
Metz, Markus; Rocchini, Duccio; Neteler, Markus
2014-05-01
Temperature is a main driver for important ecological processes. Time series temperature data provide key environmental indicators for various applications and research fields. High spatial and temporal resolution is crucial in order to perform detailed analyses in various fields of research. While meteorological station data are commonly used, they often lack completeness or are not distributed in a representative way. Remotely sensed thermal images from polar orbiting satellites are considered to be a good alternative to the scarce meteorological data as they offer almost continuous coverage of the Earth with very high temporal resolution. A drawback of temperature data obtained by satellites is the occurrence of gaps (due to clouds, aerosols) that must be filled. We have reconstructed a seamless and gap-free time series for land surface temperature (LST) at continental scale for Europe from MODIS LST products (Moderate Resolution Imaging Sensor instruments onboard the Terra and Aqua satellites), keeping the temporal resolution of four records per day and enhancing the spatial resolution from 1 km to 250 m. Here we present a new procedure to reconstruct MODIS LST time series with unprecedented detail in space and time, at the same time providing continental coverage. Our method constitutes a unique new combination of weighted temporal averaging with statistical modeling and spatial interpolation. We selected as auxiliary variables datasets which are globally available in order to propose a worldwide reproducible method. Compared to existing similar datasets, the substantial quantitative difference translates to a qualitative difference in applications and results. We consider both our dataset and the new procedure for its creation to be of utmost interest to a broad interdisciplinary audience. Moreover, we provide examples for its implications and applications, such as disease risk assessment, epidemiology, environmental monitoring, and temperature anomalies. In the near future, aggregated derivatives of our dataset (following the BIOCLIM variable scheme) will be freely made online available for direct usage in GIS based applications.
NASA Astrophysics Data System (ADS)
Candela, S. G.; Howat, I.; Noh, M. J.; Porter, C. C.; Morin, P. J.
2016-12-01
In the last decade, high resolution satellite imagery has become an increasingly accessible tool for geoscientists to quantify changes in the Arctic land surface due to geophysical, ecological and anthropomorphic processes. However, the trade off between spatial coverage and spatial-temporal resolution has limited detailed, process-level change detection over large (i.e. continental) scales. The ArcticDEM project utilized over 300,000 Worldview image pairs to produce a nearly 100% coverage elevation model (above 60°N) offering the first polar, high spatial - high resolution (2-8m by region) dataset, often with multiple repeats in areas of particular interest to geo-scientists. A dataset of this size (nearly 250 TB) offers endless new avenues of scientific inquiry, but quickly becomes unmanageable computationally and logistically for the computing resources available to the average scientist. Here we present TopoDiff, a framework for a generalized. automated workflow that requires minimal input from the end user about a study site, and utilizes cloud computing resources to provide a temporally sorted and differenced dataset, ready for geostatistical analysis. This hands-off approach allows the end user to focus on the science, without having to manage thousands of files, or petabytes of data. At the same time, TopoDiff provides a consistent and accurate workflow for image sorting, selection, and co-registration enabling cross-comparisons between research projects.
Tang, Yunqing; Dai, Luru; Zhang, Xiaoming; Li, Junbai; Hendriks, Johnny; Fan, Xiaoming; Gruteser, Nadine; Meisenberg, Annika; Baumann, Arnd; Katranidis, Alexandros; Gensch, Thomas
2015-01-01
Single molecule localization based super-resolution fluorescence microscopy offers significantly higher spatial resolution than predicted by Abbe’s resolution limit for far field optical microscopy. Such super-resolution images are reconstructed from wide-field or total internal reflection single molecule fluorescence recordings. Discrimination between emission of single fluorescent molecules and background noise fluctuations remains a great challenge in current data analysis. Here we present a real-time, and robust single molecule identification and localization algorithm, SNSMIL (Shot Noise based Single Molecule Identification and Localization). This algorithm is based on the intrinsic nature of noise, i.e., its Poisson or shot noise characteristics and a new identification criterion, QSNSMIL, is defined. SNSMIL improves the identification accuracy of single fluorescent molecules in experimental or simulated datasets with high and inhomogeneous background. The implementation of SNSMIL relies on a graphics processing unit (GPU), making real-time analysis feasible as shown for real experimental and simulated datasets. PMID:26098742
A High-Resolution Merged Wind Dataset for DYNAMO: Progress and Future Plans
NASA Technical Reports Server (NTRS)
Lang, Timothy J.; Mecikalski, John; Li, Xuanli; Chronis, Themis; Castillo, Tyler; Hoover, Kacie; Brewer, Alan; Churnside, James; McCarty, Brandi; Hein, Paul;
2015-01-01
In order to support research on optimal data assimilation methods for the Cyclone Global Navigation Satellite System (CYGNSS), launching in 2016, work has been ongoing to produce a high-resolution merged wind dataset for the Dynamics of the Madden Julian Oscillation (DYNAMO) field campaign, which took place during late 2011/early 2012. The winds are produced by assimilating DYNAMO observations into the Weather Research and Forecasting (WRF) three-dimensional variational (3DVAR) system. Data sources from the DYNAMO campaign include the upper-air sounding network, radial velocities from the radar network, vector winds from the Advanced Scatterometer (ASCAT) and Oceansat-2 Scatterometer (OSCAT) satellite instruments, the NOAA High Resolution Doppler Lidar (HRDL), and several others. In order the prep them for 3DVAR, significant additional quality control work is being done for the currently available TOGA and SMART-R radar datasets, including automatically dealiasing radial velocities and correcting for intermittent TOGA antenna azimuth angle errors. The assimilated winds are being made available as model output fields from WRF on two separate grids with different horizontal resolutions - a 3-km grid focusing on the main DYNAMO quadrilateral (i.e., Gan Island, the R/V Revelle, the R/V Mirai, and Diego Garcia), and a 1-km grid focusing on the Revelle. The wind dataset is focused on three separate approximately 2-week periods during the Madden Julian Oscillation (MJO) onsets that occurred in October, November, and December 2011. Work is ongoing to convert the 10-m surface winds from these model fields to simulated CYGNSS observations using the CYGNSS End-To-End Simulator (E2ES), and these simulated satellite observations are being compared to radar observations of DYNAMO precipitation systems to document the anticipated ability of CYGNSS to provide information on the relationships between surface winds and oceanic precipitation at the mesoscale level. This research will improve our understanding of the future utility of CYGNSS for documenting key MJO processes.
A gridded hourly rainfall dataset for the UK applied to a national physically-based modelling system
NASA Astrophysics Data System (ADS)
Lewis, Elizabeth; Blenkinsop, Stephen; Quinn, Niall; Freer, Jim; Coxon, Gemma; Woods, Ross; Bates, Paul; Fowler, Hayley
2016-04-01
An hourly gridded rainfall product has great potential for use in many hydrological applications that require high temporal resolution meteorological data. One important example of this is flood risk management, with flooding in the UK highly dependent on sub-daily rainfall intensities amongst other factors. Knowledge of sub-daily rainfall intensities is therefore critical to designing hydraulic structures or flood defences to appropriate levels of service. Sub-daily rainfall rates are also essential inputs for flood forecasting, allowing for estimates of peak flows and stage for flood warning and response. In addition, an hourly gridded rainfall dataset has significant potential for practical applications such as better representation of extremes and pluvial flash flooding, validation of high resolution climate models and improving the representation of sub-daily rainfall in weather generators. A new 1km gridded hourly rainfall dataset for the UK has been created by disaggregating the daily Gridded Estimates of Areal Rainfall (CEH-GEAR) dataset using comprehensively quality-controlled hourly rain gauge data from over 1300 observation stations across the country. Quality control measures include identification of frequent tips, daily accumulations and dry spells, comparison of daily totals against the CEH-GEAR daily dataset, and nearest neighbour checks. The quality control procedure was validated against historic extreme rainfall events and the UKCP09 5km daily rainfall dataset. General use of the dataset has been demonstrated by testing the sensitivity of a physically-based hydrological modelling system for Great Britain to the distribution and rates of rainfall and potential evapotranspiration. Of the sensitivity tests undertaken, the largest improvements in model performance were seen when an hourly gridded rainfall dataset was combined with potential evapotranspiration disaggregated to hourly intervals, with 61% of catchments showing an increase in NSE between observed and simulated streamflows as a result of more realistic sub-daily meteorological forcing.
Developing High-resolution Soil Database for Regional Crop Modeling in East Africa
NASA Astrophysics Data System (ADS)
Han, E.; Ines, A. V. M.
2014-12-01
The most readily available soil data for regional crop modeling in Africa is the World Inventory of Soil Emission potentials (WISE) dataset, which has 1125 soil profiles for the world, but does not extensively cover countries Ethiopia, Kenya, Uganda and Tanzania in East Africa. Another dataset available is the HC27 (Harvest Choice by IFPRI) in a gridded format (10km) but composed of generic soil profiles based on only three criteria (texture, rooting depth, and organic carbon content). In this paper, we present a development and application of a high-resolution (1km), gridded soil database for regional crop modeling in East Africa. Basic soil information is extracted from Africa Soil Information Service (AfSIS), which provides essential soil properties (bulk density, soil organic carbon, soil PH and percentages of sand, silt and clay) for 6 different standardized soil layers (5, 15, 30, 60, 100 and 200 cm) in 1km resolution. Soil hydraulic properties (e.g., field capacity and wilting point) are derived from the AfSIS soil dataset using well-proven pedo-transfer functions and are customized for DSSAT-CSM soil data requirements. The crop model is used to evaluate crop yield forecasts using the new high resolution soil database and compared with WISE and HC27. In this paper we will present also the results of DSSAT loosely coupled with a hydrologic model (VIC) to assimilate root-zone soil moisture. Creating a grid-based soil database, which provides a consistent soil input for two different models (DSSAT and VIC) is a critical part of this work. The created soil database is expected to contribute to future applications of DSSAT crop simulation in East Africa where food security is highly vulnerable.
Copes, Lynn E.; Lucas, Lynn M.; Thostenson, James O.; Hoekstra, Hopi E.; Boyer, Doug M.
2016-01-01
A dataset of high-resolution microCT scans of primate skulls (crania and mandibles) and certain postcranial elements was collected to address questions about primate skull morphology. The sample consists of 489 scans taken from 431 specimens, representing 59 species of most Primate families. These data have transformative reuse potential as such datasets are necessary for conducting high power research into primate evolution, but require significant time and funding to collect. Similar datasets were previously only available to select research groups across the world. The physical specimens are vouchered at Harvard’s Museum of Comparative Zoology. The data collection took place at the Center for Nanoscale Systems at Harvard. The dataset is archived on MorphoSource.org. Though this is the largest high fidelity comparative dataset yet available, its provisioning on a web archive that allows unlimited researcher contributions promises a future with vastly increased digital collections available at researchers’ finger tips. PMID:26836025
Statistical and Spatial Analysis of Bathymetric Data for the St. Clair River, 1971-2007
Bennion, David
2009-01-01
To address questions concerning ongoing geomorphic processes in the St. Clair River, selected bathymetric datasets spanning 36 years were analyzed. Comparisons of recent high-resolution datasets covering the upper river indicate a highly variable, active environment. Although statistical and spatial comparisons of the datasets show that some changes to the channel size and shape have taken place during the study period, uncertainty associated with various survey methods and interpolation processes limit the statistically certain results. The methods used to spatially compare the datasets are sensitive to small variations in position and depth that are within the range of uncertainty associated with the datasets. Characteristics of the data, such as the density of measured points and the range of values surveyed, can also influence the results of spatial comparison. With due consideration of these limitations, apparently active and ongoing areas of elevation change in the river are mapped and discussed.
Lukeš, Tomáš; Pospíšil, Jakub; Fliegel, Karel; Lasser, Theo; Hagen, Guy M
2018-03-01
Super-resolution single molecule localization microscopy (SMLM) is a method for achieving resolution beyond the classical limit in optical microscopes (approx. 200 nm laterally). Yellow fluorescent protein (YFP) has been used for super-resolution single molecule localization microscopy, but less frequently than other fluorescent probes. Working with YFP in SMLM is a challenge because a lower number of photons are emitted per molecule compared with organic dyes, which are more commonly used. Publically available experimental data can facilitate development of new data analysis algorithms. Four complete, freely available single molecule super-resolution microscopy datasets on YFP-tagged growth factor receptors expressed in a human cell line are presented, including both raw and analyzed data. We report methods for sample preparation, for data acquisition, and for data analysis, as well as examples of the acquired images. We also analyzed the SMLM datasets using a different method: super-resolution optical fluctuation imaging (SOFI). The 2 modes of analysis offer complementary information about the sample. A fifth single molecule super-resolution microscopy dataset acquired with the dye Alexa 532 is included for comparison purposes. This dataset has potential for extensive reuse. Complete raw data from SMLM experiments have typically not been published. The YFP data exhibit low signal-to-noise ratios, making data analysis a challenge. These datasets will be useful to investigators developing their own algorithms for SMLM, SOFI, and related methods. The data will also be useful for researchers investigating growth factor receptors such as ErbB3.
High-Resolution Topography and its Implications for the Formation of Europa's Ridged Plains
NASA Astrophysics Data System (ADS)
Leonard, E. J.; Pappalardo, R. T.; Yin, A.; Patthoff, D. A.; Schenk, P.
2015-12-01
The Galileo Solid State Imager (SSI) recorded nine very high-resolution frames—eight at 12 m/pixel and one at 6 m/pixel—during the E12 flyby of Europa in Dec. 1997. To understand the implications for the small-scale structure and evolution of Europa, we mosaicked these frames (observations 12ESMOTTLE01 and 02, incidence ≈18°, emission ≈77°) into their regional context (part of observation 11ESREGMAP01, 220 m/pixel, incidence ≈74°, emission ≈23°). The topography data, which was created from the image mosaic overlaps, is sparse and segmented over the high-resolution images but connected by the underlying regional resolution topography. The high-resolution topography (24 m/pixel) is among the best for the current Europan dataset. From this dataset we ascertain the root mean square, or RMS, slope for some of the most common Europan surface features in a new region. We also employ a Fourier Transform method previously used on Ganymede and on other areas of Europa (Patel et al., 1999 JGR), to derive common wavelengths for the subunits of the ubiquitous ridged plains terrain. These results have important implications for differentiating between possible formation mechanisms—extensional tilt blocks (Pappalardo et al., 1995 JGR) or folds (Leonard et al., 2015 LPSC Abstract)—and for potential future missions. We continue this method for another high-resolution region taken in the E12 orbit, WEDGES01 and 02, with the specific goal of investigating how the variations in ridged plains morphologies relate across the surface of Europa.
Christopher Daly; Melissa E. Slater; Joshua A. Roberti; Stephanie H. Laseter; Lloyd W. Swift
2017-01-01
A 69-station, densely spaced rain gauge network was maintained over the period 1951â1958 in the Coweeta Hydrologic Laboratory, located in the southern Appalachians in western North Carolina, USA. This unique dataset was used to develop the first digital seasonal and annual precipitation maps for the Coweeta basin, using elevation regression functions and...
NASA Astrophysics Data System (ADS)
Kawano, N.; Varquez, A. C. G.; Dong, Y.; Kanda, M.
2016-12-01
Numerical model such as Weather Research and Forecasting model coupled with single-layer Urban Canopy Model (WRF-UCM) is one of the powerful tools to investigate urban heat island. Urban parameters such as average building height (Have), plain area index (λp) and frontal area index (λf), are necessary inputs for the model. In general, these parameters are uniformly assumed in WRF-UCM but this leads to unrealistic urban representation. Distributed urban parameters can also be incorporated into WRF-UCM to consider a detail urban effect. The problem is that distributed building information is not readily available for most megacities especially in developing countries. Furthermore, acquiring real building parameters often require huge amount of time and money. In this study, we investigated the potential of using globally available satellite-captured datasets for the estimation of the parameters, Have, λp, and λf. Global datasets comprised of high spatial resolution population dataset (LandScan by Oak Ridge National Laboratory), nighttime lights (NOAA), and vegetation fraction (NASA). True samples of Have, λp, and λf were acquired from actual building footprints from satellite images and 3D building database of Tokyo, New York, Paris, Melbourne, Istanbul, Jakarta and so on. Regression equations were then derived from the block-averaging of spatial pairs of real parameters and global datasets. Results show that two regression curves to estimate Have and λf from the combination of population and nightlight are necessary depending on the city's level of development. An index which can be used to decide which equation to use for a city is the Gross Domestic Product (GDP). On the other hand, λphas less dependence on GDP but indicated a negative relationship to vegetation fraction. Finally, a simplified but precise approximation of urban parameters through readily-available, high-resolution global datasets and our derived regressions can be utilized to estimate a global distribution of urban parameters for later incorporation into a weather model, thus allowing us to acquire a global understanding of urban climate (Global Urban Climatology). Acknowledgment: This research was supported by the Environment Research and Technology Development Fund (S-14) of the Ministry of the Environment, Japan.
NASA Astrophysics Data System (ADS)
Ng, Z. F.; Gisen, J. I.; Akbari, A.
2018-03-01
Topography dataset is an important input in performing flood inundation modelling. However, it is always difficult to obtain high resolution topography that provide accurate elevation information. Fortunately, there are some open source topography datasets available with reasonable resolution such as SRTM and ASTER-GDEM. In Malaysia particularly in Kuantan, the modelling research on the floodplain area is still lacking. This research aims to: a) to investigate the suitability of ASTER-GDEM to be applied in the 1D-2D flood inundation modelling for the Kuantan River Basin; b) to generate flood inundation map for Kuantan river basin. The topography dataset used in this study is ASTER-GDEM to generate physical characteristics of watershed in the basin. It is used to perform rainfall runoff modelling for hydrological studies and to delineate flood inundation area in the Flood Modeller. The results obtained have shown that a 30m resolution ASTER-GDEM is applicable as an input for the 1D-2D flood modelling. The simulated water level in 2013 has NSE of 0.644 and RSME of 1.259. As a conclusion, ASTER-GDEM can be used as one alternative topography datasets for flood inundation modelling. However, the flood level obtained from the hydraulic modelling shows low accuracy at flat urban areas.
A self-trained classification technique for producing 30 m percent-water maps from Landsat data
Rover, Jennifer R.; Wylie, Bruce K.; Ji, Lei
2010-01-01
Small bodies of water can be mapped with moderate-resolution satellite data using methods where water is mapped as subpixel fractions using field measurements or high-resolution images as training datasets. A new method, developed from a regression-tree technique, uses a 30 m Landsat image for training the regression tree that, in turn, is applied to the same image to map subpixel water. The self-trained method was evaluated by comparing the percent-water map with three other maps generated from established percent-water mapping methods: (1) a regression-tree model trained with a 5 m SPOT 5 image, (2) a regression-tree model based on endmembers and (3) a linear unmixing classification technique. The results suggest that subpixel water fractions can be accurately estimated when high-resolution satellite data or intensively interpreted training datasets are not available, which increases our ability to map small water bodies or small changes in lake size at a regional scale.
Rapid Target Detection in High Resolution Remote Sensing Images Using Yolo Model
NASA Astrophysics Data System (ADS)
Wu, Z.; Chen, X.; Gao, Y.; Li, Y.
2018-04-01
Object detection in high resolution remote sensing images is a fundamental and challenging problem in the field of remote sensing imagery analysis for civil and military application due to the complex neighboring environments, which can cause the recognition algorithms to mistake irrelevant ground objects for target objects. Deep Convolution Neural Network(DCNN) is the hotspot in object detection for its powerful ability of feature extraction and has achieved state-of-the-art results in Computer Vision. Common pipeline of object detection based on DCNN consists of region proposal, CNN feature extraction, region classification and post processing. YOLO model frames object detection as a regression problem, using a single CNN predicts bounding boxes and class probabilities in an end-to-end way and make the predict faster. In this paper, a YOLO based model is used for object detection in high resolution sensing images. The experiments on NWPU VHR-10 dataset and our airport/airplane dataset gain from GoogleEarth show that, compare with the common pipeline, the proposed model speeds up the detection process and have good accuracy.
NASA Astrophysics Data System (ADS)
Doolittle, D. F.; Gharib, J. J.; Mitchell, G. A.
2015-12-01
Detailed photographic imagery and bathymetric maps of the seafloor acquired by deep submergence vehicles such as Autonomous Underwater Vehicles (AUV) and Remotely Operated Vehicles (ROV) are expanding how scientists and the public view and ultimately understand the seafloor and the processes that modify it. Several recently acquired optical and acoustic datasets, collected during ECOGIG (Ecosystem Impacts of Oil and Gas Inputs to the Gulf) and other Gulf of Mexico expeditions using the National Institute for Undersea Science Technology (NIUST) Eagle Ray, and Mola Mola AUVs, have been fused with lower resolution data to create unique three-dimensional geovisualizations. Included in these data are multi-scale and multi-resolution visualizations over hydrocarbon seeps and seep related features. Resolution of the data range from 10s of mm to 10s of m. When multi-resolution data is integrated into a single three-dimensional visual environment, new insights into seafloor and seep processes can be obtained from the intuitive nature of three-dimensional data exploration. We provide examples and demonstrate how integration of multibeam bathymetry, seafloor backscatter data, sub-bottom profiler data, textured photomosaics, and hull-mounted multibeam acoustic midwater imagery are made into a series a three-dimensional geovisualizations of actively seeping sites and associated chemosynthetic communities. From these combined and merged datasets, insights on seep community structure, morphology, ecology, fluid migration dynamics, and process geomorphology can be investigated from new spatial perspectives. Such datasets also promote valuable inter-comparisons of sensor resolution and performance.
High resolution infrared datasets useful for validating stratospheric models
NASA Technical Reports Server (NTRS)
Rinsland, Curtis P.
1992-01-01
An important objective of the High Speed Research Program (HSRP) is to support research in the atmospheric sciences that will improve the basic understanding of the circulation and chemistry of the stratosphere and lead to an interim assessment of the impact of a projected fleet of High Speed Civil Transports (HSCT's) on the stratosphere. As part of this work, critical comparisons between models and existing high quality measurements are planned. These comparisons will be used to test the reliability of current atmospheric chemistry models. Two suitable sets of high resolution infrared measurements are discussed.
4D very high-resolution topography monitoring of surface deformation using UAV-SfM framework.
NASA Astrophysics Data System (ADS)
Clapuyt, François; Vanacker, Veerle; Schlunegger, Fritz; Van Oost, Kristof
2016-04-01
During the last years, exploratory research has shown that UAV-based image acquisition is suitable for environmental remote sensing and monitoring. Image acquisition with cameras mounted on an UAV can be performed at very-high spatial resolution and high temporal frequency in the most dynamic environments. Combined with Structure-from-Motion algorithm, the UAV-SfM framework is capable of providing digital surface models (DSM) which are highly accurate when compared to other very-high resolution topographic datasets and highly reproducible for repeated measurements over the same study area. In this study, we aim at assessing (1) differential movement of the Earth's surface and (2) the sediment budget of a complex earthflow located in the Central Swiss Alps based on three topographic datasets acquired over a period of 2 years. For three time steps, we acquired aerial photographs with a standard reflex camera mounted on a low-cost and lightweight UAV. Image datasets were then processed with the Structure-from-Motion algorithm in order to reconstruct a 3D dense point cloud representing the topography. Georeferencing of outputs has been achieved based on the ground control point (GCP) extraction method, previously surveyed on the field with a RTK GPS. Finally, digital elevation model of differences (DOD) has been computed to assess the topographic changes between the three acquisition dates while surface displacements have been quantified by using image correlation techniques. Our results show that the digital elevation model of topographic differences is able to capture surface deformation at cm-scale resolution. The mean annual displacement of the earthflow is about 3.6 m while the forefront of the landslide has advanced by ca. 30 meters over a period of 18 months. The 4D analysis permits to identify the direction and velocity of Earth movement. Stable topographic ridges condition the direction of the flow with highest downslope movement on steep slopes, and diffuse movement due to lateral sediment flux in the central part of the earthflow.
Mapping and Visualization of Storm-Surge Dynamics for Hurricane Katrina and Hurricane Rita
Gesch, Dean B.
2009-01-01
The damages caused by the storm surges from Hurricane Katrina and Hurricane Rita were significant and occurred over broad areas. Storm-surge maps are among the most useful geospatial datasets for hurricane recovery, impact assessments, and mitigation planning for future storms. Surveyed high-water marks were used to generate a maximum storm-surge surface for Hurricane Katrina extending from eastern Louisiana to Mobile Bay, Alabama. The interpolated surface was intersected with high-resolution lidar elevation data covering the study area to produce a highly detailed digital storm-surge inundation map. The storm-surge dataset and related data are available for display and query in a Web-based viewer application. A unique water-level dataset from a network of portable pressure sensors deployed in the days just prior to Hurricane Rita's landfall captured the hurricane's storm surge. The recorded sensor data provided water-level measurements with a very high temporal resolution at surveyed point locations. The resulting dataset was used to generate a time series of storm-surge surfaces that documents the surge dynamics in a new, spatially explicit way. The temporal information contained in the multiple storm-surge surfaces can be visualized in a number of ways to portray how the surge interacted with and was affected by land surface features. Spatially explicit storm-surge products can be useful for a variety of hurricane impact assessments, especially studies of wetland and land changes where knowledge of the extent and magnitude of storm-surge flooding is critical.
NASA Astrophysics Data System (ADS)
Pisana, Francesco; Henzler, Thomas; Schönberg, Stefan; Klotz, Ernst; Schmidt, Bernhard; Kachelrieß, Marc
2017-03-01
Dynamic CT perfusion acquisitions are intrinsically high-dose examinations, due to repeated scanning. To keep radiation dose under control, relatively noisy images are acquired. Noise is then further enhanced during the extraction of functional parameters from the post-processing of the time attenuation curves of the voxels (TACs) and normally some smoothing filter needs to be employed to better visualize any perfusion abnormality, but sacrificing spatial resolution. In this study we propose a new method to detect perfusion abnormalities keeping both high spatial resolution and high CNR. To do this we first perform the singular value decomposition (SVD) of the original noisy spatial temporal data matrix to extract basis functions of the TACs. Then we iteratively cluster the voxels based on a smoothed version of the three most significant singular vectors. Finally, we create high spatial resolution 3D volumes where to each voxel is assigned a distance from the centroid of each cluster, showing how functionally similar each voxel is compared to the others. The method was tested on three noisy clinical datasets: one brain perfusion case with an occlusion in the left internal carotid, one healthy brain perfusion case, and one liver case with an enhancing lesion. Our method successfully detected all perfusion abnormalities with higher spatial precision when compared to the functional maps obtained with a commercially available software. We conclude this method might be employed to have a rapid qualitative indication of functional abnormalities in low dose dynamic CT perfusion datasets. The method seems to be very robust with respect to both spatial and temporal noise and does not require any special a priori assumption. While being more robust respect to noise and with higher spatial resolution and CNR when compared to the functional maps, our method is not quantitative and a potential usage in clinical routine could be as a second reader to assist in the maps evaluation, or to guide a dataset smoothing before the modeling part.
Preliminary interpretation of high resolution 3D seismic data from offshore Mt. Etna, Italy
NASA Astrophysics Data System (ADS)
Gross, F.; Krastel, S.; Chiocci, F. L.; Ridente, D.; Cukur, D.; Bialas, J.; Papenberg, C. A.; Crutchley, G.; Koch, S.
2013-12-01
In order to gain knowledge about subsurface structures and its correlation to seafloor expressions, a hydro-acoustic dataset was collected during RV Meteor Cruise M86/2 (December 2011/January 2012) in Messina Straits and offshore Mt. Etna. Especially offshore Mt. Etna, the data reveals an obvious connection between subsurface structures and previously known morphological features at the sea floor. Therefore a high resolution 3D seismic dataset was acquired between Riposto Ridge and Catania Canyon close to the shore of eastern Sicily. The study area is characterized by a major structural high, which hosts several ridge-like features at the seafloor. These features are connected to a SW-NE trending fault system. The ridges are bended in their NE-SW direction and host major escarpments at the seafloor. Furthermore they are located directly next to a massive amphitheater structure offshore Mt. Etna with slope gradients of up to 35°, which is interpreted as remnants of a massive submarine mass wasting event off Sicily. The new 3D seismic dataset allows an in depth analysis of the ongoing deformation of the east flank of Mt. Etna.
Managing the explosion of high resolution topography in the geosciences
NASA Astrophysics Data System (ADS)
Crosby, Christopher; Nandigam, Viswanath; Arrowsmith, Ramon; Phan, Minh; Gross, Benjamin
2017-04-01
Centimeter to decimeter-scale 2.5 to 3D sampling of the Earth surface topography coupled with the potential for photorealistic coloring of point clouds and texture mapping of meshes enables a wide range of science applications. Not only is the configuration and state of the surface as imaged valuable, but repeat surveys enable quantification of topographic change (erosion, deposition, and displacement) caused by various geologic processes. We are in an era of ubiquitous point clouds that come from both active sources such as laser scanners and radar as well as passive scene reconstruction via structure from motion (SfM) photogrammetry. With the decreasing costs of high-resolution topography (HRT) data collection, via methods such as SfM and UAS-based laser scanning, the number of researchers collecting these data is increasing. These "long-tail" topographic data are of modest size but great value, and challenges exist to making them widely discoverable, shared, annotated, cited, managed and archived. Presently, there are no central repositories or services to support storage and curation of these datasets. The U.S. National Science Foundation funded OpenTopography (OT) Facility employs cyberinfrastructure including large-scale data management, high-performance computing, and service-oriented architectures, to provide efficient online access to large HRT (mostly lidar) datasets, metadata, and processing tools. With over 225 datasets and 15,000 registered users, OT is well positioned to provide curation for community collected high-resolution topographic data. OT has developed a "Community DataSpace", a service built on a low cost storage cloud (e.g. AWS S3) to make it easy for researchers to upload, curate, annotate and distribute their datasets. The system's ingestion workflow will extract metadata from data uploaded; validate it; assign a digital object identifier (DOI); and create a searchable catalog entry, before publishing via the OT portal. The OT Community DataSpace enables wider discovery and utilization of these HRT datasets via the OT portal and sources that federate the OT data catalog, promote citations, and most importantly increase the impact of investments in data to catalyzes scientific discovery.
A community dataspace for distribution and processing of "long tail" high resolution topography data
NASA Astrophysics Data System (ADS)
Crosby, C. J.; Nandigam, V.; Arrowsmith, R.
2016-12-01
Topography is a fundamental observable for Earth and environmental science and engineering. High resolution topography (HRT) is revolutionary for Earth science. Cyberinfrastructure that enables users to discover, manage, share, and process these data increases the impact of investments in data collection and catalyzes scientific discovery.National Science Foundation funded OpenTopography (OT, www.opentopography.org) employs cyberinfrastructure that includes large-scale data management, high-performance computing, and service-oriented architectures, providing researchers with efficient online access to large, HRT (mostly lidar) datasets, metadata, and processing tools. HRT data are collected from satellite, airborne, and terrestrial platforms at increasingly finer resolutions, greater accuracy, and shorter repeat times. There has been a steady increase in OT data holdings due to partnerships and collaborations with various organizations with the academic NSF domain and beyond.With the decreasing costs of HRT data collection, via methods such as Structure from Motion, the number of researchers collecting these data is increasing. Researchers collecting these "long- tail" topography data (of modest size but great value) face an impediment, especially with costs associated in making them widely discoverable, shared, annotated, cited, managed and archived. Also because there are no existing central repositories or services to support storage and curation of these datasets, much of it is isolated and difficult to locate and preserve. To overcome these barriers and provide efficient centralized access to these high impact datasets, OT is developing a "Community DataSpace", a service built on a low cost storage cloud, (e.g. AWS S3) to make it easy for researchers to upload, curate, annotate and distribute their datasets. The system's ingestion workflow will extract metadata from data uploaded; validate it; assign a digital object identifier (DOI); and create a searchable catalog entry, before publishing via the OT portal. The OT Community DataSpace will enable wider discovery and utilization of these datasets via the OT portal and sources that federate the OT data catalog (e.g. data.gov), promote citations and more importantly increase the impact of investments in these data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y. X.; Van Reeth, E.; Poh, C. L., E-mail: clpoh@ntu.edu.sg
2015-08-15
Purpose: Accurate visualization of lung motion is important in many clinical applications, such as radiotherapy of lung cancer. Advancement in imaging modalities [e.g., computed tomography (CT) and MRI] has allowed dynamic imaging of lung and lung tumor motion. However, each imaging modality has its advantages and disadvantages. The study presented in this paper aims at generating synthetic 4D-CT dataset for lung cancer patients by combining both continuous three-dimensional (3D) motion captured by 4D-MRI and the high spatial resolution captured by CT using the authors’ proposed approach. Methods: A novel hybrid approach based on deformable image registration (DIR) and finite elementmore » method simulation was developed to fuse a static 3D-CT volume (acquired under breath-hold) and the 3D motion information extracted from 4D-MRI dataset, creating a synthetic 4D-CT dataset. Results: The study focuses on imaging of lung and lung tumor. Comparing the synthetic 4D-CT dataset with the acquired 4D-CT dataset of six lung cancer patients based on 420 landmarks, accurate results (average error <2 mm) were achieved using the authors’ proposed approach. Their hybrid approach achieved a 40% error reduction (based on landmarks assessment) over using only DIR techniques. Conclusions: The synthetic 4D-CT dataset generated has high spatial resolution, has excellent lung details, and is able to show movement of lung and lung tumor over multiple breathing cycles.« less
NASA Astrophysics Data System (ADS)
Nowicki, S. A.; Skuse, R. J.
2012-12-01
High-resolution ecological and climate modeling requires quantification of surface characteristics such as rock abundance, soil induration and surface roughness at fine-scale, since these features can affect the micro and macro habitat of a given area and ultimately determine the assemblage of plant and animal species that may occur there. Our objective is to develop quantitative data layers of thermophysical properties of the entire Mojave Desert Ecoregion for applications to habitat modeling being conducted by the USGS Western Ecological Research Center. These research efforts are focused on developing habitat models and a better physical understanding of the Mojave Desert, which have implications the development of solar and wind energy resources, military installation expansion and residential development planned for the Mojave. Thus there is a need to improve our understanding of the mechanical composition and thermal characteristics of natural and modified surfaces in the southwestern US at as high-resolution as possible. Since the Mojave is a sparsely-vegetated, arid landscape with little precipitation, remote sensing-based thermophysical analyses using Advanced Spaceborne Thermal Emission and Reflectance Radiometer (ASTER) day and nighttime imagery are ideal for determining the physical properties of the surface. New mosaicking techniques for thermal imagery acquired at different dates, seasons and temperatures have allowed for the highest-resolution mosaics yet generated at 100m/pixel for thermal infrared wavelengths. Among our contributions is the development of seamless day and night ASTER mosaics of land surface temperatures that are calibrated to Moderate Resolution Imaging Spectroradiometer (MODIS) coincident observations to produce both a seamless mosaic and quantitative temperatures across the region that varies spectrally and thermophysically over a large number of orbit tracks. Products derived from this dataset include surface rock abundance, apparent thermal inertia, and diurnal/seasonal thermal regime. Additionally, the combination of moderate and high-resolution thermal observations are used to map the spatial and temporal variation of significant rain storms that intermittently increase the surface moisture. The resulting thermally-derived layers are in the process of being combined with composition, vegetation and surface reflectance datasets to map the Mojave at the highest VNIR resolution (20m/pixel) and compared to currently-available lower-resolution datasets.
Prolonged Instability Prior to a Regime Shift | Science ...
Regime shifts are generally defined as the point of ‘abrupt’ change in the state of a system. However, a seemingly abrupt transition can be the product of a system reorganization that has been ongoing much longer than is evident in statistical analysis of a single component of the system. Using both univariate and multivariate statistical methods, we tested a long-term high-resolution paleoecological dataset with a known change in species assemblage for a regime shift. Analysis of this dataset with Fisher Information and multivariate time series modeling showed that there was a∼2000 year period of instability prior to the regime shift. This period of instability and the subsequent regime shift coincide with regional climate change, indicating that the system is undergoing extrinsic forcing. Paleoecological records offer a unique opportunity to test tools for the detection of thresholds and stable-states, and thus to examine the long-term stability of ecosystems over periods of multiple millennia. This manuscript explores various methods of assessing the transition between alternative states in an ecological system described by a long-term high-resolution paleoecological dataset.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Donald F.; Schulz, Carl; Konijnenburg, Marco
High-resolution Fourier transform ion cyclotron resonance (FT-ICR) mass spectrometry imaging enables the spatial mapping and identification of biomolecules from complex surfaces. The need for long time-domain transients, and thus large raw file sizes, results in a large amount of raw data (“big data”) that must be processed efficiently and rapidly. This can be compounded by largearea imaging and/or high spatial resolution imaging. For FT-ICR, data processing and data reduction must not compromise the high mass resolution afforded by the mass spectrometer. The continuous mode “Mosaic Datacube” approach allows high mass resolution visualization (0.001 Da) of mass spectrometry imaging data, butmore » requires additional processing as compared to featurebased processing. We describe the use of distributed computing for processing of FT-ICR MS imaging datasets with generation of continuous mode Mosaic Datacubes for high mass resolution visualization. An eight-fold improvement in processing time is demonstrated using a Dutch nationally available cloud service.« less
The resolution sensitivity of the South Asian monsoon and Indo-Pacific in a global 0.35° AGCM
NASA Astrophysics Data System (ADS)
Johnson, Stephanie J.; Levine, Richard C.; Turner, Andrew G.; Martin, Gill M.; Woolnough, Steven J.; Schiemann, Reinhard; Mizielinski, Matthew S.; Roberts, Malcolm J.; Vidale, Pier Luigi; Demory, Marie-Estelle; Strachan, Jane
2016-02-01
The South Asian monsoon is one of the most significant manifestations of the seasonal cycle. It directly impacts nearly one third of the world's population and also has substantial global influence. Using 27-year integrations of a high-resolution atmospheric general circulation model (Met Office Unified Model), we study changes in South Asian monsoon precipitation and circulation when horizontal resolution is increased from approximately 200-40 km at the equator (N96-N512, 1.9°-0.35°). The high resolution, integration length and ensemble size of the dataset make this the most extensive dataset used to evaluate the resolution sensitivity of the South Asian monsoon to date. We find a consistent pattern of JJAS precipitation and circulation changes as resolution increases, which include a slight increase in precipitation over peninsular India, changes in Indian and Indochinese orographic rain bands, increasing wind speeds in the Somali Jet, increasing precipitation over the Maritime Continent islands and decreasing precipitation over the northern Maritime Continent seas. To diagnose which resolution-related processes cause these changes, we compare them to published sensitivity experiments that change regional orography and coastlines. Our analysis indicates that improved resolution of the East African Highlands results in the improved representation of the Somali Jet and further suggests that improved resolution of orography over Indochina and the Maritime Continent results in more precipitation over the Maritime Continent islands at the expense of reduced precipitation further north. We also evaluate the resolution sensitivity of monsoon depressions and lows, which contribute more precipitation over northeast India at higher resolution. We conclude that while increasing resolution at these scales does not solve the many monsoon biases that exist in GCMs, it has a number of small, beneficial impacts.
Automatic optimization high-speed high-resolution OCT retinal imaging at 1μm
NASA Astrophysics Data System (ADS)
Cua, Michelle; Liu, Xiyun; Miao, Dongkai; Lee, Sujin; Lee, Sieun; Bonora, Stefano; Zawadzki, Robert J.; Mackenzie, Paul J.; Jian, Yifan; Sarunic, Marinko V.
2015-03-01
High-resolution OCT retinal imaging is important in providing visualization of various retinal structures to aid researchers in better understanding the pathogenesis of vision-robbing diseases. However, conventional optical coherence tomography (OCT) systems have a trade-off between lateral resolution and depth-of-focus. In this report, we present the development of a focus-stacking optical coherence tomography (OCT) system with automatic optimization for high-resolution, extended-focal-range clinical retinal imaging. A variable-focus liquid lens was added to correct for de-focus in real-time. A GPU-accelerated segmentation and optimization was used to provide real-time layer-specific enface visualization as well as depth-specific focus adjustment. After optimization, multiple volumes focused at different depths were acquired, registered, and stitched together to yield a single, high-resolution focus-stacked dataset. Using this system, we show high-resolution images of the ONH, from which we extracted clinically-relevant parameters such as the nerve fiber layer thickness and lamina cribrosa microarchitecture.
High resolution global gridded data for use in population studies
Lloyd, Christopher T.; Sorichetta, Alessandro; Tatem, Andrew J.
2017-01-01
Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website. PMID:28140386
NASA Astrophysics Data System (ADS)
Slinskey, E. A.; Loikith, P. C.; Waliser, D. E.; Goodman, A.
2017-12-01
Extreme precipitation events are associated with numerous societal and environmental impacts. Furthermore, anthropogenic climate change is projected to alter precipitation intensity across portions of the Continental United States (CONUS). Therefore, a spatial understanding and intuitive means of monitoring extreme precipitation over time is critical. Towards this end, we apply an event-based indicator, developed as a part of NASA's support of the ongoing efforts of the US National Climate Assessment, which assigns categories to extreme precipitation events based on 3-day storm totals as a basis for dataset intercomparison. To assess observational uncertainty across a wide range of historical precipitation measurement approaches, we intercompare in situ station data from the Global Historical Climatology Network (GHCN), satellite-derived precipitation data from NASA's Tropical Rainfall Measuring Mission (TRMM), gridded in situ station data from the Parameter-elevation Regressions on Independent Slopes Model (PRISM), global reanalysis from NASA's Modern Era Retrospective-Analysis version 2 (MERRA 2), and regional reanalysis with gauge data assimilation from NCEP's North American Regional Reanalysis (NARR). Results suggest considerable variability across the five-dataset suite in the frequency, spatial extent, and magnitude of extreme precipitation events. Consistent with expectations, higher resolution datasets were found to resemble station data best and capture a greater frequency of high-end extreme events relative to lower spatial resolution datasets. The degree of dataset agreement varies regionally, however all datasets successfully capture the seasonal cycle of precipitation extremes across the CONUS. These intercomparison results provide additional insight about observational uncertainty and the ability of a range of precipitation measurement and analysis products to capture extreme precipitation event climatology. While the event category threshold is fixed in this analysis, preliminary results from the development of a flexible categorization scheme, that scales with grid resolution, are presented.
High-resolution grids of hourly meteorological variables for Germany
NASA Astrophysics Data System (ADS)
Krähenmann, S.; Walter, A.; Brienen, S.; Imbery, F.; Matzarakis, A.
2018-02-01
We present a 1-km2 gridded German dataset of hourly surface climate variables covering the period 1995 to 2012. The dataset comprises 12 variables including temperature, dew point, cloud cover, wind speed and direction, global and direct shortwave radiation, down- and up-welling longwave radiation, sea level pressure, relative humidity and vapour pressure. This dataset was constructed statistically from station data, satellite observations and model data. It is outstanding in terms of spatial and temporal resolution and in the number of climate variables. For each variable, we employed the most suitable gridding method and combined the best of several information sources, including station records, satellite-derived data and data from a regional climate model. A module to estimate urban heat island intensity was integrated for air and dew point temperature. Owing to the low density of available synop stations, the gridded dataset does not capture all variations that may occur at a resolution of 1 km2. This applies to areas of complex terrain (all the variables), and in particular to wind speed and the radiation parameters. To achieve maximum precision, we used all observational information when it was available. This, however, leads to inhomogeneities in station network density and affects the long-term consistency of the dataset. A first climate analysis for Germany was conducted. The Rhine River Valley, for example, exhibited more than 100 summer days in 2003, whereas in 1996, the number was low everywhere in Germany. The dataset is useful for applications in various climate-related studies, hazard management and for solar or wind energy applications and it is available via doi: 10.5676/DWD_CDC/TRY_Basis_v001.
NASA Astrophysics Data System (ADS)
Lussana, Cristian; Saloranta, Tuomo; Skaugen, Thomas; Magnusson, Jan; Tveito, Ole Einar; Andersen, Jess
2018-02-01
The conventional climate gridded datasets based on observations only are widely used in atmospheric sciences; our focus in this paper is on climate and hydrology. On the Norwegian mainland, seNorge2 provides high-resolution fields of daily total precipitation for applications requiring long-term datasets at regional or national level, where the challenge is to simulate small-scale processes often taking place in complex terrain. The dataset constitutes a valuable meteorological input for snow and hydrological simulations; it is updated daily and presented on a high-resolution grid (1 km of grid spacing). The climate archive goes back to 1957. The spatial interpolation scheme builds upon classical methods, such as optimal interpolation and successive-correction schemes. An original approach based on (spatial) scale-separation concepts has been implemented which uses geographical coordinates and elevation as complementary information in the interpolation. seNorge2 daily precipitation fields represent local precipitation features at spatial scales of a few kilometers, depending on the station network density. In the surroundings of a station or in dense station areas, the predictions are quite accurate even for intense precipitation. For most of the grid points, the performances are comparable to or better than a state-of-the-art pan-European dataset (E-OBS), because of the higher effective resolution of seNorge2. However, in very data-sparse areas, such as in the mountainous region of southern Norway, seNorge2 underestimates precipitation because it does not make use of enough geographical information to compensate for the lack of observations. The evaluation of seNorge2 as the meteorological forcing for the seNorge snow model and the DDD (Distance Distribution Dynamics) rainfall-runoff model shows that both models have been able to make profitable use of seNorge2, partly because of the automatic calibration procedure they incorporate for precipitation. The seNorge2 dataset 1957-2015 is available at https://doi.org/10.5281/zenodo.845733. Daily updates from 2015 onwards are available at http://thredds.met.no/thredds/catalog/metusers/senorge2/seNorge2/provisional_archive/PREC1d/gridded_dataset/catalog.html.
Guo, Xiaoyi; Zhang, Hongyan; Wu, Zhengfang; Zhao, Jianjun; Zhang, Zhengxiang
2017-01-01
Time series of Normalized Difference Vegetation Index (NDVI) derived from multiple satellite sensors are crucial data to study vegetation dynamics. The Land Long Term Data Record Version 4 (LTDR V4) NDVI dataset was recently released at a 0.05 × 0.05° spatial resolution and daily temporal resolution. In this study, annual NDVI time series that are composited by the LTDR V4 and Moderate Resolution Imaging Spectroradiometer (MODIS) NDVI datasets (MOD13C1) are compared and evaluated for the period from 2001 to 2014 in China. The spatial patterns of the NDVI generally match between the LTDR V4 and MOD13C1 datasets. The transitional zone between high and low NDVI values generally matches the boundary of semi-arid and sub-humid regions. A significant and high coefficient of determination is found between the two datasets according to a pixel-based correlation analysis. The spatially averaged NDVI of LTDR V4 is characterized by a much weaker positive regression slope relative to that of the spatially averaged NDVI of the MOD13C1 dataset because of changes in NOAA AVHRR sensors between 2005 and 2006. The measured NDVI values of LTDR V4 were always higher than that of MOD13C1 in western China due to the relatively lower atmospheric water vapor content in western China, and opposite observation appeared in eastern China. In total, 18.54% of the LTDR V4 NDVI pixels exhibit significant trends, whereas 35.79% of the MOD13C1 NDVI pixels show significant trends. Good agreement is observed between the significant trends of the two datasets in the Northeast Plain, Bohai Economic Rim, Loess Plateau, and Yangtze River Delta. By contrast, the datasets contrasted in northwestern desert regions and southern China. A trend analysis of the regression slope values according to the vegetation type shows good agreement between the LTDR V4 and MOD13C1 datasets. This study demonstrates the spatial and temporal consistencies and discrepancies between the AVHRR LTDR and MODIS MOD13C1 NDVI products in China, which could provide useful information for the choice of NDVI products in subsequent studies of vegetation dynamics. PMID:28587266
Guo, Xiaoyi; Zhang, Hongyan; Wu, Zhengfang; Zhao, Jianjun; Zhang, Zhengxiang
2017-06-06
Time series of Normalized Difference Vegetation Index (NDVI) derived from multiple satellite sensors are crucial data to study vegetation dynamics. The Land Long Term Data Record Version 4 (LTDR V4) NDVI dataset was recently released at a 0.05 × 0.05° spatial resolution and daily temporal resolution. In this study, annual NDVI time series that are composited by the LTDR V4 and Moderate Resolution Imaging Spectroradiometer (MODIS) NDVI datasets (MOD13C1) are compared and evaluated for the period from 2001 to 2014 in China. The spatial patterns of the NDVI generally match between the LTDR V4 and MOD13C1 datasets. The transitional zone between high and low NDVI values generally matches the boundary of semi-arid and sub-humid regions. A significant and high coefficient of determination is found between the two datasets according to a pixel-based correlation analysis. The spatially averaged NDVI of LTDR V4 is characterized by a much weaker positive regression slope relative to that of the spatially averaged NDVI of the MOD13C1 dataset because of changes in NOAA AVHRR sensors between 2005 and 2006. The measured NDVI values of LTDR V4 were always higher than that of MOD13C1 in western China due to the relatively lower atmospheric water vapor content in western China, and opposite observation appeared in eastern China. In total, 18.54% of the LTDR V4 NDVI pixels exhibit significant trends, whereas 35.79% of the MOD13C1 NDVI pixels show significant trends. Good agreement is observed between the significant trends of the two datasets in the Northeast Plain, Bohai Economic Rim, Loess Plateau, and Yangtze River Delta. By contrast, the datasets contrasted in northwestern desert regions and southern China. A trend analysis of the regression slope values according to the vegetation type shows good agreement between the LTDR V4 and MOD13C1 datasets. This study demonstrates the spatial and temporal consistencies and discrepancies between the AVHRR LTDR and MODIS MOD13C1 NDVI products in China, which could provide useful information for the choice of NDVI products in subsequent studies of vegetation dynamics.
Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity
NASA Astrophysics Data System (ADS)
Narulita, Ida; Ningrum, Widya
2018-02-01
Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.
Data-driven gating in PET: Influence of respiratory signal noise on motion resolution.
Büther, Florian; Ernst, Iris; Frohwein, Lynn Johann; Pouw, Joost; Schäfers, Klaus Peter; Stegger, Lars
2018-05-21
Data-driven gating (DDG) approaches for positron emission tomography (PET) are interesting alternatives to conventional hardware-based gating methods. In DDG, the measured PET data themselves are utilized to calculate a respiratory signal, that is, subsequently used for gating purposes. The success of gating is then highly dependent on the statistical quality of the PET data. In this study, we investigate how this quality determines signal noise and thus motion resolution in clinical PET scans using a center-of-mass-based (COM) DDG approach, specifically with regard to motion management of target structures in future radiotherapy planning applications. PET list mode datasets acquired in one bed position of 19 different radiotherapy patients undergoing pretreatment [ 18 F]FDG PET/CT or [ 18 F]FDG PET/MRI were included into this retrospective study. All scans were performed over a region with organs (myocardium, kidneys) or tumor lesions of high tracer uptake and under free breathing. Aside from the original list mode data, datasets with progressively decreasing PET statistics were generated. From these, COM DDG signals were derived for subsequent amplitude-based gating of the original list mode file. The apparent respiratory shift d from end-expiration to end-inspiration was determined from the gated images and expressed as a function of signal-to-noise ratio SNR of the determined gating signals. This relation was tested against additional 25 [ 18 F]FDG PET/MRI list mode datasets where high-precision MR navigator-like respiratory signals were available as reference signal for respiratory gating of PET data, and data from a dedicated thorax phantom scan. All original 19 high-quality list mode datasets demonstrated the same behavior in terms of motion resolution when reducing the amount of list mode events for DDG signal generation. Ratios and directions of respiratory shifts between end-respiratory gates and the respective nongated image were constant over all statistic levels. Motion resolution d/d max could be modeled as d/dmax=1-e-1.52(SNR-1)0.52, with d max as the actual respiratory shift. Determining d max from d and SNR in the 25 test datasets and the phantom scan demonstrated no significant differences to the MR navigator-derived shift values and the predefined shift, respectively. The SNR can serve as a general metric to assess the success of COM-based DDG, even in different scanners and patients. The derived formula for motion resolution can be used to estimate the actual motion extent reasonably well in cases of limited PET raw data statistics. This may be of interest for individualized radiotherapy treatment planning procedures of target structures subjected to respiratory motion. © 2018 American Association of Physicists in Medicine.
Intercomparison of three microwave/infrared high resolution line-by-line radiative transfer codes
NASA Astrophysics Data System (ADS)
Schreier, F.; Garcia, S. Gimeno; Milz, M.; Kottayil, A.; Höpfner, M.; von Clarmann, T.; Stiller, G.
2013-05-01
An intercomparison of three line-by-line (lbl) codes developed independently for atmospheric sounding - ARTS, GARLIC, and KOPRA - has been performed for a thermal infrared nadir sounding application assuming a HIRS-like (High resolution Infrared Radiation Sounder) setup. Radiances for the HIRS infrared channels and a set of 42 atmospheric profiles from the "Garand dataset" have been computed. Results of this intercomparison and a discussion of reasons of the observed differences are presented.
NASA Astrophysics Data System (ADS)
Shukla, Shraddhanand; Funk, Chris; Peterson, Pete; McNally, Amy; Dinku, Tufa; Barbosa, Humberto; Paredes-Trejo, Franklin; Pedreros, Diego; Husak, Greg
2017-04-01
A high quality, long-term, high-resolution precipitation dataset is key for supporting drought-related risk management and food security early warning. Here, we present the Climate Hazards group InfraRed Precipitation with Stations (CHIRPS) v2.0, developed by scientists at the University of California, Santa Barbara and the U.S. Geological Survey Earth Resources Observation and Science Center under the direction of Famine Early Warning Systems Network (FEWS NET). CHIRPS is a quasi-global precipitation product and is made available at daily to seasonal time scales with a spatial resolution of 0.05° and a 1981 to near real-time period of record. We begin by describing the three main components of CHIRPS - a high-resolution climatology, time-varying cold cloud duration precipitation estimates, and in situ precipitation estimates, and how they are combined. We then present a validation of this dataset and describe how CHIRPS is being disseminated and used in different applications, such as large-scale hydrologic models and crop water balance models. Validation of CHIRPS has focused on comparisons with precipitation products with global coverage, long periods of record and near real-time availability such as CPC-Unified, CFS Reanalysis and ECMWF datasets and datasets such GPCC and GPCP that incorporate high quality in situ datasets from places such as Uganda, Colombia, and the Sahel. The CHIRPS is shown to have low systematic errors (bias) and low mean absolute errors. We find that CHIRPS performance appears quite similar to research quality products like the GPCC and GPCP, but with higher resolution and lower latency. We also present results from independent validation studies focused on South America and East Africa. CHIRPS is currently being used to drive FEWS NET Land Data Assimilation System (FLDAS), that incorporates multiple hydrologic models, and Water Requirement Satisfaction Index (WRSI), which is a widely used crop water balance model. The outputs (such as soil moisture and runoff) from these models are being used for real-time drought monitoring in Africa. Under support from the USAID FEWS NET, CHG/USGS has developed a two way strategy for dissemination of CHIRPS and related products (e.g. FLDAS, WRSI) and incorporate contributed station data. For example, we are currently working with partners in Mexico (Conagua), Southern Africa (SASSCAL), Colombia (IDEAM), Nigeria (Kukua), Somalia (SWALIM) and Ethiopia (NMA). These institutions provide in situ observations which enhance the CHIRPS and CHIRPS provides feedback on data quality. The CHIRPS is then placed in a web accessible geospatial database. Partners in these countries can then access CHIRPS and other outputs, and display this information using web-based mapping tools. This provides a win-win collaboration, leading to improved globally accessible precipitation estimates and improved climate services in developing nations.
Color imaging of Mars by the High Resolution Imaging Science Experiment (HiRISE)
Delamere, W.A.; Tornabene, L.L.; McEwen, A.S.; Becker, K.; Bergstrom, J.W.; Bridges, N.T.; Eliason, E.M.; Gallagher, D.; Herkenhoff, K. E.; Keszthelyi, L.; Mattson, S.; McArthur, G.K.; Mellon, M.T.; Milazzo, M.; Russell, P.S.; Thomas, N.
2010-01-01
HiRISE has been producing a large number of scientifically useful color products of Mars and other planetary objects. The three broad spectral bands, coupled with the highly sensitive 14 bit detectors and time delay integration, enable detection of subtle color differences. The very high spatial resolution of HiRISE can augment the mineralogic interpretations based on multispectral (THEMIS) and hyperspectral datasets (TES, OMEGA and CRISM) and thereby enable detailed geologic and stratigraphic interpretations at meter scales. In addition to providing some examples of color images and their interpretation, we describe the processing techniques used to produce them and note some of the minor artifacts in the output. We also provide an example of how HiRISE color products can be effectively used to expand mineral and lithologic mapping provided by CRISM data products that are backed by other spectral datasets. The utility of high quality color data for understanding geologic processes on Mars has been one of the major successes of HiRISE. ?? 2009 Elsevier Inc.
One-way coupling of an atmospheric and a hydrologic model in Colorado
Hay, L.E.; Clark, M.P.; Pagowski, M.; Leavesley, G.H.; Gutowski, W.J.
2006-01-01
This paper examines the accuracy of high-resolution nested mesoscale model simulations of surface climate. The nesting capabilities of the atmospheric fifth-generation Pennsylvania State University (PSU)-National Center for Atmospheric Research (NCAR) Mesoscale Model (MM5) were used to create high-resolution, 5-yr climate simulations (from 1 October 1994 through 30 September 1999), starting with a coarse nest of 20 km for the western United States. During this 5-yr period, two finer-resolution nests (5 and 1.7 km) were run over the Yampa River basin in northwestern Colorado. Raw and bias-corrected daily precipitation and maximum and minimum temperature time series from the three MM5 nests were used as input to the U.S. Geological Survey's distributed hydrologic model [the Precipitation Runoff Modeling System (PRMS)] and were compared with PRMS results using measured climate station data. The distributed capabilities of PRMS were provided by partitioning the Yampa River basin into hydrologic response units (HRUs). In addition to the classic polygon method of HRU definition, HRUs for PRMS were defined based on the three MM5 nests. This resulted in 16 datasets being tested using PRMS. The input datasets were derived using measured station data and raw and bias-corrected MM5 20-, 5-, and 1.7-km output distributed to 1) polygon HRUs and 2) 20-, 5-, and 1.7-km-gridded HRUs, respectively. Each dataset was calibrated independently, using a multiobjective, stepwise automated procedure. Final results showed a general increase in the accuracy of simulated runoff with an increase in HRU resolution. In all steps of the calibration procedure, the station-based simulations of runoff showed higher accuracy than the MM5-based simulations, although the accuracy of MM5 simulations was close to station data for the high-resolution nests. Further work is warranted in identifying the causes of the biases in MM5 local climate simulations and developing methods to remove them. ?? 2006 American Meteorological Society.
A phenome-wide examination of neural and cognitive function.
Poldrack, R A; Congdon, E; Triplett, W; Gorgolewski, K J; Karlsgodt, K H; Mumford, J A; Sabb, F W; Freimer, N B; London, E D; Cannon, T D; Bilder, R M
2016-12-06
This data descriptor outlines a shared neuroimaging dataset from the UCLA Consortium for Neuropsychiatric Phenomics, which focused on understanding the dimensional structure of memory and cognitive control (response inhibition) functions in both healthy individuals (130 subjects) and individuals with neuropsychiatric disorders including schizophrenia (50 subjects), bipolar disorder (49 subjects), and attention deficit/hyperactivity disorder (43 subjects). The dataset includes an extensive set of task-based fMRI assessments, resting fMRI, structural MRI, and high angular resolution diffusion MRI. The dataset is shared through the OpenfMRI project, and is formatted according to the Brain Imaging Data Structure (BIDS) standard.
NASA Astrophysics Data System (ADS)
Ramage, J. M.; Brodzik, M. J.; Hardman, M.; Troy, T. J.
2017-12-01
Snow is a vital part of the terrestrial hydrological cycle, a crucial resource for people and ecosystems. In mountainous regions snow is extensive, variable, and challenging to document. Snow melt timing and duration are important factors affecting the transfer of snow mass to soil moisture and runoff. Passive microwave brightness temperature (Tb) changes at 36 and 18 GHz are a sensitive way to detect snow melt onset due to their sensitivity to the abrupt change in emissivity. They are widely used on large icefields and high latitude watersheds. The coarse resolution ( 25 km) of historically available data has precluded effective use in high relief, heterogeneous regions, and gaps between swaths also create temporal data gaps at lower latitudes. New enhanced resolution data products generated from a scatterometer image reconstruction for radiometer (rSIR) technique are available at the original frequencies. We use these Calibrated Enhanced-resolution Brightness (CETB) Temperatures Earth System Data Records (ESDR) to evaluate existing snow melt detection algorithms that have been used in other environments, including the cross polarized gradient ratio (XPGR) and the diurnal amplitude variations (DAV) approaches. We use the 36/37 GHz (3.125 km resolution) and 18/19 GHz (6.25 km resolution) vertically and horizontally polarized datasets from the Special Sensor Microwave Imager (SSM/I) and Advanced Microwave Radiometer for EOS (AMSR-E) and evaluate them for use in this high relief environment. The new data are used to assess glacier and snow melt records in the Hunza River Basin [area 13,000 sq. km, located at 36N, 74E], a tributary to the Upper Indus Basin, Pakistan. We compare the melt timing results visually and quantitatively to the corresponding EASE-Grid 2.0 25-km dataset, SRTM topography, and surface temperatures from station and reanalysis data. The new dataset is coarser than the topography, but is able to differentiate signals of melt/refreeze timing for different altitudes and land cover in this remote area with significant hazards from snow melt and glacier discharge. The improved spatial resolution, enhanced to 3-6 km, and retaining twice daily observations is a key improvement to fully analyze snowpack melt characteristics in remote mountainous regions.
NASA Astrophysics Data System (ADS)
Johnson, M.; Ramage, J. M.; Troy, T. J.; Brodzik, M. J.
2017-12-01
Understanding the timing of snowmelt is critical for water resources management in snow-dominated watersheds. Passive microwave remote sensing has been used to estimate melt-refreeze events through brightness temperature satellite observations taken with sensors like the Special Sensor Microwave Imager (SSM/I) and the Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E). Previous studies were limited to lower resolution ( 25 km) datasets, making it difficult to quantify the snowpack in heterogeneous, high-relief areas. This study investigates the use of newly available passive microwave calibrated, enhanced-resolution brightness temperatures (CETB) produced at the National Snow and Ice Data Center to estimate melt timing at much higher spatial resolution ( 3-6 km). CETB datasets generated from SSM/I and AMSR-E records will be used to examine three mountainous basins in Colorado. The CETB datasets retain twice-daily (day/night) observations of brightness temperatures. Therefore, we employ the diurnal amplitude variation (DAV) method to detect melt onset and melt occurrences to determine if algorithms developed for legacy data are valid with the improved CETB dataset. We compare melt variability with nearby stream discharge records to determine an optimum melt onset algorithm using the newly reprocessed data. This study investigates the effectiveness of the CETB product for several locations in Colorado (North Park, Rabbit Ears, Fraser) that were the sites of previous ground/airborne surveys during the NASA Cold Land Processes Field Experiment (CLPX 2002-2003). In summary, this work lays the foundation for the utilization of higher resolution reprocessed CETB data for snow evolution more broadly in a range of environments. Consequently, the new processing methods and improved spatial resolution will enable hydrologists to better analyze trends in snow-dominated mountainous watersheds for more effective water resources management.
Sturdivant, Emily; Lentz, Erika; Thieler, E. Robert; Farris, Amy; Weber, Kathryn; Remsen, David P.; Miner, Simon; Henderson, Rachel
2017-01-01
The vulnerability of coastal systems to hazards such as storms and sea-level rise is typically characterized using a combination of ground and manned airborne systems that have limited spatial or temporal scales. Structure-from-motion (SfM) photogrammetry applied to imagery acquired by unmanned aerial systems (UAS) offers a rapid and inexpensive means to produce high-resolution topographic and visual reflectance datasets that rival existing lidar and imagery standards. Here, we use SfM to produce an elevation point cloud, an orthomosaic, and a digital elevation model (DEM) from data collected by UAS at a beach and wetland site in Massachusetts, USA. We apply existing methods to (a) determine the position of shorelines and foredunes using a feature extraction routine developed for lidar point clouds and (b) map land cover from the rasterized surfaces using a supervised classification routine. In both analyses, we experimentally vary the input datasets to understand the benefits and limitations of UAS-SfM for coastal vulnerability assessment. We find that (a) geomorphic features are extracted from the SfM point cloud with near-continuous coverage and sub-meter precision, better than was possible from a recent lidar dataset covering the same area; and (b) land cover classification is greatly improved by including topographic data with visual reflectance, but changes to resolution (when <50 cm) have little influence on the classification accuracy.
The Transition of NASA EOS Datasets to WFO Operations: A Model for Future Technology Transfer
NASA Technical Reports Server (NTRS)
Darden, C.; Burks, J.; Jedlovec, G.; Haines, S.
2007-01-01
The collocation of a National Weather Service (NWS) Forecast Office with atmospheric scientists from NASA/Marshall Space Flight Center (MSFC) in Huntsville, Alabama has afforded a unique opportunity for science sharing and technology transfer. Specifically, the NWS office in Huntsville has interacted closely with research scientists within the SPORT (Short-term Prediction and Research and Transition) Center at MSFC. One significant technology transfer that has reaped dividends is the transition of unique NASA EOS polar orbiting datasets into NWS field operations. NWS forecasters primarily rely on the AWIPS (Advanced Weather Information and Processing System) decision support system for their day to day forecast and warning decision making. Unfortunately, the transition of data from operational polar orbiters or low inclination orbiting satellites into AWIPS has been relatively slow due to a variety of reasons. The ability to integrate these high resolution NASA datasets into operations has yielded several benefits. The MODIS (MODerate-resolution Imaging Spectrometer ) instrument flying on the Aqua and Terra satellites provides a broad spectrum of multispectral observations at resolutions as fine as 250m. Forecasters routinely utilize these datasets to locate fine lines, boundaries, smoke plumes, locations of fog or haze fields, and other mesoscale features. In addition, these important datasets have been transitioned to other WFOs for a variety of local uses. For instance, WFO Great Falls Montana utilizes the MODIS snow cover product for hydrologic planning purposes while several coastal offices utilize the output from the MODIS and AMSR-E instruments to supplement observations in the data sparse regions of the Gulf of Mexico and western Atlantic. In the short term, these datasets have benefited local WFOs in a variety of ways. In the longer term, the process by which these unique datasets were successfully transitioned to operations will benefit the planning and implementation of products and datasets derived from both NPP and NPOESS. This presentation will provide a brief overview of current WFO usage of satellite data, the transition of datasets between SPORT and the N W S , and lessons learned for future transition efforts.
Large-scale imputation of epigenomic datasets for systematic annotation of diverse human tissues.
Ernst, Jason; Kellis, Manolis
2015-04-01
With hundreds of epigenomic maps, the opportunity arises to exploit the correlated nature of epigenetic signals, across both marks and samples, for large-scale prediction of additional datasets. Here, we undertake epigenome imputation by leveraging such correlations through an ensemble of regression trees. We impute 4,315 high-resolution signal maps, of which 26% are also experimentally observed. Imputed signal tracks show overall similarity to observed signals and surpass experimental datasets in consistency, recovery of gene annotations and enrichment for disease-associated variants. We use the imputed data to detect low-quality experimental datasets, to find genomic sites with unexpected epigenomic signals, to define high-priority marks for new experiments and to delineate chromatin states in 127 reference epigenomes spanning diverse tissues and cell types. Our imputed datasets provide the most comprehensive human regulatory region annotation to date, and our approach and the ChromImpute software constitute a useful complement to large-scale experimental mapping of epigenomic information.
NASA Technical Reports Server (NTRS)
Claverie, Martin; Matthews, Jessica L.; Vermote, Eric F.; Justice, Christopher O.
2016-01-01
In- land surface models, which are used to evaluate the role of vegetation in the context ofglobal climate change and variability, LAI and FAPAR play a key role, specifically with respect to thecarbon and water cycles. The AVHRR-based LAIFAPAR dataset offers daily temporal resolution,an improvement over previous products. This climate data record is based on a carefully calibratedand corrected land surface reflectance dataset to provide a high-quality, consistent time-series suitablefor climate studies. It spans from mid-1981 to the present. Further, this operational dataset is availablein near real-time allowing use for monitoring purposes. The algorithm relies on artificial neuralnetworks calibrated using the MODIS LAI/FAPAR dataset. Evaluation based on cross-comparisonwith MODIS products and in situ data show the dataset is consistent and reliable with overalluncertainties of 1.03 and 0.15 for LAI and FAPAR, respectively. However, a clear saturation effect isobserved in the broadleaf forest biomes with high LAI (greater than 4.5) and FAPAR (greater than 0.8) values.
NASA's Applied Remote Sensing Training (ARSET) Webinar Series
Atmospheric Science Data Center
2018-01-30
... Wednesday, January 17, 2018 Data Analysis Tools for High Resolution Air Quality Satellite Datasets ... For agenda, registration and additional course information, please access https://go.nasa.gov/2jmhRVD ...
a Spiral-Based Downscaling Method for Generating 30 M Time Series Image Data
NASA Astrophysics Data System (ADS)
Liu, B.; Chen, J.; Xing, H.; Wu, H.; Zhang, J.
2017-09-01
The spatial detail and updating frequency of land cover data are important factors influencing land surface dynamic monitoring applications in high spatial resolution scale. However, the fragmentized patches and seasonal variable of some land cover types (e. g. small crop field, wetland) make it labor-intensive and difficult in the generation of land cover data. Utilizing the high spatial resolution multi-temporal image data is a possible solution. Unfortunately, the spatial and temporal resolution of available remote sensing data like Landsat or MODIS datasets can hardly satisfy the minimum mapping unit and frequency of current land cover mapping / updating at the same time. The generation of high resolution time series may be a compromise to cover the shortage in land cover updating process. One of popular way is to downscale multi-temporal MODIS data with other high spatial resolution auxiliary data like Landsat. But the usual manner of downscaling pixel based on a window may lead to the underdetermined problem in heterogeneous area, result in the uncertainty of some high spatial resolution pixels. Therefore, the downscaled multi-temporal data can hardly reach high spatial resolution as Landsat data. A spiral based method was introduced to downscale low spatial and high temporal resolution image data to high spatial and high temporal resolution image data. By the way of searching the similar pixels around the adjacent region based on the spiral, the pixel set was made up in the adjacent region pixel by pixel. The underdetermined problem is prevented to a large extent from solving the linear system when adopting the pixel set constructed. With the help of ordinary least squares, the method inverted the endmember values of linear system. The high spatial resolution image was reconstructed on the basis of high spatial resolution class map and the endmember values band by band. Then, the high spatial resolution time series was formed with these high spatial resolution images image by image. Simulated experiment and remote sensing image downscaling experiment were conducted. In simulated experiment, the 30 meters class map dataset Globeland30 was adopted to investigate the effect on avoid the underdetermined problem in downscaling procedure and a comparison between spiral and window was conducted. Further, the MODIS NDVI and Landsat image data was adopted to generate the 30m time series NDVI in remote sensing image downscaling experiment. Simulated experiment results showed that the proposed method had a robust performance in downscaling pixel in heterogeneous region and indicated that it was superior to the traditional window-based methods. The high resolution time series generated may be a benefit to the mapping and updating of land cover data.
NASA Astrophysics Data System (ADS)
Williamson, A.; Newman, A. V.
2017-12-01
Finite fault inversions utilizing multiple datasets have become commonplace for large earthquakes pending data availability. The mixture of geodetic datasets such as Global Navigational Satellite Systems (GNSS) and InSAR, seismic waveforms, and when applicable, tsunami waveforms from Deep-Ocean Assessment and Reporting of Tsunami (DART) gauges, provide slightly different observations that when incorporated together lead to a more robust model of fault slip distribution. The merging of different datasets is of particular importance along subduction zones where direct observations of seafloor deformation over the rupture area are extremely limited. Instead, instrumentation measures related ground motion from tens to hundreds of kilometers away. The distance from the event and dataset type can lead to a variable degree of resolution, affecting the ability to accurately model the spatial distribution of slip. This study analyzes the spatial resolution attained individually from geodetic and tsunami datasets as well as in a combined dataset. We constrain the importance of distance between estimated parameters and observed data and how that varies between land-based and open ocean datasets. Analysis focuses on accurately scaled subduction zone synthetic models as well as analysis of the relationship between slip and data in recent large subduction zone earthquakes. This study shows that seafloor deformation sensitive datasets, like open-ocean tsunami waveforms or seafloor geodetic instrumentation, can provide unique offshore resolution for understanding most large and particularly tsunamigenic megathrust earthquake activity. In most environments, we simply lack the capability to resolve static displacements using land-based geodetic observations.
Vanderhoof, Melanie; Fairaux, Nicole; Beal, Yen-Ju G.; Hawbaker, Todd J.
2017-01-01
The Landsat Burned Area Essential Climate Variable (BAECV), developed by the U.S. Geological Survey (USGS), capitalizes on the long temporal availability of Landsat imagery to identify burned areas across the conterminous United States (CONUS) (1984–2015). Adequate validation of such products is critical for their proper usage and interpretation. Validation of coarse-resolution products often relies on independent data derived from moderate-resolution sensors (e.g., Landsat). Validation of Landsat products, in turn, is challenging because there is no corresponding source of high-resolution, multispectral imagery that has been systematically collected in space and time over the entire temporal extent of the Landsat archive. Because of this, comparison between high-resolution images and Landsat science products can help increase user's confidence in the Landsat science products, but may not, alone, be adequate. In this paper, we demonstrate an approach to systematically validate the Landsat-derived BAECV product. Burned area extent was mapped for Landsat image pairs using a manually trained semi-automated algorithm that was manually edited across 28 path/rows and five different years (1988, 1993, 1998, 2003, 2008). Three datasets were independently developed by three analysts and the datasets were integrated on a pixel by pixel basis in which at least one to all three analysts were required to agree a pixel was burned. We found that errors within our Landsat reference dataset could be minimized by using the rendition of the dataset in which pixels were mapped as burned if at least two of the three analysts agreed. BAECV errors of omission and commission for the detection of burned pixels averaged 42% and 33%, respectively for CONUS across all five validation years. Errors of omission and commission were lowest across the western CONUS, for example in the shrub and scrublands of the Arid West (31% and 24%, respectively), and highest in the grasslands and agricultural lands of the Great Plains in central CONUS (62% and 57%, respectively). The BAECV product detected most (> 65%) fire events > 10 ha across the western CONUS (Arid and Mountain West ecoregions). Our approach and results demonstrate that a thorough validation of Landsat science products can be completed with independent Landsat-derived reference data, but could be strengthened by the use of complementary sources of high-resolution data.
MSWEP V2 global 3-hourly 0.1° precipitation: methodology and quantitative appraisal
NASA Astrophysics Data System (ADS)
Beck, H.; Yang, L.; Pan, M.; Wood, E. F.; William, L.
2017-12-01
Here, we present Multi-Source Weighted-Ensemble Precipitation (MSWEP) V2, the first fully global gridded precipitation (P) dataset with a 0.1° spatial resolution. The dataset covers the period 1979-2016, has a 3-hourly temporal resolution, and was derived by optimally merging a wide range of data sources based on gauges (WorldClim, GHCN-D, GSOD, and others), satellites (CMORPH, GridSat, GSMaP, and TMPA 3B42RT), and reanalyses (ERA-Interim, JRA-55, and NCEP-CFSR). MSWEP V2 implements some major improvements over V1, such as (i) the correction of distributional P biases using cumulative distribution function matching, (ii) increasing the spatial resolution from 0.25° to 0.1°, (iii) the inclusion of ocean areas, (iv) the addition of NCEP-CFSR P estimates, (v) the addition of thermal infrared-based P estimates for the pre-TRMM era, (vi) the addition of 0.1° daily interpolated gauge data, (vii) the use of a daily gauge correction scheme that accounts for regional differences in the 24-hour accumulation period of gauges, and (viii) extension of the data record to 2016. The gauge-based assessment of the reanalysis and satellite P datasets, necessary for establishing the merging weights, revealed that the reanalysis datasets strongly overestimate the P frequency for the entire globe, and that the satellite (resp. reanalysis) datasets consistently performed better at low (high) latitudes. Compared to other state-of-the-art P datasets, MSWEP V2 exhibits more plausible global patterns in mean annual P, percentiles, and annual number of dry days, and better resolves the small-scale variability over topographically complex terrain. Other P datasets appear to consistently underestimate P amounts over mountainous regions. Long-term mean P estimates for the global, land, and ocean domains based on MSWEP V2 are 959, 796, and 1026 mm/yr, respectively, in close agreement with the best previous published estimates.
Sentinel-2 data exploitation with ESA's Sentinel-2 Toolbox
NASA Astrophysics Data System (ADS)
Gascon, Ferran; Ramoino, Fabrizzio; deanos, Yves-louis
2017-04-01
The Sentinel-2 Toolbox is a project kicked off by ESA in early 2014, under the umbrella of the ESA SEOM programme with the aim to provide a tool for visualizing, analysing, and processing the Sentinel-2 datasets. The toolbox is an extension of the SeNtinel Application Platform (SNAP), a project resulting from the effort of the developers of the Sentinel-1, Sentinel-2 and Sentinel-3 toolbox to provide a single common application framework suited for the mixed exploitation of SAR, high resolution optical and medium resolution optical datasets. All three development teams collaborate to drive the evolution of the common SNAP framework in a developer forum. In this triplet, the Sentinel-2 toolbox is dedicated to enhance SNAP support for high resolution optical imagery. It is a multi-mission toolbox, already providing support for Sentinel-2, RapidEye, Deimos, SPOT 1 to SPOT 5 datasets. In terms of processing algorithms, SNAP provides tools specific to the Sentinel-2 mission : • An atmospheric correction module, Sen2Cor, is integrated into the toolbox, and provides scene classification, atmospheric correction, cirrus detection and correction. The output L2A products can be opened seamlessly in the toolbox. • A multitemporal synthesis processor (L3) • A biophysical products processor (L2B) • A water processor • A deforestation detector • OTB tools integration • SNAP Engine for Cloud Exploitation along with a set of more generic tools for high resolution optical data exploitation. Together with the generic functionalities of SNAP this provides an ideal environment for designing multi-missions processing chains and producing value-added products from raw datasets. The use of SNAP is manifold and the desktop tools provides a rich application for interactive visualization, analysis and processing of data. But all tools available from SNAP can be accessed via command-line through the Graph Processing Framework (GPT), the kernel of the SNAP processing engine. This makes it a perfect candidate for driving the processing of data on servers for bulk processing.
Scaling up: What coupled land-atmosphere models can tell us about critical zone processes
NASA Astrophysics Data System (ADS)
FitzGerald, K. A.; Masarik, M. T.; Rudisill, W. J.; Gelb, L.; Flores, A. N.
2017-12-01
A significant limitation to extending our knowledge of critical zone (CZ) evolution and function is a lack of hydrometeorological information at sufficiently fine spatial and temporal resolutions to resolve topo-climatic gradients and adequate spatial and temporal extent to capture a range of climatic conditions across ecoregions. Research at critical zone observatories (CZOs) suggests hydrometeorological stores and fluxes exert key controls on processes such as hydrologic partitioning and runoff generation, landscape evolution, soil formation, biogeochemical cycling, and vegetation dynamics. However, advancing fundamental understanding of CZ processes necessitates understanding how hydrometeorological drivers vary across space and time. As a result of recent advances in computational capabilities it has become possible, although still computationally expensive, to simulate hydrometeorological conditions via high resolution coupled land-atmosphere models. Using the Weather Research and Forecasting (WRF) model, we developed a high spatiotemporal resolution dataset extending from water year 1987 to present for the Snake River Basin in the northwestern USA including the Reynolds Creek and Dry Creek Experimental Watersheds, both part of the Reynolds Creek CZO, as well as a range of other ecosystems including shrubland desert, montane forests, and alpine tundra. Drawing from hypotheses generated by work at these sites and across the CZO network, we use the resulting dataset in combination with CZO observations and publically available datasets to provide insights regarding hydrologic partitioning, vegetation distribution, and erosional processes. This dataset provides key context in interpreting and reconciling what observations obtained at particular sites reveal about underlying CZ structure and function. While this dataset does not extend to future climates, the same modeling framework can be used to dynamically downscale coarse global climate model output to scales relevant to CZ processes. This presents an opportunity to better characterize the impact of climate change on the CZ. We also argue that opportunities exist beyond the one way flow of information and that what we learn at CZOs has the potential to contribute significantly to improved Earth system models.
High Quality Data for Grid Integration Studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clifton, Andrew; Draxl, Caroline; Sengupta, Manajit
As variable renewable power penetration levels increase in power systems worldwide, renewable integration studies are crucial to ensure continued economic and reliable operation of the power grid. The existing electric grid infrastructure in the US in particular poses significant limitations on wind power expansion. In this presentation we will shed light on requirements for grid integration studies as far as wind and solar energy are concerned. Because wind and solar plants are strongly impacted by weather, high-resolution and high-quality weather data are required to drive power system simulations. Future data sets will have to push limits of numerical weather predictionmore » to yield these high-resolution data sets, and wind data will have to be time-synchronized with solar data. Current wind and solar integration data sets are presented. The Wind Integration National Dataset (WIND) Toolkit is the largest and most complete grid integration data set publicly available to date. A meteorological data set, wind power production time series, and simulated forecasts created using the Weather Research and Forecasting Model run on a 2-km grid over the continental United States at a 5-min resolution is now publicly available for more than 126,000 land-based and offshore wind power production sites. The National Solar Radiation Database (NSRDB) is a similar high temporal- and spatial resolution database of 18 years of solar resource data for North America and India. The need for high-resolution weather data pushes modeling towards finer scales and closer synchronization. We also present how we anticipate such datasets developing in the future, their benefits, and the challenges with using and disseminating such large amounts of data.« less
Semantic labeling of high-resolution aerial images using an ensemble of fully convolutional networks
NASA Astrophysics Data System (ADS)
Sun, Xiaofeng; Shen, Shuhan; Lin, Xiangguo; Hu, Zhanyi
2017-10-01
High-resolution remote sensing data classification has been a challenging and promising research topic in the community of remote sensing. In recent years, with the rapid advances of deep learning, remarkable progress has been made in this field, which facilitates a transition from hand-crafted features designing to an automatic end-to-end learning. A deep fully convolutional networks (FCNs) based ensemble learning method is proposed to label the high-resolution aerial images. To fully tap the potentials of FCNs, both the Visual Geometry Group network and a deeper residual network, ResNet, are employed. Furthermore, to enlarge training samples with diversity and gain better generalization, in addition to the commonly used data augmentation methods (e.g., rotation, multiscale, and aspect ratio) in the literature, aerial images from other datasets are also collected for cross-scene learning. Finally, we combine these learned models to form an effective FCN ensemble and refine the results using a fully connected conditional random field graph model. Experiments on the ISPRS 2-D Semantic Labeling Contest dataset show that our proposed end-to-end classification method achieves an overall accuracy of 90.7%, a state-of-the-art in the field.
High-resolution satellite imagery for mesoscale meteorological studies
NASA Technical Reports Server (NTRS)
Johnson, David B.; Flament, Pierre; Bernstein, Robert L.
1994-01-01
In this article high-resolution satellite imagery from a variety of meteorological and environmental satellites is compared. Digital datasets from Geostationary Operational Environmental Satellite (GOES), National Oceanic and Atmospheric Administration (NOAA), Defense Meteorological Satellite Program (DMSP), Landsat, and Satellite Pour l'Observation de la Terre (SPOT) satellites were archived as part of the 1990 Hawaiian Rainband Project (HaRP) and form the basis of the comparisons. During HaRP, GOES geostationary satellite coverage was marginal, so the main emphasis is on the polar-orbiting satellites.
NASA Astrophysics Data System (ADS)
Cosh, M. H.; Jackson, T. J.; Colliander, A.; Bindlish, R.; McKee, L.; Goodrich, D. C.; Prueger, J. H.; Hornbuckle, B. K.; Coopersmith, E. J.; Holifield Collins, C.; Smith, J.
2016-12-01
With the launch of the Soil Moisture Active Passive Mission (SMAP) in 2015, a new era of soil moisture monitoring was begun. Soil moisture is available on a near daily basis at a 36 km resolution for the globe. But this dataset is only as valuable if its products are accurate and reliable. Therefore, in order to demonstrate the accuracy of the soil moisture product, NASA enacted an extensive calibration and validation program with many in situ soil moisture networks contributing data across a variety of landscape regimes. However, not all questions can be answered by these networks. As a result, two intensive field experiments were executed to provide more detailed reference points for calibration and validation. Multi-week field campaigns were conducted in Arizona and Iowa at the USDA Agricultural Research Service Walnut Gulch and South Fork Experimental Watersheds, respectively. Aircraft observations were made to provide a high resolution data product. Soil moisture, soil roughness and vegetation data were collected at high resolution to provide a downscaled dataset to compare against aircraft and satellite estimates.
NASA Astrophysics Data System (ADS)
Kasai, K.; Shiomi, K.; Konno, A.; Tadono, T.; Hori, M.
2016-12-01
Global observation of greenhouse gases such as carbon dioxide (CO2) and methane (CH4) with high spatio-temporal resolution and accurate estimation of sources and sinks are important to understand greenhouse gases dynamics. Greenhouse Gases Observing Satellite (GOSAT) has observed column-averaged dry-air mole fractions of CO2 (XCO2) and CH4 (XCH4) over 7 years since January 2009 with wide swath but sparse pointing. Orbiting Carbon Observatory-2 (OCO-2) has observed XCO2 jointly on orbit since July 2014 with narrow swath but high resolution. We use two retrieved datasets as GOSAT observation data. One is ACOS GOSAT/TANSO-FTS Level 2 Full Product by NASA/JPL, and the other is NIES TANSO-FTS L2 column amount (SWIR). By using these GOSAT datasets and OCO-2 L2 Full Product, the biases among datasets, local sources and sinks, and temporal variability of greenhouse gases are clarified. In addition, CarbonTracker, which is a global model of atmospheric CO2 and CH4 developed by NOAA/ESRL, are also analyzed for comparing between satellite observation data and atmospheric model data. Before analyzing these datasets, outliers are screened by using quality flag, outcome flag, and warn level in land or sea parts. Time series data of XCO2 and XCH4 are obtained globally from satellite observation and atmospheric model datasets, and functions which express typical inter-annual and seasonal variation are fitted to each spatial grid. Consequently, anomalous events of XCO2 and XCH4 are extracted by the difference between each time series dataset and the fitted function. Regional emission and absorption events are analyzed by time series variation of satellite observation data and by comparing with atmospheric model data.
The Impact of Estimating High-Resolution Tropospheric Gradients on Multi-GNSS Precise Positioning
Zhou, Feng; Li, Xingxing; Li, Weiwei; Chen, Wen; Dong, Danan; Wickert, Jens; Schuh, Harald
2017-01-01
Benefits from the modernized US Global Positioning System (GPS), the revitalized Russian GLObal NAvigation Satellite System (GLONASS), and the newly-developed Chinese BeiDou Navigation Satellite System (BDS) and European Galileo, multi-constellation Global Navigation Satellite System (GNSS) has emerged as a powerful tool not only in positioning, navigation, and timing (PNT), but also in remote sensing of the atmosphere and ionosphere. Both precise positioning and the derivation of atmospheric parameters can benefit from multi-GNSS observations. In this contribution, extensive evaluations are conducted with multi-GNSS datasets collected from 134 globally-distributed ground stations of the International GNSS Service (IGS) Multi-GNSS Experiment (MGEX) network in July 2016. The datasets are processed in six different constellation combinations, i.e., GPS-, GLONASS-, BDS-only, GPS + GLONASS, GPS + BDS, and GPS + GLONASS + BDS + Galileo precise point positioning (PPP). Tropospheric gradients are estimated with eight different temporal resolutions, from 1 h to 24 h, to investigate the impact of estimating high-resolution gradients on position estimates. The standard deviation (STD) is used as an indicator of positioning repeatability. The results show that estimating tropospheric gradients with high temporal resolution can achieve better positioning performance than the traditional strategy in which tropospheric gradients are estimated on a daily basis. Moreover, the impact of estimating tropospheric gradients with different temporal resolutions at various elevation cutoff angles (from 3° to 20°) is investigated. It can be observed that with increasing elevation cutoff angles, the improvement in positioning repeatability is decreased. PMID:28368346
The Impact of Estimating High-Resolution Tropospheric Gradients on Multi-GNSS Precise Positioning.
Zhou, Feng; Li, Xingxing; Li, Weiwei; Chen, Wen; Dong, Danan; Wickert, Jens; Schuh, Harald
2017-04-03
Benefits from the modernized US Global Positioning System (GPS), the revitalized Russian GLObal NAvigation Satellite System (GLONASS), and the newly-developed Chinese BeiDou Navigation Satellite System (BDS) and European Galileo, multi-constellation Global Navigation Satellite System (GNSS) has emerged as a powerful tool not only in positioning, navigation, and timing (PNT), but also in remote sensing of the atmosphere and ionosphere. Both precise positioning and the derivation of atmospheric parameters can benefit from multi-GNSS observations. In this contribution, extensive evaluations are conducted with multi-GNSS datasets collected from 134 globally-distributed ground stations of the International GNSS Service (IGS) Multi-GNSS Experiment (MGEX) network in July 2016. The datasets are processed in six different constellation combinations, i.e., GPS-, GLONASS-, BDS-only, GPS + GLONASS, GPS + BDS, and GPS + GLONASS + BDS + Galileo precise point positioning (PPP). Tropospheric gradients are estimated with eight different temporal resolutions, from 1 h to 24 h, to investigate the impact of estimating high-resolution gradients on position estimates. The standard deviation (STD) is used as an indicator of positioning repeatability. The results show that estimating tropospheric gradients with high temporal resolution can achieve better positioning performance than the traditional strategy in which tropospheric gradients are estimated on a daily basis. Moreover, the impact of estimating tropospheric gradients with different temporal resolutions at various elevation cutoff angles (from 3° to 20°) is investigated. It can be observed that with increasing elevation cutoff angles, the improvement in positioning repeatability is decreased.
NASA Technical Reports Server (NTRS)
Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.
2010-01-01
One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using traditional verification methodologies. Output from object-based verification within NCAR s Meteorological Evaluation Tools reveals that the WRF runs initialized with LIS+MODIS data consistently generated precipitation objects that better matched observed precipitation objects, especially at higher precipitation intensities. The LIS+MODIS runs produced on average a 4% increase in matched precipitation areas and a simultaneous 4% decrease in unmatched areas during three months of daily simulations.
NASA Astrophysics Data System (ADS)
Koma, Zsófia; Székely, Balázs; Dorninger, Peter; Rasztovits, Sascha; Roncat, Andreas; Zámolyi, András; Krawczyk, Dominik; Pfeifer, Norbert
2014-05-01
Aerial imagery derivatives collected by the Unmanned Aerial Vehicle (UAV) technology can be used as input for generation of high resolution digital terrain model (DTM) data along with the Terrestrial Laser Scanning (TLS) method. Both types of datasets are suitable for detailed geological and geomorphometric analysis, because the data provide micro-topographical and structural geological information. Our study focuses on the comparison of the possibilities of the extracted geological information, which is available from high resolution DTMs. This research attempts to find an answer which technology is more effective for geological and geomorphological analysis. The measurements were taken at the Doren landslide (Vorarlberg, Austria), a complex rotational land slide situated in the Alpine molasse foreland. Several formations (Kojen Formation, Würmian glacial moraine sediments, Weissach Formation) were tectonized there in the course of the alpine orogeny (Oberhauser et al, 2007). The typical fault direction is WSW-ENE. The UAV measurements that were carried out simultaneously with the TLS campaign focused on the landslide scarp. The original image resolution was 4 mm/pixel. Image matching was implemented in pyramid level 2 and the achieved resolution of the DTM was 0.05 meter. The TLS dataset includes 18 scan positions and more than 300 million points for the whole landslide area. The achieved DTM has 0.2 meter resolution. The steps of the geological and geomorphological analysis were: (1) visual interpretation based on field work and geological maps, (2) quantitative DTM analysis. In the quantitative analysis input data provided by the different kinds of DTMs were used for further parameter calculations (e.g. slope, aspect, sigmaZ). In the next step an automatic classification method was used for the detection of faults and classification of different parts of the landslide. The conclusion was that for geological visualization interpretation UAV datasets are better, because the high resolution texture information allows for the extraction of the digital geomorphology indicators. For quantitative analysis both datasets are informative, but the TLS DTM has an advantage of accessing additional information on faults beneath the vegetation cover. These studies were carried out partly in the framework of Hybrid 3D project financed by the Austrian Research Promotion Agency (FFG) and Von-Oben and 4D-IT; the contribution of ZsK was partly funded by Campus Hungary Internship TÁMOP-424B1; BSz contributed partly as an Alexander von Humboldt Research Fellow.
NASA Astrophysics Data System (ADS)
Nord, Guillaume; Boudevillain, Brice; Berne, Alexis; Branger, Flora; Braud, Isabelle; Dramais, Guillaume; Gérard, Simon; Le Coz, Jérôme; Legoût, Cédric; Molinié, Gilles; Van Baelen, Joel; Vandervaere, Jean-Pierre; Andrieu, Julien; Aubert, Coralie; Calianno, Martin; Delrieu, Guy; Grazioli, Jacopo; Hachani, Sahar; Horner, Ivan; Huza, Jessica; Le Boursicaud, Raphaël; Raupach, Timothy H.; Teuling, Adriaan J.; Uber, Magdalena; Vincendon, Béatrice; Wijbrans, Annette
2017-03-01
A comprehensive hydrometeorological dataset is presented spanning the period 1 January 2011-31 December 2014 to improve the understanding of the hydrological processes leading to flash floods and the relation between rainfall, runoff, erosion and sediment transport in a mesoscale catchment (Auzon, 116 km2) of the Mediterranean region. Badlands are present in the Auzon catchment and well connected to high-gradient channels of bedrock rivers which promotes the transfer of suspended solids downstream. The number of observed variables, the various sensors involved (both in situ and remote) and the space-time resolution ( ˜ km2, ˜ min) of this comprehensive dataset make it a unique contribution to research communities focused on hydrometeorology, surface hydrology and erosion. Given that rainfall is highly variable in space and time in this region, the observation system enables assessment of the hydrological response to rainfall fields. Indeed, (i) rainfall data are provided by rain gauges (both a research network of 21 rain gauges with a 5 min time step and an operational network of 10 rain gauges with a 5 min or 1 h time step), S-band Doppler dual-polarization radars (1 km2, 5 min resolution), disdrometers (16 sensors working at 30 s or 1 min time step) and Micro Rain Radars (5 sensors, 100 m height resolution). Additionally, during the special observation period (SOP-1) of the HyMeX (Hydrological Cycle in the Mediterranean Experiment) project, two X-band radars provided precipitation measurements at very fine spatial and temporal scales (1 ha, 5 min). (ii) Other meteorological data are taken from the operational surface weather observation stations of Météo-France (including 2 m air temperature, atmospheric pressure, 2 m relative humidity, 10 m wind speed and direction, global radiation) at the hourly time resolution (six stations in the region of interest). (iii) The monitoring of surface hydrology and suspended sediment is multi-scale and based on nested catchments. Three hydrometric stations estimate water discharge at a 2-10 min time resolution. Two of these stations also measure additional physico-chemical variables (turbidity, temperature, conductivity) and water samples are collected automatically during floods, allowing further geochemical characterization of water and suspended solids. Two experimental plots monitor overland flow and erosion at 1 min time resolution on a hillslope with vineyard. A network of 11 sensors installed in the intermittent hydrographic network continuously measures water level and water temperature in headwater subcatchments (from 0.17 to 116 km2) at a time resolution of 2-5 min. A network of soil moisture sensors enables the continuous measurement of soil volumetric water content at 20 min time resolution at 9 sites. Additionally, concomitant observations (soil moisture measurements and stream gauging) were performed during floods between 2012 and 2014. Finally, this dataset is considered appropriate for understanding the rainfall variability in time and space at fine scales, improving areal rainfall estimations and progressing in distributed hydrological and erosion modelling. DOI of the referenced dataset: doi:10.6096/MISTRALS-HyMeX.1438.
Large-Scale Astrophysical Visualization on Smartphones
NASA Astrophysics Data System (ADS)
Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.
2011-07-01
Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.
NASA Astrophysics Data System (ADS)
Chaney, N.; Wood, E. F.
2014-12-01
The increasing accessibility of high-resolution land data (< 100 m) and high performance computing allows improved parameterizations of subgrid hydrologic processes in macroscale land surface models. Continental scale fully distributed modeling at these spatial scales is possible; however, its practicality for operational use is still unknown due to uncertainties in input data, model parameters, and storage requirements. To address these concerns, we propose a modeling framework that provides the spatial detail of a fully distributed model yet maintains the benefits of a semi-distributed model. In this presentation we will introduce DTOPLATS-MP, a coupling between the NOAH-MP land surface model and the Dynamic TOPMODEL hydrologic model. This new model captures a catchment's spatial heterogeneity by clustering high-resolution land datasets (soil, topography, and land cover) into hundreds of hydrologic similar units (HSUs). A prior DEM analysis defines the connections between each HSU. At each time step, the 1D land surface model updates each HSU; the HSUs then interact laterally via the subsurface and surface. When compared to the fully distributed form of the model, this framework allows a significant decrease in computation and storage while providing most of the same information and enabling parameter transferability. As a proof of concept, we will show how this new modeling framework can be run over CONUS at a 30-meter spatial resolution. For each catchment in the WBD HUC-12 dataset, the model is run between 2002 and 2012 using available high-resolution continental scale land and meteorological datasets over CONUS (dSSURGO, NLCD, NED, and NCEP Stage IV). For each catchment, the model is run with 1000 model parameter sets obtained from a Latin hypercube sample. This exercise will illustrate the feasibility of running the model operationally at continental scales while accounting for model parameter uncertainty.
NASA Astrophysics Data System (ADS)
Williams, C.; Kniveton, D.; Layberry, R.
2009-04-01
It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. In this research, high resolution satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA) are used as a basis for undertaking model experiments using a state-of-the-art regional climate model. The MIRA dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. Once the model's ability to reproduce extremes has been assessed, idealised regions of sea surface temperature (SST) anomalies are used to force the model, with the overall aim of investigating the ways in which SST anomalies influence rainfall extremes over southern Africa. In this paper, results from sensitivity testing of the regional climate model's domain size are briefly presented, before a comparison of simulated daily rainfall from the model with the satellite-derived dataset. Secondly, simulations of current climate and rainfall extremes from the model are compared to the MIRA dataset at daily timescales. Finally, the results from the idealised SST experiments are presented, suggesting highly nonlinear associations between rainfall extremes remote SST anomalies.
NASA Astrophysics Data System (ADS)
Cherubini, Francesco; Hu, Xiangping; Vezhapparambu, Sajith; Stromman, Anders
2017-04-01
Surface albedo, a key parameter of the Earth's climate system, has high variability in space, time, and land cover and its parameterization is among the most important variables in climate models. The lack of extensive estimates for model improvement is one of the main limitations for accurately quantifying the influence of surface albedo changes on the planetary radiation balance. We use multi-year satellite retrievals of MODIS surface albedo (MCD43A3), high resolution land cover maps, and meteorological records to characterize albedo variations in Norway across latitude, seasons, land cover type, and topography. We then use this dataset to elaborate semi-empirical models to predict albedo values as a function of tree species, age, volume and climate variables like temperature and snow water equivalents (SWE). Given the complexity of the dataset and model formulation, we apply an innovative non-linear programming approach simultaneously coupled with linear un-mixing. The MODIS albedo products are at a resolution of about 500 m and 8 days. The land cover maps provide vegetation structure information on relative abundance of tree species, age, and biomass volumes at 16 m resolution (for both deciduous and coniferous species). Daily observations of meteorological information on air temperature and SWE are produced at 1 km resolution from interpolation of meteorological weather stations in Norway. These datasets have different resolution and projection, and are harmonized by identifying, for each MODIS pixel, the intersecting land cover polygons and the percentage area of the MODIS pixel represented by each land cover type. We then filter the subplots according to the following criteria: i) at least 96% of the total pixel area is covered by a single land cover class (either forest or cropland); ii) if forest area, at least 98% of the forest area is covered by spruce, deciduous or pine. Forested pixels are then categorized as spruce, deciduous, or pine dominant if the fraction of the respective tree species is greater than 75%. Results show averages of albedo estimates for forests and cropland depicting spatial (along a latitudinal gradient) and temporal (daily, monthly, and seasonal) variations across Norway. As the case study region is a country with heterogeneous topography, we also study the sensitivity of the albedo estimates to the slope and aspect of the terrain. The mathematical programming approach uses a variety of functional forms, constraints and variables, leading to many different model outputs. There are several models with relatively high performances, allowing for a flexibility in the model selection, with different model variants suitable for different situations. This approach produces albedo predictions at the same resolution of the land cover dataset (16 m, notably higher than the MODIS estimates), can incorporate changes in climate conditions, and is robust to cross-validation between different locations. By integrating satellite measurements and high-resolution vegetation maps, we can thus produce semi-empirical models that can predict albedo values for boreal forests using a variety of input variables representing climate and/or vegetation structure. Further research can explore the possible advantages of its implementation in land surface schemes over existing approaches.
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; LaCasse, Katherine M.; Santanello, Joseph A., Jr.; Lapenta, William M.; Petars-Lidard, Christa D.
2007-01-01
The exchange of energy and moisture between the Earth's surface and the atmospheric boundary layer plays a critical role in many hydrometeorological processes. Accurate and high-resolution representations of surface properties such as sea-surface temperature (SST), vegetation, soil temperature and moisture content, and ground fluxes are necessary to better understand the Earth-atmosphere interactions and improve numerical predictions of weather and climate phenomena. The NASA/NWS Short-term Prediction Research and Transition (SPORT) Center is currently investigating the potential benefits of assimilating high-resolution datasets derived from the NASA moderate resolution imaging spectroradiometer (MODIS) instruments using the Weather Research and Forecasting (WRF) model and the Goddard Space Flight Center Land Information System (LIS). The LIS is a software framework that integrates satellite and ground-based observational and modeled data along with multiple land surface models (LSMs) and advanced computing tools to accurately characterize land surface states and fluxes. The LIS can be run uncoupled to provide a high-resolution land surface initial condition, and can also be run in a coupled mode with WRF to integrate surface and soil quantities using any of the LSMs available in LIS. The LIS also includes the ability to optimize the initialization of surface and soil variables by tuning the spin-up time period and atmospheric forcing parameters, which cannot be done in the standard WRF. Among the datasets available from MODIS, a leaf-area index field and composite SST analysis are used to improve the lower boundary and initial conditions to the LIS/WRF coupled model over both land and water. Experiments will be conducted to measure the potential benefits from using the coupled LIS/WRF model over the Florida peninsula during May 2004. This month experienced relatively benign weather conditions, which will allow the experiments to focus on the local and mesoscale impacts of the high-resolution MODIS datasets and optimized soil and surface initial conditions. Follow-on experiments will examine the utility of such an optimized WRF configuration for more complex weather scenarios such as convective initiation. This paper will provide an overview of the experiment design and present preliminary results from selected cases in May 2004.
On-line 3D motion estimation using low resolution MRI
NASA Astrophysics Data System (ADS)
Glitzner, M.; de Senneville, B. Denis; Lagendijk, J. J. W.; Raaymakers, B. W.; Crijns, S. P. M.
2015-08-01
Image processing such as deformable image registration finds its way into radiotherapy as a means to track non-rigid anatomy. With the advent of magnetic resonance imaging (MRI) guided radiotherapy, intrafraction anatomy snapshots become technically feasible. MRI provides the needed tissue signal for high-fidelity image registration. However, acquisitions, especially in 3D, take a considerable amount of time. Pushing towards real-time adaptive radiotherapy, MRI needs to be accelerated without degrading the quality of information. In this paper, we investigate the impact of image resolution on the quality of motion estimations. Potentially, spatially undersampled images yield comparable motion estimations. At the same time, their acquisition times would reduce greatly due to the sparser sampling. In order to substantiate this hypothesis, exemplary 4D datasets of the abdomen were downsampled gradually. Subsequently, spatiotemporal deformations are extracted consistently using the same motion estimation for each downsampled dataset. Errors between the original and the respectively downsampled version of the dataset are then evaluated. Compared to ground-truth, results show high similarity of deformations estimated from downsampled image data. Using a dataset with {{≤ft(2.5 \\text{mm}\\right)}3} voxel size, deformation fields could be recovered well up to a downsampling factor of 2, i.e. {{≤ft(5 \\text{mm}\\right)}3} . In a therapy guidance scenario MRI, imaging speed could accordingly increase approximately fourfold, with acceptable loss of estimated motion quality.
Gangodagamage, Chandana; Wullschleger, Stan
2014-07-03
This dataset represent a map of the high center (HC) and low center (LC) polygon boundaries delineated from high resolution LiDAR data for the arctic coastal plain at Barrow, Alaska. The polygon troughs are considered as the surface expression of the ice-wedges. The troughs are in lower elevations than the interior polygon. The trough widths were initially identified from LiDAR data, and the boundary between two polygons assumed to be located along the lowest elevations on trough widths between them.
NASA Astrophysics Data System (ADS)
Dunn, R. J. H.; Willett, K. M.; Thorne, P. W.; Woolley, E. V.; Durre, I.; Dai, A.; Parker, D. E.; Vose, R. S.
2012-10-01
This paper describes the creation of HadISD: an automatically quality-controlled synoptic resolution dataset of temperature, dewpoint temperature, sea-level pressure, wind speed, wind direction and cloud cover from global weather stations for 1973-2011. The full dataset consists of over 6000 stations, with 3427 long-term stations deemed to have sufficient sampling and quality for climate applications requiring sub-daily resolution. As with other surface datasets, coverage is heavily skewed towards Northern Hemisphere mid-latitudes. The dataset is constructed from a large pre-existing ASCII flatfile data bank that represents over a decade of substantial effort at data retrieval, reformatting and provision. These raw data have had varying levels of quality control applied to them by individual data providers. The work proceeded in several steps: merging stations with multiple reporting identifiers; reformatting to netCDF; quality control; and then filtering to form a final dataset. Particular attention has been paid to maintaining true extreme values where possible within an automated, objective process. Detailed validation has been performed on a subset of global stations and also on UK data using known extreme events to help finalise the QC tests. Further validation was performed on a selection of extreme events world-wide (Hurricane Katrina in 2005, the cold snap in Alaska in 1989 and heat waves in SE Australia in 2009). Some very initial analyses are performed to illustrate some of the types of problems to which the final data could be applied. Although the filtering has removed the poorest station records, no attempt has been made to homogenise the data thus far, due to the complexity of retaining the true distribution of high-resolution data when applying adjustments. Hence non-climatic, time-varying errors may still exist in many of the individual station records and care is needed in inferring long-term trends from these data. This dataset will allow the study of high frequency variations of temperature, pressure and humidity on a global basis over the last four decades. Both individual extremes and the overall population of extreme events could be investigated in detail to allow for comparison with past and projected climate. A version-control system has been constructed for this dataset to allow for the clear documentation of any updates and corrections in the future.
NASA Astrophysics Data System (ADS)
Xiong, Qiufen; Hu, Jianglin
2013-05-01
The minimum/maximum (Min/Max) temperature in the Yangtze River valley is decomposed into the climatic mean and anomaly component. A spatial interpolation is developed which combines the 3D thin-plate spline scheme for climatological mean and the 2D Barnes scheme for the anomaly component to create a daily Min/Max temperature dataset. The climatic mean field is obtained by the 3D thin-plate spline scheme because the relationship between the decreases in Min/Max temperature with elevation is robust and reliable on a long time-scale. The characteristics of the anomaly field tend to be related to elevation variation weakly, and the anomaly component is adequately analyzed by the 2D Barnes procedure, which is computationally efficient and readily tunable. With this hybridized interpolation method, a daily Min/Max temperature dataset that covers the domain from 99°E to 123°E and from 24°N to 36°N with 0.1° longitudinal and latitudinal resolution is obtained by utilizing daily Min/Max temperature data from three kinds of station observations, which are national reference climatological stations, the basic meteorological observing stations and the ordinary meteorological observing stations in 15 provinces and municipalities in the Yangtze River valley from 1971 to 2005. The error estimation of the gridded dataset is assessed by examining cross-validation statistics. The results show that the statistics of daily Min/Max temperature interpolation not only have high correlation coefficient (0.99) and interpolation efficiency (0.98), but also the mean bias error is 0.00 °C. For the maximum temperature, the root mean square error is 1.1 °C and the mean absolute error is 0.85 °C. For the minimum temperature, the root mean square error is 0.89 °C and the mean absolute error is 0.67 °C. Thus, the new dataset provides the distribution of Min/Max temperature over the Yangtze River valley with realistic, successive gridded data with 0.1° × 0.1° spatial resolution and daily temporal scale. The primary factors influencing the dataset precision are elevation and terrain complexity. In general, the gridded dataset has a relatively high precision in plains and flatlands and a relatively low precision in mountainous areas.
NASA Astrophysics Data System (ADS)
Rettmann, M. E.; Holmes, D. R., III; Gunawan, M. S.; Ge, X.; Karwoski, R. A.; Breen, J. F.; Packer, D. L.; Robb, R. A.
2012-03-01
Geometric analysis of the left atrium and pulmonary veins is important for studying reverse structural remodeling following cardiac ablation therapy. It has been shown that the left atrium decreases in volume and the pulmonary vein ostia decrease in diameter following ablation therapy. Most analysis techniques, however, require laborious manual tracing of image cross-sections. Pulmonary vein diameters are typically measured at the junction between the left atrium and pulmonary veins, called the pulmonary vein ostia, with manually drawn lines on volume renderings or on image cross-sections. In this work, we describe a technique for making semi-automatic measurements of the left atrium and pulmonary vein ostial diameters from high resolution CT scans and multi-phase datasets. The left atrium and pulmonary veins are segmented from a CT volume using a 3D volume approach and cut planes are interactively positioned to separate the pulmonary veins from the body of the left atrium. The cut plane is also used to compute the pulmonary vein ostial diameter. Validation experiments are presented which demonstrate the ability to repeatedly measure left atrial volume and pulmonary vein diameters from high resolution CT scans, as well as the feasibility of this approach for analyzing dynamic, multi-phase datasets. In the high resolution CT scans the left atrial volume measurements show high repeatability with approximately 4% intra-rater repeatability and 8% inter-rater repeatability. Intra- and inter-rater repeatability for pulmonary vein diameter measurements range from approximately 2 to 4 mm. For the multi-phase CT datasets, differences in left atrial volumes between a standard slice-by-slice approach and the proposed 3D volume approach are small, with percent differences on the order of 3% to 6%.
Hubbard, B.E.; Crowley, J.K.
2005-01-01
Hyperspectral data coverage from the EO-1 Hyperion sensor was useful for calibrating Advanced Land Imager (ALI) and Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) images of a volcanic terrane area of the Chilean-Bolivian Altiplano. Following calibration, the ALI and ASTER datasets were co-registered and joined to produce a 13-channel reflectance cube spanning the Visible to Short Wave Infrared (0.4-2.4 ??m). Eigen analysis and comparison of the Hyperion data with the ALI + ASTER reflectance data, as well as mapping results using various ALI+ASTER data subsets, provided insights into the information dimensionality of all the data. In particular, high spectral resolution, low signal-to-noise Hyperion data were only marginally better for mineral mapping than the merged 13-channel, low spectral resolution, high signal-to-noise ALI + ASTER dataset. Neither the Hyperion nor the combined ALI + ASTER datasets had sufficient information dimensionality for mapping the diverse range of surface materials exposed on the Altiplano. However, it is possible to optimize the use of the multispectral data for mineral-mapping purposes by careful data subsetting, and by employing other appropriate image-processing strategies.
Analysis of Specular Reflections Off Geostationary Satellites
NASA Astrophysics Data System (ADS)
Jolley, A.
2016-09-01
Many photometric studies of artificial satellites have attempted to define procedures that minimise the size of datasets required to infer information about satellites. However, it is unclear whether deliberately limiting the size of datasets significantly reduces the potential for information to be derived from them. In 2013 an experiment was conducted using a 14 inch Celestron CG-14 telescope to gain multiple night-long, high temporal resolution datasets of six geostationary satellites [1]. This experiment produced evidence of complex variations in the spectral energy distribution (SED) of reflections off satellite surface materials, particularly during specular reflections. Importantly, specific features relating to the SED variations could only be detected with high temporal resolution data. An update is provided regarding the nature of SED and colour variations during specular reflections, including how some of the variables involved contribute to these variations. Results show that care must be taken when comparing observed spectra to a spectral library for the purpose of material identification; a spectral library that uses wavelength as the only variable will be unable to capture changes that occur to a material's reflected spectra with changing illumination and observation geometry. Conversely, colour variations with changing illumination and observation geometry might provide an alternative means of determining material types.
Buytaert, Jan A N; Salih, Wasil H M; Dierick, Manual; Jacobs, Patric; Dirckx, Joris J J
2011-12-01
In order to improve realism in middle ear (ME) finite-element modeling (FEM), comprehensive and precise morphological data are needed. To date, micro-scale X-ray computed tomography (μCT) recordings have been used as geometric input data for FEM models of the ME ossicles. Previously, attempts were made to obtain these data on ME soft tissue structures as well. However, due to low X-ray absorption of soft tissue, quality of these images is limited. Another popular approach is using histological sections as data for 3D models, delivering high in-plane resolution for the sections, but the technique is destructive in nature and registration of the sections is difficult. We combine data from high-resolution μCT recordings with data from high-resolution orthogonal-plane fluorescence optical-sectioning microscopy (OPFOS), both obtained on the same gerbil specimen. State-of-the-art μCT delivers high-resolution data on the 3D shape of ossicles and other ME bony structures, while the OPFOS setup generates data of unprecedented quality both on bone and soft tissue ME structures. Each of these techniques is tomographic and non-destructive and delivers sets of automatically aligned virtual sections. The datasets coming from different techniques need to be registered with respect to each other. By combining both datasets, we obtain a complete high-resolution morphological model of all functional components in the gerbil ME. The resulting 3D model can be readily imported in FEM software and is made freely available to the research community. In this paper, we discuss the methods used, present the resulting merged model, and discuss the morphological properties of the soft tissue structures, such as muscles and ligaments.
A seamless, high-resolution digital elevation model (DEM) of the north-central California coast
Foxgrover, Amy C.; Barnard, Patrick L.
2012-01-01
A seamless, 2-meter resolution digital elevation model (DEM) of the north-central California coast has been created from the most recent high-resolution bathymetric and topographic datasets available. The DEM extends approximately 150 kilometers along the California coastline, from Half Moon Bay north to Bodega Head. Coverage extends inland to an elevation of +20 meters and offshore to at least the 3 nautical mile limit of state waters. This report describes the procedures of DEM construction, details the input data sources, and provides the DEM for download in both ESRI Arc ASCII and GeoTIFF file formats with accompanying metadata.
Detection and Monitoring of Oil Spills Using Moderate/High-Resolution Remote Sensing Images.
Li, Ying; Cui, Can; Liu, Zexi; Liu, Bingxin; Xu, Jin; Zhu, Xueyuan; Hou, Yongchao
2017-07-01
Current marine oil spill detection and monitoring methods using high-resolution remote sensing imagery are quite limited. This study presented a new bottom-up and top-down visual saliency model. We used Landsat 8, GF-1, MAMS, HJ-1 oil spill imagery as dataset. A simplified, graph-based visual saliency model was used to extract bottom-up saliency. It could identify the regions with high visual saliency object in the ocean. A spectral similarity match model was used to obtain top-down saliency. It could distinguish oil regions and exclude the other salient interference by spectrums. The regions of interest containing oil spills were integrated using these complementary saliency detection steps. Then, the genetic neural network was used to complete the image classification. These steps increased the speed of analysis. For the test dataset, the average running time of the entire process to detect regions of interest was 204.56 s. During image segmentation, the oil spill was extracted using a genetic neural network. The classification results showed that the method had a low false-alarm rate (high accuracy of 91.42%) and was able to increase the speed of the detection process (fast runtime of 19.88 s). The test image dataset was composed of different types of features over large areas in complicated imaging conditions. The proposed model was proved to be robust in complex sea conditions.
Poppenga, Sandra K.; Worstell, Bruce B.; Stoker, Jason M.; Greenlee, Susan K.
2009-01-01
The U.S. Geological Survey (USGS) has taken the lead in the creation of a valuable remote sensing product by incorporating digital elevation models (DEMs) derived from Light Detection and Ranging (lidar) into the National Elevation Dataset (NED), the elevation layer of 'The National Map'. High-resolution lidar-derived DEMs provide the accuracy needed to systematically quantify and fully integrate surface flow including flow direction, flow accumulation, sinks, slope, and a dense drainage network. In 2008, 1-meter resolution lidar data were acquired in Minnehaha County, South Dakota. The acquisition was a collaborative effort between Minnehaha County, the city of Sioux Falls, and the USGS Earth Resources Observation and Science (EROS) Center. With the newly acquired lidar data, USGS scientists generated high-resolution DEMs and surface flow features. This report compares lidar-derived surface flow features in Minnehaha County to 30- and 10-meter elevation data previously incorporated in the NED and ancillary hydrography datasets. Surface flow features generated from lidar-derived DEMs are consistently integrated with elevation and are important in understanding surface-water movement to better detect surface-water runoff, flood inundation, and erosion. Many topographic and hydrologic applications will benefit from the increased availability of accurate, high-quality, and high-resolution surface-water data. The remotely sensed data provide topographic information and data integration capabilities needed for meeting current and future human and environmental needs.
NASA Astrophysics Data System (ADS)
McDonald, K. C.; Jensen, K.; Alvarez, J.; Azarderakhsh, M.; Schroeder, R.; Podest, E.; Chapman, B. D.; Zimmermann, R.
2015-12-01
We have been assembling a global-scale Earth System Data Record (ESDR) of natural Inundated Wetlands to facilitate investigations on their role in climate, biogeochemistry, hydrology, and biodiversity. The ESDR comprises (1) Fine-resolution (100 meter) maps, delineating wetland extent, vegetation type, and seasonal inundation dynamics for regional to continental-scale areas, and (2) global coarse-resolution (~25 km), multi-temporal mappings of inundated area fraction (Fw) across multiple years. During March 2013, the NASA/JPL L-band polarimetric airborne imaging radar, UAVSAR, conducted airborne studies over regions of South America including portions of the western Amazon basin. We collected UAVSAR datasets over regions of the Amazon basin during that time to support systematic analyses of error sources related to the Inundated Wetlands ESDR. UAVSAR datasets were collected over Pacaya Samiria, Peru, Madre de Dios, Peru, and the Napo River in Ecuador. We derive landcover classifications from the UAVSAR datasets emphasizing wetlands regions, identifying regions of open water and inundated vegetation. We compare the UAVSAR-based datasets with those comprising the ESDR to assess uncertainty associated with the high resolution and the coarse resolution ESDR components. Our goal is to create an enhanced ESDR of inundated wetlands with statistically robust uncertainty estimates. The ESDR documentation will include a detailed breakdown of error sources and associated uncertainties within the data record. This work was carried out in part within the framework of the ALOS Kyoto & Carbon Initiative. PALSAR data were provided by JAXA/EORC and the Alaska Satellite Facility. Portions of this work were conducted at the Jet Propulsion Laboratory, California Institute of Technology under contract to the National Aeronautics and Space Administration.
A new global 1-km dataset of percentage tree cover derived from remote sensing
DeFries, R.S.; Hansen, M.C.; Townshend, J.R.G.; Janetos, A.C.; Loveland, Thomas R.
2000-01-01
Accurate assessment of the spatial extent of forest cover is a crucial requirement for quantifying the sources and sinks of carbon from the terrestrial biosphere. In the more immediate context of the United Nations Framework Convention on Climate Change, implementation of the Kyoto Protocol calls for estimates of carbon stocks for a baseline year as well as for subsequent years. Data sources from country level statistics and other ground-based information are based on varying definitions of 'forest' and are consequently problematic for obtaining spatially and temporally consistent carbon stock estimates. By combining two datasets previously derived from the Advanced Very High Resolution Radiometer (AVHRR) at 1 km spatial resolution, we have generated a prototype global map depicting percentage tree cover and associated proportions of trees with different leaf longevity (evergreen and deciduous) and leaf type (broadleaf and needleleaf). The product is intended for use in terrestrial carbon cycle models, in conjunction with other spatial datasets such as climate and soil type, to obtain more consistent and reliable estimates of carbon stocks. The percentage tree cover dataset is available through the Global Land Cover Facility at the University of Maryland at http://glcf.umiacs.umd.edu.
NASA Astrophysics Data System (ADS)
Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong
2016-06-01
With the rapid developments of the sensor technology, high spatial resolution imagery and airborne Lidar point clouds can be captured nowadays, which make classification, extraction, evaluation and analysis of a broad range of object features available. High resolution imagery, Lidar dataset and parcel map can be widely used for classification as information carriers. Therefore, refinement of objects classification is made possible for the urban land cover. The paper presents an approach to object based image analysis (OBIA) combing high spatial resolution imagery and airborne Lidar point clouds. The advanced workflow for urban land cover is designed with four components. Firstly, colour-infrared TrueOrtho photo and laser point clouds were pre-processed to derive the parcel map of water bodies and nDSM respectively. Secondly, image objects are created via multi-resolution image segmentation integrating scale parameter, the colour and shape properties with compactness criterion. Image can be subdivided into separate object regions. Thirdly, image objects classification is performed on the basis of segmentation and a rule set of knowledge decision tree. These objects imagery are classified into six classes such as water bodies, low vegetation/grass, tree, low building, high building and road. Finally, in order to assess the validity of the classification results for six classes, accuracy assessment is performed through comparing randomly distributed reference points of TrueOrtho imagery with the classification results, forming the confusion matrix and calculating overall accuracy and Kappa coefficient. The study area focuses on test site Vaihingen/Enz and a patch of test datasets comes from the benchmark of ISPRS WG III/4 test project. The classification results show higher overall accuracy for most types of urban land cover. Overall accuracy is 89.5% and Kappa coefficient equals to 0.865. The OBIA approach provides an effective and convenient way to combine high resolution imagery and Lidar ancillary data for classification of urban land cover.
The effect of spatial resolution on water scarcity estimates in Australia
NASA Astrophysics Data System (ADS)
Gevaert, Anouk; Veldkamp, Ted; van Dijk, Albert; Ward, Philip
2017-04-01
Water scarcity is an important global issue with severe socio-economic consequences, and its occurrence is likely to increase in many regions due to population growth, economic development and climate change. This has prompted a number of global and regional studies to identify areas that are vulnerable to water scarcity and to determine how this vulnerability will change in the future. A drawback of these studies, however, is that they typically have coarse spatial resolutions. Here, we studied the effect of increasing the spatial resolution of water scarcity estimates in Australia, and the Murray-Darling Basin in particular. This was achieved by calculating the water stress index (WSI), an indicator showing the ratio of water use to water availability, at 0.5 and 0.05 degree resolution for the period 1990-2010. Monthly water availability data were based on outputs of the Australian Water Resources Assessment Landscape model (AWRA-L), which was run at both spatial resolutions and at a daily time scale. Water use information was obtained from a monthly 0.5 degree global dataset that distinguishes between water consumption for irrigation, livestock, industrial and domestic uses. The data were downscaled to 0.05 degree by dividing the sectoral water uses over the areas covered by relevant land use types using a high resolution ( 0.5km) land use dataset. The monthly WSIs at high and low resolution were then used to evaluate differences in the patterns of water scarcity frequency and intensity. In this way, we assess to what extent increasing the spatial resolution can improve the identification of vulnerable areas and thereby assist in the development of strategies to lower this vulnerability. The results of this study provide insight into the scalability of water scarcity estimates and the added value of high resolution water scarcity information in water resources management.
NASA Technical Reports Server (NTRS)
Ott, L.; Putman, B.; Collatz, J.; Gregg, W.
2012-01-01
Column CO2 observations from current and future remote sensing missions represent a major advancement in our understanding of the carbon cycle and are expected to help constrain source and sink distributions. However, data assimilation and inversion methods are challenged by the difference in scale of models and observations. OCO-2 footprints represent an area of several square kilometers while NASA s future ASCENDS lidar mission is likely to have an even smaller footprint. In contrast, the resolution of models used in global inversions are typically hundreds of kilometers wide and often cover areas that include combinations of land, ocean and coastal areas and areas of significant topographic, land cover, and population density variations. To improve understanding of scales of atmospheric CO2 variability and representativeness of satellite observations, we will present results from a global, 10-km simulation of meteorology and atmospheric CO2 distributions performed using NASA s GEOS-5 general circulation model. This resolution, typical of mesoscale atmospheric models, represents an order of magnitude increase in resolution over typical global simulations of atmospheric composition allowing new insight into small scale CO2 variations across a wide range of surface flux and meteorological conditions. The simulation includes high resolution flux datasets provided by NASA s Carbon Monitoring System Flux Pilot Project at half degree resolution that have been down-scaled to 10-km using remote sensing datasets. Probability distribution functions are calculated over larger areas more typical of global models (100-400 km) to characterize subgrid-scale variability in these models. Particular emphasis is placed on coastal regions and regions containing megacities and fires to evaluate the ability of coarse resolution models to represent these small scale features. Additionally, model output are sampled using averaging kernels characteristic of OCO-2 and ASCENDS measurement concepts to create realistic pseudo-datasets. Pseudo-data are averaged over coarse model grid cell areas to better understand the ability of measurements to characterize CO2 distributions and spatial gradients on both short (daily to weekly) and long (monthly to seasonal) time scales
Mahmoudzadeh, Amir Pasha; Kashou, Nasser H.
2013-01-01
Interpolation has become a default operation in image processing and medical imaging and is one of the important factors in the success of an intensity-based registration method. Interpolation is needed if the fractional unit of motion is not matched and located on the high resolution (HR) grid. The purpose of this work is to present a systematic evaluation of eight standard interpolation techniques (trilinear, nearest neighbor, cubic Lagrangian, quintic Lagrangian, hepatic Lagrangian, windowed Sinc, B-spline 3rd order, and B-spline 4th order) and to compare the effect of cost functions (least squares (LS), normalized mutual information (NMI), normalized cross correlation (NCC), and correlation ratio (CR)) for optimized automatic image registration (OAIR) on 3D spoiled gradient recalled (SPGR) magnetic resonance images (MRI) of the brain acquired using a 3T GE MR scanner. Subsampling was performed in the axial, sagittal, and coronal directions to emulate three low resolution datasets. Afterwards, the low resolution datasets were upsampled using different interpolation methods, and they were then compared to the high resolution data. The mean squared error, peak signal to noise, joint entropy, and cost functions were computed for quantitative assessment of the method. Magnetic resonance image scans and joint histogram were used for qualitative assessment of the method. PMID:24000283
Mahmoudzadeh, Amir Pasha; Kashou, Nasser H
2013-01-01
Interpolation has become a default operation in image processing and medical imaging and is one of the important factors in the success of an intensity-based registration method. Interpolation is needed if the fractional unit of motion is not matched and located on the high resolution (HR) grid. The purpose of this work is to present a systematic evaluation of eight standard interpolation techniques (trilinear, nearest neighbor, cubic Lagrangian, quintic Lagrangian, hepatic Lagrangian, windowed Sinc, B-spline 3rd order, and B-spline 4th order) and to compare the effect of cost functions (least squares (LS), normalized mutual information (NMI), normalized cross correlation (NCC), and correlation ratio (CR)) for optimized automatic image registration (OAIR) on 3D spoiled gradient recalled (SPGR) magnetic resonance images (MRI) of the brain acquired using a 3T GE MR scanner. Subsampling was performed in the axial, sagittal, and coronal directions to emulate three low resolution datasets. Afterwards, the low resolution datasets were upsampled using different interpolation methods, and they were then compared to the high resolution data. The mean squared error, peak signal to noise, joint entropy, and cost functions were computed for quantitative assessment of the method. Magnetic resonance image scans and joint histogram were used for qualitative assessment of the method.
NASA Astrophysics Data System (ADS)
Hou, C. Y.; Dattore, R.; Peng, G. S.
2014-12-01
The National Center for Atmospheric Research's Global Climate Four-Dimensional Data Assimilation (CFDDA) Hourly 40km Reanalysis dataset is a dynamically downscaled dataset with high temporal and spatial resolution. The dataset contains three-dimensional hourly analyses in netCDF format for the global atmospheric state from 1985 to 2005 on a 40km horizontal grid (0.4°grid increment) with 28 vertical levels, providing good representation of local forcing and diurnal variation of processes in the planetary boundary layer. This project aimed to make the dataset publicly available, accessible, and usable in order to provide a unique resource to allow and promote studies of new climate characteristics. When the curation project started, it had been five years since the data files were generated. Also, although the Principal Investigator (PI) had generated a user document at the end of the project in 2009, the document had not been maintained. Furthermore, the PI had moved to a new institution, and the remaining team members were reassigned to other projects. These factors made data curation in the areas of verifying data quality, harvest metadata descriptions, documenting provenance information especially challenging. As a result, the project's curation process found that: Data curator's skill and knowledge helped make decisions, such as file format and structure and workflow documentation, that had significant, positive impact on the ease of the dataset's management and long term preservation. Use of data curation tools, such as the Data Curation Profiles Toolkit's guidelines, revealed important information for promoting the data's usability and enhancing preservation planning. Involving data curators during each stage of the data curation life cycle instead of at the end could improve the curation process' efficiency. Overall, the project showed that proper resources invested in the curation process would give datasets the best chance to fulfill their potential to help with new climate pattern discovery.
NASA Astrophysics Data System (ADS)
Brandl, C.; Reece, R.; Bayer, J.; Bales, M. K.
2016-12-01
Bonaire is located on the Bonaire microplate between the Caribbean and South American plates, and is part of the Netherlands Leeward Antilles as well as the ABC Islands along with Aruba and Curacao. As the major tectonic plates move they stress the microplate, which causes deformation as faulting. This study utilizes legacy seismic reflection data combined with a recent nearshore survey to study tectonic deformation in the basins surrounding Bonaire. Our legacy data covers a large portion of the ABC Islands; one dataset is a 1981 multichannel seismic (MCS) WesternGeco survey and the other is a 1971 USGS survey that we converted from print to SEGY. The modern dataset (2013) is a high-resolution MCS survey acquired off the western coast of Bonaire. We will use the legacy datasets to validate previous interpretations in the nearshore environment and extend these interpretations to the deepwater basins. Faults influenced by regional tectonics are more evident in deepwater basins because of their lateral continuity, and offset of thick sedimentary strata. A recent study of nearshore Bonaire utilizing the high-resolution seismic dataset interpreted several NE-SW dipping normal faults, which may correspond to regional extension. However, the influence is not clear, perhaps due to a lack of data or the nearshore nature of the dataset. Analysis of the legacy datasets show several areas in the surrounding basins with faults dipping NE-SW. Further analysis may reinforce observations made in the nearshore environment. Studying the tectonics of Bonaire can provide insight about the evolution of the region and help better define the effect of regional tectonic forces on the microplate. This study also shows the benefit of legacy seismic datasets that are publically available but stored as print or film in conjunction with modern data. They can provide value to a modern study by expanding the scope of available data as well as increasing the number of questions a study can address.
A high resolution spatial population database of Somalia for disease risk mapping.
Linard, Catherine; Alegana, Victor A; Noor, Abdisalan M; Snow, Robert W; Tatem, Andrew J
2010-09-14
Millions of Somali have been deprived of basic health services due to the unstable political situation of their country. Attempts are being made to reconstruct the health sector, in particular to estimate the extent of infectious disease burden. However, any approach that requires the use of modelled disease rates requires reasonable information on population distribution. In a low-income country such as Somalia, population data are lacking, are of poor quality, or become outdated rapidly. Modelling methods are therefore needed for the production of contemporary and spatially detailed population data. Here land cover information derived from satellite imagery and existing settlement point datasets were used for the spatial reallocation of populations within census units. We used simple and semi-automated methods that can be implemented with free image processing software to produce an easily updatable gridded population dataset at 100 × 100 meters spatial resolution. The 2010 population dataset was matched to administrative population totals projected by the UN. Comparison tests between the new dataset and existing population datasets revealed important differences in population size distributions, and in population at risk of malaria estimates. These differences are particularly important in more densely populated areas and strongly depend on the settlement data used in the modelling approach. The results show that it is possible to produce detailed, contemporary and easily updatable settlement and population distribution datasets of Somalia using existing data. The 2010 population dataset produced is freely available as a product of the AfriPop Project and can be downloaded from: http://www.afripop.org.
A high resolution spatial population database of Somalia for disease risk mapping
2010-01-01
Background Millions of Somali have been deprived of basic health services due to the unstable political situation of their country. Attempts are being made to reconstruct the health sector, in particular to estimate the extent of infectious disease burden. However, any approach that requires the use of modelled disease rates requires reasonable information on population distribution. In a low-income country such as Somalia, population data are lacking, are of poor quality, or become outdated rapidly. Modelling methods are therefore needed for the production of contemporary and spatially detailed population data. Results Here land cover information derived from satellite imagery and existing settlement point datasets were used for the spatial reallocation of populations within census units. We used simple and semi-automated methods that can be implemented with free image processing software to produce an easily updatable gridded population dataset at 100 × 100 meters spatial resolution. The 2010 population dataset was matched to administrative population totals projected by the UN. Comparison tests between the new dataset and existing population datasets revealed important differences in population size distributions, and in population at risk of malaria estimates. These differences are particularly important in more densely populated areas and strongly depend on the settlement data used in the modelling approach. Conclusions The results show that it is possible to produce detailed, contemporary and easily updatable settlement and population distribution datasets of Somalia using existing data. The 2010 population dataset produced is freely available as a product of the AfriPop Project and can be downloaded from: http://www.afripop.org. PMID:20840751
Santos, Eduardo Jose Melos Dos; McCabe, Antony; Gonzalez-Galarza, Faviel F; Jones, Andrew R; Middleton, Derek
2016-03-01
The Allele Frequencies Net Database (AFND) is a freely accessible database which stores population frequencies for alleles or genes of the immune system in worldwide populations. Herein we introduce two new tools. We have defined new classifications of data (gold, silver and bronze) to assist users in identifying the most suitable populations for their tasks. The gold standard datasets are defined by allele frequencies summing to 1, sample sizes >50 and high resolution genotyping, while silver standard datasets do not meet gold standard genotyping resolution and/or sample size criteria. The bronze standard datasets are those that could not be classified under the silver or gold standards. The gold standard includes >500 datasets covering over 3 million individuals from >100 countries at one or more of the following loci: HLA-A, -B, -C, -DPA1, -DPB1, -DQA1, -DQB1 and -DRB1 - with all loci except DPA1 present in more than 220 datasets. Three out of 12 geographic regions have low representation (the majority of their countries having less than five datasets) and the Central Asia region has no representation. There are 18 countries that are not represented by any gold standard datasets but are represented by at least one dataset that is either silver or bronze standard. We also briefly summarize the data held by AFND for KIR genes, alleles and their ligands. Our second new component is a data submission tool to assist users in the collection of the genotypes of the individuals (raw data), facilitating submission of short population reports to Human Immunology, as well as simplifying the submission of population demographics and frequency data. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.
Estimation of wind regime from combination of RCM and NWP data in the Gulf of Riga (Baltic Sea)
NASA Astrophysics Data System (ADS)
Sile, T.; Sennikovs, J.; Bethers, U.
2012-04-01
Gulf of Riga is a semi-enclosed gulf located in the Eastern part of the Baltic Sea. Reliable wind climate data is crucial for the development of wind energy. The objective of this study is to create high resolution wind parameter datasets for the Gulf of Riga using climate and numerical weather prediction (NWP) models as an alternative to methods that rely on observations with the expectation of benefit from comparing different approaches. The models used for the estimation of the wind regime are an ensemble of Regional Climate Models (RCM, ENSEMBLES, 23 runs are considered) and high resolution NWP data. Future projections provided by RCM are of interest however their spatial resolution is unsatisfactory. We describe a method of spatial refinement of RCM data using NWP data to resolve small scale features. We apply the method of RCM bias correction (Sennikovs and Bethers, 2009) previously used for temperature and precipitation to wind data and use NWP data instead of observations. The refinement function is calculated using contemporary climate (1981- 2010) and later applied to RCM near future (2021 - 2050) projections to produce a dataset with the same resolution as NWP data. This method corrects for RCM biases that were shown to be present in the initial analysis and inter-model statistical analysis was carried out to estimate uncertainty. Using the datasets produced by this method the current and future projections of wind speed and wind energy density are calculated. Acknowledgments: This research is part of the GORWIND (The Gulf of Riga as a Resource for Wind Energy) project (EU34711). The ENSEMBLES data used in this work was funded by the EU FP6 Integrated Project ENSEMBLES (Contract number 505539) whose support is gratefully acknowledged.
Challenges in Extracting Information From Large Hydrogeophysical-monitoring Datasets
NASA Astrophysics Data System (ADS)
Day-Lewis, F. D.; Slater, L. D.; Johnson, T.
2012-12-01
Over the last decade, new automated geophysical data-acquisition systems have enabled collection of increasingly large and information-rich geophysical datasets. Concurrent advances in field instrumentation, web services, and high-performance computing have made real-time processing, inversion, and visualization of large three-dimensional tomographic datasets practical. Geophysical-monitoring datasets have provided high-resolution insights into diverse hydrologic processes including groundwater/surface-water exchange, infiltration, solute transport, and bioremediation. Despite the high information content of such datasets, extraction of quantitative or diagnostic hydrologic information is challenging. Visual inspection and interpretation for specific hydrologic processes is difficult for datasets that are large, complex, and (or) affected by forcings (e.g., seasonal variations) unrelated to the target hydrologic process. New strategies are needed to identify salient features in spatially distributed time-series data and to relate temporal changes in geophysical properties to hydrologic processes of interest while effectively filtering unrelated changes. Here, we review recent work using time-series and digital-signal-processing approaches in hydrogeophysics. Examples include applications of cross-correlation, spectral, and time-frequency (e.g., wavelet and Stockwell transforms) approaches to (1) identify salient features in large geophysical time series; (2) examine correlation or coherence between geophysical and hydrologic signals, even in the presence of non-stationarity; and (3) condense large datasets while preserving information of interest. Examples demonstrate analysis of large time-lapse electrical tomography and fiber-optic temperature datasets to extract information about groundwater/surface-water exchange and contaminant transport.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Katherine J; Hack, James J; Truesdale, John
A new high-resolution (0.9more » $$^{\\circ}$$x1.25$$^{\\circ}$$ in the horizontal) global tropospheric aerosol dataset with monthly resolution is generated using the finite-volume configuration of Community Atmosphere Model (CAM4) coupled to a bulk aerosol model and forced with recent estimates of surface emissions for the latter part of twentieth century. The surface emissions dataset is constructed from Coupled Model Inter-comparison Project (CMIP5) decadal-resolution surface emissions dataset to include REanalysis of TROpospheric chemical composition (RETRO) wildfire monthly emissions dataset. Experiments forced with the new tropospheric aerosol dataset and conducted using the spectral configuration of CAM4 with a T85 truncation (1.4$$^{\\circ}$$x1.4$$^{\\circ}$$) with prescribed twentieth century observed sea surface temperature, sea-ice and greenhouse gases reveal that variations in tropospheric aerosol levels can induce significant regional climate variability on the inter-annual timescales. Regression analyses over tropical Atlantic and Africa reveal that increasing dust aerosols can cool the North African landmass and shift convection southwards from West Africa into the Gulf of Guinea in the spring season in the simulations. Further, we find that increasing carbonaceous aerosols emanating from the southwestern African savannas can cool the region significantly and increase the marine stratocumulus cloud cover over the southeast tropical Atlantic ocean by aerosol-induced diabatic heating of the free troposphere above the low clouds. Experiments conducted with CAM4 coupled to a slab ocean model suggest that present day aerosols can shift the ITCZ southwards over the tropical Atlantic and can reduce the ocean mixed layer temperature beneath the increased marine stratocumulus clouds in the southeastern tropical Atlantic.« less
User's Guide for MapIMG 2: Map Image Re-projection Software Package
Finn, Michael P.; Trent, Jason R.; Buehler, Robert A.
2006-01-01
BACKGROUND Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in commercial software packages, but implementation with data other than points requires specific adaptation of the transformation equations or prior preparation of the data to allow the transformation to succeed. It seems that some of these packages use the U.S. Geological Survey's (USGS) General Cartographic Transformation Package (GCTP) or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003a). Usery and others (2003b) compiled and tabulated the accuracy of categorical areas in projected raster datasets of global extent. Based on the shortcomings identified in these studies, geographers and applications programmers at the USGS expanded and evolved a USGS software package, MapIMG, for raster map projection transformation (Finn and Trent, 2004). Daniel R. Steinwand of Science Applications International Corporation, National Center for Earth Resources Observation and Science, originally developed MapIMG for the USGS, basing it on GCTP. Through previous and continuing efforts at the USGS' National Geospatial Technical Operations Center, this program has been transformed from an application based on command line input into a software package based on a graphical user interface for Windows, Linux, and other UNIX machines.
NASA Astrophysics Data System (ADS)
Abul Ehsan Bhuiyan, Md; Nikolopoulos, Efthymios I.; Anagnostou, Emmanouil N.; Quintana-Seguí, Pere; Barella-Ortiz, Anaïs
2018-02-01
This study investigates the use of a nonparametric, tree-based model, quantile regression forests (QRF), for combining multiple global precipitation datasets and characterizing the uncertainty of the combined product. We used the Iberian Peninsula as the study area, with a study period spanning 11 years (2000-2010). Inputs to the QRF model included three satellite precipitation products, CMORPH, PERSIANN, and 3B42 (V7); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset. We calibrated the QRF model for two seasons and two terrain elevation categories and used it to generate ensemble for these conditions. Evaluation of the combined product was based on a high-resolution, ground-reference precipitation dataset (SAFRAN) available at 5 km 1 h-1 resolution. Furthermore, to evaluate relative improvements and the overall impact of the combined product in hydrological response, we used the generated ensemble to force a distributed hydrological model (the SURFEX land surface model and the RAPID river routing scheme) and compared its streamflow simulation results with the corresponding simulations from the individual global precipitation and reference datasets. We concluded that the proposed technique could generate realizations that successfully encapsulate the reference precipitation and provide significant improvement in streamflow simulations, with reduction in systematic and random error on the order of 20-99 and 44-88 %, respectively, when considering the ensemble mean.
Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoa T. Nguyen; Stone, Daithi; E. Wes Bethel
2016-01-01
An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different casemore » studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.« less
Super-resolution reconstruction of MR image with a novel residual learning network algorithm
NASA Astrophysics Data System (ADS)
Shi, Jun; Liu, Qingping; Wang, Chaofeng; Zhang, Qi; Ying, Shihui; Xu, Haoyu
2018-04-01
Spatial resolution is one of the key parameters of magnetic resonance imaging (MRI). The image super-resolution (SR) technique offers an alternative approach to improve the spatial resolution of MRI due to its simplicity. Convolutional neural networks (CNN)-based SR algorithms have achieved state-of-the-art performance, in which the global residual learning (GRL) strategy is now commonly used due to its effectiveness for learning image details for SR. However, the partial loss of image details usually happens in a very deep network due to the degradation problem. In this work, we propose a novel residual learning-based SR algorithm for MRI, which combines both multi-scale GRL and shallow network block-based local residual learning (LRL). The proposed LRL module works effectively in capturing high-frequency details by learning local residuals. One simulated MRI dataset and two real MRI datasets have been used to evaluate our algorithm. The experimental results show that the proposed SR algorithm achieves superior performance to all of the other compared CNN-based SR algorithms in this work.
The SeaFlux Turbulent Flux Dataset Version 1.0 Documentation
NASA Technical Reports Server (NTRS)
Clayson, Carol Anne; Roberts, J. Brent; Bogdanoff, Alec S.
2012-01-01
Under the auspices of the World Climate Research Programme (WCRP) Global Energy and Water cycle EXperiment (GEWEX) Data and Assessment Panel (GDAP), the SeaFlux Project was created to investigate producing a high-resolution satellite-based dataset of surface turbulent fluxes over the global oceans. The most current release of the SeaFlux product is Version 1.0; this represents the initial release of turbulent surface heat fluxes, associated near-surface variables including a diurnally varying sea surface temperature.
HIGH-RESOLUTION DATASET OF URBAN CANOPY PARAMETERS FOR HOUSTON, TEXAS
Urban dispersion and air quality simulation models applied at various horizontal scales require different levels of fidelity for specifying the characteristics of the underlying surfaces. As the modeling scales approach the neighborhood level (~1 km horizontal grid spacing), the...
Carreer, William J.; Flight, Robert M.; Moseley, Hunter N. B.
2013-01-01
New metabolomics applications of ultra-high resolution and accuracy mass spectrometry can provide thousands of detectable isotopologues, with the number of potentially detectable isotopologues increasing exponentially with the number of stable isotopes used in newer isotope tracing methods like stable isotope-resolved metabolomics (SIRM) experiments. This huge increase in usable data requires software capable of correcting the large number of isotopologue peaks resulting from SIRM experiments in a timely manner. We describe the design of a new algorithm and software system capable of handling these high volumes of data, while including quality control methods for maintaining data quality. We validate this new algorithm against a previous single isotope correction algorithm in a two-step cross-validation. Next, we demonstrate the algorithm and correct for the effects of natural abundance for both 13C and 15N isotopes on a set of raw isotopologue intensities of UDP-N-acetyl-D-glucosamine derived from a 13C/15N-tracing experiment. Finally, we demonstrate the algorithm on a full omics-level dataset. PMID:24404440
Stoker, Jason M.; Tyler, Dean J.; Turnipseed, D. Phil; Van Wilson, K.; Oimoen, Michael J.
2009-01-01
Hurricane Katrina was one of the largest natural disasters in U.S. history. Due to the sheer size of the affected areas, an unprecedented regional analysis at very high resolution and accuracy was needed to properly quantify and understand the effects of the hurricane and the storm tide. Many disparate sources of lidar data were acquired and processed for varying environmental reasons by pre- and post-Katrina projects. The datasets were in several formats and projections and were processed to varying phases of completion, and as a result the task of producing a seamless digital elevation dataset required a high level of coordination, research, and revision. To create a seamless digital elevation dataset, many technical issues had to be resolved before producing the desired 1/9-arc-second (3meter) grid needed as the map base for projecting the Katrina peak storm tide throughout the affected coastal region. This report presents the methodology that was developed to construct seamless digital elevation datasets from multipurpose, multi-use, and disparate lidar datasets, and describes an easily accessible Web application for viewing the maximum storm tide caused by Hurricane Katrina in southeastern Louisiana, Mississippi, and Alabama.
Endalamaw, Abraham; Bolton, W. Robert; Young-Robertson, Jessica M.; ...
2017-09-14
Modeling hydrological processes in the Alaskan sub-arctic is challenging because of the extreme spatial heterogeneity in soil properties and vegetation communities. Nevertheless, modeling and predicting hydrological processes is critical in this region due to its vulnerability to the effects of climate change. Coarse-spatial-resolution datasets used in land surface modeling pose a new challenge in simulating the spatially distributed and basin-integrated processes since these datasets do not adequately represent the small-scale hydrological, thermal, and ecological heterogeneity. The goal of this study is to improve the prediction capacity of mesoscale to large-scale hydrological models by introducing a small-scale parameterization scheme, which bettermore » represents the spatial heterogeneity of soil properties and vegetation cover in the Alaskan sub-arctic. The small-scale parameterization schemes are derived from observations and a sub-grid parameterization method in the two contrasting sub-basins of the Caribou Poker Creek Research Watershed (CPCRW) in Interior Alaska: one nearly permafrost-free (LowP) sub-basin and one permafrost-dominated (HighP) sub-basin. The sub-grid parameterization method used in the small-scale parameterization scheme is derived from the watershed topography. We found that observed soil thermal and hydraulic properties – including the distribution of permafrost and vegetation cover heterogeneity – are better represented in the sub-grid parameterization method than the coarse-resolution datasets. Parameters derived from the coarse-resolution datasets and from the sub-grid parameterization method are implemented into the variable infiltration capacity (VIC) mesoscale hydrological model to simulate runoff, evapotranspiration (ET), and soil moisture in the two sub-basins of the CPCRW. Simulated hydrographs based on the small-scale parameterization capture most of the peak and low flows, with similar accuracy in both sub-basins, compared to simulated hydrographs based on the coarse-resolution datasets. On average, the small-scale parameterization scheme improves the total runoff simulation by up to 50 % in the LowP sub-basin and by up to 10 % in the HighP sub-basin from the large-scale parameterization. This study shows that the proposed sub-grid parameterization method can be used to improve the performance of mesoscale hydrological models in the Alaskan sub-arctic watersheds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endalamaw, Abraham; Bolton, W. Robert; Young-Robertson, Jessica M.
Modeling hydrological processes in the Alaskan sub-arctic is challenging because of the extreme spatial heterogeneity in soil properties and vegetation communities. Nevertheless, modeling and predicting hydrological processes is critical in this region due to its vulnerability to the effects of climate change. Coarse-spatial-resolution datasets used in land surface modeling pose a new challenge in simulating the spatially distributed and basin-integrated processes since these datasets do not adequately represent the small-scale hydrological, thermal, and ecological heterogeneity. The goal of this study is to improve the prediction capacity of mesoscale to large-scale hydrological models by introducing a small-scale parameterization scheme, which bettermore » represents the spatial heterogeneity of soil properties and vegetation cover in the Alaskan sub-arctic. The small-scale parameterization schemes are derived from observations and a sub-grid parameterization method in the two contrasting sub-basins of the Caribou Poker Creek Research Watershed (CPCRW) in Interior Alaska: one nearly permafrost-free (LowP) sub-basin and one permafrost-dominated (HighP) sub-basin. The sub-grid parameterization method used in the small-scale parameterization scheme is derived from the watershed topography. We found that observed soil thermal and hydraulic properties – including the distribution of permafrost and vegetation cover heterogeneity – are better represented in the sub-grid parameterization method than the coarse-resolution datasets. Parameters derived from the coarse-resolution datasets and from the sub-grid parameterization method are implemented into the variable infiltration capacity (VIC) mesoscale hydrological model to simulate runoff, evapotranspiration (ET), and soil moisture in the two sub-basins of the CPCRW. Simulated hydrographs based on the small-scale parameterization capture most of the peak and low flows, with similar accuracy in both sub-basins, compared to simulated hydrographs based on the coarse-resolution datasets. On average, the small-scale parameterization scheme improves the total runoff simulation by up to 50 % in the LowP sub-basin and by up to 10 % in the HighP sub-basin from the large-scale parameterization. This study shows that the proposed sub-grid parameterization method can be used to improve the performance of mesoscale hydrological models in the Alaskan sub-arctic watersheds.« less
Variability in Tropospheric Ozone over China Derived from Assimilated GOME-2 Ozone Profiles
NASA Astrophysics Data System (ADS)
van Peet, J. C. A.; van der A, R. J.; Kelder, H. M.
2016-08-01
A tropospheric ozone dataset is derived from assimilated GOME-2 ozone profiles for 2008. Ozone profiles are retrieved with the OPERA algorithm, using the optimal estimation method. The retrievals are done on a spatial resolution of 160×160 km on 16 layers ranging from the surface up to 0.01 hPa. By using the averaging kernels in the data assimilation, the algorithm maintains the high resolution vertical structures of the model, while being constrained by observations with a lower vertical resolution.
NASA Astrophysics Data System (ADS)
Dube, Timothy; Mutanga, Onisimo
2016-09-01
Reliable and accurate mapping and extraction of key forest indicators of ecosystem development and health, such as aboveground biomass (AGB) and aboveground carbon stocks (AGCS) is critical in understanding forests contribution to the local, regional and global carbon cycle. This information is critical in assessing forest contribution towards ecosystem functioning and services, as well as their conservation status. This work aimed at assessing the applicability of the high resolution 8-band WorldView-2 multispectral dataset together with environmental variables in quantifying AGB and aboveground carbon stocks for three forest plantation species i.e. Eucalyptus dunii (ED), Eucalyptus grandis (EG) and Pinus taeda (PT) in uMgeni Catchment, South Africa. Specifically, the strength of the Worldview-2 sensor in terms of its improved imaging agilities is examined as an independent dataset and in conjunction with selected environmental variables. The results have demonstrated that the integration of high resolution 8-band Worldview-2 multispectral data with environmental variables provide improved AGB and AGCS estimates, when compared to the use of spectral data as an independent dataset. The use of integrated datasets yielded a high R2 value of 0.88 and RMSEs of 10.05 t ha-1 and 5.03 t C ha-1 for E. dunii AGB and carbon stocks; whereas the use of spectral data as an independent dataset yielded slightly weaker results, producing an R2 value of 0.73 and an RMSE of 18.57 t ha-1 and 09.29 t C ha-1. Similarly, high accurate results (R2 value of 0.73 and RMSE values of 27.30 t ha-1 and 13.65 t C ha-1) were observed from the estimation of inter-species AGB and carbon stocks. Overall, the findings of this work have shown that the integration of new generation multispectral datasets with environmental variables provide a robust toolset required for the accurate and reliable retrieval of forest aboveground biomass and carbon stocks in densely forested terrestrial ecosystems.
Kalwij, Jesse M; Robertson, Mark P; Ronk, Argo; Zobel, Martin; Pärtel, Meelis
2014-01-01
Much ecological research relies on existing multispecies distribution datasets. Such datasets, however, can vary considerably in quality, extent, resolution or taxonomic coverage. We provide a framework for a spatially-explicit evaluation of geographical representation within large-scale species distribution datasets, using the comparison of an occurrence atlas with a range atlas dataset as a working example. Specifically, we compared occurrence maps for 3773 taxa from the widely-used Atlas Florae Europaeae (AFE) with digitised range maps for 2049 taxa of the lesser-known Atlas of North European Vascular Plants. We calculated the level of agreement at a 50-km spatial resolution using average latitudinal and longitudinal species range, and area of occupancy. Agreement in species distribution was calculated and mapped using Jaccard similarity index and a reduced major axis (RMA) regression analysis of species richness between the entire atlases (5221 taxa in total) and between co-occurring species (601 taxa). We found no difference in distribution ranges or in the area of occupancy frequency distribution, indicating that atlases were sufficiently overlapping for a valid comparison. The similarity index map showed high levels of agreement for central, western, and northern Europe. The RMA regression confirmed that geographical representation of AFE was low in areas with a sparse data recording history (e.g., Russia, Belarus and the Ukraine). For co-occurring species in south-eastern Europe, however, the Atlas of North European Vascular Plants showed remarkably higher richness estimations. Geographical representation of atlas data can be much more heterogeneous than often assumed. Level of agreement between datasets can be used to evaluate geographical representation within datasets. Merging atlases into a single dataset is worthwhile in spite of methodological differences, and helps to fill gaps in our knowledge of species distribution ranges. Species distribution dataset mergers, such as the one exemplified here, can serve as a baseline towards comprehensive species distribution datasets.
Retinal optical coherence tomography at 1 μm with dynamic focus control and axial motion tracking
NASA Astrophysics Data System (ADS)
Cua, Michelle; Lee, Sujin; Miao, Dongkai; Ju, Myeong Jin; Mackenzie, Paul J.; Jian, Yifan; Sarunic, Marinko V.
2016-02-01
High-resolution optical coherence tomography (OCT) retinal imaging is important to noninvasively visualize the various retinal structures to aid in better understanding of the pathogenesis of vision-robbing diseases. However, conventional OCT systems have a trade-off between lateral resolution and depth-of-focus. In this report, we present the development of a focus-stacking OCT system with automatic focus optimization for high-resolution, extended-focal-range clinical retinal imaging by incorporating a variable-focus liquid lens into the sample arm optics. Retinal layer tracking and selection was performed using a graphics processing unit accelerated processing platform for focus optimization, providing real-time layer-specific en face visualization. After optimization, multiple volumes focused at different depths were acquired, registered, and stitched together to yield a single, high-resolution focus-stacked dataset. Using this system, we show high-resolution images of the retina and optic nerve head, from which we extracted clinically relevant parameters such as the nerve fiber layer thickness and lamina cribrosa microarchitecture.
Retinal optical coherence tomography at 1 μm with dynamic focus control and axial motion tracking.
Cua, Michelle; Lee, Sujin; Miao, Dongkai; Ju, Myeong Jin; Mackenzie, Paul J; Jian, Yifan; Sarunic, Marinko V
2016-02-01
High-resolution optical coherence tomography (OCT) retinal imaging is important to noninvasively visualize the various retinal structures to aid in better understanding of the pathogenesis of vision-robbing diseases. However, conventional OCT systems have a trade-off between lateral resolution and depth-of-focus. In this report, we present the development of a focus-stacking OCT system with automatic focus optimization for high-resolution, extended-focal-range clinical retinal imaging by incorporating a variable-focus liquid lens into the sample arm optics. Retinal layer tracking and selection was performed using a graphics processing unit accelerated processing platform for focus optimization, providing real-time layer-specific en face visualization. After optimization, multiple volumes focused at different depths were acquired, registered, and stitched together to yield a single, high-resolution focus-stacked dataset. Using this system, we show high-resolution images of the retina and optic nerve head, from which we extracted clinically relevant parameters such as the nerve fiber layer thickness and lamina cribrosa microarchitecture.
Generation of High Resolution Global DSM from ALOS PRISM
NASA Astrophysics Data System (ADS)
Takaku, J.; Tadono, T.; Tsutsui, K.
2014-04-01
Panchromatic Remote-sensing Instrument for Stereo Mapping (PRISM), one of onboard sensors carried on the Advanced Land Observing Satellite (ALOS), was designed to generate worldwide topographic data with its optical stereoscopic observation. The sensor consists of three independent panchromatic radiometers for viewing forward, nadir, and backward in 2.5 m ground resolution producing a triplet stereoscopic image along its track. The sensor had observed huge amount of stereo images all over the world during the mission life of the satellite from 2006 through 2011. We have semi-automatically processed Digital Surface Model (DSM) data with the image archives in some limited areas. The height accuracy of the dataset was estimated at less than 5 m (rms) from the evaluation with ground control points (GCPs) or reference DSMs derived from the Light Detection and Ranging (LiDAR). Then, we decided to process the global DSM datasets from all available archives of PRISM stereo images by the end of March 2016. This paper briefly reports on the latest processing algorithms for the global DSM datasets as well as their preliminary results on some test sites. The accuracies and error characteristics of datasets are analyzed and discussed on various fields by the comparison with existing global datasets such as Ice, Cloud, and land Elevation Satellite (ICESat) data and Shuttle Radar Topography Mission (SRTM) data, as well as the GCPs and the reference airborne LiDAR/DSM.
Modelling Biophysical Parameters of Maize Using Landsat 8 Time Series
NASA Astrophysics Data System (ADS)
Dahms, Thorsten; Seissiger, Sylvia; Conrad, Christopher; Borg, Erik
2016-06-01
Open and free access to multi-frequent high-resolution data (e.g. Sentinel - 2) will fortify agricultural applications based on satellite data. The temporal and spatial resolution of these remote sensing datasets directly affects the applicability of remote sensing methods, for instance a robust retrieving of biophysical parameters over the entire growing season with very high geometric resolution. In this study we use machine learning methods to predict biophysical parameters, namely the fraction of absorbed photosynthetic radiation (FPAR), the leaf area index (LAI) and the chlorophyll content, from high resolution remote sensing. 30 Landsat 8 OLI scenes were available in our study region in Mecklenburg-Western Pomerania, Germany. In-situ data were weekly to bi-weekly collected on 18 maize plots throughout the summer season 2015. The study aims at an optimized prediction of biophysical parameters and the identification of the best explaining spectral bands and vegetation indices. For this purpose, we used the entire in-situ dataset from 24.03.2015 to 15.10.2015. Random forest and conditional inference forests were used because of their explicit strong exploratory and predictive character. Variable importance measures allowed for analysing the relation between the biophysical parameters with respect to the spectral response, and the performance of the two approaches over the plant stock evolvement. Classical random forest regression outreached the performance of conditional inference forests, in particular when modelling the biophysical parameters over the entire growing period. For example, modelling biophysical parameters of maize for the entire vegetation period using random forests yielded: FPAR: R² = 0.85; RMSE = 0.11; LAI: R² = 0.64; RMSE = 0.9 and chlorophyll content (SPAD): R² = 0.80; RMSE=4.9. Our results demonstrate the great potential in using machine-learning methods for the interpretation of long-term multi-frequent remote sensing datasets to model biophysical parameters.
Hair-bundle proteomes of avian and mammalian inner-ear utricles
Wilmarth, Phillip A.; Krey, Jocelyn F.; Shin, Jung-Bum; Choi, Dongseok; David, Larry L.; Barr-Gillespie, Peter G.
2015-01-01
Examination of multiple proteomics datasets within or between species increases the reliability of protein identification. We report here proteomes of inner-ear hair bundles from three species (chick, mouse, and rat), which were collected on LTQ or LTQ Velos ion-trap mass spectrometers; the constituent proteins were quantified using MS2 intensities, which are the summed intensities of all peptide fragmentation spectra matched to a protein. The data are available via ProteomeXchange with identifiers PXD002410 (chick LTQ), PXD002414 (chick Velos), PXD002415 (mouse Velos), and PXD002416 (rat LTQ). The two chick bundle datasets compared favourably to a third, already-described chick bundle dataset, which was quantified using MS1 peak intensities, the summed intensities of peptides identified by high-resolution mass spectrometry (PXD000104; updated analysis in PXD002445). The mouse bundle dataset described here was comparable to a different mouse bundle dataset quantified using MS1 intensities (PXD002167). These six datasets will be useful for identifying the core proteome of vestibular hair bundles. PMID:26645194
Detecting and Quantifying Forest Change: The Potential of Existing C- and X-Band Radar Datasets.
Tanase, Mihai A; Ismail, Ismail; Lowell, Kim; Karyanto, Oka; Santoro, Maurizio
2015-01-01
This paper evaluates the opportunity provided by global interferometric radar datasets for monitoring deforestation, degradation and forest regrowth in tropical and semi-arid environments. The paper describes an easy to implement method for detecting forest spatial changes and estimating their magnitude. The datasets were acquired within space-borne high spatial resolutions radar missions at near-global scales thus being significant for monitoring systems developed under the United Framework Convention on Climate Change (UNFCCC). The approach presented in this paper was tested in two areas located in Indonesia and Australia. Forest change estimation was based on differences between a reference dataset acquired in February 2000 by the Shuttle Radar Topography Mission (SRTM) and TanDEM-X mission (TDM) datasets acquired in 2011 and 2013. The synergy between SRTM and TDM datasets allowed not only identifying changes in forest extent but also estimating their magnitude with respect to the reference through variations in forest height.
NASA Astrophysics Data System (ADS)
Anker, Y.; Hershkovitz, Y.; Gasith, A.; Ben-Dor, E.
2011-12-01
Although remote sensing of fluvial ecosystems is well developed, the tradeoff between spectral and spatial resolutions prevents its application in small streams (<3m width). In the current study, a remote sensing approach for monitoring and research of small ecosystem was developed. The method is based on differentiation between two indicative vegetation species out of the ecosystem flora. Since when studied, the channel was covered mostly by a filamentous green alga (Cladophora glomerata) and watercress (Nasturtium officinale), these species were chosen as indicative; nonetheless, common reed (Phragmites australis) was also classified in order to exclude it from the stream ROI. The procedure included: A. For both section and habitat scales classifications, acquisition of aerial digital RGB datasets. B. For section scale classification, hyperspectral (HSR) dataset acquisition. C. For calibration, HSR reflectance measurements of specific ground targets, in close proximity to each dataset acquisition swath. D. For habitat scale classification, manual, in-stream flora grid transects classification. The digital RGB datasets were converted to reflectance units by spectral calibration against colored reference plates. These red, green, blue, white, and black EVA foam reference plates were measured by an ASD field spectrometer and each was given a spectral value. Each spectral value was later applied to the spectral calibration and radiometric correction of spectral RGB (SRGB) cube. Spectral calibration of the HSR dataset was done using the empirical line method, based on reference values of progressive grey scale targets. Differentiation between the vegetation species was done by supervised classification both for the HSR and for the SRGB datasets. This procedure was done using the Spectral Angle Mapper function with the spectral pattern of each vegetation species as a spectral end member. Comparison between the two remote sensing techniques and between the SRGB classification and the in-situ transects indicates that: A. Stream vegetation classification resolution is about 4 cm by the SRGB method compared to about 1 m by HSR. Moreover, this resolution is also higher than of the manual grid transect classification. B. The SRGB method is by far the most cost-efficient. The combination of spectral information (rather than the cognitive color) and high spatial resolution of aerial photography provides noise filtration and better sub-water detection capabilities than the HSR technique. C. Only the SRGB method applies for habitat and section scales; hence, its application together with in-situ grid transects for validation, may be optimal for use in similar scenarios.
The HSR dataset was first degraded to 17 bands with the same spectral range as the RGB dataset and also to a dataset with 3 equivalent bands
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system
Jensen, Tue V.; Pinson, Pierre
2017-01-01
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation. PMID:29182600
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.
Jensen, Tue V; Pinson, Pierre
2017-11-28
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system
NASA Astrophysics Data System (ADS)
Jensen, Tue V.; Pinson, Pierre
2017-11-01
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.
Mapping regional soil water erosion risk in the Brittany-Loire basin for water management agency
NASA Astrophysics Data System (ADS)
Degan, Francesca; Cerdan, Olivier; Salvador-Blanes, Sébastien; Gautier, Jean-Noël
2014-05-01
Soil water erosion is one of the main degradation processes that affect soils through the removal of soil particles from the surface. The impacts for environment and agricultural areas are diverse, such as water pollution, crop yield depression, organic matter loss and reduction in water storage capacity. There is therefore a strong need to produce maps at the regional scale to help environmental policy makers and soil and water management bodies to mitigate the effect of water and soil pollution. Our approach aims to model and map soil erosion risk at regional scale (155 000 km²) and high spatial resolution (50 m) in the Brittany - Loire basin. The factors responsible for soil erosion are different according to the spatial and time scales considered. The regional scale entails challenges about homogeneous data sets availability, spatial resolution of results, various erosion processes and agricultural practices. We chose to improve the MESALES model (Le Bissonnais et al., 2002) to map soil erosion risk, because it was developed specifically for water erosion in agricultural fields in temperate areas. The MESALES model consists in a decision tree which gives for each combination of factors the corresponding class of soil erosion risk. Four factors that determine soil erosion risk are considered: soils, land cover, climate and topography. The first main improvement of the model consists in using newly available datasets that are more accurate than the initial ones. The datasets used cover all the study area homogeneously. Soil dataset has a 1/1 000 000 scale and attributes such as texture, soil type, rock fragment and parent material are used. The climate dataset has a spatial resolution of 8 km and a temporal resolution of mm/day for 12 years. Elevation dataset has a spatial resolution of 50 m. Three different land cover datasets are used where the finest spatial resolution is 50 m over three years. Using these datasets, four erosion factors are characterized and quantified: the soil factors (soil sealing, erodibility and runoff), the rate of land cover over three years for each season and for 77 land use classes, the topographic factor (slope and drainage area) and the climate hazard (seasonal amount and rainfall erosivity). These modifications of the original MESALES model allow to better represent erosion risk for arable and bare land. We validated model results by stakeholder consultations and meetings over all the study area. The model has finally been modified taking into account validation results. Results are provided with a spatial resolution of 1 km, and then integrated into 2121 catchments. An erosion risk map for each season and an annual erosion risk map are produced. These new maps allow to organize in hierarchy 2121 catchments into three erosion risk classes. In the annual erosion risk map, 347 catchments have the highest erosion risk, which corresponds to 16 % of total Brittany-Loire basin area. Water management agency now uses these maps to identify priority areas and to plan specific preservation practices.
NASA Technical Reports Server (NTRS)
Townshend, John R.; Masek, Jeffrey G.; Huang, ChengQuan; Vermote, Eric F.; Gao, Feng; Channan, Saurabh; Sexton, Joseph O.; Feng, Min; Narasimhan, Ramghuram; Kim, Dohyung;
2012-01-01
The compilation of global Landsat data-sets and the ever-lowering costs of computing now make it feasible to monitor the Earth's land cover at Landsat resolutions of 30 m. In this article, we describe the methods to create global products of forest cover and cover change at Landsat resolutions. Nevertheless, there are many challenges in ensuring the creation of high-quality products. And we propose various ways in which the challenges can be overcome. Among the challenges are the need for atmospheric correction, incorrect calibration coefficients in some of the data-sets, the different phenologies between compilations, the need for terrain correction, the lack of consistent reference data for training and accuracy assessment, and the need for highly automated characterization and change detection. We propose and evaluate the creation and use of surface reflectance products, improved selection of scenes to reduce phenological differences, terrain illumination correction, automated training selection, and the use of information extraction procedures robust to errors in training data along with several other issues. At several stages we use Moderate Resolution Spectroradiometer data and products to assist our analysis. A global working prototype product of forest cover and forest cover change is included.
Automated quantification of surface water inundation in wetlands using optical satellite imagery
DeVries, Ben; Huang, Chengquan; Lang, Megan W.; Jones, John W.; Huang, Wenli; Creed, Irena F.; Carroll, Mark L.
2017-01-01
We present a fully automated and scalable algorithm for quantifying surface water inundation in wetlands. Requiring no external training data, our algorithm estimates sub-pixel water fraction (SWF) over large areas and long time periods using Landsat data. We tested our SWF algorithm over three wetland sites across North America, including the Prairie Pothole Region, the Delmarva Peninsula and the Everglades, representing a gradient of inundation and vegetation conditions. We estimated SWF at 30-m resolution with accuracies ranging from a normalized root-mean-square-error of 0.11 to 0.19 when compared with various high-resolution ground and airborne datasets. SWF estimates were more sensitive to subtle inundated features compared to previously published surface water datasets, accurately depicting water bodies, large heterogeneously inundated surfaces, narrow water courses and canopy-covered water features. Despite this enhanced sensitivity, several sources of errors affected SWF estimates, including emergent or floating vegetation and forest canopies, shadows from topographic features, urban structures and unmasked clouds. The automated algorithm described in this article allows for the production of high temporal resolution wetland inundation data products to support a broad range of applications.
Wavelet data compression for archiving high-resolution icosahedral model data
NASA Astrophysics Data System (ADS)
Wang, N.; Bao, J.; Lee, J.
2011-12-01
With the increase of the resolution of global circulation models, it becomes ever more important to develop highly effective solutions to archive the huge datasets produced by those models. While lossless data compression guarantees the accuracy of the restored data, it can only achieve limited reduction of data size. Wavelet transform based data compression offers significant potentials in data size reduction, and it has been shown very effective in transmitting data for remote visualizations. However, for data archive purposes, a detailed study has to be conducted to evaluate its impact to the datasets that will be used in further numerical computations. In this study, we carried out two sets of experiments for both summer and winter seasons. An icosahedral grid weather model and a highly efficient wavelet data compression software were used for this study. Initial conditions were compressed and input to the model to run to 10 days. The forecast results were then compared to those forecast results from the model run with the original uncompressed initial conditions. Several visual comparisons, as well as the statistics of numerical comparisons are presented. These results indicate that with specified minimum accuracy losses, wavelet data compression achieves significant data size reduction, and at the same time, it maintains minimum numerical impacts to the datasets. In addition, some issues are discussed to increase the archive efficiency while retaining a complete set of meta data for each archived file.
EnviroAtlas - Percent Stream Buffer Zone As Natural Land Cover for the Conterminous United States
This EnviroAtlas dataset shows the percentage of land area within a 30 meter buffer zone along the National Hydrography Dataset (NHD) high resolution stream network, and along water bodies such as lakes and ponds that are connected via flow to the streams, that is classified as forest land cover, modified forest land cover, and natural land cover using the 2006 National Land Cover Dataset (NLCD) for each Watershed Boundary Dataset (WBD) 12-digit hydrological unit (HUC) in the conterminous United States. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
Umehara, Kensuke; Ota, Junko; Ishida, Takayuki
2017-10-18
In this study, the super-resolution convolutional neural network (SRCNN) scheme, which is the emerging deep-learning-based super-resolution method for enhancing image resolution in chest CT images, was applied and evaluated using the post-processing approach. For evaluation, 89 chest CT cases were sampled from The Cancer Imaging Archive. The 89 CT cases were divided randomly into 45 training cases and 44 external test cases. The SRCNN was trained using the training dataset. With the trained SRCNN, a high-resolution image was reconstructed from a low-resolution image, which was down-sampled from an original test image. For quantitative evaluation, two image quality metrics were measured and compared to those of the conventional linear interpolation methods. The image restoration quality of the SRCNN scheme was significantly higher than that of the linear interpolation methods (p < 0.001 or p < 0.05). The high-resolution image reconstructed by the SRCNN scheme was highly restored and comparable to the original reference image, in particular, for a ×2 magnification. These results indicate that the SRCNN scheme significantly outperforms the linear interpolation methods for enhancing image resolution in chest CT images. The results also suggest that SRCNN may become a potential solution for generating high-resolution CT images from standard CT images.
Wong, Gerard; Leckie, Christopher; Kowalczyk, Adam
2012-01-15
Feature selection is a key concept in machine learning for microarray datasets, where features represented by probesets are typically several orders of magnitude larger than the available sample size. Computational tractability is a key challenge for feature selection algorithms in handling very high-dimensional datasets beyond a hundred thousand features, such as in datasets produced on single nucleotide polymorphism microarrays. In this article, we present a novel feature set reduction approach that enables scalable feature selection on datasets with hundreds of thousands of features and beyond. Our approach enables more efficient handling of higher resolution datasets to achieve better disease subtype classification of samples for potentially more accurate diagnosis and prognosis, which allows clinicians to make more informed decisions in regards to patient treatment options. We applied our feature set reduction approach to several publicly available cancer single nucleotide polymorphism (SNP) array datasets and evaluated its performance in terms of its multiclass predictive classification accuracy over different cancer subtypes, its speedup in execution as well as its scalability with respect to sample size and array resolution. Feature Set Reduction (FSR) was able to reduce the dimensions of an SNP array dataset by more than two orders of magnitude while achieving at least equal, and in most cases superior predictive classification performance over that achieved on features selected by existing feature selection methods alone. An examination of the biological relevance of frequently selected features from FSR-reduced feature sets revealed strong enrichment in association with cancer. FSR was implemented in MATLAB R2010b and is available at http://ww2.cs.mu.oz.au/~gwong/FSR.
NASA Astrophysics Data System (ADS)
Pan, J.; Durand, M. T.; Jiang, L.; Liu, D.
2017-12-01
The newly-processed NASA MEaSures Calibrated Enhanced-Resolution Brightness Temperature (CETB) reconstructed using antenna measurement response function (MRF) is considered to have significantly improved fine-resolution measurements with better georegistration for time-series observations and equivalent field of view (FOV) for frequencies with the same monomial spatial resolution. We are looking forward to its potential for the global snow observing purposes, and therefore aim to test its performance for characterizing snow properties, especially the snow water equivalent (SWE) in large areas. In this research, two candidate SWE algorithms will be tested in China for the years between 2005 to 2010 using the reprocessed TB from the Advanced Microwave Scanning Radiometer for EOS (AMSR-E), with the results to be evaluated using the daily snow depth measurements at over 700 national synoptic stations. One of the algorithms is the SWE retrieval algorithm used for the FengYun (FY) - 3 Microwave Radiation Imager. This algorithm uses the multi-channel TB to calculate SWE for three major snow regions in China, with the coefficients adapted for different land cover types. The second algorithm is the newly-established Bayesian Algorithm for SWE Estimation with Passive Microwave measurements (BASE-PM). This algorithm uses the physically-based snow radiative transfer model to find the histogram of most-likely snow property that matches the multi-frequency TB from 10.65 to 90 GHz. It provides a rough estimation of snow depth and grain size at the same time and showed a 30 mm SWE RMS error using the ground radiometer measurements at Sodankyla. This study will be the first attempt to test it spatially for satellite. The use of this algorithm benefits from the high resolution and the spatial consistency between frequencies embedded in the new dataset. This research will answer three questions. First, to what extent can CETB increase the heterogeneity in the mapped SWE? Second, will the SWE estimation error statistics be improved using this high-resolution dataset? Third, how will the SWE retrieval accuracy be improved using CETB and the new SWE retrieval techniques?
NASA Astrophysics Data System (ADS)
Ikeshima, D.; Yamazaki, D.; Yoshikawa, S.; Kanae, S.
2015-12-01
The specification of worldwide water body distribution is important for discovering hydrological cycle. Global 3-second Water Body Map (G3WBM) is a global scale map, which indicates the distribution of water body in 90m resolutions (http://hydro.iis.u-tokyo.ac.jp/~yamadai/G3WBM/index.html). This dataset was mainly built to identify the width of river channels, which is one of major uncertainties of continental-scale river hydrodynamics models. To survey the true width of the river channel, this water body map distinguish Permanent Water Body from Temporary Water Body, which means separating river channel and flood plain. However, rivers with narrower width, which is a major case in usual river, could not be observed in this map. To overcome this problem, updating the algorithm of G3WBM and enhancing the resolutions to 30m is the goal of this research. Although this 30m-resolution water body map uses similar algorithm as G3WBM, there are many technical issues attributed to relatively high resolutions. Those are such as lack of same high-resolution digital elevation map, or contamination problem of sub-pixel scale object on satellite acquired image, or invisibility of well-vegetated water body such as swamp. To manage those issues, this research used more than 30,000 satellite images of Landsat Global Land Survey (GLS), and lately distributed topography data of Shuttle Rader Topography Mission (SRTM) 1 arc-second (30m) digital elevation map. Also the effect of aerosol, which would scatter the sun reflectance and disturb the acquired result image, was considered. Due to these revises, the global water body distribution was established in more precise resolution.
Gesch, Dean B.
2009-01-01
The importance of sea-level rise in shaping coastal landscapes is well recognized within the earth science community, but as with many natural hazards, communicating the risks associated with sea-level rise remains a challenge. Topography is a key parameter that influences many of the processes involved in coastal change, and thus, up-to-date, high-resolution, high-accuracy elevation data are required to model the coastal environment. Maps of areas subject to potential inundation have great utility to planners and managers concerned with the effects of sea-level rise. However, most of the maps produced to date are simplistic representations derived from older, coarse elevation data. In the last several years, vast amounts of high quality elevation data derived from lidar have become available. Because of their high vertical accuracy and spatial resolution, these lidar data are an excellent source of up-to-date information from which to improve identification and delineation of vulnerable lands. Four elevation datasets of varying resolution and accuracy were processed to demonstrate that the improved quality of lidar data leads to more precise delineation of coastal lands vulnerable to inundation. A key component of the comparison was to calculate and account for the vertical uncertainty of the elevation datasets. This comparison shows that lidar allows for a much more detailed delineation of the potential inundation zone when compared to other types of elevation models. It also shows how the certainty of the delineation of lands vulnerable to a given sea-level rise scenario is much improved when derived from higher resolution lidar data.
Baptiste Dafflon
2015-04-07
Low-altitude remote sensing dataset including DEM and RGB mosaic for AB (July 13 2013) and L2 corridor (July 21 2013).Processing flowchart for each corridor:Ground control points (GCP, 20.3 cm square white targets, every 20 m) surveyed with RTK GPS. Acquisition of RGB pictures using a Kite-based platform. Structure from Motion based reconstruction using hundreds of pictures and GCP coordinates. Export of DEM and RGB mosaic in geotiff format (NAD 83, 2012 geoid, UTM zone 4 north) with pixel resolution of about 2 cm, and x,y,z accuracy in centimeter range (less than 10 cm). High-accuracy and high-resolution inside GCPs zone for L2 corridor (500x20m), AB corridor (500x40) DEM will be updated once all GCPs will be measured. Only zones between GCPs are accurate although all the mosaic is provided.
A global wind resource atlas including high-resolution terrain effects
NASA Astrophysics Data System (ADS)
Hahmann, Andrea; Badger, Jake; Olsen, Bjarke; Davis, Neil; Larsen, Xiaoli; Badger, Merete
2015-04-01
Currently no accurate global wind resource dataset is available to fill the needs of policy makers and strategic energy planners. Evaluating wind resources directly from coarse resolution reanalysis datasets underestimate the true wind energy resource, as the small-scale spatial variability of winds is missing. This missing variability can account for a large part of the local wind resource. Crucially, it is the windiest sites that suffer the largest wind resource errors: in simple terrain the windiest sites may be underestimated by 25%, in complex terrain the underestimate can be as large as 100%. The small-scale spatial variability of winds can be modelled using novel statistical methods and by application of established microscale models within WAsP developed at DTU Wind Energy. We present the framework for a single global methodology, which is relative fast and economical to complete. The method employs reanalysis datasets, which are downscaled to high-resolution wind resource datasets via a so-called generalization step, and microscale modelling using WAsP. This method will create the first global wind atlas (GWA) that covers all land areas (except Antarctica) and 30 km coastal zone over water. Verification of the GWA estimates will be done at carefully selected test regions, against verified estimates from mesoscale modelling and satellite synthetic aperture radar (SAR). This verification exercise will also help in the estimation of the uncertainty of the new wind climate dataset. Uncertainty will be assessed as a function of spatial aggregation. It is expected that the uncertainty at verification sites will be larger than that of dedicated assessments, but the uncertainty will be reduced at levels of aggregation appropriate for energy planning, and importantly much improved relative to what is used today. In this presentation we discuss the methodology used, which includes the generalization of wind climatologies, and the differences in local and spatially aggregated wind resources that result from using different reanalyses in the various verification regions. A prototype web interface for the public access to the data will also be showcased.
NASA Astrophysics Data System (ADS)
Yu, H.; Gu, H.
2017-12-01
A novel multivariate seismic formation pressure prediction methodology is presented, which incorporates high-resolution seismic velocity data from prestack AVO inversion, and petrophysical data (porosity and shale volume) derived from poststack seismic motion inversion. In contrast to traditional seismic formation prediction methods, the proposed methodology is based on a multivariate pressure prediction model and utilizes a trace-by-trace multivariate regression analysis on seismic-derived petrophysical properties to calibrate model parameters in order to make accurate predictions with higher resolution in both vertical and lateral directions. With prestack time migration velocity as initial velocity model, an AVO inversion was first applied to prestack dataset to obtain high-resolution seismic velocity with higher frequency that is to be used as the velocity input for seismic pressure prediction, and the density dataset to calculate accurate Overburden Pressure (OBP). Seismic Motion Inversion (SMI) is an inversion technique based on Markov Chain Monte Carlo simulation. Both structural variability and similarity of seismic waveform are used to incorporate well log data to characterize the variability of the property to be obtained. In this research, porosity and shale volume are first interpreted on well logs, and then combined with poststack seismic data using SMI to build porosity and shale volume datasets for seismic pressure prediction. A multivariate effective stress model is used to convert velocity, porosity and shale volume datasets to effective stress. After a thorough study of the regional stratigraphic and sedimentary characteristics, a regional normally compacted interval model is built, and then the coefficients in the multivariate prediction model are determined in a trace-by-trace multivariate regression analysis on the petrophysical data. The coefficients are used to convert velocity, porosity and shale volume datasets to effective stress and then to calculate formation pressure with OBP. Application of the proposed methodology to a research area in East China Sea has proved that the method can bridge the gap between seismic and well log pressure prediction and give predicted pressure values close to pressure meassurements from well testing.
In situ observations of the isotopic composition of methane at the Cabauw tall tower site
NASA Astrophysics Data System (ADS)
Röckmann, Thomas; Eyer, Simon; van der Veen, Carina; Popa, Maria E.; Tuzson, Béla; Monteil, Guillaume; Houweling, Sander; Harris, Eliza; Brunner, Dominik; Fischer, Hubertus; Zazzeri, Giulia; Lowry, David; Nisbet, Euan G.; Brand, Willi A.; Necki, Jaroslav M.; Emmenegger, Lukas; Mohn, Joachim
2016-08-01
High-precision analyses of the isotopic composition of methane in ambient air can potentially be used to discriminate between different source categories. Due to the complexity of isotope ratio measurements, such analyses have generally been performed in the laboratory on air samples collected in the field. This poses a limitation on the temporal resolution at which the isotopic composition can be monitored with reasonable logistical effort. Here we present the performance of a dual isotope ratio mass spectrometric system (IRMS) and a quantum cascade laser absorption spectroscopy (QCLAS)-based technique for in situ analysis of the isotopic composition of methane under field conditions. Both systems were deployed at the Cabauw Experimental Site for Atmospheric Research (CESAR) in the Netherlands and performed in situ, high-frequency (approx. hourly) measurements for a period of more than 5 months. The IRMS and QCLAS instruments were in excellent agreement with a slight systematic offset of (+0.25 ± 0.04) ‰ for δ13C and (-4.3 ± 0.4) ‰ for δD. This was corrected for, yielding a combined dataset with more than 2500 measurements of both δ13C and δD. The high-precision and high-temporal-resolution dataset not only reveals the overwhelming contribution of isotopically depleted agricultural CH4 emissions from ruminants at the Cabauw site but also allows the identification of specific events with elevated contributions from more enriched sources such as natural gas and landfills. The final dataset was compared to model calculations using the global model TM5 and the mesoscale model FLEXPART-COSMO. The results of both models agree better with the measurements when the TNO-MACC emission inventory is used in the models than when the EDGAR inventory is used. This suggests that high-resolution isotope measurements have the potential to further constrain the methane budget when they are performed at multiple sites that are representative for the entire European domain.
NASA Astrophysics Data System (ADS)
Berg, W. K.
2016-12-01
The Global Precipitation Mission (GPM) Core Observatory, which was launched in February of 2014, provides a number of advances for satellite monitoring of precipitation including a dual-frequency radar, high frequency channels on the GPM Microwave Imager (GMI), and coverage over middle and high latitudes. The GPM concept, however, is about producing unified precipitation retrievals from a constellation of microwave radiometers to provide approximately 3-hourly global sampling. This involves intercalibration of the input brightness temperatures from the constellation radiometers, development of an apriori precipitation database using observations from the state-of-the-art GPM radiometer and radars, and accounting for sensor differences in the retrieval algorithm in a physically-consistent way. Efforts by the GPM inter-satellite calibration working group, or XCAL team, and the radiometer algorithm team to create unified precipitation retrievals from the GPM radiometer constellation were fully implemented into the current version 4 GPM precipitation products. These include precipitation estimates from a total of seven conical-scanning and six cross-track scanning radiometers as well as high spatial and temporal resolution global level 3 gridded products. Work is now underway to extend this unified constellation-based approach to the combined TRMM/GPM data record starting in late 1997. The goal is to create a long-term global precipitation dataset employing these state-of-the-art calibration and retrieval algorithm approaches. This new long-term global precipitation dataset will incorporate the physics provided by the combined GPM GMI and DPR sensors into the apriori database, extend prior TRMM constellation observations to high latitudes, and expand the available TRMM precipitation data to the full constellation of available conical and cross-track scanning radiometers. This combined TRMM/GPM precipitation data record will thus provide a high-quality high-temporal resolution global dataset for use in a wide variety of weather and climate research applications.
GRDC. A Collaborative Framework for Radiological Background and Contextual Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian J. Quiter; Ramakrishnan, Lavanya; Mark S. Bandstra
The Radiation Mobile Analysis Platform (RadMAP) is unique in its capability to collect both high quality radiological data from both gamma-ray detectors and fast neutron detectors and a broad array of contextual data that includes positioning and stance data, high-resolution 3D radiological data from weather sensors, LiDAR, and visual and hyperspectral cameras. The datasets obtained from RadMAP are both voluminous and complex and require analyses from highly diverse communities within both the national laboratory and academic communities. Maintaining a high level of transparency will enable analysis products to further enrich the RadMAP dataset. It is in this spirit of openmore » and collaborative data that the RadMAP team proposed to collect, calibrate, and make available online data from the RadMAP system. The Berkeley Data Cloud (BDC) is a cloud-based data management framework that enables web-based data browsing visualization, and connects curated datasets to custom workflows such that analysis products can be managed and disseminated while maintaining user access rights. BDC enables cloud-based analyses of large datasets in a manner that simulates real-time data collection, such that BDC can be used to test algorithm performance on real and source-injected datasets. Using the BDC framework, a subset of the RadMAP datasets have been disseminated via the Gamma Ray Data Cloud (GRDC) that is hosted through the National Energy Research Science Computing (NERSC) Center, enabling data access to over 40 users at 10 institutions.« less
NASA Technical Reports Server (NTRS)
Quattrochi, D. A.; Lapenta, W. M.; Crosson, W. L.; Estes, M. G., Jr.; Limaye, A.; Kahn, M.
2006-01-01
Local and state agencies are responsible for developing state implementation plans to meet National Ambient Air Quality Standards. Numerical models used for this purpose simulate the transport and transformation of criteria pollutants and their precursors. The specification of land use/land cover (LULC) plays an important role in controlling modeled surface meteorology and emissions. NASA researchers have worked with partners and Atlanta stakeholders to incorporate an improved high-resolution LULC dataset for the Atlanta area within their modeling system and to assess meteorological and air quality impacts of Urban Heat Island (UHI) mitigation strategies. The new LULC dataset provides a more accurate representation of land use, has the potential to improve model accuracy, and facilitates prediction of LULC changes. Use of the new LULC dataset for two summertime episodes improved meteorological forecasts, with an existing daytime cold bias of approx. equal to 3 C reduced by 30%. Model performance for ozone prediction did not show improvement. In addition, LULC changes due to Atlanta area urbanization were predicted through 2030, for which model simulations predict higher urban air temperatures. The incorporation of UHI mitigation strategies partially offset this warming trend. The data and modeling methods used are generally applicable to other U.S. cities.
A Comparative Study of Point Cloud Data Collection and Processing
NASA Astrophysics Data System (ADS)
Pippin, J. E.; Matheney, M.; Gentle, J. N., Jr.; Pierce, S. A.; Fuentes-Pineda, G.
2016-12-01
Over the past decade, there has been dramatic growth in the acquisition of publicly funded high-resolution topographic data for scientific, environmental, engineering and planning purposes. These data sets are valuable for applications of interest across a large and varied user community. However, because of the large volumes of data produced by high-resolution mapping technologies and expense of aerial data collection, it is often difficult to collect and distribute these datasets. Furthermore, the data can be technically challenging to process, requiring software and computing resources not readily available to many users. This study presents a comparison of advanced computing hardware and software that is used to collect and process point cloud datasets, such as LIDAR scans. Activities included implementation and testing of open source libraries and applications for point cloud data processing such as, Meshlab, Blender, PDAL, and PCL. Additionally, a suite of commercial scale applications, Skanect and Cloudcompare, were applied to raw datasets. Handheld hardware solutions, a Structure Scanner and Xbox 360 Kinect V1, were tested for their ability to scan at three field locations. The resultant data projects successfully scanned and processed subsurface karst features ranging from small stalactites to large rooms, as well as a surface waterfall feature. Outcomes support the feasibility of rapid sensing in 3D at field scales.
Edmands, William M B; Barupal, Dinesh K; Scalbert, Augustin
2015-03-01
MetMSLine represents a complete collection of functions in the R programming language as an accessible GUI for biomarker discovery in large-scale liquid-chromatography high-resolution mass spectral datasets from acquisition through to final metabolite identification forming a backend to output from any peak-picking software such as XCMS. MetMSLine automatically creates subdirectories, data tables and relevant figures at the following steps: (i) signal smoothing, normalization, filtration and noise transformation (PreProc.QC.LSC.R); (ii) PCA and automatic outlier removal (Auto.PCA.R); (iii) automatic regression, biomarker selection, hierarchical clustering and cluster ion/artefact identification (Auto.MV.Regress.R); (iv) Biomarker-MS/MS fragmentation spectra matching and fragment/neutral loss annotation (Auto.MS.MS.match.R) and (v) semi-targeted metabolite identification based on a list of theoretical masses obtained from public databases (DBAnnotate.R). All source code and suggested parameters are available in an un-encapsulated layout on http://wmbedmands.github.io/MetMSLine/. Readme files and a synthetic dataset of both X-variables (simulated LC-MS data), Y-variables (simulated continuous variables) and metabolite theoretical masses are also available on our GitHub repository. © The Author 2014. Published by Oxford University Press.
Edmands, William M. B.; Barupal, Dinesh K.; Scalbert, Augustin
2015-01-01
Summary: MetMSLine represents a complete collection of functions in the R programming language as an accessible GUI for biomarker discovery in large-scale liquid-chromatography high-resolution mass spectral datasets from acquisition through to final metabolite identification forming a backend to output from any peak-picking software such as XCMS. MetMSLine automatically creates subdirectories, data tables and relevant figures at the following steps: (i) signal smoothing, normalization, filtration and noise transformation (PreProc.QC.LSC.R); (ii) PCA and automatic outlier removal (Auto.PCA.R); (iii) automatic regression, biomarker selection, hierarchical clustering and cluster ion/artefact identification (Auto.MV.Regress.R); (iv) Biomarker—MS/MS fragmentation spectra matching and fragment/neutral loss annotation (Auto.MS.MS.match.R) and (v) semi-targeted metabolite identification based on a list of theoretical masses obtained from public databases (DBAnnotate.R). Availability and implementation: All source code and suggested parameters are available in an un-encapsulated layout on http://wmbedmands.github.io/MetMSLine/. Readme files and a synthetic dataset of both X-variables (simulated LC–MS data), Y-variables (simulated continuous variables) and metabolite theoretical masses are also available on our GitHub repository. Contact: ScalbertA@iarc.fr PMID:25348215
Satellite remote sensing of fine particulate air pollutants over Indian mega cities
NASA Astrophysics Data System (ADS)
Sreekanth, V.; Mahesh, B.; Niranjan, K.
2017-11-01
In the backdrop of the need for high spatio-temporal resolution data on PM2.5 mass concentrations for health and epidemiological studies over India, empirical relations between Aerosol Optical Depth (AOD) and PM2.5 mass concentrations are established over five Indian mega cities. These relations are sought to predict the surface PM2.5 mass concentrations from high resolution columnar AOD datasets. Current study utilizes multi-city public domain PM2.5 data (from US Consulate and Embassy's air monitoring program) and MODIS AOD, spanning for almost four years. PM2.5 is found to be positively correlated with AOD. Station-wise linear regression analysis has shown spatially varying regression coefficients. Similar analysis has been repeated by eliminating data from the elevated aerosol prone seasons, which has improved the correlation coefficient. The impact of the day to day variability in the local meteorological conditions on the AOD-PM2.5 relationship has been explored by performing a multiple regression analysis. A cross-validation approach for the multiple regression analysis considering three years of data as training dataset and one-year data as validation dataset yielded an R value of ∼0.63. The study was concluded by discussing the factors which can improve the relationship.
Burkhard, Silja Barbara
2018-01-01
Development of specialized cells and structures in the heart is regulated by spatially -restricted molecular pathways. Disruptions in these pathways can cause severe congenital cardiac malformations or functional defects. To better understand these pathways and how they regulate cardiac development we used tomo-seq, combining high-throughput RNA-sequencing with tissue-sectioning, to establish a genome-wide expression dataset with high spatial resolution for the developing zebrafish heart. Analysis of the dataset revealed over 1100 genes differentially expressed in sub-compartments. Pacemaker cells in the sinoatrial region induce heart contractions, but little is known about the mechanisms underlying their development. Using our transcriptome map, we identified spatially restricted Wnt/β-catenin signaling activity in pacemaker cells, which was controlled by Islet-1 activity. Moreover, Wnt/β-catenin signaling controls heart rate by regulating pacemaker cellular response to parasympathetic stimuli. Thus, this high-resolution transcriptome map incorporating all cell types in the embryonic heart can expose spatially restricted molecular pathways critical for specific cardiac functions. PMID:29400650
Data Mining and Optimization Tools for Developing Engine Parameters Tools
NASA Technical Reports Server (NTRS)
Dhawan, Atam P.
1998-01-01
This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. From the total budget of $5,000, Tricia and I studied the problem domain for developing ail Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy datasets. From the study and discussion with NASA LERC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of the data for GA based multi-resolution optimal search. Wavelet processing is proposed to create a coarse resolution representation of data providing two advantages in GA based search: 1. We will have less data to begin with to make search sub-spaces. 2. It will have robustness against the noise because at every level of wavelet based decomposition, we will be decomposing the signal into low pass and high pass filters.
Neural Networks as a Tool for Constructing Continuous NDVI Time Series from AVHRR and MODIS
NASA Technical Reports Server (NTRS)
Brown, Molly E.; Lary, David J.; Vrieling, Anton; Stathakis, Demetris; Mussa, Hamse
2008-01-01
The long term Advanced Very High Resolution Radiometer-Normalized Difference Vegetation Index (AVHRR-NDVI) record provides a critical historical perspective on vegetation dynamics necessary for global change research. Despite the proliferation of new sources of global, moderate resolution vegetation datasets, the remote sensing community is still struggling to create datasets derived from multiple sensors that allow the simultaneous use of spectral vegetation for time series analysis. To overcome the non-stationary aspect of NDVI, we use an artificial neural network (ANN) to map the NDVI indices from AVHRR to those from MODIS using atmospheric, surface type and sensor-specific inputs to account for the differences between the sensors. The NDVI dynamics and range of MODIS NDVI data at one degree is matched and extended through the AVHRR record. Four years of overlap between the two sensors is used to train a neural network to remove atmospheric and sensor specific effects on the AVHRR NDVI. In this paper, we present the resulting continuous dataset, its relationship to MODIS data, and a validation of the product.
Enhancing Hi-C data resolution with deep convolutional neural network HiCPlus.
Zhang, Yan; An, Lin; Xu, Jie; Zhang, Bo; Zheng, W Jim; Hu, Ming; Tang, Jijun; Yue, Feng
2018-02-21
Although Hi-C technology is one of the most popular tools for studying 3D genome organization, due to sequencing cost, the resolution of most Hi-C datasets are coarse and cannot be used to link distal regulatory elements to their target genes. Here we develop HiCPlus, a computational approach based on deep convolutional neural network, to infer high-resolution Hi-C interaction matrices from low-resolution Hi-C data. We demonstrate that HiCPlus can impute interaction matrices highly similar to the original ones, while only using 1/16 of the original sequencing reads. We show that the models learned from one cell type can be applied to make predictions in other cell or tissue types. Our work not only provides a computational framework to enhance Hi-C data resolution but also reveals features underlying the formation of 3D chromatin interactions.
NASA Astrophysics Data System (ADS)
Kamarinas, I.; Julian, J.; Owsley, B.; de Beurs, K.; Hughes, A.
2014-12-01
Water quality is dictated by interactions among geomorphic processes, vegetation characteristics, weather patterns, and anthropogenic land uses over multiple spatio-temporal scales. In order to understand how changes in climate and land use impact river water quality, a suite of data with high temporal resolution over a long period is needed. Further, all of this data must be analyzed with respect to connectivity to the river, thus requiring high spatial resolution data. Here, we present how changes in climate and land use over the past 25 years have affected water quality in the 268 sq. km Hoteo River catchment in New Zealand. Hydro-climatic data included daily solar radiation, temperature, soil moisture, rainfall, drought indices, and runoff at 5-km resolution. Land cover changes were measured every 8 days at 30-m resolution by fusing Landsat and MODIS satellite imagery. Water quality was assessed using 15-min turbidity (2011-2014) and monthly data for a suite of variables (1990-2014). Watershed connectivity was modeled using a corrected 15-m DEM and a high-resolution drainage network. Our analyses revealed that this catchment experiences cyclical droughts which, when combined with intense land uses such as livestock grazing and plantation forest harvesting, leaves many areas in the catchment disturbed (i.e. exposed soil) that are connected to the river through surface runoff. As a result, flow-normalized turbidity was elevated during droughts and remained relatively low during wet periods. For example, disturbed land area decreased from 9% to 4% over 2009-2013, which was a relatively wet period. During the extreme drought of 2013, disturbed area increased to 6% in less than a year due mainly to slow pasture recovery after heavy stocking rates. The relationships found in this study demonstrate that high spatiotemporal resolution land cover datasets are very important to understanding the interactions between landscape and climate, and how these interactions affect water quality.
Lunar Observer Laser Altimeter observations for lunar base site selection
NASA Technical Reports Server (NTRS)
Garvin, James B.; Bufton, Jack L.
1992-01-01
One of the critical datasets for optimal selection of future lunar landing sites is local- to regional-scale topography. Lunar base site selection will require such data for both engineering and scientific operations purposes. The Lunar Geoscience Orbiter or Lunar Observer is the ideal precursory science mission from which to obtain this required information. We suggest that a simple laser altimeter instrument could be employed to measure local-scale slopes, heights, and depths of lunar surface features important to lunar base planning and design. For this reason, we have designed and are currently constructing a breadboard of a Lunar Observer Laser Altimeter (LOLA) instrument capable of acquiring contiguous-footprint topographic profiles with both 30-m and 300-m along-track resolution. This instrument meets all the severe weight, power, size, and data rate limitations imposed by Observer-class spacecraft. In addition, LOLA would be capable of measuring the within-footprint vertical roughness of the lunar surface, and the 1.06-micron relative surface reflectivity at normal incidence. We have used airborne laser altimeter data for a few representative lunar analog landforms to simulate and analyze LOLA performance in a 100-km lunar orbit. We demonstrate that this system in its highest resolution mode (30-m diameter footprints) would quantify the topography of all but the very smallest lunar landforms. At its global mapping resolution (300-m diameter footprints), LOLA would establish the topographic context for lunar landing site selection by providing the basis for constructing a 1-2 km spatial resolution global, geodetic topographic grid that would contain a high density of observations (e.g., approximately 1000 observations per each 1 deg by 1 deg cell at the lunar equator). The high spatial and vertical resolution measurements made with a LOLA-class instrument on a precursory Lunar Observer would be highly synergistic with high-resolution imaging datasets, and will allow for direct quantification of critical slopes, heights, and depths of features visible in images of potential lunar base sites.
Acoustically Mounted Microcrystals Yield High Resolution X-ray Structures†,‡
Soares, Alexei S.; Engel, Matthew A.; Stearns, Richard; Datwani, Sammy; Olechno, Joe; Ellson, Richard; Skinner, John M.; Allaire, Marc; Orville, Allen M.
2011-01-01
We demonstrate a general strategy to determine structures from showers of microcrystals. It uses acoustic droplet ejection (ADE) to transfer 2.5 nanoliter droplets from the surface of microcrystal slurries, through the air, and onto mounting micromesh pins. Individual microcrystals are located by raster-scanning a several micron X-ray beam across the cryocooled micromeshes. X-ray diffraction datasets merged from several micron-sized crystals are used to solve 1.8 Å resolution crystal structures. PMID:21542590
High-resolution MR imaging for dental impressions: a feasibility study.
Boldt, Julian; Rottner, Kurt; Schmitter, Marc; Hopfgartner, Andreas; Jakob, Peter; Richter, Ernst-Jürgen; Tymofiyeva, Olga
2018-04-01
Magnetic resonance imaging is an emerging technology in dental medicine. While low-resolution MRI has especially provided means to examine the temporomandibular joint due to its anatomic inaccessibility, it was the goal of this study to assess whether high-resolution MRI is capable of delivering a dataset sufficiently precise enough to serve as digital impression of human teeth. An informed and consenting patient in need of dental restoration with fixed partial dentures was chosen as subject. Two prepared teeth were measured using MRI and the dataset subjected to mathematical processing before Fourier transformation. After reconstruction, a 3D file was generated which was fed into an existing industry standard CAD/CAM process. A framework for a fixed dental prosthesis was digitally modeled and manufactured by laser-sintering. The fit in situ was found to be acceptable by current clinical standards, which allowed permanent placement of the fixed prosthesis. Using a clinical whole-body MR scanner with the addition of custom add-on hardware, contrast enhancement, and data post-processing, resolution and signal-to-noise ratio were sufficiently achieved to allow fabrication of a dental restoration in an acquisition time comparable to the setting time of common dental impression materials. Furthermore, the measurement was well tolerated. The herein described method can be regarded as proof of principle that MRI is a promising option for digital impressions when fixed partial dentures are required.
NASA Astrophysics Data System (ADS)
Campisano, C. J.; Dimaggio, E. N.; Arrowsmith, J. R.; Kimbel, W. H.; Reed, K. E.; Robinson, S. E.; Schoville, B. J.
2008-12-01
Understanding the geographic, temporal, and environmental contexts of human evolution requires the ability to compare wide-ranging datasets collected from multiple research disciplines. Paleoanthropological field- research projects are notoriously independent administratively even in regions of high transdisciplinary importance. As a result, valuable opportunities for the integration of new and archival datasets spanning diverse archaeological assemblages, paleontological localities, and stratigraphic sequences are often neglected, which limits the range of research questions that can be addressed. Using geoinformatic tools we integrate spatial, temporal, and semantically disparate paleoanthropological and geological datasets from the Hadar sedimentary basin of the Afar Rift, Ethiopia. Applying newly integrated data to investigations of fossil- rich sediments will provide the geospatial framework critical for addressing fundamental questions concerning hominins and their paleoenvironmental context. We present a preliminary cyberinfrastructure for data management that will allow scientists, students, and interested citizens to interact with, integrate, and visualize data from the Afar region. Examples of our initial integration efforts include generating a regional high-resolution satellite imagery base layer for georeferencing, standardizing and compiling multiple project datasets and digitizing paper maps. We also demonstrate how the robust datasets generated from our work are being incorporated into a new, digital module for Arizona State University's Hadar Paleoanthropology Field School - modernizing field data collection methods, on-the-fly data visualization and query, and subsequent analysis and interpretation. Armed with a fully fused database tethered to high-resolution satellite imagery, we can more accurately reconstruct spatial and temporal paleoenvironmental conditions and efficiently address key scientific questions, such as those regarding the relative importance of internal and external ecological, climatological, and tectonic forcings on evolutionary change in the fossil record. In close association with colleagues working in neighboring project areas, this work advances multidisciplinary and collaborative research, training, and long-range antiquities conservation in the Hadar region.
Leaf Area Index Estimation Using Chinese GF-1 Wide Field View Data in an Agriculture Region.
Wei, Xiangqin; Gu, Xingfa; Meng, Qingyan; Yu, Tao; Zhou, Xiang; Wei, Zheng; Jia, Kun; Wang, Chunmei
2017-07-08
Leaf area index (LAI) is an important vegetation parameter that characterizes leaf density and canopy structure, and plays an important role in global change study, land surface process simulation and agriculture monitoring. The wide field view (WFV) sensor on board the Chinese GF-1 satellite can acquire multi-spectral data with decametric spatial resolution, high temporal resolution and wide coverage, which are valuable data sources for dynamic monitoring of LAI. Therefore, an automatic LAI estimation algorithm for GF-1 WFV data was developed based on the radiative transfer model and LAI estimation accuracy of the developed algorithm was assessed in an agriculture region with maize as the dominated crop type. The radiative transfer model was firstly used to simulate the physical relationship between canopy reflectance and LAI under different soil and vegetation conditions, and then the training sample dataset was formed. Then, neural networks (NNs) were used to develop the LAI estimation algorithm using the training sample dataset. Green, red and near-infrared band reflectances of GF-1 WFV data were used as the input variables of the NNs, as well as the corresponding LAI was the output variable. The validation results using field LAI measurements in the agriculture region indicated that the LAI estimation algorithm could achieve satisfactory results (such as R² = 0.818, RMSE = 0.50). In addition, the developed LAI estimation algorithm had potential to operationally generate LAI datasets using GF-1 WFV land surface reflectance data, which could provide high spatial and temporal resolution LAI data for agriculture, ecosystem and environmental management researches.
Prototype global burnt area algorithm using the AVHRR-LTDR time series
NASA Astrophysics Data System (ADS)
López-Saldaña, Gerardo; Pereira, José Miguel; Aires, Filipe
2013-04-01
One of the main limitations of products derived from remotely-sensed data is the length of the data records available for climate studies. The Advanced Very High Resolution Radiometer (AVHRR) long-term data record (LTDR) comprises a daily global atmospherically-corrected surface reflectance dataset at 0.05° spatial resolution and is available for the 1981-1999 time period. Fire is strong cause of land surface change and emissions of greenhouse gases around the globe. A global long-term identification of areas affected by fire is needed to analyze trends and fire-clime relationships. A burnt area algorithm can be seen as a change point detection problem where there is an abrupt change in the surface reflectance due to the biomass burning. Using the AVHRR-LTDR dataset, a time series of bidirectional reflectance distribution function (BRDF) corrected surface reflectance was generated using the daily observations and constraining the BRDF model inversion using a climatology of BRDF parameters derived from 12 years of MODIS data. The identification of the burnt area was performed using a t-test in the pre- and post-fire reflectance values and a change point detection algorithm, then spectral constraints were applied to flag changes caused by natural land processes like vegetation seasonality or flooding. Additional temporal constraints are applied focusing in the persistence of the affected areas. Initial results for year 1998, which was selected because of a positive fire anomaly, show spatio-temporal coherence but further analysis is required and a formal rigorous validation will be applied using burn scars identified from high-resolution datasets.
Advances in Remote Sensing for Vegetation Dynamics and Agricultural Management
NASA Technical Reports Server (NTRS)
Tucker, Compton; Puma, Michael
2015-01-01
Spaceborne remote sensing has led to great advances in the global monitoring of vegetation. For example, the NASA Global Inventory Modeling and Mapping Studies (GIMMS) group has developed widely used datasets from the Advanced Very High Resolution Radiometer (AVHRR) sensors as well as the Moderate Resolution Imaging Spectroradiometer (MODIS) map imagery and normalized difference vegetation index datasets. These data are valuable for analyzing vegetation trends and variability at the regional and global levels. Numerous studies have investigated such trends and variability for both natural vegetation (e.g., re-greening of the Sahel, shifts in the Eurasian boreal forest, Amazonian drought sensitivity) and crops (e.g., impacts of extremes on agricultural production). Here, a critical overview is presented on recent developments and opportunities in the use of remote sensing for monitoring vegetation and crop dynamics.
Exploratory visualization of astronomical data on ultra-high-resolution wall displays
NASA Astrophysics Data System (ADS)
Pietriga, Emmanuel; del Campo, Fernando; Ibsen, Amanda; Primet, Romain; Appert, Caroline; Chapuis, Olivier; Hempel, Maren; Muñoz, Roberto; Eyheramendy, Susana; Jordan, Andres; Dole, Hervé
2016-07-01
Ultra-high-resolution wall displays feature a very high pixel density over a large physical surface, which makes them well-suited to the collaborative, exploratory visualization of large datasets. We introduce FITS-OW, an application designed for such wall displays, that enables astronomers to navigate in large collections of FITS images, query astronomical databases, and display detailed, complementary data and documents about multiple sources simultaneously. We describe how astronomers interact with their data using both the wall's touchsensitive surface and handheld devices. We also report on the technical challenges we addressed in terms of distributed graphics rendering and data sharing over the computer clusters that drive wall displays.
Estimating Vegetation Height from WorldView-02 and ArcticDEM Data for Broad Ecological Applications
NASA Astrophysics Data System (ADS)
Meddens, A. J.; Vierling, L. A.; Eitel, J.; Jennewein, J. S.; White, J. C.; Wulder, M.
2017-12-01
Boreal and arctic regions are warming at an unprecedented rate, and at a rate higher than in other regions across the globe. Ecological processes are highly responsive to temperature and therefore substantial changes in these northern ecosystems are expected. Recently, NASA initiated the Arctic-Boreal Vulnerability Experiment (ABoVE), which is a large-scale field campaign that aims to gain a better understanding of how the arctic responds to environmental change. High-resolution data products that quantify vegetation structure and function will improve efforts to assess these environmental change impacts. Our objective was to develop and test an approach that allows for mapping vegetation height at a 5m grid cell resolution across the ABoVE domain. To accomplish this, we selected three study areas across a north-south gradient in Alaska, representing an area of approximately 130 km2. We developed a RandomForest modeling approach for predicting vegetation height using the ArcticDEM (a digital surface model produced across the Arctic by the Polar Geospatial Center) and high-resolution multispectral satellite data (WorldView-2) in conjunction with aerial lidar data for calibration and validation. Vegetation height was successfully predicted across the three study areas and evaluated using an independent dataset, with R2 ranging from 0.58 to 0.76 and RMSEs ranging from 1.8 to 2.4 m. This predicted vegetation height dataset also led to the development of a digital terrain model using the ArcticDEM digital surface model by removing canopy heights from the surface heights. Our results show potential to establish a high resolution pan-arctic vegetation height map, which will provide useful information to a broad range of ongoing and future ecological research in high northern latitudes.
NASA Astrophysics Data System (ADS)
Dahl, E.; Chanover, N.; Voelz, D.; Kuehn, D.; Strycker, P.
2016-12-01
Jupiter's upper atmosphere is a highly dynamic system in which clouds and storms change color, shape, and size on variable timescales. The exact mechanism by which the deep atmosphere affects these changes in the uppermost cloud deck is still unknown. However, with Juno's arrival in July 2016, it is now possible to take detailed observations of the deep atmosphere with the spacecraft's Microwave Radiometer. By taking detailed optical measurements of Jupiter's uppermost cloud deck in conjunction with these microwave observations, we can provide a context in which to better understand these observations. Ultimately, we can utilize these two complementary datasets in order to thoroughly characterize Jupiter's atmosphere in terms of its vertical cloud structure, color distribution, and dynamical state throughout the Juno era. These optical data will also provide a complement to the near-IR sensitivity of the Jovian InfraRed Auroral Mapper and will expand on the limited spectral coverage of JunoCam. In order to obtain high spectral resolution images of Jupiter's atmosphere in the optical regime we use the New Mexico State University Acousto-optic Imaging Camera (NAIC). NAIC's acousto-optic tunable filter allows us to take hyperspectral image cubes of Jupiter from 450-950 nm at an average spectral resolution (λ/dλ) of 242. We present a preliminary analysis of two datasets obtained with NAIC at the Apache Point Observatory 3.5-m telescope: one pre-Juno dataset from March 2016 and the other from November 2016. From these data we derive low-resolution optical spectra of the Great Red Spot and a representative belt and zone to compare with previous work and laboratory measurements of candidate chromophore materials. Additionally, we compare these two datasets to inspect how the atmosphere has changed since before Juno arrived at Jupiter. NASA supported this work through award number NNX15AP34A.
NASA Astrophysics Data System (ADS)
Morin, Efrat; Marra, Francesco; Peleg, Nadav; Mei, Yiwen; Anagnostou, Emmanouil N.
2017-04-01
Rainfall frequency analysis is used to quantify the probability of occurrence of extreme rainfall and is traditionally based on rain gauge records. The limited spatial coverage of rain gauges is insufficient to sample the spatiotemporal variability of extreme rainfall and to provide the areal information required by management and design applications. Conversely, remote sensing instruments, even if quantitative uncertain, offer coverage and spatiotemporal detail that allow overcoming these issues. In recent years, remote sensing datasets began to be used for frequency analyses, taking advantage of increased record lengths and quantitative adjustments of the data. However, the studies so far made use of concepts and techniques developed for rain gauge (i.e. point or multiple-point) data and have been validated by comparison with gauge-derived analyses. These procedures add further sources of uncertainty and prevent from isolating between data and methodological uncertainties and from fully exploiting the available information. In this study, we step out of the gauge-centered concept presenting a direct comparison between at-site Intensity-Duration-Frequency (IDF) curves derived from different remote sensing datasets on corresponding spatial scales, temporal resolutions and records. We analyzed 16 years of homogeneously corrected and gauge-adjusted C-Band weather radar estimates, high-resolution CMORPH and gauge-adjusted high-resolution CMORPH over the Eastern Mediterranean. Results of this study include: (a) good spatial correlation between radar and satellite IDFs ( 0.7 for 2-5 years return period); (b) consistent correlation and dispersion in the raw and gauge adjusted CMORPH; (c) bias is almost uniform with return period for 12-24 h durations; (d) radar identifies thicker tail distributions than CMORPH and the tail of the distributions depends on the spatial and temporal scales. These results demonstrate the potential of remote sensing datasets for rainfall frequency analysis for management (e.g. warning and early-warning systems) and design (e.g. sewer design, large scale drainage planning)
High-resolution daily gridded datasets of air temperature and wind speed for Europe
NASA Astrophysics Data System (ADS)
Brinckmann, S.; Krähenmann, S.; Bissolli, P.
2015-08-01
New high-resolution datasets for near surface daily air temperature (minimum, maximum and mean) and daily mean wind speed for Europe (the CORDEX domain) are provided for the period 2001-2010 for the purpose of regional model validation in the framework of DecReg, a sub-project of the German MiKlip project, which aims to develop decadal climate predictions. The main input data sources are hourly SYNOP observations, partly supplemented by station data from the ECA&D dataset (http://www.ecad.eu). These data are quality tested to eliminate erroneous data and various kinds of inhomogeneities. Grids in a resolution of 0.044° (5 km) are derived by spatial interpolation of these station data into the CORDEX area. For temperature interpolation a modified version of a regression kriging method developed by Krähenmann et al. (2011) is used. At first, predictor fields of altitude, continentality and zonal mean temperature are chosen for a regression applied to monthly station data. The residuals of the monthly regression and the deviations of the daily data from the monthly averages are interpolated using simple kriging in a second and third step. For wind speed a new method based on the concept used for temperature was developed, involving predictor fields of exposure, roughness length, coastal distance and ERA Interim reanalysis wind speed at 850 hPa. Interpolation uncertainty is estimated by means of the kriging variance and regression uncertainties. Furthermore, to assess the quality of the final daily grid data, cross validation is performed. Explained variance ranges from 70 to 90 % for monthly temperature and from 50 to 60 % for monthly wind speed. The resulting RMSE for the final daily grid data amounts to 1-2 °C and 1-1.5 m s-1 (depending on season and parameter) for daily temperature parameters and daily mean wind speed, respectively. The datasets presented in this article are published at http://dx.doi.org/10.5676/DWD_CDC/DECREG0110v1.
Lagomasino, David; Fatoyinbo, Temilola; Lee, SeungKuk; Feliciano, Emanuelle; Trettin, Carl; Simard, Marc
2016-04-01
Canopy height is one of the strongest predictors of biomass and carbon in forested ecosystems. Additionally, mangrove ecosystems represent one of the most concentrated carbon reservoirs that are rapidly degrading as a result of deforestation, development, and hydrologic manipulation. Therefore, the accuracy of Canopy Height Models (CHM) over mangrove forest can provide crucial information for monitoring and verification protocols. We compared four CHMs derived from independent remotely sensed imagery and identified potential errors and bias between measurement types. CHMs were derived from three spaceborne datasets; Very-High Resolution (VHR) stereophotogrammetry, TerraSAR-X add-on for Digital Elevation Measurement, and Shuttle Radar Topography Mission (TanDEM-X), and lidar data which was acquired from an airborne platform. Each dataset exhibited different error characteristics that were related to spatial resolution, sensitivities of the sensors, and reference frames. Canopies over 10 m were accurately predicted by all CHMs while the distributions of canopy height were best predicted by the VHR CHM. Depending on the guidelines and strategies needed for monitoring and verification activities, coarse resolution CHMs could be used to track canopy height at regional and global scales with finer resolution imagery used to validate and monitor critical areas undergoing rapid changes.
NASA Astrophysics Data System (ADS)
Nedimovic, M. R.; Mountain, G. S.; Austin, J. A., Jr.; Fulthorpe, C.; Aali, M.; Baldwin, K.; Bhatnagar, T.; Johnson, C.; Küçük, H. M.; Newton, A.; Stanley, J.
2015-12-01
In June-July 2015, we acquired the first 3D/2D hybrid (short/long streamer) multichannel seismic (MCS) reflection dataset. These data were collected simultaneously across IODP Exp. 313 drillsites, off New Jersey, using R/V Langsethand cover ~95% of the planned 12x50 km box. Despite the large survey area, the lateral and vertical resolution for the 3D dataset is almost a magnitude of order higher than for data gathered for standard petroleum exploration. Such high-resolution was made possible by collection of common midpoint (CMP) lines whose combined length is ~3 times the Earth's circumference (~120,000 profile km) and a source rich in high-frequencies. We present details on the data acquisition, ongoing data analysis, and preliminary results. The science driving this project is presented by Mountain et al. The 3D component of this innovative survey used an athwartship cross cable, extended laterally by 2 barovanes roughly 357.5 m apart and trailed by 24 50-m P-Cables spaced ~12.5 m with near-trace offset of 53 m. Each P-Cable had 8 single hydrophone groups spaced at 6.25 m for a total of 192 channels. Record length was 4 s and sample rate 0.5 ms, with no low cut and an 824 Hz high cut filter. We ran 77 sail lines spaced ~150 m. Receiver locations were determined using 2 GPS receivers mounted on floats and 2 compasses and depth sensors per streamer. Streamer depths varied from 2.1 to 3.7 m. The 2D component used a single 3 km streamer, with 240 9-hydrophone groups spaced at 12.5 m, towed astern with near-trace offset of 229 m. The record length was 4 s and sample rate 0.5 ms, with low cut filter at 2 Hz and high cut at 412 Hz. Receiver locations were recorded using GPS at the head float and tail buoy, combined with 12 bird compasses spaced ~300 m. Nominal streamer depth was 4.5 m. The source for both systems was a 700 in3 linear array of 4 Bolt air guns suspended at 4.5 m towing depth, 271.5 m behind the ship's stern. Shot spacing was 12.5 m. Data analysis to prestack time migration is being carried out by Absolute Imaging, a commercial company. The shipboard QC analysis and brute stacks indicate that the final product will be superb. Key advantages of the hybrid 3D/2D dataset are: (1) Velocity control from the 2D long-streamer data combined with the ultra-high resolution of the P-Cable 3D dataset; (2) Opportunity for prestack and poststack attribute analysis.
CscoreTool: fast Hi-C compartment analysis at high resolution.
Zheng, Xiaobin; Zheng, Yixian
2018-05-01
The genome-wide chromosome conformation capture (Hi-C) has revealed that the eukaryotic genome can be partitioned into A and B compartments that have distinctive chromatin and transcription features. Current Principle Component Analyses (PCA)-based method for the A/B compartment prediction based on Hi-C data requires substantial CPU time and memory. We report the development of a method, CscoreTool, which enables fast and memory-efficient determination of A/B compartments at high resolution even in datasets with low sequencing depth. https://github.com/scoutzxb/CscoreTool. xzheng@carnegiescience.edu. Supplementary data are available at Bioinformatics online.
Clickstream data yields high-resolution maps of science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bollen, Johan; Van De Sompel, Herbert; Hagberg, Aric
2009-01-01
Intricate maps of science have been created from citation data to visualize the structure of scientific activity. However, most scientific publications are now accessed online. Scholarly web portals record detailed log data at a scale that exceeds the number of all existing citations combined. Such log data is recorded immediately upon publication and keeps track of the sequences of user requests (clickstreams) that are issued by a variety of users across many different domains. Given these advantagees of log datasets over citation data, we investigate whether they can produce high-resolution, more current maps of science.
NASA Technical Reports Server (NTRS)
Vila, Daniel; deGoncalves, Luis Gustavo; Toll, David L.; Rozante, Jose Roberto
2008-01-01
This paper describes a comprehensive assessment of a new high-resolution, high-quality gauge-satellite based analysis of daily precipitation over continental South America during 2004. This methodology is based on a combination of additive and multiplicative bias correction schemes in order to get the lowest bias when compared with the observed values. Inter-comparisons and cross-validations tests have been carried out for the control algorithm (TMPA real-time algorithm) and different merging schemes: additive bias correction (ADD), ratio bias correction (RAT) and TMPA research version, for different months belonging to different seasons and for different network densities. All compared merging schemes produce better results than the control algorithm, but when finer temporal (daily) and spatial scale (regional networks) gauge datasets is included in the analysis, the improvement is remarkable. The Combined Scheme (CoSch) presents consistently the best performance among the five techniques. This is also true when a degraded daily gauge network is used instead of full dataset. This technique appears a suitable tool to produce real-time, high-resolution, high-quality gauge-satellite based analyses of daily precipitation over land in regional domains.
Parallel Visualization of Large-Scale Aerodynamics Calculations: A Case Study on the Cray T3E
NASA Technical Reports Server (NTRS)
Ma, Kwan-Liu; Crockett, Thomas W.
1999-01-01
This paper reports the performance of a parallel volume rendering algorithm for visualizing a large-scale, unstructured-grid dataset produced by a three-dimensional aerodynamics simulation. This dataset, containing over 18 million tetrahedra, allows us to extend our performance results to a problem which is more than 30 times larger than the one we examined previously. This high resolution dataset also allows us to see fine, three-dimensional features in the flow field. All our tests were performed on the Silicon Graphics Inc. (SGI)/Cray T3E operated by NASA's Goddard Space Flight Center. Using 511 processors, a rendering rate of almost 9 million tetrahedra/second was achieved with a parallel overhead of 26%.
EnviroAtlas - 303(d) Impairments by 12-digit HUC for the Conterminous United States
This EnviroAtlas dataset depicts the total length of stream or river flowlines that have impairments submitted to the EPA by states under section 303(d) of the Clean Water Act. It also contains the total lengths of streams, rivers, and canals, total waterbody area, and stream density (stream length per area) from the US Geological Survey's high-resolution National Hydrography Dataset (NHD).This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
Remote visual analysis of large turbulence databases at multiple scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulido, Jesus; Livescu, Daniel; Kanov, Kalin
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less
Remote visual analysis of large turbulence databases at multiple scales
Pulido, Jesus; Livescu, Daniel; Kanov, Kalin; ...
2018-06-15
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less
In-situ observations of the isotopic composition of methane at the Cabauw tall tower site
NASA Astrophysics Data System (ADS)
Röckmann, Thomas; Eyer, Simon; van der Veen, Carina; E Popa, Maria; Tuzson, Béla; Monteil, Guillaume; Houweling, Sander; Harris, Eliza; Brunner, Dominik; Fischer, Hubertus; Zazzeri, Giulia; Lowry, David; Nisbet, Euan G.; Brand, Willi A.; Necki, Jaroslav M.; Emmenegger, Lukas; Mohn, Joachim
2017-04-01
High precision analyses of the isotopic composition of methane in ambient air can potentially be used to discriminate between different source categories. Due to the complexity of isotope ratio measurements, such analyses have generally been performed in the laboratory on air samples collected in the field. This poses a limitation on the temporal resolution at which the isotopic composition can be monitored with reasonable logistical effort. Here we present the performance of a dual isotope ratio mass spectrometric system (IRMS) and a quantum cascade laser absorption spectroscopy (QCLAS) based technique for in-situ analysis of the isotopic composition of methane under field conditions. Both systems were deployed at the Cabauw experimental site for atmospheric research (CESAR) in the Netherlands and performed in-situ, high-frequency (approx. hourly) measurements for a period of more than 5 months. The IRMS and QCLAS instruments were in excellent agreement with a slight systematic offset of +0.05 ± 0.03 ‰ for δ13C-CH4 and -3.6 ± 0.4 ‰ for δD-CH4. This was corrected for, yielding a combined dataset with more than 2500 measurements of both δ13C and δD. The high precision and temporal resolution dataset does not only reveal the overwhelming contribution of isotopically depleted agricultural CH4 emissions from ruminants at the Cabauw site, but also allows the identification of specific events with elevated contributions from more enriched sources such as natural gas and landfills. The final dataset was compared to model calculations using the global model TM5 and the mesoscale model FLEXPART-COSMO. The results of both models agree better with the measurements when the TNO-MACC emission inventory is used in the models than when the EDGAR inventory is used. This suggests that high-resolution isotope measurements have the potential to further constrain the methane budget, when they are performed at multiple sites that are representative for the entire European domain.
Gao, Mingxing; Xu, Xiwei; Klinger, Yann; van der Woerd, Jerome; Tapponnier, Paul
2017-08-15
The recent dramatic increase in millimeter- to centimeter- resolution topographic datasets obtained via multi-view photogrammetry raises the possibility of mapping detailed offset geomorphology and constraining the spatial characteristics of active faults. Here, for the first time, we applied this new method to acquire high-resolution imagery and generate topographic data along the Altyn Tagh fault, which is located in a remote high elevation area and shows preserved ancient earthquake surface ruptures. A digital elevation model (DEM) with a resolution of 0.065 m and an orthophoto with a resolution of 0.016 m were generated from these images. We identified piercing markers and reconstructed offsets based on both the orthoimage and the topography. The high-resolution UAV data were used to accurately measure the recent seismic offset. We obtained the recent offset of 7 ± 1 m. Combined with the high resolution satellite image, we measured cumulative offsets of 15 ± 2 m, 20 ± 2 m, 30 ± 2 m, which may be due to multiple paleo-earthquakes. Therefore, UAV mapping can provide fine-scale data for the assessment of the seismic hazards.
Fast Spatio-Temporal Data Mining from Large Geophysical Datasets
NASA Technical Reports Server (NTRS)
Stolorz, P.; Mesrobian, E.; Muntz, R.; Santos, J. R.; Shek, E.; Yi, J.; Mechoso, C.; Farrara, J.
1995-01-01
Use of the UCLA CONQUEST (CONtent-based Querying in Space and Time) is reviewed for performance of automatic cyclone extraction and detection of spatio-temporal blocking conditions on MPP. CONQUEST is a data analysis environment for knowledge and data mining to aid in high-resolution modeling of climate modeling.
Smucker, Nathan J; Kuhn, Anne; Charpentier, Michael A; Cruz-Quinones, Carlos J; Elonen, Colleen M; Whorley, Sarah B; Jicha, Terri M; Serbst, Jonathan R; Hill, Brian H; Wehr, John D
2016-03-01
Watershed management and policies affecting downstream ecosystems benefit from identifying relationships between land cover and water quality. However, different data sources can create dissimilarities in land cover estimates and models that characterize ecosystem responses. We used a spatially balanced stream study (1) to effectively sample development and urban stressor gradients while representing the extent of a large coastal watershed (>4400 km(2)), (2) to document differences between estimates of watershed land cover using 30-m resolution national land cover database (NLCD) and <1-m resolution land cover data, and (3) to determine if predictive models and relationships between water quality and land cover differed when using these two land cover datasets. Increased concentrations of nutrients, anions, and cations had similarly significant correlations with increased watershed percent impervious cover (IC), regardless of data resolution. The NLCD underestimated percent forest for 71/76 sites by a mean of 11 % and overestimated percent wetlands for 71/76 sites by a mean of 8 %. The NLCD almost always underestimated IC at low development intensities and overestimated IC at high development intensities. As a result of underestimated IC, regression models using NLCD data predicted mean background concentrations of NO3 (-) and Cl(-) that were 475 and 177 %, respectively, of those predicted when using finer resolution land cover data. Our sampling design could help states and other agencies seeking to create monitoring programs and indicators responsive to anthropogenic impacts. Differences between land cover datasets could affect resource protection due to misguided management targets, watershed development and conservation practices, or water quality criteria.
NASA SPoRT Initialization Datasets for Local Model Runs in the Environmental Modeling System
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; LaFontaine, Frank J.; Molthan, Andrew L.; Carcione, Brian; Wood, Lance; Maloney, Joseph; Estupinan, Jeral; Medlin, Jeffrey M.; Blottman, Peter; Rozumalski, Robert A.
2011-01-01
The NASA Short-term Prediction Research and Transition (SPoRT) Center has developed several products for its National Weather Service (NWS) partners that can be used to initialize local model runs within the Weather Research and Forecasting (WRF) Environmental Modeling System (EMS). These real-time datasets consist of surface-based information updated at least once per day, and produced in a composite or gridded product that is easily incorporated into the WRF EMS. The primary goal for making these NASA datasets available to the WRF EMS community is to provide timely and high-quality information at a spatial resolution comparable to that used in the local model configurations (i.e., convection-allowing scales). The current suite of SPoRT products supported in the WRF EMS include a Sea Surface Temperature (SST) composite, a Great Lakes sea-ice extent, a Greenness Vegetation Fraction (GVF) composite, and Land Information System (LIS) gridded output. The SPoRT SST composite is a blend of primarily the Moderate Resolution Imaging Spectroradiometer (MODIS) infrared and Advanced Microwave Scanning Radiometer for Earth Observing System data for non-precipitation coverage over the oceans at 2-km resolution. The composite includes a special lake surface temperature analysis over the Great Lakes using contributions from the Remote Sensing Systems temperature data. The Great Lakes Environmental Research Laboratory Ice Percentage product is used to create a sea-ice mask in the SPoRT SST composite. The sea-ice mask is produced daily (in-season) at 1.8-km resolution and identifies ice percentage from 0 100% in 10% increments, with values above 90% flagged as ice.
Accessing, Utilizing and Visualizing NASA Remote Sensing Data for Malaria Modeling and Surveillance
NASA Technical Reports Server (NTRS)
Kiang, Richard K.; Adimi, Farida; Kempler, Steven
2007-01-01
This poster presentation reviews the use of NASA remote sensing data that can be used to extract environmental information for modeling malaria transmission. The authors discuss the remote sensing data from Landsat, Advanced Very High Resolution Radiometer (AVHRR), Moderate Resolution Imaging Spectroradiometer (MODIS), Tropical Rainfall Measuring Mission (TRMM), Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Earth Observing One (EO-1), Advanced Land Imager (ALI) and Seasonal to Interannual Earth Science Information Partner (SIESIP) dataset.
NASA Astrophysics Data System (ADS)
Steiner, N.; McDonald, K. C.; Podest, E.; Dinardo, S. J.; Miller, C. E.
2016-12-01
Freeze/thaw and hydrologic cycling have important influence over surface processes in Arctic ecosystems and in Arctic carbon cycling. The seasonal freezing and thawing of soils bracket negative and positive modes of CO2 and CH4 flux of the bulk landscape. Hydrologic processes, such as seasonal inundation of thawed tundra create a complex microtopography where greenhouse-gas sources and sinks occur over short distances. Because of a high spatial variability hydrologic features must be mapped at fine resolution. These mappings can then be compared to local and regional scale observations of surface conditions, such as temperature and freeze/thaw state, to create better estimates of these important surface fields. The Carbon in the Arctic Vulnerability Experiment (CARVE) monitors carbon gas cycling in Alaskan using aircraft-deployed gas sampling instruments along with remote sensing observations of the land surface condition. A nadir-pointed, forward looking infrared (FLIR) imager mounted on the CARVE air-craft is used to measure upwelling mid-infrared spectral radiance at 3-5 microns. The FLIR instrument was operated during the spring, summer and fall seasons, 2013 through 2015. The instantaneous field of view (IFOV) of the FLIR instrument allows for a sub-meter resolution from a height of 500 m. High resolution data products allows for the discrimination of individual landscape components such as soil, vegetation and surface water features in the image footprint. We assess the effectiveness of the FLIR thermal images in monitoring thawing and inundation processes at very high resolutions. Analyses of FLIR datasets over focused study areas emphasizing exploration of the FLIR dataset utility for detailed land surface characterization as related to surface moisture and temperature. Emphasis is given to the Barrow CMDL station site and employ the tram-based data collections there. We will also examine potential at other high latitude sites of interest, e.g. Atqasuk, Ivotuk Alaska and tundra polygon sites under study by collaborators at UT Austin. The combination of high resolution temperature observations with associated estimates of temperature from other instruments can be used to discriminate hydrologic from temperature features in the mid-infrared to produce a high-resolution hydrology product.
NASA Astrophysics Data System (ADS)
Lehmann, Jan Rudolf Karl; Zvara, Ondrej; Prinz, Torsten
2015-04-01
The biological invasion of Australian Acacia species in natural ecosystems outside Australia has often a negative impact on native and endemic plant species and the related biodiversity. In Brazil, the Atlantic rainforest of Bahia and Espirito Santo forms an associated type of ecosystem, the Mussununga. In our days this biologically diverse ecosystem is negatively affected by the invasion of Acacia mangium and Acacia auriculiformis, both introduced to Brazil by the agroforestry to increase the production of pulp and high grade woods. In order to detect the distribution of Acacia species and to monitor the expansion of this invasion the use of high-resolution imagery data acquired with an autonomous Unmanned Aerial System (UAS) proved to be a very promising approach. In this study, two types of datasets - CIR and RGB - were collected since both types provide different information. In case of CIR imagery attention was paid on spectral signatures related to plants, whereas in case of RGB imagery the focus was on surface characteristics. Orthophoto-mosaics and DSM/DTM for both dataset were extracted. RGB/IHS transformations of the imagery's colour space were utilized, as well as NDVIblue index in case of CIR imagery to discriminate plant associations. Next, two test areas were defined in order validate OBIA rule sets using eCognition software. In case of RGB dataset, a rule set based on elevation distinction between high vegetation (including Acacia) and low vegetation (including soils) was developed. High vegetation was classified using Nearest Neighbour algorithm while working with the CIR dataset. The IHS information was used to mask shadows, soils and low vegetation. Further Nearest Neighbour classification was used for distinction between Acacia and other high vegetation types. Finally an accuracy assessment was performed using a confusion matrix. One can state that the IHS information appeared to be helpful in Acacia detection while the surface elevation information in case of RGB dataset was helpful to distinguish between low and high vegetation types. The successful use of a fixed-wing UAS proved to be a reliable and flexible technique to acquire ecologically sensitive data over wide areas and by extended UAS flight missions.
Exploring Antarctic Land Surface Temperature Extremes Using Condensed Anomaly Databases
NASA Astrophysics Data System (ADS)
Grant, Glenn Edwin
Satellite observations have revolutionized the Earth Sciences and climate studies. However, data and imagery continue to accumulate at an accelerating rate, and efficient tools for data discovery, analysis, and quality checking lag behind. In particular, studies of long-term, continental-scale processes at high spatiotemporal resolutions are especially problematic. The traditional technique of downloading an entire dataset and using customized analysis code is often impractical or consumes too many resources. The Condensate Database Project was envisioned as an alternative method for data exploration and quality checking. The project's premise was that much of the data in any satellite dataset is unneeded and can be eliminated, compacting massive datasets into more manageable sizes. Dataset sizes are further reduced by retaining only anomalous data of high interest. Hosting the resulting "condensed" datasets in high-speed databases enables immediate availability for queries and exploration. Proof of the project's success relied on demonstrating that the anomaly database methods can enhance and accelerate scientific investigations. The hypothesis of this dissertation is that the condensed datasets are effective tools for exploring many scientific questions, spurring further investigations and revealing important information that might otherwise remain undetected. This dissertation uses condensed databases containing 17 years of Antarctic land surface temperature anomalies as its primary data. The study demonstrates the utility of the condensate database methods by discovering new information. In particular, the process revealed critical quality problems in the source satellite data. The results are used as the starting point for four case studies, investigating Antarctic temperature extremes, cloud detection errors, and the teleconnections between Antarctic temperature anomalies and climate indices. The results confirm the hypothesis that the condensate databases are a highly useful tool for Earth Science analyses. Moreover, the quality checking capabilities provide an important method for independent evaluation of dataset veracity.
Cortical fibers orientation mapping using in-vivo whole brain 7 T diffusion MRI.
Gulban, Omer F; De Martino, Federico; Vu, An T; Yacoub, Essa; Uğurbil, Kamil; Lenglet, Christophe
2018-05-10
Diffusion MRI of the cortical gray matter is challenging because the micro-environment probed by water molecules is much more complex than within the white matter. High spatial and angular resolutions are therefore necessary to uncover anisotropic diffusion patterns and laminar structures, which provide complementary (e.g. to anatomical and functional MRI) microstructural information about the cortex architectonic. Several ex-vivo and in-vivo MRI studies have recently addressed this question, however predominantly with an emphasis on specific cortical areas. There is currently no whole brain in-vivo data leveraging multi-shell diffusion MRI acquisition at high spatial resolution, and depth dependent analysis, to characterize the complex organization of cortical fibers. Here, we present unique in-vivo human 7T diffusion MRI data, and a dedicated cortical depth dependent analysis pipeline. We leverage the high spatial (1.05 mm isotropic) and angular (198 diffusion gradient directions) resolution of this whole brain dataset to improve cortical fiber orientations mapping, and study neurites (axons and/or dendrites) trajectories across cortical depths. Tangential fibers in superficial cortical depths and crossing fiber configurations in deep cortical depths are identified. Fibers gradually inserting into the gyral walls are visualized, which contributes to mitigating the gyral bias effect. Quantitative radiality maps and histograms in individual subjects and cortex-based aligned datasets further support our results. Copyright © 2018 Elsevier Inc. All rights reserved.
A high resolution atlas of gene expression in the domestic sheep (Ovis aries)
Farquhar, Iseabail L.; Young, Rachel; Lefevre, Lucas; Pridans, Clare; Tsang, Hiu G.; Afrasiabi, Cyrus; Watson, Mick; Whitelaw, C. Bruce; Freeman, Tom C.; Archibald, Alan L.; Hume, David A.
2017-01-01
Sheep are a key source of meat, milk and fibre for the global livestock sector, and an important biomedical model. Global analysis of gene expression across multiple tissues has aided genome annotation and supported functional annotation of mammalian genes. We present a large-scale RNA-Seq dataset representing all the major organ systems from adult sheep and from several juvenile, neonatal and prenatal developmental time points. The Ovis aries reference genome (Oar v3.1) includes 27,504 genes (20,921 protein coding), of which 25,350 (19,921 protein coding) had detectable expression in at least one tissue in the sheep gene expression atlas dataset. Network-based cluster analysis of this dataset grouped genes according to their expression pattern. The principle of ‘guilt by association’ was used to infer the function of uncharacterised genes from their co-expression with genes of known function. We describe the overall transcriptional signatures present in the sheep gene expression atlas and assign those signatures, where possible, to specific cell populations or pathways. The findings are related to innate immunity by focusing on clusters with an immune signature, and to the advantages of cross-breeding by examining the patterns of genes exhibiting the greatest expression differences between purebred and crossbred animals. This high-resolution gene expression atlas for sheep is, to our knowledge, the largest transcriptomic dataset from any livestock species to date. It provides a resource to improve the annotation of the current reference genome for sheep, presenting a model transcriptome for ruminants and insight into gene, cell and tissue function at multiple developmental stages. PMID:28915238
A high resolution atlas of gene expression in the domestic sheep (Ovis aries).
Clark, Emily L; Bush, Stephen J; McCulloch, Mary E B; Farquhar, Iseabail L; Young, Rachel; Lefevre, Lucas; Pridans, Clare; Tsang, Hiu G; Wu, Chunlei; Afrasiabi, Cyrus; Watson, Mick; Whitelaw, C Bruce; Freeman, Tom C; Summers, Kim M; Archibald, Alan L; Hume, David A
2017-09-01
Sheep are a key source of meat, milk and fibre for the global livestock sector, and an important biomedical model. Global analysis of gene expression across multiple tissues has aided genome annotation and supported functional annotation of mammalian genes. We present a large-scale RNA-Seq dataset representing all the major organ systems from adult sheep and from several juvenile, neonatal and prenatal developmental time points. The Ovis aries reference genome (Oar v3.1) includes 27,504 genes (20,921 protein coding), of which 25,350 (19,921 protein coding) had detectable expression in at least one tissue in the sheep gene expression atlas dataset. Network-based cluster analysis of this dataset grouped genes according to their expression pattern. The principle of 'guilt by association' was used to infer the function of uncharacterised genes from their co-expression with genes of known function. We describe the overall transcriptional signatures present in the sheep gene expression atlas and assign those signatures, where possible, to specific cell populations or pathways. The findings are related to innate immunity by focusing on clusters with an immune signature, and to the advantages of cross-breeding by examining the patterns of genes exhibiting the greatest expression differences between purebred and crossbred animals. This high-resolution gene expression atlas for sheep is, to our knowledge, the largest transcriptomic dataset from any livestock species to date. It provides a resource to improve the annotation of the current reference genome for sheep, presenting a model transcriptome for ruminants and insight into gene, cell and tissue function at multiple developmental stages.
Se-SAD serial femtosecond crystallography datasets from selenobiotinyl-streptavidin
Yoon, Chun Hong; DeMirci, Hasan; Sierra, Raymond G.; Dao, E. Han; Ahmadi, Radman; Aksit, Fulya; Aquila, Andrew L.; Batyuk, Alexander; Ciftci, Halilibrahim; Guillet, Serge; Hayes, Matt J.; Hayes, Brandon; Lane, Thomas J.; Liang, Meng; Lundström, Ulf; Koglin, Jason E.; Mgbam, Paul; Rao, Yashas; Rendahl, Theodore; Rodriguez, Evan; Zhang, Lindsey; Wakatsuki, Soichi; Boutet, Sébastien; Holton, James M.; Hunter, Mark S.
2017-01-01
We provide a detailed description of selenobiotinyl-streptavidin (Se-B SA) co-crystal datasets recorded using the Coherent X-ray Imaging (CXI) instrument at the Linac Coherent Light Source (LCLS) for selenium single-wavelength anomalous diffraction (Se-SAD) structure determination. Se-B SA was chosen as the model system for its high affinity between biotin and streptavidin where the sulfur atom in the biotin molecule (C10H16N2O3S) is substituted with selenium. The dataset was collected at three different transmissions (100, 50, and 10%) using a serial sample chamber setup which allows for two sample chambers, a front chamber and a back chamber, to operate simultaneously. Diffraction patterns from Se-B SA were recorded to a resolution of 1.9 Å. The dataset is publicly available through the Coherent X-ray Imaging Data Bank (CXIDB) and also on LCLS compute nodes as a resource for research and algorithm development. PMID:28440794
Se-SAD serial femtosecond crystallography datasets from selenobiotinyl-streptavidin
NASA Astrophysics Data System (ADS)
Yoon, Chun Hong; Demirci, Hasan; Sierra, Raymond G.; Dao, E. Han; Ahmadi, Radman; Aksit, Fulya; Aquila, Andrew L.; Batyuk, Alexander; Ciftci, Halilibrahim; Guillet, Serge; Hayes, Matt J.; Hayes, Brandon; Lane, Thomas J.; Liang, Meng; Lundström, Ulf; Koglin, Jason E.; Mgbam, Paul; Rao, Yashas; Rendahl, Theodore; Rodriguez, Evan; Zhang, Lindsey; Wakatsuki, Soichi; Boutet, Sébastien; Holton, James M.; Hunter, Mark S.
2017-04-01
We provide a detailed description of selenobiotinyl-streptavidin (Se-B SA) co-crystal datasets recorded using the Coherent X-ray Imaging (CXI) instrument at the Linac Coherent Light Source (LCLS) for selenium single-wavelength anomalous diffraction (Se-SAD) structure determination. Se-B SA was chosen as the model system for its high affinity between biotin and streptavidin where the sulfur atom in the biotin molecule (C10H16N2O3S) is substituted with selenium. The dataset was collected at three different transmissions (100, 50, and 10%) using a serial sample chamber setup which allows for two sample chambers, a front chamber and a back chamber, to operate simultaneously. Diffraction patterns from Se-B SA were recorded to a resolution of 1.9 Å. The dataset is publicly available through the Coherent X-ray Imaging Data Bank (CXIDB) and also on LCLS compute nodes as a resource for research and algorithm development.
Long Term Cloud Property Datasets From MODIS and AVHRR Using the CERES Cloud Algorithm
NASA Technical Reports Server (NTRS)
Minnis, Patrick; Bedka, Kristopher M.; Doelling, David R.; Sun-Mack, Sunny; Yost, Christopher R.; Trepte, Qing Z.; Bedka, Sarah T.; Palikonda, Rabindra; Scarino, Benjamin R.; Chen, Yan;
2015-01-01
Cloud properties play a critical role in climate change. Monitoring cloud properties over long time periods is needed to detect changes and to validate and constrain models. The Clouds and the Earth's Radiant Energy System (CERES) project has developed several cloud datasets from Aqua and Terra MODIS data to better interpret broadband radiation measurements and improve understanding of the role of clouds in the radiation budget. The algorithms applied to MODIS data have been adapted to utilize various combinations of channels on the Advanced Very High Resolution Radiometer (AVHRR) on the long-term time series of NOAA and MetOp satellites to provide a new cloud climate data record. These datasets can be useful for a variety of studies. This paper presents results of the MODIS and AVHRR analyses covering the period from 1980-2014. Validation and comparisons with other datasets are also given.
Large uncertainties in observed daily precipitation extremes over land
NASA Astrophysics Data System (ADS)
Herold, Nicholas; Behrangi, Ali; Alexander, Lisa V.
2017-01-01
We explore uncertainties in observed daily precipitation extremes over the terrestrial tropics and subtropics (50°S-50°N) based on five commonly used products: the Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) dataset, the Global Precipitation Climatology Centre-Full Data Daily (GPCC-FDD) dataset, the Tropical Rainfall Measuring Mission (TRMM) multi-satellite research product (T3B42 v7), the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Climate Data Record (PERSIANN-CDR), and the Global Precipitation Climatology Project's One-Degree Daily (GPCP-1DD) dataset. We use the precipitation indices R10mm and Rx1day, developed by the Expert Team on Climate Change Detection and Indices, to explore the behavior of "moderate" and "extreme" extremes, respectively. In order to assess the sensitivity of extreme precipitation to different grid sizes we perform our calculations on four common spatial resolutions (0.25° × 0.25°, 1° × 1°, 2.5° × 2.5°, and 3.75° × 2.5°). The impact of the chosen "order of operation" in calculating these indices is also determined. Our results show that moderate extremes are relatively insensitive to product and resolution choice, while extreme extremes can be very sensitive. For example, at 0.25° × 0.25° quasi-global mean Rx1day values vary from 37 mm in PERSIANN-CDR to 62 mm in T3B42. We find that the interproduct spread becomes prominent at resolutions of 1° × 1° and finer, thus establishing a minimum effective resolution at which observational products agree. Without improvements in interproduct spread, these exceedingly large observational uncertainties at high spatial resolution may limit the usefulness of model evaluations. As has been found previously, resolution sensitivity can be largely eliminated by applying an order of operation where indices are calculated prior to regridding. However, this approach is not appropriate when true area averages are desired (e.g., for model evaluations).
NASA Astrophysics Data System (ADS)
Tang, U. W.; Wang, Z. S.
2008-10-01
Each city has its unique urban form. The importance of urban form on sustainable development has been recognized in recent years. Traditionally, air quality modelling in a city is in a mesoscale with grid resolution of kilometers, regardless of its urban form. This paper introduces a GIS-based air quality and noise model system developed to study the built environment of highly compact urban forms. Compared with traditional mesoscale air quality model system, the present model system has a higher spatial resolution down to individual buildings along both sides of the street. Applying the developed model system in the Macao Peninsula with highly compact urban forms, the average spatial resolution of input and output data is as high as 174 receptor points per km2. Based on this input/output dataset with a high spatial resolution, this study shows that even the highly compact urban forms can be fragmented into a very small geographic scale of less than 3 km2. This is due to the significant temporal variation of urban development. The variation of urban form in each fragment in turn affects air dispersion, traffic condition, and thus air quality and noise in a measurable scale.
Predicting the Location of Human Perirhinal Cortex, Brodmann's area 35, from MRI
Augustinack, Jean C.; Huber, Kristen E.; Stevens, Allison A.; Roy, Michelle; Frosch, Matthew P.; van der Kouwe, André J.W.; Wald, Lawrence L.; Van Leemput, Koen; McKee, Ann; Fischl, Bruce
2012-01-01
The perirhinal cortex (Brodmann's area 35) is a multimodal area that is important for normal memory function. Specifically, perirhinal cortex is involved in detection of novel objects and manifests neurofibrillary tangles in Alzheimer's disease very early in disease progression. We scanned ex vivo brain hemispheres at standard resolution (1 mm × 1 mm × 1 mm) to construct pial/white matter surfaces in FreeSurfer and scanned again at high resolution (120 μm × 120 μm × 120 μm) to determine cortical architectural boundaries. After labeling perirhinal area 35 in the high resolution images, we mapped the high resolution labels to the surface models to localize area 35 in fourteen cases. We validated the area boundaries determined using histological Nissl staining. To test the accuracy of the probabilistic mapping, we measured the Hausdorff distance between the predicted and true labels and found that the median Hausdorff distance was 4.0 mm for left hemispheres (n = 7) and 3.2 mm for right hemispheres (n = 7) across subjects. To show the utility of perirhinal localization, we mapped our labels to a subset of the Alzheimer's Disease Neuroimaging Initiative dataset and found decreased cortical thickness measures in mild cognitive impairment and Alzheimer's disease compared to controls in the predicted perirhinal area 35. Our ex vivo probabilistic mapping of perirhinal cortex provides histologically validated, automated and accurate labeling of architectonic regions in the medial temporal lobe, and facilitates the analysis of atrophic changes in a large dataset for earlier detection and diagnosis. PMID:22960087
The sensitivity of ecosystem service models to choices of input data and spatial resolution
Bagstad, Kenneth J.; Cohen, Erika; Ancona, Zachary H.; McNulty, Steven; Sun, Ge
2018-01-01
Although ecosystem service (ES) modeling has progressed rapidly in the last 10–15 years, comparative studies on data and model selection effects have become more common only recently. Such studies have drawn mixed conclusions about whether different data and model choices yield divergent results. In this study, we compared the results of different models to address these questions at national, provincial, and subwatershed scales in Rwanda. We compared results for carbon, water, and sediment as modeled using InVEST and WaSSI using (1) land cover data at 30 and 300 m resolution and (2) three different input land cover datasets. WaSSI and simpler InVEST models (carbon storage and annual water yield) were relatively insensitive to the choice of spatial resolution, but more complex InVEST models (seasonal water yield and sediment regulation) produced large differences when applied at differing resolution. Six out of nine ES metrics (InVEST annual and seasonal water yield and WaSSI) gave similar predictions for at least two different input land cover datasets. Despite differences in mean values when using different data sources and resolution, we found significant and highly correlated results when using Spearman's rank correlation, indicating consistent spatial patterns of high and low values. Our results confirm and extend conclusions of past studies, showing that in certain cases (e.g., simpler models and national-scale analyses), results can be robust to data and modeling choices. For more complex models, those with different output metrics, and subnational to site-based analyses in heterogeneous environments, data and model choices may strongly influence study findings.
Development of a superconducting bulk magnet for NMR and MRI.
Nakamura, Takashi; Tamada, Daiki; Yanagi, Yousuke; Itoh, Yoshitaka; Nemoto, Takahiro; Utumi, Hiroaki; Kose, Katsumi
2015-10-01
A superconducting bulk magnet composed of six vertically stacked annular single-domain c-axis-oriented Eu-Ba-Cu-O crystals was energized to 4.74 T using a conventional superconducting magnet for high-resolution NMR spectroscopy. Shim coils, gradient coils, and radio frequency coils for high resolution NMR and MRI were installed in the 23 mm-diameter room-temperature bore of the bulk magnet. A 6.9 ppm peak-to-peak homogeneous region suitable for MRI was achieved in the central cylindrical region (6.2 mm diameter, 9.1 mm length) of the bulk magnet by using a single layer shim coil. A 21 Hz spectral resolution that can be used for high resolution NMR spectroscopy was obtained in the central cylindrical region (1.3 mm diameter, 4 mm length) of the bulk magnet by using a multichannel shim coil. A clear 3D MR image dataset of a chemically fixed mouse fetus with (50 μm)(3) voxel resolution was obtained in 5.5 h. We therefore concluded that the cryogen-free superconducting bulk magnet developed in this study is useful for high-resolution desktop NMR, MRI and mobile NMR device. Copyright © 2015 Elsevier Inc. All rights reserved.
Landschoff, Jannes; Du Plessis, Anton; Griffiths, Charles L
2015-01-01
Brooding brittle stars have a special mode of reproduction whereby they retain their eggs and juveniles inside respiratory body sacs called bursae. In the past, studying this phenomenon required disturbance of the sample by dissecting the adult. This caused irreversible damage and made the sample unsuitable for future studies. Micro X-ray computed tomography (μCT) is a promising technique, not only to visualise juveniles inside the bursae, but also to keep the sample intact and make the dataset of the scan available for future reference. Seven μCT scans of five freshly fixed (70 % ethanol) individuals, representing three differently sized brittle star species, provided adequate image quality to determine the numbers, sizes and postures of internally brooded young, as well as anatomy and morphology of adults. No staining agents were necessary to achieve high-resolution, high-contrast images, which permitted visualisations of both calcified and soft tissue. The raw data (projection and reconstruction images) are publicly available for download from GigaDB. Brittle stars of all sizes are suitable candidates for μCT imaging. This explicitly adds a new technique to the suite of tools available for studying the development of internally brooded young. The purpose of applying the technique was to visualise juveniles inside the adult, but because of the universally good quality of the dataset, the images can also be used for anatomical or comparative morphology-related studies of adult structures.
NASA Astrophysics Data System (ADS)
Freychet, N.; Duchez, A.; Wu, C.-H.; Chen, C.-A.; Hsu, H.-H.; Hirschi, J.; Forryan, A.; Sinha, B.; New, A. L.; Graham, T.; Andrews, M. B.; Tu, C.-Y.; Lin, S.-J.
2017-02-01
This work investigates the variability of extreme weather events (drought spells, DS15, and daily heavy rainfall, PR99) over East Asia. It particularly focuses on the large scale atmospheric circulation associated with high levels of the occurrence of these extreme events. Two observational datasets (APHRODITE and PERSIANN) are compared with two high-resolution global climate models (HiRAM and HadGEM3-GC2) and an ensemble of other lower resolution climate models from CMIP5. We first evaluate the performance of the high resolution models. They both exhibit good skill in reproducing extreme events, especially when compared with CMIP5 results. Significant differences exist between the two observational datasets, highlighting the difficulty of having a clear estimate of extreme events. The link between the variability of the extremes and the large scale circulation is investigated, on monthly and interannual timescales, using composite and correlation analyses. Both extreme indices DS15 and PR99 are significantly linked to the low level wind intensity over East Asia, i.e. the monsoon circulation. It is also found that DS15 events are strongly linked to the surface temperature over the Siberian region and to the land-sea pressure contrast, while PR99 events are linked to the sea surface temperature anomalies over the West North Pacific. These results illustrate the importance of the monsoon circulation on extremes over East Asia. The dependencies on of the surface temperature over the continent and the sea surface temperature raise the question as to what extent they could affect the occurrence of extremes over tropical regions in future projections.
Establishment and analysis of High-Resolution Assimilation Dataset of water-energy cycle over China
NASA Astrophysics Data System (ADS)
Wen, Xiaohang; Liao, Xiaohan; Dong, Wenjie; Yuan, Wenping
2015-04-01
For better prediction and understanding of water-energy exchange process and land-atmospheric interaction, the in-situ observed meteorological data which were acquired from China Meteorological Administration (CMA) were assimilated in the Weather Research and Forecasting (WRF) model and the monthly Green Vegetation Coverage (GVF) data, which was calculated by the Normalized Difference Vegetation Index (NDVI) of Earth Observing System Moderate-Resolution Imaging Spectroradiometer (EOS-MODIS), Digital Elevation Model (DEM) data of the Shuttle Radar Topography Mission (SRTM) system were also integrated in the WRF model over China. Further, the High-Resolution Assimilation Dataset of water-energy cycle over China (HRADC) was produced by WRF model. This dataset include 25 km horizontal resolution near surface meteorological data such as air temperature, humidity, ground temperature, and pressure at 19 levels, soil temperature and soil moisture at 4 levels, green vegetation coverage, latent heat flux, sensible heat flux, and ground heat flux for 3 hours. In this study, we 1) briefly introduce the cycling 3D-Var assimilation method; 2) Compare results of meteorological elements such as 2 m temperature, precipitation and ground temperature generated by the HRADC with the gridded observation data from CMA, and Global Land Data Assimilation System (GLDAS) output data from National Aeronautics and Space Administration (NASA). It is found that the results of 2 m temperature were improved compared with the control simulation and has effectively reproduced the observed patterns, and the simulated results of ground temperature, 0-10 cm soil temperature and specific humidity were as much closer to GLDAS outputs. Root mean square errors are reduced in assimilation run than control run, and the assimilation run of ground temperature, 0-10 cm soil temperature, radiation and surface fluxes were agreed well with the GLDAS outputs over China. The HRADC could be used in further research on the long period climatic effects and characteristics of water-energy cycle over China.
Beyond RGB: Very high resolution urban remote sensing with multimodal deep networks
NASA Astrophysics Data System (ADS)
Audebert, Nicolas; Le Saux, Bertrand; Lefèvre, Sébastien
2018-06-01
In this work, we investigate various methods to deal with semantic labeling of very high resolution multi-modal remote sensing data. Especially, we study how deep fully convolutional networks can be adapted to deal with multi-modal and multi-scale remote sensing data for semantic labeling. Our contributions are threefold: (a) we present an efficient multi-scale approach to leverage both a large spatial context and the high resolution data, (b) we investigate early and late fusion of Lidar and multispectral data, (c) we validate our methods on two public datasets with state-of-the-art results. Our results indicate that late fusion make it possible to recover errors steaming from ambiguous data, while early fusion allows for better joint-feature learning but at the cost of higher sensitivity to missing data.
NASA Astrophysics Data System (ADS)
Larson, Timothy P.; Schou, Jesper
2018-02-01
Building upon our previous work, in which we analyzed smoothed and subsampled velocity data from the Michelson Doppler Imager (MDI), we extend our analysis to unsmoothed, full-resolution MDI data. We also present results from the Helioseismic and Magnetic Imager (HMI), in both full resolution and processed to be a proxy for the low-resolution MDI data. We find that the systematic errors that we saw previously, namely peaks in both the high-latitude rotation rate and the normalized residuals of odd a-coefficients, are almost entirely absent in the two full-resolution analyses. Furthermore, we find that both systematic errors seem to depend almost entirely on how the input images are apodized, rather than on resolution or smoothing. Using the full-resolution HMI data, we confirm our previous findings regarding the effect of using asymmetric profiles on mode parameters, and also find that they occasionally result in more stable fits. We also confirm our previous findings regarding discrepancies between 360-day and 72-day analyses. We further investigate a six-month period previously seen in f-mode frequency shifts using the low-resolution datasets, this time accounting for solar-cycle dependence using magnetic-field data. Both HMI and MDI saw prominent six-month signals in the frequency shifts, but we were surprised to discover that the strongest signal at that frequency occurred in the mode coverage for the low-resolution proxy. Finally, a comparison of mode parameters from HMI and MDI shows that the frequencies and a-coefficients agree closely, encouraging the concatenation of the two datasets.
Dsm Based Orientation of Large Stereo Satellite Image Blocks
NASA Astrophysics Data System (ADS)
d'Angelo, P.; Reinartz, P.
2012-07-01
High resolution stereo satellite imagery is well suited for the creation of digital surface models (DSM). A system for highly automated and operational DSM and orthoimage generation based on CARTOSAT-1 imagery is presented, with emphasis on fully automated georeferencing. The proposed system processes level-1 stereo scenes using the rational polynomial coefficients (RPC) universal sensor model. The RPC are derived from orbit and attitude information and have a much lower accuracy than the ground resolution of approximately 2.5 m. In order to use the images for orthorectification or DSM generation, an affine RPC correction is required. In this paper, GCP are automatically derived from lower resolution reference datasets (Landsat ETM+ Geocover and SRTM DSM). The traditional method of collecting the lateral position from a reference image and interpolating the corresponding height from the DEM ignores the higher lateral accuracy of the SRTM dataset. Our method avoids this drawback by using a RPC correction based on DSM alignment, resulting in improved geolocation of both DSM and ortho images. Scene based method and a bundle block adjustment based correction are developed and evaluated for a test site covering the nothern part of Italy, for which 405 Cartosat-1 Stereopairs are available. Both methods are tested against independent ground truth. Checks against this ground truth indicate a lateral error of 10 meters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xiaoma; Zhou, Yuyu; Asrar, Ghassem R.
High spatiotemporal resolution air temperature (Ta) datasets are increasingly needed for assessing the impact of temperature change on people, ecosystems, and energy system, especially in the urban domains. However, such datasets are not widely available because of the large spatiotemporal heterogeneity of Ta caused by complex biophysical and socioeconomic factors such as built infrastructure and human activities. In this study, we developed a 1-km gridded dataset of daily minimum Ta (Tmin) and maximum Ta (Tmax), and the associated uncertainties, in urban and surrounding areas in the conterminous U.S. for the 2003–2016 period. Daily geographically weighted regression (GWR) models were developedmore » and used to interpolate Ta using 1 km daily land surface temperature and elevation as explanatory variables. The leave-one-out cross-validation approach indicates that our method performs reasonably well, with root mean square errors of 2.1 °C and 1.9 °C, mean absolute errors of 1.5 °C and 1.3 °C, and R 2 of 0.95 and 0.97, for Tmin and Tmax, respectively. The resulting dataset captures reasonably the spatial heterogeneity of Ta in the urban areas, and also captures effectively the urban heat island (UHI) phenomenon that Ta rises with the increase of urban development (i.e., impervious surface area). The new dataset is valuable for studying environmental impacts of urbanization such as UHI and other related effects (e.g., on building energy consumption and human health). The proposed methodology also shows a potential to build a long-term record of Ta worldwide, to fill the data gap that currently exists for studies of urban systems.« less
Status and Preliminary Evaluation for Chinese Re-Analysis Datasets
NASA Astrophysics Data System (ADS)
bin, zhao; chunxiang, shi; tianbao, zhao; dong, si; jingwei, liu
2016-04-01
Based on operational T639L60 spectral model, combined with Hybird_GSI assimilation system by using meteorological observations including radiosondes, buoyes, satellites el al., a set of Chinese Re-Analysis (CRA) datasets is developing by Chinese National Meteorological Information Center (NMIC) of Chinese Meteorological Administration (CMA). The datasets are run at 30km (0.28°latitude / longitude) resolution which holds higher resolution than most of the existing reanalysis dataset. The reanalysis is done in an effort to enhance the accuracy of historical synoptic analysis and aid to find out detailed investigation of various weather and climate systems. The current status of reanalysis is in a stage of preliminary experimental analysis. One-year forecast data during Jun 2013 and May 2014 has been simulated and used in synoptic and climate evaluation. We first examine the model prediction ability with the new assimilation system, and find out that it represents significant improvement in Northern and Southern hemisphere, due to addition of new satellite data, compared with operational T639L60 model, the effect of upper-level prediction is improved obviously and overall prediction stability is enhanced. In climatological analysis, compared with ERA-40, NCEP/NCAR and NCEP/DOE reanalyses, the results show that surface temperature simulates a bit lower in land and higher over ocean, 850-hPa specific humidity reflects weakened anomaly and the zonal wind value anomaly is focus on equatorial tropics. Meanwhile, the reanalysis dataset shows good ability for various climate index, such as subtropical high index, ESMI (East-Asia subtropical Summer Monsoon Index) et al., especially for the Indian and western North Pacific monsoon index. Latter we will further improve the assimilation system and dynamical simulating performance, and obtain 40-years (1979-2018) reanalysis datasets. It will provide a more comprehensive analysis for synoptic and climate diagnosis.
Buttenfield, B.P.; Stanislawski, L.V.; Brewer, C.A.
2011-01-01
This paper reports on generalization and data modeling to create reduced scale versions of the National Hydrographic Dataset (NHD) for dissemination through The National Map, the primary data delivery portal for USGS. Our approach distinguishes local differences in physiographic factors, to demonstrate that knowledge about varying terrain (mountainous, hilly or flat) and varying climate (dry or humid) can support decisions about algorithms, parameters, and processing sequences to create generalized, smaller scale data versions which preserve distinct hydrographic patterns in these regions. We work with multiple subbasins of the NHD that provide a range of terrain and climate characteristics. Specifically tailored generalization sequences are used to create simplified versions of the high resolution data, which was compiled for 1:24,000 scale mapping. Results are evaluated cartographically and metrically against a medium resolution benchmark version compiled for 1:100,000, developing coefficients of linear and areal correspondence.
NASA Astrophysics Data System (ADS)
Hedrick, A.; Marshall, H.-P.; Winstral, A.; Elder, K.; Yueh, S.; Cline, D.
2014-06-01
Repeated Light Detection and Ranging (LiDAR) surveys are quickly becoming the de facto method for measuring spatial variability of montane snowpacks at high resolution. This study examines the potential of a 750 km2 LiDAR-derived dataset of snow depths, collected during the 2007 northern Colorado Cold Lands Processes Experiment (CLPX-2), as a validation source for an operational hydrologic snow model. The SNOw Data Assimilation System (SNODAS) model framework, operated by the US National Weather Service, combines a physically-based energy-and-mass-balance snow model with satellite, airborne and automated ground-based observations to provide daily estimates of snowpack properties at nominally 1 km resolution over the coterminous United States. Independent validation data is scarce due to the assimilating nature of SNODAS, compelling the need for an independent validation dataset with substantial geographic coverage. Within twelve distinctive 500 m × 500 m study areas located throughout the survey swath, ground crews performed approximately 600 manual snow depth measurements during each of the CLPX-2 LiDAR acquisitions. This supplied a dataset for constraining the uncertainty of upscaled LiDAR estimates of snow depth at the 1 km SNODAS resolution, resulting in a root-mean-square difference of 13 cm. Upscaled LiDAR snow depths were then compared to the SNODAS-estimates over the entire study area for the dates of the LiDAR flights. The remotely-sensed snow depths provided a more spatially continuous comparison dataset and agreed more closely to the model estimates than that of the in situ measurements alone. Finally, the results revealed three distinct areas where the differences between LiDAR observations and SNODAS estimates were most drastic, suggesting natural processes specific to these regions as causal influences on model uncertainty.
NASA Astrophysics Data System (ADS)
Pangaluru, K.; Velicogna, I.; Ciraci, E.; Mohajerani, Y.
2017-12-01
The Indus, Ganges and Brahmaputra (IGB) basins supply water for both domestic and agricultural demands, the latter of which is the mainstay of Indian economy. Here, we use high-resolution Asia Refined Analysis (HAR) rainfall datasets to study the spatial and temporal behavior of rainfall over the mountainous areas of the Indus, Ganges and Brahmaputra (IGB) over the period from 2001 to 2014. The validation of High Asia Refined Analysis (HAR) precipitation data is carried out with observational (GPCP, CRU and CPC) and satellite (TRMM_3B43) datasets for the period. We find that the relative differences between the HAR model and the satellite and gauge-based datasets varies between -9% and 67% for the seasonal mean and between 1% and 26% for the annual mean for all basins. The correlation between the HAR model and the observational datasets lies between 0.5 and 0.9 for all seasons. Spatial variations and monthly magnitudes of gridded precipitation trends are calculated by using the Mann-Kendall (MK) test and the Thei-Sen approach (TSA) respectively. We found significant positive trends precipitation grids over the IGB basins in the annual and monsoon season time frames, as opposed to winter and falls seasons.
Estimating rice yield from MODIS-Landsat fusion data in Taiwan
NASA Astrophysics Data System (ADS)
Chen, C. R.; Chen, C. F.; Nguyen, S. T.
2017-12-01
Rice production monitoring with remote sensing is an important activity in Taiwan due to official initiatives. Yield estimation is a challenge in Taiwan because rice fields are small and fragmental. High spatiotemporal satellite data providing phenological information of rice crops is thus required for this monitoring purpose. This research aims to develop data fusion approaches to integrate daily Moderate Resolution Imaging Spectroradiometer (MODIS) and Landsat data for rice yield estimation in Taiwan. In this study, the low-resolution MODIS LST and emissivity data are used as reference data sources to obtain the high-resolution LST from Landsat data using the mixed-pixel analysis technique, and the time-series EVI data were derived the fusion of MODIS and Landsat spectral band data using STARFM method. The LST and EVI simulated results showed the close agreement between the LST and EVI obtained by the proposed methods with the reference data. The rice-yield model was established using EVI and LST data based on information of rice crop phenology collected from 371 ground survey sites across the country in 2014. The results achieved from the fusion datasets compared with the reference data indicated the close relationship between the two datasets with the correlation coefficient (R2) of 0.75 and root mean square error (RMSE) of 338.7 kgs, which were more accurate than those using the coarse-resolution MODIS LST data (R2 = 0.71 and RMSE = 623.82 kgs). For the comparison of total production, 64 towns located in the west part of Taiwan were used. The results also confirmed that the model using fusion datasets produced more accurate results (R2 = 0.95 and RMSE = 1,243 tons) than that using the course-resolution MODIS data (R2 = 0.91 and RMSE = 1,749 tons). This study demonstrates the application of MODIS-Landsat fusion data for rice yield estimation at the township level in Taiwan. The results obtained from the methods used in this study could be useful to policymakers; and thus, the methods can be transferable to other regions in the world for rice yield estimation.
NASA Astrophysics Data System (ADS)
Coppola, E.; Fantini, A.; Raffaele, F.; Torma, C. Z.; Bacer, S.; Giorgi, F.; Ahrens, B.; Dubois, C.; Sanchez, E.; Verdecchia, M.
2017-12-01
We assess the statistics of different daily precipitation indices in ensembles of Med-CORDEX and EUROCORDEX experiments at high resolution (grid spacing of ˜0.11° , or RCM11) and medium resolution (grid spacing of ˜0.44° , or RCM44) with regional climate models (RCMs) driven by the ERA-Interim reanalysis of observations for the period 1989-2008. The assessment is carried out by comparison with a set of high resolution observation datasets for 9 European subregions. The statistics analyzed include quantitative metrics for mean precipitation, daily precipitation Probability Density Functions (PDFs), daily precipitation intensity, frequency, 95th percentile and 95th percentile of dry spell length. We assess both an ensemble including all Med-CORDEX and EURO-CORDEX models and one including the Med-CORDEX models alone. For the All Models ensembles, the RCM11 one shows a remarkable performance in reproducing the spatial patterns and seasonal cycle of mean precipitation over all regions, with a consistent and marked improvement compared to the RCM44 ensemble and the ERA-Interim reanalysis. A good consistency with observations by the RCM11 ensemble (and a substantial improvement compared to RCM44 and ERA-Interim) is found also for the daily precipitation PDFs, mean intensity and, to a lesser extent, the 95th percentile. In fact, for some regions the RCM11 ensemble overestimates the occurrence of very high intensity events while for one region the models underestimate the occurrence of the largest extremes. The RCM11 ensemble still shows a general tendency to underestimate the dry day frequency and 95th percentile of dry spell length over wetter regions, with only a marginal improvement compared to the lower resolution models. This indicates that the problem of the excessive production of low precipitation events found in many climate models persists also at relatively high resolutions, at least in wet climate regimes. Concerning the Med-CORDEX model ensembles we find that their performance is of similar quality as that of the all-models over the Mediterranean regions analyzed. Finally, we stress the need of consistent and quality checked fine scale observation datasets for the assessment of RCMs run at increasingly high horizontal resolutions.
Iglesias, Juan Eugenio; Augustinack, Jean C; Nguyen, Khoa; Player, Christopher M; Player, Allison; Wright, Michelle; Roy, Nicole; Frosch, Matthew P; McKee, Ann C; Wald, Lawrence L; Fischl, Bruce; Van Leemput, Koen
2015-07-15
Automated analysis of MRI data of the subregions of the hippocampus requires computational atlases built at a higher resolution than those that are typically used in current neuroimaging studies. Here we describe the construction of a statistical atlas of the hippocampal formation at the subregion level using ultra-high resolution, ex vivo MRI. Fifteen autopsy samples were scanned at 0.13 mm isotropic resolution (on average) using customized hardware. The images were manually segmented into 13 different hippocampal substructures using a protocol specifically designed for this study; precise delineations were made possible by the extraordinary resolution of the scans. In addition to the subregions, manual annotations for neighboring structures (e.g., amygdala, cortex) were obtained from a separate dataset of in vivo, T1-weighted MRI scans of the whole brain (1mm resolution). The manual labels from the in vivo and ex vivo data were combined into a single computational atlas of the hippocampal formation with a novel atlas building algorithm based on Bayesian inference. The resulting atlas can be used to automatically segment the hippocampal subregions in structural MRI images, using an algorithm that can analyze multimodal data and adapt to variations in MRI contrast due to differences in acquisition hardware or pulse sequences. The applicability of the atlas, which we are releasing as part of FreeSurfer (version 6.0), is demonstrated with experiments on three different publicly available datasets with different types of MRI contrast. The results show that the atlas and companion segmentation method: 1) can segment T1 and T2 images, as well as their combination, 2) replicate findings on mild cognitive impairment based on high-resolution T2 data, and 3) can discriminate between Alzheimer's disease subjects and elderly controls with 88% accuracy in standard resolution (1mm) T1 data, significantly outperforming the atlas in FreeSurfer version 5.3 (86% accuracy) and classification based on whole hippocampal volume (82% accuracy). Copyright © 2015. Published by Elsevier Inc.
A framework for global river flood risk assessment
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Bouwman, A.; Ward, P. J.; Jongman, B.
2012-04-01
There is an increasing need for strategic global assessments of flood risks. Such assessments may be required by: (a) International Financing Institutes and Disaster Management Agencies to evaluate where, when, and which investments in flood risk mitigation are most required; (b) (re-)insurers, who need to determine their required coverage capital; and (c) large companies to account for risks of regional investments. In this contribution, we propose a framework for global river flood risk assessment. The framework combines coarse scale resolution hazard probability distributions, derived from global hydrological model runs (typical scale about 0.5 degree resolution) with high resolution estimates of exposure indicators. The high resolution is required because floods typically occur at a much smaller scale than the typical resolution of global hydrological models, and exposure indicators such as population, land use and economic value generally are strongly variable in space and time. The framework therefore estimates hazard at a high resolution ( 1 km2) by using a) global forcing data sets of the current (or in scenario mode, future) climate; b) a global hydrological model; c) a global flood routing model, and d) importantly, a flood spatial downscaling routine. This results in probability distributions of annual flood extremes as an indicator of flood hazard, at the appropriate resolution. A second component of the framework combines the hazard probability distribution with classical flood impact models (e.g. damage, affected GDP, affected population) to establish indicators for flood risk. The framework can be applied with a large number of datasets and models and sensitivities of such choices can be evaluated by the user. The framework is applied using the global hydrological model PCR-GLOBWB, combined with a global flood routing model. Downscaling of the hazard probability distributions to 1 km2 resolution is performed with a new downscaling algorithm, applied on a number of target regions. We demonstrate the use of impact models in these regions based on global GDP, population, and land use maps. In this application, we show sensitivities of the estimated risks with regard to the use of different climate input datasets, decisions made in the downscaling algorithm, and different approaches to establish distributed estimates of GDP and asset exposure to flooding.
Potential for using regional and global datasets for national scale ecosystem service modelling
NASA Astrophysics Data System (ADS)
Maxwell, Deborah; Jackson, Bethanna
2016-04-01
Ecosystem service models are increasingly being used by planners and policy makers to inform policy development and decisions about national-level resource management. Such models allow ecosystem services to be mapped and quantified, and subsequent changes to these services to be identified and monitored. In some cases, the impact of small scale changes can be modelled at a national scale, providing more detailed information to decision makers about where to best focus investment and management interventions that could address these issues, while moving toward national goals and/or targets. National scale modelling often uses national (or local) data (for example, soils, landcover and topographical information) as input. However, there are some places where fine resolution and/or high quality national datasets cannot be easily obtained, or do not even exist. In the absence of such detailed information, regional or global datasets could be used as input to such models. There are questions, however, about the usefulness of these coarser resolution datasets and the extent to which inaccuracies in this data may degrade predictions of existing and potential ecosystem service provision and subsequent decision making. Using LUCI (the Land Utilisation and Capability Indicator) as an example predictive model, we examine how the reliability of predictions change when national datasets of soil, landcover and topography are substituted with coarser scale regional and global datasets. We specifically look at how LUCI's predictions of where water services, such as flood risk, flood mitigation, erosion and water quality, change when national data inputs are replaced by regional and global datasets. Using the Conwy catchment, Wales, as a case study, the land cover products compared are the UK's Land Cover Map (2007), the European CORINE land cover map and the ESA global land cover map. Soils products include the National Soil Map of England and Wales (NatMap) and the European Soils Database. NEXTmap elevation data, which covers the UK and parts of continental Europe, are compared to global AsterDEM and SRTM30 topographical products. While the regional and global datasets can be used to fill gaps in data requirements, the coarser resolution of these datasets means that there is greater aggregation of information over larger areas. This loss of detail impacts on the reliability of model output, particularly where significant discrepancies between datasets exist. The implications of this loss of detail in terms of spatial planning and decision making is discussed. Finally, in the context of broader development the need for better nationally and globally available data to allow LUCI and other ecosystem models to become more globally applicable is highlighted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kronewitter, Scott R.; Slysz, Gordon W.; Marginean, Ioan
2014-05-31
Dense LC-MS datasets have convoluted extracted ion chromatograms with multiple chromatographic peaks that cloud the differentiation between intact compounds with their overlapping isotopic distributions, peaks due to insource ion fragmentation, and noise. Making this differentiation is critical in glycomics datasets because chromatographic peaks correspond to different intact glycan structural isomers. The GlyQ-IQ software is targeted chromatography centric software designed for chromatogram and mass spectra data processing and subsequent glycan composition annotation. The targeted analysis approach offers several key advantages to LC-MS data processing and annotation over traditional algorithms. A priori information about the individual target’s elemental composition allows for exactmore » isotope profile modeling for improved feature detection and increased sensitivity by focusing chromatogram generation and peak fitting on the isotopic species in the distribution having the highest intensity and data quality. Glycan target annotation is corroborated by glycan family relationships and in source fragmentation detection. The GlyQ-IQ software is developed in this work (Part 1) and was used to profile N-glycan compositions from human serum LC-MS Datasets. The companion manuscript GlyQ-IQ Part 2 discusses developments in human serum N-glycan sample preparation, glycan isomer separation, and glycan electrospray ionization. A case study is presented to demonstrate how GlyQ-IQ identifies and removes confounding chromatographic peaks from high mannose glycan isomers from human blood serum. In addition, GlyQ-IQ was used to generate a broad N-glycan profile from a high resolution (100K/60K) nESI-LS-MS/MS dataset including CID and HCD fragmentation acquired on a Velos Pro Mass spectrometer. 101 glycan compositions and 353 isomer peaks were detected from a single sample. 99% of the GlyQ-IQ glycan-feature assignments passed manual validation and are backed with high resolution mass spectra and mass accuracies less than 7 ppm.« less
NASA Astrophysics Data System (ADS)
Jackson, C.; Sava, E.; Cervone, G.
2017-12-01
Hurricane Harvey has been noted as the wettest cyclone on record for the US as well as the most destructive (so far) for the 2017 hurricane season. An entire year worth of rainfall occurred over the course of a few days. The city of Houston was greatly impacted as the storm lingered over the city for five days, causing a record-breaking 50+ inches of rain as well as severe damage from flooding. Flood model simulations were performed to reconstruct the event in order to better understand, assess, and predict flooding dynamics for the future. Additionally, number of remote sensing platforms, and on ground instruments that provide near real-time data have also been used for flood identification, monitoring, and damage assessment. Although both flood models and remote sensing techniques are able to identify inundated areas, rapid and accurate flood prediction at a high spatio-temporal resolution remains a challenge. Thus a methodological approach which fuses the two techniques can help to better validate what is being modeled and observed. Recent advancements in data fusion techniques of remote sensing with near real time heterogeneous datasets have allowed emergency responders to more efficiently extract increasingly precise and relevant knowledge from the available information. In this work the use of multiple sources of contributed data, coupled with remotely sensed and open source geospatial datasets is demonstrated to generate an understanding of potential damage assessment for the floods after Hurricane Harvey in Harris County, Texas. The feasibility of integrating multiple sources at different temporal and spatial resolutions into hydrodynamic models for flood inundation simulations is assessed. Furthermore the contributed datasets are compared against a reconstructed flood extent generated from the Flood2D-GPU model.
ArcticDEM Validation and Accuracy Assessment
NASA Astrophysics Data System (ADS)
Candela, S. G.; Howat, I.; Noh, M. J.; Porter, C. C.; Morin, P. J.
2017-12-01
ArcticDEM comprises a growing inventory Digital Elevation Models (DEMs) covering all land above 60°N. As of August, 2017, ArcticDEM had openly released 2-m resolution, individual DEM covering over 51 million km2, which includes areas of repeat coverage for change detection, as well as over 15 million km2 of 5-m resolution seamless mosaics. By the end of the project, over 80 million km2 of 2-m DEMs will be produced, averaging four repeats of the 20 million km2 Arctic landmass. ArcticDEM is produced from sub-meter resolution, stereoscopic imagery using open source software (SETSM) on the NCSA Blue Waters supercomputer. These DEMs have known biases of several meters due to errors in the sensor models generated from satellite positioning. These systematic errors are removed through three-dimensional registration to high-precision Lidar or other control datasets. ArcticDEM is registered to seasonally-subsetted ICESat elevations due its global coverage and high report accuracy ( 10 cm). The vertical accuracy of ArcticDEM is then obtained from the statistics of the fit to the ICESat point cloud, which averages -0.01 m ± 0.07 m. ICESat, however, has a relatively coarse measurement footprint ( 70 m) which may impact the precision of the registration. Further, the ICESat data predates the ArcticDEM imagery by a decade, so that temporal changes in the surface may also impact the registration. Finally, biases may exist between different the different sensors in the ArcticDEM constellation. Here we assess the accuracy of ArcticDEM and the ICESat registration through comparison to multiple high-resolution airborne lidar datasets that were acquired within one year of the imagery used in ArcticDEM. We find the ICESat dataset is performing as anticipated, introducing no systematic bias during the coregistration process, and reducing vertical errors to within the uncertainty of the airborne Lidars. Preliminary sensor comparisons show no significant difference post coregistration, suggesting that there is no sensor bias between platforms, and all data is suitable for analysis without further correction. Here we will present accuracy assessments, observations and comparisons over diverse terrain in parts of Alaska and Greenland.
Influence of reanalysis datasets on dynamically downscaling the recent past
NASA Astrophysics Data System (ADS)
Moalafhi, Ditiro B.; Evans, Jason P.; Sharma, Ashish
2017-08-01
Multiple reanalysis datasets currently exist that can provide boundary conditions for dynamic downscaling and simulating local hydro-climatic processes at finer spatial and temporal resolutions. Previous work has suggested that there are two reanalyses alternatives that provide the best lateral boundary conditions for downscaling over southern Africa. This study dynamically downscales these reanalyses (ERA-I and MERRA) over southern Africa to a high resolution (10 km) grid using the WRF model. Simulations cover the period 1981-2010. Multiple observation datasets were used for both surface temperature and precipitation to account for observational uncertainty when assessing results. Generally, temperature is simulated quite well, except over the Namibian coastal plain where the simulations show anomalous warm temperature related to the failure to propagate the influence of the cold Benguela current inland. Precipitation tends to be overestimated in high altitude areas, and most of southern Mozambique. This could be attributed to challenges in handling complex topography and capturing large-scale circulation patterns. While MERRA driven WRF exhibits slightly less bias in temperature especially for La Nina years, ERA-I driven simulations are on average superior in terms of RMSE. When considering multiple variables and metrics, ERA-I is found to produce the best simulation of the climate over the domain. The influence of the regional model appears to be large enough to overcome the small difference in relative errors present in the lateral boundary conditions derived from these two reanalyses.
The NLCD-MODIS land cover-albedo database integrates high-quality MODIS albedo observations with areas of homogeneous land cover from NLCD. The spatial resolution (pixel size) of the database is 480m-x-480m aligned to the standardized UGSG Albers Equal-Area projection. The spatial extent of the database is the continental United States. This dataset is associated with the following publication:Wickham , J., C.A. Barnes, and T. Wade. Combining NLCD and MODIS to Create a Land Cover-Albedo Dataset for the Continental United States. REMOTE SENSING OF ENVIRONMENT. Elsevier Science Ltd, New York, NY, USA, 170(0): 143-153, (2015).
NASA Astrophysics Data System (ADS)
Yamamoto, Kristina H.; Anderson, Sharolyn J.; Sutton, Paul C.
2015-10-01
Sea turtle nesting beaches in southeastern Florida were evaluated for changes from 1999 to 2005 using LiDAR datasets. Changes to beach volume were correlated with changes in several elevation-derived characteristics, such as elevation and slope. In addition, these changes to beach geomorphology were correlated to changes in nest success, illustrating that beach alterations may affect sea turtle nesting behavior. The ability to use LiDAR datasets to quickly and efficiently conduct beach comparisons for habitat use represents another benefit to this high spatial resolution data.
Dynamic Moss Observed with Hi-C
NASA Technical Reports Server (NTRS)
Alexander, Caroline; Winebarger, Amy; Morton, Richard; Savage, Sabrina
2014-01-01
The High-resolution Coronal Imager (Hi-C), flown on 11 July 2012, has revealed an unprecedented level of detail and substructure within the solar corona. Hi--C imaged a large active region (AR11520) with 0.2-0.3'' spatial resolution and 5.5s cadence over a 5 minute period. An additional dataset with a smaller FOV, the same resolution, but with a higher temporal cadence (1s) was also taken during the rocket flight. This dataset was centered on a large patch of 'moss' emission that initially seemed to show very little variability. Image processing revealed this region to be much more dynamic than first thought with numerous bright and dark features observed to appear, move and disappear over the 5 minute observation. Moss is thought to be emission from the upper transition region component of hot loops so studying its dynamics and the relation between the bright/dark features and underlying magnetic features is important to tie the interaction of the different atmospheric layers together. Hi-C allows us to study the coronal emission of the moss at the smallest scales while data from SDO/AIA and HMI is used to give information on these structures at different heights/temperatures. Using the high temporal and spatial resolution of Hi-C the observed moss features were tracked and the distribution of displacements, speeds, and sizes were measured. This allows us to comment on both the physical processes occurring within the dynamic moss and the scales at which these changes are occurring.
Dynamic Moss Observed with Hi-C
NASA Technical Reports Server (NTRS)
Alexander, Caroline; Winebarger, Amy; Morton, Richard; Savage, Sabrina
2014-01-01
The High-resolution Coronal Imager (Hi-C), flown on 11 July 2012, has revealed an unprecedented level of detail and substructure within the solar corona. Hi-C imaged a large active region (AR11520) with 0.2-0.3'' spatial resolution and 5.5s cadence over a 5 minute period. An additional dataset with a smaller FOV, the same resolution, but with a higher temporal cadence (1s) was also taken during the rocket flight. This dataset was centered on a large patch of 'moss' emission that initially seemed to show very little variability. Image processing revealed this region to be much more dynamic than first thought with numerous bright and dark features observed to appear, move and disappear over the 5 minute observation. Moss is thought to be emission from the upper transition region component of hot loops so studying its dynamics and the relation between the bright/dark features and underlying magnetic features is important to tie the interaction of the different atmospheric layers together. Hi-C allows us to study the coronal emission of the moss at the smallest scales while data from SDO/AIA and HMI is used to give information on these structures at different heights/temperatures. Using the high temporal and spatial resolution of Hi-C the observed moss features were tracked and the distribution of displacements, speeds, and sizes were measured. This allows us to comment on both the physical processes occurring within the dynamic moss and the scales at which these changes are occurring.
Practical considerations for coil-wrapped Distributed Temperature Sensing setups
NASA Astrophysics Data System (ADS)
Solcerova, Anna; van Emmerik, Tim; Hilgersom, Koen; van de Giesen, Nick
2015-04-01
Fiber-optic Distributed Temperature Sensing (DTS) has been applied widely in hydrological and meteorological systems. For example, DTS has been used to measure streamflow, groundwater, soil moisture and temperature, air temperature, and lake energy fluxes. Many of these applications require a spatial monitoring resolution smaller than the minimum resolution of the DTS device. Therefore, measuring with these resolutions requires a custom made setup. To obtain both high temporal and high spatial resolution temperature measurements, fiber-optic cable is often wrapped around, and glued to, a coil, for example a PVC conduit. For these setups, it is often assumed that the construction characteristics (e.g., the coil material, shape, diameter) do not influence the DTS temperature measurements significantly. This study compares DTS datasets obtained during four measurement campaigns. The datasets were acquired using different setups, allowing to investigate the influence of the construction characteristics on the monitoring results. This comparative study suggests that the construction material, shape, diameter, and way of attachment can have a significant influence on the results. We present a qualitative and quantitative approximation of errors introduced through the selection of the construction, e.g., choice of coil material, influence of solar radiation, coil diameter, and cable attachment method. Our aim is to provide insight in factors that influence DTS measurements, which designers of future DTS measurements setups can take into account. Moreover, we present a number of solutions to minimize these errors for improved temperature retrieval using DTS.
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Kumar, Sujay V.; Kuliogwski, Robert J.; Langston, Carrie
2013-01-01
This paper and poster presented a description of the current real-time SPoRT-LIS run over the southeastern CONUS to provide high-resolution, land surface initialization grids for local numerical model forecasts at NWS forecast offices. The LIS hourly output also offers a supplemental dataset to aid in situational awareness for convective initiation forecasts, assessing flood potential, and monitoring drought at fine scales. It is a goal of SPoRT and several NWS forecast offices to expand the LIS to an entire CONUS domain, so that LIS output can be utilized by NWS Western Region offices, among others. To make this expansion cleanly so as to provide high-quality land surface output, SPoRT tested new precipitation datasets in LIS as an alternative forcing dataset to the current radar+gauge Stage IV product. Similar to the Stage IV product, the NMQ product showed comparable patterns of precipitation and soil moisture distribution, but suffered from radar gaps in the intermountain West, and incorrectly set values to zero instead of missing in the data-void regions of Mexico and Canada. The other dataset tested was the next-generation GOES-R QPE algorithm, which experienced a high bias in both coverage and intensity of accumulated precipitation relative to the control (NLDAS2), Stage IV, and NMQ simulations. The resulting root zone soil moisture was substantially higher in most areas.
High Resolution Mapping of Wetland Ecosystems SPOT-5 Take 5 for Evaluation of Sentinel-2
NASA Astrophysics Data System (ADS)
Ade, Christiana; Hestir, Erin L.; Khanna, Shruti; Ustin, Susan L.
2016-08-01
Around the world wetlands are critical to human societies and ecosystems, providing services such as habitat, water, food and fiber, flood and nutrient control, and cultural, recreational and religious value. However, the dynamic nature of tidal wetlands makes measuring ecosystem responses to climate change, seasonal inundation regimes, and anthropogenic disturbance from current and previous Earth observing sensors challenging due to limited spatial and temporal resolutions. Sentinel- 2 will directly address this challenge by providing high spatial resolution data with frequent revisit time. This pilot study aims to develop methodology for future Sentinel-2 products and highlight the variability of tidal wetland ecosystems, thereby demonstrating the necessity of improved spatial particularly temporal resolution. Here the simulated Sentinel-2 dataset from the SPOT-5 Take 5 experiment reveals the capacity of the new sensor to simultaneously assess tidal wetland ecosystem phenology and water quality in inland waters.
NASA Astrophysics Data System (ADS)
Kganyago, Mahlatse; Odindi, John; Adjorlolo, Clement; Mhangara, Paidamoyo
2018-05-01
Globally, there is paucity of accurate information on the spatial distribution and patch sizes of Invasive Alien Plants (IAPs) species. Such information is needed to aid optimisation of control mechanisms to prevent further spread of IAPs and minimize their impacts. Recent studies have shown the capability of very high spatial (<1 m) and spectral resolution (<10 nm) data for discriminating vegetation species. However, very high spatial resolution may introduce significant intra-species spectral variability and result in reduced mapping accuracy, while higher spectral resolution data are commonly limited to smaller areas, are costly and computationally expensive. Alternatively, medium and high spatial resolution data are available at low or no cost and have limitedly been evaluated for their potential in determining invasion patterns relevant for invasion ecology and aiding effective IAPs management. In this study medium and high resolution datasets from Landsat Operational Land Imager (OLI) and SPOT 6 sensors respectively, were evaluated for mapping the distribution and patch sizes of IAP, Parthenium hysterophorus in the savannah landscapes of KwaZulu-Natal, South Africa. Support Vector Machines (SVM) classifier was used for classification of both datasets. Results indicated that SPOT 6 had a higher overall accuracy (86%) than OLI (83%) in mapping P. hysterophorus. The study found larger distributions and patch sizes in OLI than in SPOT 6 as a result of possible P. hysterophorus expansion due to temporal differences between images and coarser pixels were insufficient to delineate gaps inside larger patches. On the other hand, SPOT 6 showed better capabilities of delineating gaps and boundaries of patches, hence had better estimates of distribution and patch sizes. Overall, the study showed that OLI may be suitable for mapping well-established patches for the purpose of large scale monitoring, while SPOT 6 can be used for mapping small patches and prioritising them for eradication to prevent further spread at a landscape scale.
NASA Astrophysics Data System (ADS)
Zolina, Olga; Simmer, Clemens; Kapala, Alice; Mächel, Hermann; Gulev, Sergey; Groisman, Pavel
2014-05-01
We present new high resolution precipitation daily grids developed at Meteorological Institute, University of Bonn and German Weather Service (DWD) under the STAMMEX project (Spatial and Temporal Scales and Mechanisms of Extreme Precipitation Events over Central Europe). Daily precipitation grids have been developed from the daily-observing precipitation network of DWD, which runs one of the World's densest rain gauge networks comprising more than 7500 stations. Several quality-controlled daily gridded products with homogenized sampling were developed covering the periods 1931-onwards (with 0.5 degree resolution), 1951-onwards (0.25 degree and 0.5 degree), and 1971-2000 (0.1 degree). Different methods were tested to select the best gridding methodology that minimizes errors of integral grid estimates over hilly terrain. Besides daily precipitation values with uncertainty estimates (which include standard estimates of the kriging uncertainty as well as error estimates derived by a bootstrapping algorithm), the STAMMEX data sets include a variety of statistics that characterize temporal and spatial dynamics of the precipitation distribution (quantiles, extremes, wet/dry spells, etc.). Comparisons with existing continental-scale daily precipitation grids (e.g., CRU, ECA E-OBS, GCOS) which include considerably less observations compared to those used in STAMMEX, demonstrate the added value of high-resolution grids for extreme rainfall analyses. These data exhibit spatial variability pattern and trends in precipitation extremes, which are missed or incorrectly reproduced over Central Europe from coarser resolution grids based on sparser networks. The STAMMEX dataset can be used for high-quality climate diagnostics of precipitation variability, as a reference for reanalyses and remotely-sensed precipitation products (including the upcoming Global Precipitation Mission products), and for input into regional climate and operational weather forecast models. We will present numerous application of the STAMMEX grids spanning from case studies of the major Central European floods to long-term changes in different precipitation statistics, including those accounting for the alternation of dry and wet periods and precipitation intensities associated with prolonged rainy episodes.
Selkowitz, D.J.
2010-01-01
Shrub cover appears to be increasing across many areas of the Arctic tundra biome, and increasing shrub cover in the Arctic has the potential to significantly impact global carbon budgets and the global climate system. For most of the Arctic, however, there is no existing baseline inventory of shrub canopy cover, as existing maps of Arctic vegetation provide little information about the density of shrub cover at a moderate spatial resolution across the region. Remotely-sensed fractional shrub canopy maps can provide this necessary baseline inventory of shrub cover. In this study, we compare the accuracy of fractional shrub canopy (> 0.5 m tall) maps derived from multi-spectral, multi-angular, and multi-temporal datasets from Landsat imagery at 30 m spatial resolution, Moderate Resolution Imaging SpectroRadiometer (MODIS) imagery at 250 m and 500 m spatial resolution, and MultiAngle Imaging Spectroradiometer (MISR) imagery at 275 m spatial resolution for a 1067 km2 study area in Arctic Alaska. The study area is centered at 69 ??N, ranges in elevation from 130 to 770 m, is composed primarily of rolling topography with gentle slopes less than 10??, and is free of glaciers and perennial snow cover. Shrubs > 0.5 m in height cover 2.9% of the study area and are primarily confined to patches associated with specific landscape features. Reference fractional shrub canopy is determined from in situ shrub canopy measurements and a high spatial resolution IKONOS image swath. Regression tree models are constructed to estimate fractional canopy cover at 250 m using different combinations of input data from Landsat, MODIS, and MISR. Results indicate that multi-spectral data provide substantially more accurate estimates of fractional shrub canopy cover than multi-angular or multi-temporal data. Higher spatial resolution datasets also provide more accurate estimates of fractional shrub canopy cover (aggregated to moderate spatial resolutions) than lower spatial resolution datasets, an expected result for a study area where most shrub cover is concentrated in narrow patches associated with rivers, drainages, and slopes. Including the middle infrared bands available from Landsat and MODIS in the regression tree models (in addition to the four standard visible and near-infrared spectral bands) typically results in a slight boost in accuracy. Including the multi-angular red band data available from MISR in the regression tree models, however, typically boosts accuracy more substantially, resulting in moderate resolution fractional shrub canopy estimates approaching the accuracy of estimates derived from the much higher spatial resolution Landsat sensor. Given the poor availability of snow and cloud-free Landsat scenes in many areas of the Arctic and the promising results demonstrated here by the MISR sensor, MISR may be the best choice for large area fractional shrub canopy mapping in the Alaskan Arctic for the period 2000-2009.
Science Enabling Applications of Gridded Radiances and Products
NASA Astrophysics Data System (ADS)
Goldberg, M.; Wolf, W.; Zhou, L.
2005-12-01
New generations of hyperspectral sounders and imagers are not only providing vastly improved information to monitor, assess and predict the Earth's environment, they also provide tremendous volumes of data to manage. Key management challenges must include data processing, distribution, archive and utilization. At the NOAA/NESDIS Office of Research and Applications, we have started to address the challenge of utilizing high volume satellite by thinning observations and developing gridded datasets from the observations made from the NASA AIRS, AMSU and MODIS instrument. We have developed techniques for intelligent thinning of AIRS data for numerical weather prediction, by selecting the clearest AIRS 14 km field of view within a 3 x 3 array. The selection uses high spatial resolution 1 km MODIS data which are spatially convolved to the AIRS field of view. The MODIS cloud masks and AIRS cloud tests are used to select the clearest. During the real-time processing the data are thinned and gridded to support monitoring, validation and scientific studies. Products from AIRS, which includes profiles of temperature, water vapor and ozone and cloud-corrected infrared radiances for more than 2000 channels, are derived from a single AIRS/AMSU field of regard, which is a 3 x 3 array of AIRS footprints (each with a 14 km spatial resolution) collocated with a single AMSU footprint (42 km). One of our key gridded dataset is a daily 3 x 3 latitude/longitude projection which contains the nearest AIRS/AMSU field of regard with respect to the center of the 3 x 3 lat/lon grid. This particular gridded dataset is 1/40 the size of the full resolution data. This gridded dataset is the type of product request that can be used to support algorithm validation and improvements. It also provides for a very economical approach for reprocessing, testing and improving algorithms for climate studies without having to reprocess the full resolution data stored at the DAAC. For example, on a single CPU workstation, all the AIRS derived products can be derived from a single year of gridded data in 5 days. This relatively short turnaround time, which can be reduced considerably to 3 hours by using a cluster of 40 pc G5processors, allows for repeated reprocessing at the PIs home institution before substantial investments are made to reprocess the full resolution data sets archived at the DAAC. In other words, do not reprocess the full resolution data until the science community have tested and selected the optimal algorithm on the gridded data. Development and applications of gridded radiances and products will be discussed. The applications can be provided as part of a web-based service.
Network analysis reveals multiscale controls on streamwater chemistry
Kevin J. McGuire; Christian E. Torgersen; Gene E. Likens; Donald C. Buso; Winsor H. Lowe; Scott W. Bailey
2014-01-01
By coupling synoptic data from a basin-wide assessment of streamwater chemistry with network-based geostatistical analysis, we show that spatial processes differentially affect biogeochemical condition and pattern across a headwater stream network. We analyzed a high-resolution dataset consisting of 664 water samples collected every 100 m throughout 32 tributaries in...
NASA Technical Reports Server (NTRS)
Rogers, D.; Christensen, P.; Bandfield, J. L.; Christensen, P.
2001-01-01
MGS TES data is used at high resolution to map small regions of basalt in Mars' northern hemisphere. With the exception of 2 outliers, the northern extent of the highland basalt appears to correspond with the northern edge of the cratered highlands. Additional information is contained in the original extended abstract.
NASA Astrophysics Data System (ADS)
Elarab, Manal; Ticlavilca, Andres M.; Torres-Rua, Alfonso F.; Maslova, Inga; McKee, Mac
2015-12-01
Precision agriculture requires high-resolution information to enable greater precision in the management of inputs to production. Actionable information about crop and field status must be acquired at high spatial resolution and at a temporal frequency appropriate for timely responses. In this study, high spatial resolution imagery was obtained through the use of a small, unmanned aerial system called AggieAirTM. Simultaneously with the AggieAir flights, intensive ground sampling for plant chlorophyll was conducted at precisely determined locations. This study reports the application of a relevance vector machine coupled with cross validation and backward elimination to a dataset composed of reflectance from high-resolution multi-spectral imagery (VIS-NIR), thermal infrared imagery, and vegetative indices, in conjunction with in situ SPAD measurements from which chlorophyll concentrations were derived, to estimate chlorophyll concentration from remotely sensed data at 15-cm resolution. The results indicate that a relevance vector machine with a thin plate spline kernel type and kernel width of 5.4, having LAI, NDVI, thermal and red bands as the selected set of inputs, can be used to spatially estimate chlorophyll concentration with a root-mean-squared-error of 5.31 μg cm-2, efficiency of 0.76, and 9 relevance vectors.
The Spectral Image Processing System (SIPS): Software for integrated analysis of AVIRIS data
NASA Technical Reports Server (NTRS)
Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.
1992-01-01
The Spectral Image Processing System (SIPS) is a software package developed by the Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, in response to a perceived need to provide integrated tools for analysis of imaging spectrometer data both spectrally and spatially. SIPS was specifically designed to deal with data from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the High Resolution Imaging Spectrometer (HIRIS), but was tested with other datasets including the Geophysical and Environmental Research Imaging Spectrometer (GERIS), GEOSCAN images, and Landsat TM. SIPS was developed using the 'Interactive Data Language' (IDL). It takes advantage of high speed disk access and fast processors running under the UNIX operating system to provide rapid analysis of entire imaging spectrometer datasets. SIPS allows analysis of single or multiple imaging spectrometer data segments at full spatial and spectral resolution. It also allows visualization and interactive analysis of image cubes derived from quantitative analysis procedures such as absorption band characterization and spectral unmixing. SIPS consists of three modules: SIPS Utilities, SIPS_View, and SIPS Analysis. SIPS version 1.1 is described below.
High-resolution data on the impact of warming on soil CO2 efflux from an Asian monsoon forest
Liang, Naishen; Teramoto, Munemasa; Takagi, Masahiro; Zeng, Jiye
2017-01-01
This paper describes a project for evaluation of global warming’s impacts on soil carbon dynamics in Japanese forest ecosystems. We started a soil warming experiment in late 2008 in a 55-year-old evergreen broad-leaved forest at the boundary between the subtropical and warm-temperate biomes in southern Japan. We used infrared carbon-filament heat lamps to increase soil temperature by about 2.5 °C at a depth of 5 cm and continuously recorded CO2 emission from the soil surface using a multichannel automated chamber system. Here, we present details of the experimental processes and datasets for the CO2 emission rate, soil temperature, and soil moisture from control, trenched, and warmed trenched plots. The long term of the study and its high resolution make the datasets meaningful for use in or development of coupled climate-ecosystem models to tune their dynamic behaviour as well as to provide mean parameters for decomposition of soil organic carbon to support future predictions of soil carbon sequestration. PMID:28291228
Map Matching and Real World Integrated Sensor Data Warehousing (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burton, E.
2014-02-01
The inclusion of interlinked temporal and spatial elements within integrated sensor data enables a tremendous degree of flexibility when analyzing multi-component datasets. The presentation illustrates how to warehouse, process, and analyze high-resolution integrated sensor datasets to support complex system analysis at the entity and system levels. The example cases presented utilizes in-vehicle sensor system data to assess vehicle performance, while integrating a map matching algorithm to link vehicle data to roads to demonstrate the enhanced analysis possible via interlinking data elements. Furthermore, in addition to the flexibility provided, the examples presented illustrate concepts of maintaining proprietary operational information (Fleet DNA)more » and privacy of study participants (Transportation Secure Data Center) while producing widely distributed data products. Should real-time operational data be logged at high resolution across multiple infrastructure types, map matched to their associated infrastructure, and distributed employing a similar approach; dependencies between urban environment infrastructures components could be better understood. This understanding is especially crucial for the cities of the future where transportation will rely more on grid infrastructure to support its energy demands.« less
NASA Technical Reports Server (NTRS)
Lolli, Simone; Welton, Ellsworth J.; Campbell, James R.; Eloranta, Edwin; Holben, Brent N.; Chew, Boon Ning; Salinas, Santo V.
2014-01-01
From August 2012 to February 2013 a High Resolution Spectral Lidar (HSRL; 532 nm) was deployed at that National University of Singapore near a NASA Micro Pulse Lidar NETwork (MPLNET; 527 nm) site. A primary objective of the MPLNET lidar project is the production and dissemination of reliable Level 1 measurements and Level 2 retrieval products. This paper characterizes and quantifies error in Level 2 aerosol optical property retrievals conducted through inversion techniques that derive backscattering and extinction coefficients from MPLNET elastic single-wavelength datasets. MPLNET Level 2 retrievals for aerosol optical depth and extinction/backscatter coefficient profiles are compared with corresponding HSRL datasets, for which the instrument collects direct measurements of each using a unique optical configuration that segregates aerosol and cloud backscattered signal from molecular signal. The intercomparison is performed, and error matrices reported, for lower (0-5km) and the upper (>5km) troposphere, respectively, to distinguish uncertainties observed within and above the MPLNET instrument optical overlap regime.
Müllenbroich, M Caroline; Silvestri, Ludovico; Onofri, Leonardo; Costantini, Irene; Hoff, Marcel Van't; Sacconi, Leonardo; Iannello, Giulio; Pavone, Francesco S
2015-10-01
Comprehensive mapping and quantification of neuronal projections in the central nervous system requires high-throughput imaging of large volumes with microscopic resolution. To this end, we have developed a confocal light-sheet microscope that has been optimized for three-dimensional (3-D) imaging of structurally intact clarified whole-mount mouse brains. We describe the optical and electromechanical arrangement of the microscope and give details on the organization of the microscope management software. The software orchestrates all components of the microscope, coordinates critical timing and synchronization, and has been written in a versatile and modular structure using the LabVIEW language. It can easily be adapted and integrated to other microscope systems and has been made freely available to the light-sheet community. The tremendous amount of data routinely generated by light-sheet microscopy further requires novel strategies for data handling and storage. To complete the full imaging pipeline of our high-throughput microscope, we further elaborate on big data management from streaming of raw images up to stitching of 3-D datasets. The mesoscale neuroanatomy imaged at micron-scale resolution in those datasets allows characterization and quantification of neuronal projections in unsectioned mouse brains.
NASA Astrophysics Data System (ADS)
Laiti, Lavinia; Giovannini, Lorenzo; Zardi, Dino
2015-04-01
The accurate assessment of the solar radiation available at the Earth's surface is essential for a wide range of energy-related applications, such as the design of solar power plants, water heating systems and energy-efficient buildings, as well as in the fields of climatology, hydrology, ecology and agriculture. The characterization of solar radiation is particularly challenging in complex-orography areas, where topographic shadowing and altitude effects, together with local weather phenomena, greatly increase the spatial and temporal variability of such variable. At present, approaches ranging from surface measurements interpolation to orographic down-scaling of satellite data, to numerical model simulations are adopted for mapping solar radiation. In this contribution a high-resolution (200 m) solar atlas for the Trentino region (Italy) is presented, which was recently developed on the basis of hourly observations of global radiation collected from the local radiometric stations during the period 2004-2012. Monthly and annual climatological irradiation maps were obtained by the combined use of a GIS-based clear-sky model (r.sun module of GRASS GIS) and geostatistical interpolation techniques (kriging). Moreover, satellite radiation data derived by the MeteoSwiss HelioMont algorithm (2 km resolution) were used for missing-data reconstruction and for the final mapping, thus integrating ground-based and remote-sensing information. The results are compared with existing solar resource datasets, such as the PVGIS dataset, produced by the Joint Research Center Institute for Energy and Transport, and the HelioMont dataset, in order to evaluate the accuracy of the different datasets available for the region of interest.
Datasets, Technologies and Products from the NASA/NOAA Electronic Theater 2002
NASA Technical Reports Server (NTRS)
Hasler, A. Fritz; Starr, David (Technical Monitor)
2001-01-01
An in depth look at the Earth Science datasets used in the Etheater Visualizations will be presented. This will include the satellite orbits, platforms, scan patterns, the size, temporal and spatial resolution, and compositing techniques used to obtain the datasets as well as the spectral bands utilized.
Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.
2012-01-01
Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394
Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P
2012-06-05
Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.
SDCLIREF - A sub-daily gridded reference dataset
NASA Astrophysics Data System (ADS)
Wood, Raul R.; Willkofer, Florian; Schmid, Franz-Josef; Trentini, Fabian; Komischke, Holger; Ludwig, Ralf
2017-04-01
Climate change is expected to impact the intensity and frequency of hydrometeorological extreme events. In order to adequately capture and analyze extreme rainfall events, in particular when assessing flood and flash flood situations, data is required at high spatial and sub-daily resolution which is often not available in sufficient density and over extended time periods. The ClimEx project (Climate Change and Hydrological Extreme Events) addresses the alteration of hydrological extreme events under climate change conditions. In order to differentiate between a clear climate change signal and the limits of natural variability, unique Single-Model Regional Climate Model Ensembles (CRCM5 driven by CanESM2, RCP8.5) were created for a European and North-American domain, each comprising 50 members of 150 years (1951-2100). In combination with the CORDEX-Database, this newly created ClimEx-Ensemble is a one-of-a-kind model dataset to analyze changes of sub-daily extreme events. For the purpose of bias-correcting the regional climate model ensembles as well as for the baseline calibration and validation of hydrological catchment models, a new sub-daily (3h) high-resolution (500m) gridded reference dataset (SDCLIREF) was created for a domain covering the Upper Danube and Main watersheds ( 100.000km2). As the sub-daily observations lack a continuous time series for the reference period 1980-2010, the need for a suitable method to bridge the gap of the discontinuous time series arouse. The Method of Fragments (Sharma and Srikanthan (2006); Westra et al. (2012)) was applied to transform daily observations to sub-daily rainfall events to extend the time series and densify the station network. Prior to applying the Method of Fragments and creating the gridded dataset using rigorous interpolation routines, data collection of observations, operated by several institutions in three countries (Germany, Austria, Switzerland), and the subsequent quality control of the observations was carried out. Among others, the quality control checked for steps, extensive dry seasons, temporal consistency and maximum hourly values. The resulting SDCLIREF dataset provides a robust precipitation reference for hydrometeorological applications in unprecedented high spatio-temporal resolution. References: Sharma, A.; Srikanthan, S. (2006): Continuous Rainfall Simulation: A Nonparametric Alternative. In: 30th Hydrology and Water Resources Symposium 4-7 December 2006, Launceston, Tasmania. Westra, S.; Mehrotra, R.; Sharma, A.; Srikanthan, R. (2012): Continuous rainfall simulation. 1. A regionalized subdaily disaggregation approach. In: Water Resour. Res. 48 (1). DOI: 10.1029/2011WR010489.
Multiplatform observations enabling albedo retrievals with high temporal resolution
NASA Astrophysics Data System (ADS)
Riihelä, Aku; Manninen, Terhikki; Key, Jeffrey; Sun, Qingsong; Sütterlin, Melanie; Lattanzio, Alessio; Schaaf, Crystal
2017-04-01
In this paper we show that combining observations from different polar orbiting satellite families (such as AVHRR and MODIS) is physically justifiable and technically feasible. Our proposed approach will lead to surface albedo retrievals at higher temporal resolution than the state of the art, with comparable or better accuracy. This study is carried out in the World Meteorological Organization (WMO) Sustained and coordinated processing of Environmental Satellite data for Climate Monitoring (SCOPE-CM) project SCM-02 (http://www.scope-cm.org/projects/scm-02/). Following a spectral homogenization of the Top-of-Atmosphere reflectances of bands 1 & 2 from AVHRR and MODIS, both observation datasets are atmospherically corrected with a coherent atmospheric profile and algorithm. The resulting surface reflectances are then fed into an inversion of the RossThick-LiSparse-Reciprocal surface bidirectional reflectance distribution function (BRDF) model. The results of the inversion (BRDF kernels) may then be integrated to estimate various surface albedo quantities. A key principle here is that the larger number of valid surface observations with multiple satellites allows us to invert the BRDF coefficients within a shorter time span, enabling the monitoring of relatively rapid surface phenomena such as snowmelt. The proposed multiplatform approach is expected to bring benefits in particular to the observation of the albedo of the polar regions, where persistent cloudiness and long atmospheric path lengths present challenges to satellite-based retrievals. Following a similar logic, the retrievals over tropical regions with high cloudiness should also benefit from the method. We present results from a demonstrator dataset of a global combined AVHRR-GAC and MODIS dataset covering the year 2010. The retrieved surface albedo is compared against quality-monitored in situ albedo observations from the Baseline Surface Radiation Network (BSRN). Additionally, the combined retrieval dataset is compared against MODIS C6 albedo/BRDF datasets to assess the quality of the multiplatform approach against current state of the art. This approach is not limited to AHVRR and MODIS observations. Provided that the spectral homogenization produces an acceptably good match, any instrument observing the Earth's surface in the visible and near-infrared wavelengths could, in principal, be included to further enhance the temporal resolution and accuracy of the retrievals. The SCOPE-CM initiative provides a potential framework for such expansion in the future.
NASA Technical Reports Server (NTRS)
Lagomasino, David; Fatoyinbo, Temilola; Lee, SeungKuk; Feliciano, Emanuelle; Trettin, Carl; Simard, Marc
2016-01-01
Canopy height is one of the strongest predictors of biomass and carbon in forested ecosystems. Additionally, mangrove ecosystems represent one of the most concentrated carbon reservoirs that are rapidly degrading as a result of deforestation, development, and hydrologic manipulation. Therefore, the accuracy of Canopy Height Models (CHM) over mangrove forest can provide crucial information for monitoring and verification protocols. We compared four CHMs derived from independent remotely sensed imagery and identified potential errors and bias between measurement types. CHMs were derived from three spaceborne datasets; Very-High Resolution (VHR) stereophotogrammetry, TerraSAR-X add-on for Digital Elevation Measurement (DEM), and Shuttle Radar Topography Mission (TanDEM-X), and lidar data which was acquired from an airborne platform. Each dataset exhibited different error characteristics that were related to spatial resolution, sensitivities of the sensors, and reference frames. Canopies over 10 meters were accurately predicted by all CHMs while the distributions of canopy height were best predicted by the VHR CHM. Depending on the guidelines and strategies needed for monitoring and verification activities, coarse resolution CHMs could be used to track canopy height at regional and global scales with finer resolution imagery used to validate and monitor critical areas undergoing rapid changes.
Lagomasino, David; Fatoyinbo, Temilola; Lee, SeungKuk; Feliciano, Emanuelle; Trettin, Carl; Simard, Marc
2017-01-01
Canopy height is one of the strongest predictors of biomass and carbon in forested ecosystems. Additionally, mangrove ecosystems represent one of the most concentrated carbon reservoirs that are rapidly degrading as a result of deforestation, development, and hydrologic manipulation. Therefore, the accuracy of Canopy Height Models (CHM) over mangrove forest can provide crucial information for monitoring and verification protocols. We compared four CHMs derived from independent remotely sensed imagery and identified potential errors and bias between measurement types. CHMs were derived from three spaceborne datasets; Very-High Resolution (VHR) stereophotogrammetry, TerraSAR-X add-on for Digital Elevation Measurement, and Shuttle Radar Topography Mission (TanDEM-X), and lidar data which was acquired from an airborne platform. Each dataset exhibited different error characteristics that were related to spatial resolution, sensitivities of the sensors, and reference frames. Canopies over 10 m were accurately predicted by all CHMs while the distributions of canopy height were best predicted by the VHR CHM. Depending on the guidelines and strategies needed for monitoring and verification activities, coarse resolution CHMs could be used to track canopy height at regional and global scales with finer resolution imagery used to validate and monitor critical areas undergoing rapid changes. PMID:29629207
NASA Astrophysics Data System (ADS)
Mao, D.; Torrence, M. H.; Mazarico, E.; Neumann, G. A.; Smith, D. E.; Zuber, M. T.
2016-12-01
LRO has been in a polar lunar orbit for 7 year since it was launched in June 2009. Seven instruments are onboard LRO to perform a global and detailed geophysical, geological and geochemical mapping of the Moon, some of which have very high spatial resolution. To take full advantage of the high resolution LRO datasets from these instruments, the spacecraft orbit must be reconstructed precisely. The baseline LRO tracking was the NASA's White Sands station in New Mexico and a commercial network, the Universal Space Network (USN), providing up to 20 hours per day of almost continuous S-band radio frequency link to LRO. The USN stations produce S-band range data with a 0.4 m precision and Doppler data with a 0.8 mm/s precision. Using the S-band tracking data together with the high-resolution gravity field model from the GRAIL mission, definitive LRO orbit solutions are obtained with an accuracy of 10 m in total position and 0.5 m radially. Confirmed by the 0.50-m high-resolution NAC images from the LROC team, these orbits well represent the LRO orbit "truth". In addition to the S-band data, one-way Laser Ranging (LR) to LRO provides a unique LRO optical tracking dataset over 5 years, from June 2009 to September 2014. Ten international satellite laser ranging stations contributed over 4000 hours LR data with the 0.05 - 0.10 m normal point precision. Another set of high precision LRO tracking data is provided by the Deep Space Network (DSN), which produces radiometric tracking data more precise than the USN S-band data. In the last two years of the LRO mission, the temporal coverage of the USN data has decreased significantly. We show that LR and DSN data can be a good supplement to the baseline tracking data for the orbit reconstruction.
Soil erodibility in Europe: a high-resolution dataset based on LUCAS.
Panagos, Panos; Meusburger, Katrin; Ballabio, Cristiano; Borrelli, Pasqualle; Alewell, Christine
2014-05-01
The greatest obstacle to soil erosion modelling at larger spatial scales is the lack of data on soil characteristics. One key parameter for modelling soil erosion is the soil erodibility, expressed as the K-factor in the widely used soil erosion model, the Universal Soil Loss Equation (USLE) and its revised version (RUSLE). The K-factor, which expresses the susceptibility of a soil to erode, is related to soil properties such as organic matter content, soil texture, soil structure and permeability. With the Land Use/Cover Area frame Survey (LUCAS) soil survey in 2009 a pan-European soil dataset is available for the first time, consisting of around 20,000 points across 25 Member States of the European Union. The aim of this study is the generation of a harmonised high-resolution soil erodibility map (with a grid cell size of 500 m) for the 25 EU Member States. Soil erodibility was calculated for the LUCAS survey points using the nomograph of Wischmeier and Smith (1978). A Cubist regression model was applied to correlate spatial data such as latitude, longitude, remotely sensed and terrain features in order to develop a high-resolution soil erodibility map. The mean K-factor for Europe was estimated at 0.032 thahha(-1)MJ(-1)mm(-1) with a standard deviation of 0.009 thahha(-1)MJ(-1)mm(-1). The yielded soil erodibility dataset compared well with the published local and regional soil erodibility data. However, the incorporation of the protective effect of surface stone cover, which is usually not considered for the soil erodibility calculations, resulted in an average 15% decrease of the K-factor. The exclusion of this effect in K-factor calculations is likely to result in an overestimation of soil erosion, particularly for the Mediterranean countries, where highest percentages of surface stone cover were observed. Copyright © 2014. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Leydsman-McGinty, E. I.; Ramsey, R. D.; McGinty, C.
2013-12-01
The Remote Sensing/GIS Laboratory at Utah State University, in cooperation with the United States Environmental Protection Agency, is quantifying impervious surfaces for three watershed sub-basins in Utah. The primary objective of developing watershed-scale quantifications of impervious surfaces is to provide an indicator of potential impacts to wetlands that occur within the Wasatch Front and along the Great Salt Lake. A geospatial layer of impervious surfaces can assist state agencies involved with Utah's Wetlands Program Plan (WPP) in understanding the impacts of impervious surfaces on wetlands, as well as support them in carrying out goals and actions identified in the WPP. The three watershed sub-basins, Lower Bear-Malad, Lower Weber, and Jordan, span the highly urbanized Wasatch Front and are consistent with focal areas in need of wetland monitoring and assessment as identified in Utah's WPP. Geospatial layers of impervious surface currently exist in the form of national and regional land cover datasets; however, these datasets are too coarse to be utilized in fine-scale analyses. In addition, the pixel-based image processing techniques used to develop these coarse datasets have proven insufficient in smaller scale or detailed studies, particularly when applied to high-resolution satellite imagery or aerial photography. Therefore, object-based image analysis techniques are being implemented to develop the geospatial layer of impervious surfaces. Object-based image analysis techniques employ a combination of both geospatial and image processing methods to extract meaningful information from high-resolution imagery. Spectral, spatial, textural, and contextual information is used to group pixels into image objects and then subsequently used to develop rule sets for image classification. eCognition, an object-based image analysis software program, is being utilized in conjunction with one-meter resolution National Agriculture Imagery Program (NAIP) aerial photography from 2011.
Advanced Multivariate Inversion Techniques for High Resolution 3D Geophysical Modeling (Invited)
NASA Astrophysics Data System (ADS)
Maceira, M.; Zhang, H.; Rowe, C. A.
2009-12-01
We focus on the development and application of advanced multivariate inversion techniques to generate a realistic, comprehensive, and high-resolution 3D model of the seismic structure of the crust and upper mantle that satisfies several independent geophysical datasets. Building on previous efforts of joint invesion using surface wave dispersion measurements, gravity data, and receiver functions, we have added a fourth dataset, seismic body wave P and S travel times, to the simultaneous joint inversion method. We present a 3D seismic velocity model of the crust and upper mantle of northwest China resulting from the simultaneous, joint inversion of these four data types. Surface wave dispersion measurements are primarily sensitive to seismic shear-wave velocities, but at shallow depths it is difficult to obtain high-resolution velocities and to constrain the structure due to the depth-averaging of the more easily-modeled, longer-period surface waves. Gravity inversions have the greatest resolving power at shallow depths, and they provide constraints on rock density variations. Moreover, while surface wave dispersion measurements are primarily sensitive to vertical shear-wave velocity averages, body wave receiver functions are sensitive to shear-wave velocity contrasts and vertical travel-times. Addition of the fourth dataset, consisting of seismic travel-time data, helps to constrain the shear wave velocities both vertically and horizontally in the model cells crossed by the ray paths. Incorporation of both P and S body wave travel times allows us to invert for both P and S velocity structure, capitalizing on empirical relationships between both wave types’ seismic velocities with rock densities, thus eliminating the need for ad hoc assumptions regarding the Poisson ratios. Our new tomography algorithm is a modification of the Maceira and Ammon joint inversion code, in combination with the Zhang and Thurber TomoDD (double-difference tomography) program.
NASA Astrophysics Data System (ADS)
Evans, Ben; Moeller, Iris; Smith, Geoff; Spencer, Tom
2017-04-01
Saltmarshes provide valuable ecosystem services and are protected. Nevertheless they are generally thought to be declining in extent in North West Europe and beyond. The drivers of this decline and its variability are complex and inadequately described. When considering management for future ecosystem service provision it is important to understand why, where, and to what extent areal decline is likely to occur. Physically-based morphological modelling of fine-sediment systems is in its infancy. The models and necessary expertise and facilities to run and validate them are rarely directly accessible to practitioners. This paper uses an accessible and easily applied data-driven modelling approach for the quantitative estimation of current marsh system status and likely future marsh development. Central to this approach are monitoring datasets providing high resolution spatial data and the recognition that antecedent morphology exerts a principal control on future landform change (morphodynamic feedback). Further, current morphology can also be regarded as an integrated response of the intertidal system to the process environment . It may also, therefore, represent proxy information on historical conditions beyond the period of observational records. Novel methods are developed to extract quantitative morphological information from aerial photographic, LiDAR and satellite datasets. Morphometric indices are derived relating to the functional configuration of landform units that go beyond previous efforts and basic description of extent. The incorporation of morphometric indices derived from existing monitoring datasets is shown to improve the performance of statistical models for predicting salt marsh evolution but wider applications and benefits are expected. The indices are useful landscape descriptors when assessing system status and may provide relatively robust measures for comparison against historical datasets. They are also valuable metrics when considering how the landscape delivers ecosystem services and are essential for the testing and validation of morphological models of salt marshes and other systems.
NASA Astrophysics Data System (ADS)
Arendt, A. A.; Houser, P.; Kapnick, S. B.; Kargel, J. S.; Kirschbaum, D.; Kumar, S.; Margulis, S. A.; McDonald, K. C.; Osmanoglu, B.; Painter, T. H.; Raup, B. H.; Rupper, S.; Tsay, S. C.; Velicogna, I.
2017-12-01
The High Mountain Asia Team (HiMAT) is an assembly of 13 research groups funded by NASA to improve understanding of cryospheric and hydrological changes in High Mountain Asia (HMA). Our project goals are to quantify historical and future variability in weather and climate over the HMA, partition the components of the water budget across HMA watersheds, explore physical processes driving changes, and predict couplings and feedbacks between physical and human systems through assessment of hazards and downstream impacts. These objectives are being addressed through analysis of remote sensing datasets combined with modeling and assimilation methods to enable data integration across multiple spatial and temporal scales. Our work to date has focused on developing improved high resolution precipitation, snow cover and snow water equivalence products through a variety of statistical uncertainty analysis, dynamical downscaling and assimilation techniques. These and other high resolution climate products are being used as input and validation for an assembly of land surface and General Circulation Models. To quantify glacier change in the region we have calculated multidecadal mass balances of a subset of HMA glaciers by comparing commercial satellite imagery with earlier elevation datasets. HiMAT is using these tools and datasets to explore the impact of atmospheric aerosols and surface impurities on surface energy exchanges, to determine drivers of glacier and snowpack melt rates, and to improve our capacity to predict future hydrological variability. Outputs from the climate and land surface assessments are being combined with landslide and glacier lake inventories to refine our ability to predict hazards in the region. Economic valuation models are also being used to assess impacts on water resources and hydropower. Field data of atmospheric aerosol, radiative flux and glacier lake conditions are being collected to provide ground validation for models and remote sensing products. In this presentation we will discuss initial results and outline plans for a scheduled release of our datasets and findings to the broader community. We will also describe our methods for cross-team collaboration through the adoption of cloud computing and data integration tools.
Automated brain tumor segmentation using spatial accuracy-weighted hidden Markov Random Field.
Nie, Jingxin; Xue, Zhong; Liu, Tianming; Young, Geoffrey S; Setayesh, Kian; Guo, Lei; Wong, Stephen T C
2009-09-01
A variety of algorithms have been proposed for brain tumor segmentation from multi-channel sequences, however, most of them require isotropic or pseudo-isotropic resolution of the MR images. Although co-registration and interpolation of low-resolution sequences, such as T2-weighted images, onto the space of the high-resolution image, such as T1-weighted image, can be performed prior to the segmentation, the results are usually limited by partial volume effects due to interpolation of low-resolution images. To improve the quality of tumor segmentation in clinical applications where low-resolution sequences are commonly used together with high-resolution images, we propose the algorithm based on Spatial accuracy-weighted Hidden Markov random field and Expectation maximization (SHE) approach for both automated tumor and enhanced-tumor segmentation. SHE incorporates the spatial interpolation accuracy of low-resolution images into the optimization procedure of the Hidden Markov Random Field (HMRF) to segment tumor using multi-channel MR images with different resolutions, e.g., high-resolution T1-weighted and low-resolution T2-weighted images. In experiments, we evaluated this algorithm using a set of simulated multi-channel brain MR images with known ground-truth tissue segmentation and also applied it to a dataset of MR images obtained during clinical trials of brain tumor chemotherapy. The results show that more accurate tumor segmentation results can be obtained by comparing with conventional multi-channel segmentation algorithms.
NASA Technical Reports Server (NTRS)
Chelton, Dudley B.; Schlax, Michael G.
1994-01-01
A formalism is presented for determining the wavenumber-frequency transfer function associated with an irregularly sampled multidimensional dataset. This transfer function reveals the filtering characteristics and aliasing patterns inherent in the sample design. In combination with information about the spectral characteristics of the signal, the transfer function can be used to quantify the spatial and temporal resolution capability of the dataset. Application of the method to idealized Geosat altimeter data (i.e., neglecting measurement errors and data dropouts) concludes that the Geosat orbit configuration is capable of resolving scales of about 3 deg in latitude and longitude by about 30 days.
Impact relevance and usability of high resolution climate modeling and data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arnott, James C.
2016-10-30
The Aspen Global Change Institute hosted a technical science workshop entitled, “Impact Relevance and Usability of High-Resolution Climate Modeling and Datasets,” on August 2-7, 2015 in Aspen, CO. Kate Calvin (Pacific Northwest National Laboratory), Andrew Jones (Lawrence Berkeley National Laboratory) and Jean-François Lamarque (NCAR) served as co-chairs for the workshop. The meeting included the participation of 29 scientists for a total of 145 participant days. Following the workshop, workshop co-chairs authored a meeting report published in Eos on April 27, 2016. Insights from the workshop directly contributed to the formation of a new DOE-supported project co-led by workshop co-chair Andymore » Jones. A subset of meeting participants continue to work on a publication on institutional innovations that can support the usability of high resolution modeling, among other sources of climate information.« less
A hierarchical network-based algorithm for multi-scale watershed delineation
NASA Astrophysics Data System (ADS)
Castronova, Anthony M.; Goodall, Jonathan L.
2014-11-01
Watershed delineation is a process for defining a land area that contributes surface water flow to a single outlet point. It is a commonly used in water resources analysis to define the domain in which hydrologic process calculations are applied. There has been a growing effort over the past decade to improve surface elevation measurements in the U.S., which has had a significant impact on the accuracy of hydrologic calculations. Traditional watershed processing on these elevation rasters, however, becomes more burdensome as data resolution increases. As a result, processing of these datasets can be troublesome on standard desktop computers. This challenge has resulted in numerous works that aim to provide high performance computing solutions to large data, high resolution data, or both. This work proposes an efficient watershed delineation algorithm for use in desktop computing environments that leverages existing data, U.S. Geological Survey (USGS) National Hydrography Dataset Plus (NHD+), and open source software tools to construct watershed boundaries. This approach makes use of U.S. national-level hydrography data that has been precomputed using raster processing algorithms coupled with quality control routines. Our approach uses carefully arranged data and mathematical graph theory to traverse river networks and identify catchment boundaries. We demonstrate this new watershed delineation technique, compare its accuracy with traditional algorithms that derive watershed solely from digital elevation models, and then extend our approach to address subwatershed delineation. Our findings suggest that the open-source hierarchical network-based delineation procedure presented in the work is a promising approach to watershed delineation that can be used summarize publicly available datasets for hydrologic model input pre-processing. Through our analysis, we explore the benefits of reusing the NHD+ datasets for watershed delineation, and find that the our technique offers greater flexibility and extendability than traditional raster algorithms.
NASA Astrophysics Data System (ADS)
Zittis, G.; Bruggeman, A.; Camera, C.; Hadjinicolaou, P.; Lelieveld, J.
2017-07-01
Climate change is expected to substantially influence precipitation amounts and distribution. To improve simulations of extreme rainfall events, we analyzed the performance of different convection and microphysics parameterizations of the WRF (Weather Research and Forecasting) model at very high horizontal resolutions (12, 4 and 1 km). Our study focused on the eastern Mediterranean climate change hot-spot. Five extreme rainfall events over Cyprus were identified from observations and were dynamically downscaled from the ERA-Interim (EI) dataset with WRF. We applied an objective ranking scheme, using a 1-km gridded observational dataset over Cyprus and six different performance metrics, to investigate the skill of the WRF configurations. We evaluated the rainfall timing and amounts for the different resolutions, and discussed the observational uncertainty over the particular extreme events by comparing three gridded precipitation datasets (E-OBS, APHRODITE and CHIRPS). Simulations with WRF capture rainfall over the eastern Mediterranean reasonably well for three of the five selected extreme events. For these three cases, the WRF simulations improved the ERA-Interim data, which strongly underestimate the rainfall extremes over Cyprus. The best model performance is obtained for the January 1989 event, simulated with an average bias of 4% and a modified Nash-Sutcliff of 0.72 for the 5-member ensemble of the 1-km simulations. We found overall added value for the convection-permitting simulations, especially over regions of high-elevation. Interestingly, for some cases the intermediate 4-km nest was found to outperform the 1-km simulations for low-elevation coastal parts of Cyprus. Finally, we identified significant and inconsistent discrepancies between the three, state of the art, gridded precipitation datasets for the tested events, highlighting the observational uncertainty in the region.
High-resolution mapping of global surface water and its long-term changes
NASA Astrophysics Data System (ADS)
Pekel, J. F.; Cottam, A.; Gorelick, N.; Belward, A.
2016-12-01
The location and persistence of surface water is both affected by climate and human activity and affects climate, biological diversity and human wellbeing. Global datasets documenting surface water location and seasonality have been produced but measuring long-term changes at high resolution remains a challenge.To address the dynamic nature of water, the European Commission's Joint Research Centre (JRC), working with the Google Earth Engine (GEE) team has processed each single pixel acquired by Landsat 5, 7, and 8 between 16th March 1984 to 10th October 2015 (> 3.000.000 Landsat scenes, representing > 1823 Terabytes of data).The produced dataset record months and years when water was present across 32 year, were occurrence changed and what form changes took in terms of seasonality and persistence, and document intra-annual persistence, inter-annual variability, and trends.This validated dataset shows that impacts of climate change and climate oscillations on surface water occurrence can be measured and that evidence can be gathered showing how surface water is altered by human activities.Freely available, we anticipate that this dataset will provide valuable information to those working in areas linked to security of water supply for agriculture, industry and human consumption, for assessing water-related disaster reduction and recovery and for the study of waterborne pollution and disease spread. The maps will also improve surface boundary condition setting in climate and weather models, improve carbon emissions estimates, inform regional climate change impact studies, delimit wetlands for biodiversity and determine desertification trends. Issues such as dam building (and less widespread dam removal), disappearing rivers, the geopolitics of water distribution and coastal erosion are also addressed.
OpenTopography: Enabling Online Access to High-Resolution Lidar Topography Data and Processing Tools
NASA Astrophysics Data System (ADS)
Crosby, Christopher; Nandigam, Viswanath; Baru, Chaitan; Arrowsmith, J. Ramon
2013-04-01
High-resolution topography data acquired with lidar (light detection and ranging) technology are revolutionizing the way we study the Earth's surface and overlying vegetation. These data, collected from airborne, tripod, or mobile-mounted scanners have emerged as a fundamental tool for research on topics ranging from earthquake hazards to hillslope processes. Lidar data provide a digital representation of the earth's surface at a resolution sufficient to appropriately capture the processes that contribute to landscape evolution. The U.S. National Science Foundation-funded OpenTopography Facility (http://www.opentopography.org) is a web-based system designed to democratize access to earth science-oriented lidar topography data. OpenTopography provides free, online access to lidar data in a number of forms, including the raw point cloud and associated geospatial-processing tools for customized analysis. The point cloud data are co-located with on-demand processing tools to generate digital elevation models, and derived products and visualizations which allow users to quickly access data in a format appropriate for their scientific application. The OpenTopography system is built using a service-oriented architecture (SOA) that leverages cyberinfrastructure resources at the San Diego Supercomputer Center at the University of California San Diego to allow users, regardless of expertise level, to access these massive lidar datasets and derived products for use in research and teaching. OpenTopography hosts over 500 billion lidar returns covering 85,000 km2. These data are all in the public domain and are provided by a variety of partners under joint agreements and memoranda of understanding with OpenTopography. Partners include national facilities such as the NSF-funded National Center for Airborne Lidar Mapping (NCALM), as well as non-governmental organizations and local, state, and federal agencies. OpenTopography has become a hub for high-resolution topography resources. Datasets hosted by other organizations, as well as lidar-specific software, can be registered into the OpenTopography catalog, providing users a "one-stop shop" for such information. With several thousand active users, OpenTopography is an excellent example of a mature Spatial Data Infrastructure system that is enabling access to challenging data for research, education and outreach. Ongoing OpenTopography design and development work includes the archive and publication of datasets using digital object identifiers (DOIs); creation of a more flexible and scalable high-performance environment for processing of large datasets; expanded support for satellite and terrestrial lidar; and creation of a "pluggable" infrastructure for third-party programs and algorithms. OpenTopography has successfully created a facility for sharing lidar data. In the project's next phase, we are working to enable equally easy and successful sharing of services for processing and analysis of these data.
Wu, Jiayi; Ma, Yong-Bei; Congdon, Charles; Brett, Bevin; Chen, Shuobing; Xu, Yaofang; Ouyang, Qi
2017-01-01
Structural heterogeneity in single-particle cryo-electron microscopy (cryo-EM) data represents a major challenge for high-resolution structure determination. Unsupervised classification may serve as the first step in the assessment of structural heterogeneity. However, traditional algorithms for unsupervised classification, such as K-means clustering and maximum likelihood optimization, may classify images into wrong classes with decreasing signal-to-noise-ratio (SNR) in the image data, yet demand increased computational costs. Overcoming these limitations requires further development of clustering algorithms for high-performance cryo-EM data processing. Here we introduce an unsupervised single-particle clustering algorithm derived from a statistical manifold learning framework called generative topographic mapping (GTM). We show that unsupervised GTM clustering improves classification accuracy by about 40% in the absence of input references for data with lower SNRs. Applications to several experimental datasets suggest that our algorithm can detect subtle structural differences among classes via a hierarchical clustering strategy. After code optimization over a high-performance computing (HPC) environment, our software implementation was able to generate thousands of reference-free class averages within hours in a massively parallel fashion, which allows a significant improvement on ab initio 3D reconstruction and assists in the computational purification of homogeneous datasets for high-resolution visualization. PMID:28786986
Wu, Jiayi; Ma, Yong-Bei; Congdon, Charles; Brett, Bevin; Chen, Shuobing; Xu, Yaofang; Ouyang, Qi; Mao, Youdong
2017-01-01
Structural heterogeneity in single-particle cryo-electron microscopy (cryo-EM) data represents a major challenge for high-resolution structure determination. Unsupervised classification may serve as the first step in the assessment of structural heterogeneity. However, traditional algorithms for unsupervised classification, such as K-means clustering and maximum likelihood optimization, may classify images into wrong classes with decreasing signal-to-noise-ratio (SNR) in the image data, yet demand increased computational costs. Overcoming these limitations requires further development of clustering algorithms for high-performance cryo-EM data processing. Here we introduce an unsupervised single-particle clustering algorithm derived from a statistical manifold learning framework called generative topographic mapping (GTM). We show that unsupervised GTM clustering improves classification accuracy by about 40% in the absence of input references for data with lower SNRs. Applications to several experimental datasets suggest that our algorithm can detect subtle structural differences among classes via a hierarchical clustering strategy. After code optimization over a high-performance computing (HPC) environment, our software implementation was able to generate thousands of reference-free class averages within hours in a massively parallel fashion, which allows a significant improvement on ab initio 3D reconstruction and assists in the computational purification of homogeneous datasets for high-resolution visualization.
Analysis of long term trends of precipitation estimates acquired using radar network in Turkey
NASA Astrophysics Data System (ADS)
Tugrul Yilmaz, M.; Yucel, Ismail; Kamil Yilmaz, Koray
2016-04-01
Precipitation estimates, a vital input in many hydrological and agricultural studies, can be obtained using many different platforms (ground station-, radar-, model-, satellite-based). Satellite- and model-based estimates are spatially continuous datasets, however they lack the high resolution information many applications often require. Station-based values are actual precipitation observations, however they suffer from their nature that they are point data. These datasets may be interpolated however such end-products may have large errors over remote locations with different climate/topography/etc than the areas stations are installed. Radars have the particular advantage of having high spatial resolution information over land even though accuracy of radar-based precipitation estimates depends on the Z-R relationship, mountain blockage, target distance from the radar, spurious echoes resulting from anomalous propagation of the radar beam, bright band contamination and ground clutter. A viable method to obtain spatially and temporally high resolution consistent precipitation information is merging radar and station data to take advantage of each retrieval platform. An optimally merged product is particularly important in Turkey where complex topography exerts strong controls on the precipitation regime and in turn hampers observation efforts. There are currently 10 (additional 7 are planned) weather radars over Turkey obtaining precipitation information since 2007. This study aims to optimally merge radar precipitation data with station based observations to introduce a station-radar blended precipitation product. This study was supported by TUBITAK fund # 114Y676.
Prototype Global Burnt Area Algorithm Using a Multi-sensor Approach
NASA Astrophysics Data System (ADS)
López Saldaña, G.; Pereira, J.; Aires, F.
2013-05-01
One of the main limitations of products derived from remotely-sensed data is the length of the data records available for climate studies. The Advanced Very High Resolution Radiometer (AVHRR) long-term data record (LTDR) comprises a daily global atmospherically-corrected surface reflectance dataset at 0.05Deg spatial resolution and is available for the 1981-1999 time period. The Moderate Resolution Imaging Spectroradiometer (MODIS) instrument has been on orbit in the Terra platform since late 1999 and in Aqua since mid 2002; surface reflectance products, MYD09CMG and MOD09CMG, are available at 0.05Deg spatial resolution. Fire is strong cause of land surface change and emissions of greenhouse gases around the globe. A global long-term identification of areas affected by fire is needed to analyze trends and fire-clime relationships. A burnt area algorithm can be seen as a change point detection problem where there is an abrupt change in the surface reflectance due to the biomass burning. Using the AVHRR-LTDR and the aforementioned MODIS products, a time series of bidirectional reflectance distribution function (BRDF) corrected surface reflectance was generated using the daily observations and constraining the BRDF model inversion using a climatology of BRDF parameters derived from 12 years of MODIS data. The identification of the burnt area was performed using a t-test in the pre- and post-fire reflectance values and a change point detection algorithm, then spectral constraints were applied to flag changes caused by natural land processes like vegetation seasonality or flooding. Additional temporal constraints are applied focusing in the persistence of the affected areas. Initial results for years 1998 to 2002, show spatio-temporal coherence but further analysis is required and a formal rigorous validation will be applied using burn scars identified from high-resolution datasets.
NASA Astrophysics Data System (ADS)
Zhang, Xuezhen; Xiong, Zhe; Zheng, Jingyun; Ge, Quansheng
2018-02-01
The community of climate change impact assessments and adaptations research needs regional high-resolution (spatial) meteorological data. This study produced two downscaled precipitation datasets with spatial resolutions of as high as 3 km by 3 km for the Heihe River Basin (HRB) from 2011 to 2014 using the Weather Research and Forecast (WRF) model nested with Final Analysis (FNL) from the National Center for Environmental Prediction (NCEP) and ERA-Interim from the European Centre for Medium-Range Weather Forecasts (ECMWF) (hereafter referred to as FNLexp and ERAexp, respectively). Both of the downscaling simulations generally reproduced the observed spatial patterns of precipitation. However, users should keep in mind that the two downscaled datasets are not exactly the same in terms of observations. In comparison to the remote sensing-based estimation, the FNLexp produced a bias of heavy precipitation centers. In comparison to the ground gauge-based measurements, for the warm season (May to September), the ERAexp produced more precipitation (root-mean-square error (RMSE) = 295.4 mm, across the 43 sites) and more heavy rainfall days, while the FNLexp produced less precipitation (RMSE = 115.6 mm) and less heavy rainfall days. Both the ERAexp and FNLexp produced considerably more precipitation for the cold season (October to April) with RMSE values of 119.5 and 32.2 mm, respectively, and more heavy precipitation days. Along with simulating a higher number of heavy precipitation days, both the FNLexp and ERAexp also simulated stronger extreme precipitation. Sensitivity experiments show that the bias of these simulations is much more sensitive to micro-physical parameterizations than to the spatial resolution of topography data. For the HRB, application of the WSM3 scheme may improve the performance of the WRF model.
Building Change Detection in Very High Resolution Satellite Stereo Image Time Series
NASA Astrophysics Data System (ADS)
Tian, J.; Qin, R.; Cerra, D.; Reinartz, P.
2016-06-01
There is an increasing demand for robust methods on urban sprawl monitoring. The steadily increasing number of high resolution and multi-view sensors allows producing datasets with high temporal and spatial resolution; however, less effort has been dedicated to employ very high resolution (VHR) satellite image time series (SITS) to monitor the changes in buildings with higher accuracy. In addition, these VHR data are often acquired from different sensors. The objective of this research is to propose a robust time-series data analysis method for VHR stereo imagery. Firstly, the spatial-temporal information of the stereo imagery and the Digital Surface Models (DSMs) generated from them are combined, and building probability maps (BPM) are calculated for all acquisition dates. In the second step, an object-based change analysis is performed based on the derivative features of the BPM sets. The change consistence between object-level and pixel-level are checked to remove any outlier pixels. Results are assessed on six pairs of VHR satellite images acquired within a time span of 7 years. The evaluation results have proved the efficiency of the proposed method.
Using high-resolution displays for high-resolution cardiac data.
Goodyer, Christopher; Hodrien, John; Wood, Jason; Kohl, Peter; Brodlie, Ken
2009-07-13
The ability to perform fast, accurate, high-resolution visualization is fundamental to improving our understanding of anatomical data. As the volumes of data increase from improvements in scanning technology, the methods applied to visualization must evolve. In this paper, we address the interactive display of data from high-resolution magnetic resonance imaging scanning of a rabbit heart and subsequent histological imaging. We describe a visualization environment involving a tiled liquid crystal display panel display wall and associated software, which provides an interactive and intuitive user interface. The oView software is an OpenGL application that is written for the VR Juggler environment. This environment abstracts displays and devices away from the application itself, aiding portability between different systems, from desktop PCs to multi-tiled display walls. Portability between display walls has been demonstrated through its use on walls at the universities of both Leeds and Oxford. We discuss important factors to be considered for interactive two-dimensional display of large three-dimensional datasets, including the use of intuitive input devices and level of detail aspects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bautista, Julian E.; Busca, Nicolas G.; Bailey, Stephen
We describe mock data-sets generated to simulate the high-redshift quasar sample in Data Release 11 (DR11) of the SDSS-III Baryon Oscillation Spectroscopic Survey (BOSS). The mock spectra contain Lyα forest correlations useful for studying the 3D correlation function including Baryon Acoustic Oscillations (BAO). They also include astrophysical effects such as quasar continuum diversity and high-density absorbers, instrumental effects such as noise and spectral resolution, as well as imperfections introduced by the SDSS pipeline treatment of the raw data. The Lyα forest BAO analysis of the BOSS collaboration, described in Delubac et al. 2014, has used these mock data-sets to developmore » and cross-check analysis procedures prior to performing the BAO analysis on real data, and for continued systematic cross checks. Tests presented here show that the simulations reproduce sufficiently well important characteristics of real spectra. These mock data-sets will be made available together with the data at the time of the Data Release 11.« less
Image Quality Ranking Method for Microscopy
Koho, Sami; Fazeli, Elnaz; Eriksson, John E.; Hänninen, Pekka E.
2016-01-01
Automated analysis of microscope images is necessitated by the increased need for high-resolution follow up of events in time. Manually finding the right images to be analyzed, or eliminated from data analysis are common day-to-day problems in microscopy research today, and the constantly growing size of image datasets does not help the matter. We propose a simple method and a software tool for sorting images within a dataset, according to their relative quality. We demonstrate the applicability of our method in finding good quality images in a STED microscope sample preparation optimization image dataset. The results are validated by comparisons to subjective opinion scores, as well as five state-of-the-art blind image quality assessment methods. We also show how our method can be applied to eliminate useless out-of-focus images in a High-Content-Screening experiment. We further evaluate the ability of our image quality ranking method to detect out-of-focus images, by extensive simulations, and by comparing its performance against previously published, well-established microscopy autofocus metrics. PMID:27364703
Leveling data in geochemical mapping: scope of application, pros and cons of existing methods
NASA Astrophysics Data System (ADS)
Pereira, Benoît; Vandeuren, Aubry; Sonnet, Philippe
2017-04-01
Geochemical mapping successfully met a range of needs from mineral exploration to environmental management. In Europe and around the world numerous geochemical datasets already exist. These datasets may originate from geochemical mapping projects or from the collection of sample analyses requested by environmental protection regulatory bodies. Combining datasets can be highly beneficial for establishing geochemical maps with increased resolution and/or coverage area. However this practice requires assessing the equivalence between datasets and, if needed, applying data leveling to remove possible biases between datasets. In the literature, several procedures for assessing dataset equivalence and leveling data are proposed. Daneshfar & Cameron (1998) proposed a method for the leveling of two adjacent datasets while Pereira et al. (2016) proposed two methods for the leveling of datasets that contain records located within the same geographical area. Each discussed method requires its own set of assumptions (underlying populations of data, spatial distribution of data, etc.). Here we propose to discuss the scope of application, pros, cons and practical recommendations for each method. This work is illustrated with several case studies in Wallonia (Southern Belgium) and in Europe involving trace element geochemical datasets. References: Daneshfar, B. & Cameron, E. (1998), Leveling geochemical data between map sheets, Journal of Geochemical Exploration 63(3), 189-201. Pereira, B.; Vandeuren, A.; Govaerts, B. B. & Sonnet, P. (2016), Assessing dataset equivalence and leveling data in geochemical mapping, Journal of Geochemical Exploration 168, 36-48.
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley T.; Case, Jonathan L.; Molthan, Andrew L.
2012-01-01
The Short-term Prediction Research and Transition (SPoRT) Center is a collaborative partnership between NASA and operational forecasting partners, including a number of National Weather Service forecast offices. SPoRT provides real-time NASA products and capabilities to help its partners address specific operational forecast challenges. One challenge that forecasters face is using guidance from local and regional deterministic numerical models configured at convection-allowing resolution to help assess a variety of mesoscale/convective-scale phenomena such as sea-breezes, local wind circulations, and mesoscale convective weather potential on a given day. While guidance from convection-allowing models has proven valuable in many circumstances, the potential exists for model improvements by incorporating more representative land-water surface datasets, and by assimilating retrieved temperature and moisture profiles from hyper-spectral sounders. In order to help increase the accuracy of deterministic convection-allowing models, SPoRT produces real-time, 4-km CONUS forecasts using a configuration of the Weather Research and Forecasting (WRF) model (hereafter SPoRT-WRF) that includes unique NASA products and capabilities including 4-km resolution soil initialization data from the Land Information System (LIS), 2-km resolution SPoRT SST composites over oceans and large water bodies, high-resolution real-time Green Vegetation Fraction (GVF) composites derived from the Moderate-resolution Imaging Spectroradiometer (MODIS) instrument, and retrieved temperature and moisture profiles from the Atmospheric Infrared Sounder (AIRS) and Infrared Atmospheric Sounding Interferometer (IASI). NCAR's Model Evaluation Tools (MET) verification package is used to generate statistics of model performance compared to in situ observations and rainfall analyses for three months during the summer of 2012 (June-August). Detailed analyses of specific severe weather outbreaks during the summer will be presented to assess the potential added-value of the SPoRT datasets and data assimilation methodology compared to a WRF configuration without the unique datasets and data assimilation.
A 150 year record of annual Bristlecone Pine 14C from the second millennium BC
NASA Astrophysics Data System (ADS)
Pearson, Charlotte; Salzer, Matthew; Brewer, Peter; Hodgins, Gregory; Jull, A. J. Timothy; Lange, Todd; Cruz, Richard; Brown, David; Boswijk, Gretel
2017-04-01
The Interdisciplinary Chronology of Civilizations Project (ICCP) at the University of Arizona (UA) aims to resolve longstanding chronological issues for Aegean and Near Eastern archaeology. A central component of this work is the production of annual resolution sequences of 14C from securely anchored tree-ring chronologies. Contemporary growth rings from Northern and Southern Hemisphere locations will be tested against a dataset of consecutive annual resolution 14C measurements from tree-rings of securely dated North American bristlecone pine (Pinus longaeva D.K. Bailey). These data will be used in a number of ways: to investigate potential issues with the current IntCal dataset due to interpolation, smoothing, or the inclusion of annual scale rapid changes in 14C; to identify 14C off-sets; to evaluate whether annual determinations of 14C present sufficient advantages for dating to justify the substantial costs involved in creating an annual resolution calibration curve; to explore whether the degree of reproducibility between species and growth locations justifies the construction of regional curves or allows us to pioneer 'annual resolution wigglematching' to anchor substantial floating tree-ring chronologies from Mediterranean archaeological contexts, and; if new rapid changes in 14C (aka 'spikes') are discovered, to use these to achieve this same goal. The initial focus of the project is the first and second millennium BC. From this period we present 150 annual 14C determinations from bristlecone pine and explore preliminary findings based on comparisons with the existing IntCal dataset, decadal data from the Mediterranean, and some contemporary years from Irish Oak (Quercus spp.) and New Zealand Kauri (Agathis australis (D. Don) Lindl.). This work, in combination with results from another UA project team (see abstract by Jull et al.) helps confirm the potential of the bristlecone pine archive for high resolution radiocarbon research.
High resolution wetland mapping in West Siberian taiga zone for methane emission inventory
NASA Astrophysics Data System (ADS)
Terentieva, I. E.; Glagolev, M. V.; Lapshina, E. D.; Sabrekov, A. F.; Maksyutov, S. S.
2015-12-01
High latitude wetlands are important for understanding climate change risks because these environments sink carbon and emit methane. Fine scale heterogeneity of wetland landscapes pose challenges for producing the greenhouse gas flux inventories based on point observations. To reduce uncertainties at the regional scale, we mapped wetlands and water bodies in the taiga zone of West Siberia on a scene-by-scene basis using a supervised classification of Landsat imagery. The training dataset was based on high-resolution images and field data that were collected at 28 test areas. Classification scheme was aimed at methane inventory applications and included 7 wetland ecosystem types composing 9 wetland complexes in different proportions. Accuracy assessment based on 1082 validation polygons of 10 × 10 pixels indicated an overall map accuracy of 79 %. The total area of the wetlands and water bodies was estimated to be 52.4 Mha or 4-12 % of the global wetland area. Ridge-hollow complexes prevail in WS's taiga, occupying 33 % of the domain, followed by forested bogs or "ryams" (23 %), ridge-hollow-lake complexes (16 %), open fens (8 %), palsa complexes (7 %), open bogs (5 %), patterned fens (4 %), and swamps (4 %). Various oligotrophic environments are dominant among the wetland ecosystems, while fens cover only 14 % of the area. Because of the significant update in the wetland ecosystem coverage, a considerable revaluation of the total CH4 emissions from the entire region is expected. A new Landsat-based map of WS's taiga wetlands provides a benchmark for validation of coarse-resolution global land cover products and wetland datasets in high latitudes.
An empirical understanding of triple collocation evaluation measure
NASA Astrophysics Data System (ADS)
Scipal, Klaus; Doubkova, Marcela; Hegyova, Alena; Dorigo, Wouter; Wagner, Wolfgang
2013-04-01
Triple collocation method is an advanced evaluation method that has been used in the soil moisture field for only about half a decade. The method requires three datasets with an independent error structure that represent an identical phenomenon. The main advantages of the method are that it a) doesn't require a reference dataset that has to be considered to represent the truth, b) limits the effect of random and systematic errors of other two datasets, and c) simultaneously assesses the error of three datasets. The objective of this presentation is to assess the triple collocation error (Tc) of the ASAR Global Mode Surface Soil Moisture (GM SSM 1) km dataset and highlight problems of the method related to its ability to cancel the effect of error of ancillary datasets. In particular, the goal is to a) investigate trends in Tc related to the change in spatial resolution from 5 to 25 km, b) to investigate trends in Tc related to the choice of a hydrological model, and c) to study the relationship between Tc and other absolute evaluation methods (namely RMSE and Error Propagation EP). The triple collocation method is implemented using ASAR GM, AMSR-E, and a model (either AWRA-L, GLDAS-NOAH, or ERA-Interim). First, the significance of the relationship between the three soil moisture datasets was tested that is a prerequisite for the triple collocation method. Second, the trends in Tc related to the choice of the third reference dataset and scale were assessed. For this purpose the triple collocation is repeated replacing AWRA-L with two different globally available model reanalysis dataset operating at different spatial resolution (ERA-Interim and GLDAS-NOAH). Finally, the retrieved results were compared to the results of the RMSE and EP evaluation measures. Our results demonstrate that the Tc method does not eliminate the random and time-variant systematic errors of the second and the third dataset used in the Tc. The possible reasons include the fact a) that the TC method could not fully function with datasets acting at very different spatial resolutions, or b) that the errors were not fully independent as initially assumed.
Large-Scale Image Analytics Using Deep Learning
NASA Astrophysics Data System (ADS)
Ganguly, S.; Nemani, R. R.; Basu, S.; Mukhopadhyay, S.; Michaelis, A.; Votava, P.
2014-12-01
High resolution land cover classification maps are needed to increase the accuracy of current Land ecosystem and climate model outputs. Limited studies are in place that demonstrates the state-of-the-art in deriving very high resolution (VHR) land cover products. In addition, most methods heavily rely on commercial softwares that are difficult to scale given the region of study (e.g. continents to globe). Complexities in present approaches relate to (a) scalability of the algorithm, (b) large image data processing (compute and memory intensive), (c) computational cost, (d) massively parallel architecture, and (e) machine learning automation. In addition, VHR satellite datasets are of the order of terabytes and features extracted from these datasets are of the order of petabytes. In our present study, we have acquired the National Agricultural Imaging Program (NAIP) dataset for the Continental United States at a spatial resolution of 1-m. This data comes as image tiles (a total of quarter million image scenes with ~60 million pixels) and has a total size of ~100 terabytes for a single acquisition. Features extracted from the entire dataset would amount to ~8-10 petabytes. In our proposed approach, we have implemented a novel semi-automated machine learning algorithm rooted on the principles of "deep learning" to delineate the percentage of tree cover. In order to perform image analytics in such a granular system, it is mandatory to devise an intelligent archiving and query system for image retrieval, file structuring, metadata processing and filtering of all available image scenes. Using the Open NASA Earth Exchange (NEX) initiative, which is a partnership with Amazon Web Services (AWS), we have developed an end-to-end architecture for designing the database and the deep belief network (following the distbelief computing model) to solve a grand challenge of scaling this process across quarter million NAIP tiles that cover the entire Continental United States. The AWS core components that we use to solve this problem are DynamoDB along with S3 for database query and storage, ElastiCache shared memory architecture for image segmentation, Elastic Map Reduce (EMR) for image feature extraction, and the memory optimized Elastic Cloud Compute (EC2) for the learning algorithm.
Avulsion research using flume experiments and highly accurate and temporal-rich SfM datasets
NASA Astrophysics Data System (ADS)
Javernick, L.; Bertoldi, W.; Vitti, A.
2017-12-01
SfM's ability to produce high-quality, large-scale digital elevation models (DEMs) of complicated and rapidly evolving systems has made it a valuable technique for low-budget researchers and practitioners. While SfM has provided valuable datasets that capture single-flood event DEMs, there is an increasing scientific need to capture higher temporal resolution datasets that can quantify the evolutionary processes instead of pre- and post-flood snapshots. However, flood events' dangerous field conditions and image matching challenges (e.g. wind, rain) prevent quality SfM-image acquisition. Conversely, flume experiments offer opportunities to document flood events, but achieving consistent and accurate DEMs to detect subtle changes in dry and inundated areas remains a challenge for SfM (e.g. parabolic error signatures).This research aimed at investigating the impact of naturally occurring and manipulated avulsions on braided river morphology and on the encroachment of floodplain vegetation, using laboratory experiments. This required DEMs with millimeter accuracy and precision and at a temporal resolution to capture the processes. SfM was chosen as it offered the most practical method. Through redundant local network design and a meticulous ground control point (GCP) survey with a Leica Total Station in red laser configuration (reported 2 mm accuracy), the SfM residual errors compared to separate ground truthing data produced mean errors of 1.5 mm (accuracy) and standard deviations of 1.4 mm (precision) without parabolic error signatures. Lighting conditions in the flume were limited to uniform, oblique, and filtered LED strips, which removed glint and thus improved bed elevation mean errors to 4 mm, but errors were further reduced by means of an open source software for refraction correction. The obtained datasets have provided the ability to quantify how small flood events with avulsion can have similar morphologic and vegetation impacts as large flood events without avulsion. Further, this research highlights the potential application of SfM in the laboratory and ability to document physical and biological processes at greater spatial and temporal resolution. Marie Sklodowska-Curie Individual Fellowship: River-HMV, 656917
Capability of AVHRR data in discriminating rangeland cover mixtures
Senay, Gabriel B.; Elliott, R.L.
2002-01-01
A combination of high temporal resolution Advanced Very High Resolution Radiometer (AVHRR) data and high spatial information Map Information Analysis and Display System (MIADS) landuse/landcover data from the United States Department of Agriculture-Natural Resources Conservation Service (USDA-NRCS) were used to investigate the feasibility of using the combined dataset for regional evapotranspiration (ET) studies. It was shown that the biweekly maximum Normalized Difference Vegetation Index (NDVI) composite AVHRR data were capable of discriminating rangelands with different types of trees and shrubs species. AVHRR data also showed a potential to distinguish canopy cover differences within a mix of similar species. The combination of MIADS data and AVHRR data can be used to study temporal dynamics of various cover types for use in regional ET estimates.
NASA Astrophysics Data System (ADS)
Xu, Z.; Rhoades, A.; Johansen, H.; Ullrich, P. A.; Collins, W. D.
2017-12-01
Dynamical downscaling is widely used to properly characterize regional surface heterogeneities that shape the local hydroclimatology. However, the factors in dynamical downscaling, including the refinement of model horizontal resolution, large-scale forcing datasets and dynamical cores, have not been fully evaluated. Two cutting-edge global-to-regional downscaling methods are used to assess these, specifically the variable-resolution Community Earth System Model (VR-CESM) and the Weather Research & Forecasting (WRF) regional climate model, under different horizontal resolutions (28, 14, and 7 km). Two groups of WRF simulations are driven by either the NCEP reanalysis dataset (WRF_NCEP) or VR-CESM outputs (WRF_VRCESM) to evaluate the effects of the large-scale forcing datasets. The impacts of dynamical core are assessed by comparing the VR-CESM simulations to the coupled WRF_VRCESM simulations with the same physical parameterizations and similar grid domains. The simulated hydroclimatology (i.e., total precipitation, snow cover, snow water equivalent and surface temperature) are compared with the reference datasets. The large-scale forcing datasets are critical to the WRF simulations in more accurately simulating total precipitation, SWE and snow cover, but not surface temperature. Both the WRF and VR-CESM results highlight that no significant benefit is found in the simulated hydroclimatology by just increasing horizontal resolution refinement from 28 to 7 km. Simulated surface temperature is sensitive to the choice of dynamical core. WRF generally simulates higher temperatures than VR-CESM, alleviates the systematic cold bias of DJF temperatures over the California mountain region, but overestimates the JJA temperature in California's Central Valley.
The partitioning of total nitrate (TNO3) and total ammonium (TNH4) between gas and aerosol phases is studied with two thermodynamic equilibrium models, ISORROPIA and AIM, and three datasets: high time-resolution measurement data from the 1999 Atlanta SuperSite Experiment and from...
Metzger, Marc J.; Bunce, Robert G.H.; Jongman, Rob H.G.; Sayre, Roger G.; Trabucco, Antonio; Zomer, Robert
2013-01-01
Main conclusions: The GEnS provides a robust spatial analytical framework for the aggregation of local observations, identification of gaps in current monitoring efforts and systematic design of complementary and new monitoring and research. The dataset is available for non-commercial use through the GEO portal (http://www.geoportal.org).
Mesocell study area snow distributions for the Cold Land Processes Experiment (CLPX)
Glen E. Liston; Christopher A. Hiemstra; Kelly Elder; Donald W. Cline
2008-01-01
The Cold Land Processes Experiment (CLPX) had a goal of describing snow-related features over a wide range of spatial and temporal scales. This required linking disparate snow tools and datasets into one coherent, integrated package. Simulating realistic high-resolution snow distributions and features requires a snow-evolution modeling system (SnowModel) that can...
NASA Astrophysics Data System (ADS)
Baru, C.; Arrowsmith, R.; Crosby, C.; Nandigam, V.; Phan, M.; Cowart, C.
2012-04-01
OpenTopography is a cyberinfrastructure-based facility for online access to high-resolution topography and tools. The project is an outcome of the Geosciences Network (GEON) project, which was a research project funded several years ago in the US to investigate the use of cyberinfrastructure to support research and education in the geosciences. OpenTopography provides online access to large LiDAR point cloud datasets along with services for processing these data. Users are able to generate custom DEMs by invoking DEM services provided by OpenTopography with custom parameter values. Users can track the progress of their jobs, and a private myOpenTopo area retains job information and job outputs. Data available at OpenTopography are provided by a variety of data acquisition groups under joint agreements and memoranda of understanding (MoU). These include national facilities such as the National Center for Airborne Lidar Mapping, as well as local, state, and federal agencies. OpenTopography is also being designed as a hub for high-resolution topography resources. Datasets and services available at other locations can also be registered here, providing a "one-stop shop" for such information. We will describe the OpenTopography system architecture and its current set of features, including the service-oriented architecture, a job-tracking database, and social networking features. We will also describe several design and development activities underway to archive and publish datasets using digital object identifiers (DOIs); create a more flexible and scalable high-performance environment for processing of large datasets; extend support for satellite-based and terrestrial lidar as well as synthetic aperture radar (SAR) data; and create a "pluggable" infrastructure for third-party services. OpenTopography has successfully created a facility for sharing lidar data. In the next phase, we are developing a facility that will also enable equally easy and successful sharing of services related to these data.
A global dataset of crowdsourced land cover and land use reference data.
Fritz, Steffen; See, Linda; Perger, Christoph; McCallum, Ian; Schill, Christian; Schepaschenko, Dmitry; Duerauer, Martina; Karner, Mathias; Dresel, Christopher; Laso-Bayas, Juan-Carlos; Lesiv, Myroslava; Moorthy, Inian; Salk, Carl F; Danylo, Olha; Sturn, Tobias; Albrecht, Franziska; You, Liangzhi; Kraxner, Florian; Obersteiner, Michael
2017-06-13
Global land cover is an essential climate variable and a key biophysical driver for earth system models. While remote sensing technology, particularly satellites, have played a key role in providing land cover datasets, large discrepancies have been noted among the available products. Global land use is typically more difficult to map and in many cases cannot be remotely sensed. In-situ or ground-based data and high resolution imagery are thus an important requirement for producing accurate land cover and land use datasets and this is precisely what is lacking. Here we describe the global land cover and land use reference data derived from the Geo-Wiki crowdsourcing platform via four campaigns. These global datasets provide information on human impact, land cover disagreement, wilderness and land cover and land use. Hence, they are relevant for the scientific community that requires reference data for global satellite-derived products, as well as those interested in monitoring global terrestrial ecosystems in general.
Se-SAD serial femtosecond crystallography datasets from selenobiotinyl-streptavidin
Yoon, Chun Hong; DeMirci, Hasan; Sierra, Raymond G.; ...
2017-04-25
We provide a detailed description of selenobiotinyl-streptavidin (Se-B SA) co-crystal datasets recorded using the Coherent X-ray Imaging (CXI) instrument at the Linac Coherent Light Source (LCLS) for selenium single-wavelength anomalous diffraction (Se-SAD) structure determination. Se-B SA was chosen as the model system for its high affinity between biotin and streptavidin where the sulfur atom in the biotin molecule (C 10H 16N 2O 3S) is substituted with selenium. The dataset was collected at three different transmissions (100, 50, and 10%) using a serial sample chamber setup which allows for two sample chambers, a front chamber and a back chamber, to operatemore » simultaneously. Diffraction patterns from Se-B SA were recorded to a resolution of 1.9 Å. The dataset is publicly available through the Coherent X-ray Imaging Data Bank (CXIDB) and also on LCLS compute nodes as a resource for research and algorithm development.« less
Se-SAD serial femtosecond crystallography datasets from selenobiotinyl-streptavidin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Chun Hong; DeMirci, Hasan; Sierra, Raymond G.
We provide a detailed description of selenobiotinyl-streptavidin (Se-B SA) co-crystal datasets recorded using the Coherent X-ray Imaging (CXI) instrument at the Linac Coherent Light Source (LCLS) for selenium single-wavelength anomalous diffraction (Se-SAD) structure determination. Se-B SA was chosen as the model system for its high affinity between biotin and streptavidin where the sulfur atom in the biotin molecule (C 10H 16N 2O 3S) is substituted with selenium. The dataset was collected at three different transmissions (100, 50, and 10%) using a serial sample chamber setup which allows for two sample chambers, a front chamber and a back chamber, to operatemore » simultaneously. Diffraction patterns from Se-B SA were recorded to a resolution of 1.9 Å. The dataset is publicly available through the Coherent X-ray Imaging Data Bank (CXIDB) and also on LCLS compute nodes as a resource for research and algorithm development.« less
Crystal cryocooling distorts conformational heterogeneity in a model Michaelis complex of DHFR
Keedy, Daniel A.; van den Bedem, Henry; Sivak, David A.; Petsko, Gregory A.; Ringe, Dagmar; Wilson, Mark A.; Fraser, James S.
2014-01-01
Summary Most macromolecular X-ray structures are determined from cryocooled crystals, but it is unclear whether cryocooling distorts functionally relevant flexibility. Here we compare independently acquired pairs of high-resolution datasets of a model Michaelis complex of dihydrofolate reductase (DHFR), collected by separate groups at both room and cryogenic temperatures. These datasets allow us to isolate the differences between experimental procedures and between temperatures. Our analyses of multiconformer models and time-averaged ensembles suggest that cryocooling suppresses and otherwise modifies sidechain and mainchain conformational heterogeneity, quenching dynamic contact networks. Despite some idiosyncratic differences, most changes from room temperature to cryogenic temperature are conserved, and likely reflect temperature-dependent solvent remodeling. Both cryogenic datasets point to additional conformations not evident in the corresponding room-temperature datasets, suggesting that cryocooling does not merely trap pre-existing conformational heterogeneity. Our results demonstrate that crystal cryocooling consistently distorts the energy landscape of DHFR, a paragon for understanding functional protein dynamics. PMID:24882744
A global dataset of crowdsourced land cover and land use reference data
Fritz, Steffen; See, Linda; Perger, Christoph; McCallum, Ian; Schill, Christian; Schepaschenko, Dmitry; Duerauer, Martina; Karner, Mathias; Dresel, Christopher; Laso-Bayas, Juan-Carlos; Lesiv, Myroslava; Moorthy, Inian; Salk, Carl F.; Danylo, Olha; Sturn, Tobias; Albrecht, Franziska; You, Liangzhi; Kraxner, Florian; Obersteiner, Michael
2017-01-01
Global land cover is an essential climate variable and a key biophysical driver for earth system models. While remote sensing technology, particularly satellites, have played a key role in providing land cover datasets, large discrepancies have been noted among the available products. Global land use is typically more difficult to map and in many cases cannot be remotely sensed. In-situ or ground-based data and high resolution imagery are thus an important requirement for producing accurate land cover and land use datasets and this is precisely what is lacking. Here we describe the global land cover and land use reference data derived from the Geo-Wiki crowdsourcing platform via four campaigns. These global datasets provide information on human impact, land cover disagreement, wilderness and land cover and land use. Hence, they are relevant for the scientific community that requires reference data for global satellite-derived products, as well as those interested in monitoring global terrestrial ecosystems in general. PMID:28608851
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draxl, C.; Hodge, B. M.; Orwig, K.
2013-10-01
Regional wind integration studies in the United States require detailed wind power output data at many locations to perform simulations of how the power system will operate under high-penetration scenarios. The wind data sets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as be time synchronized with available load profiles. The Wind Integration National Dataset (WIND) Toolkit described in this paper fulfills these requirements. A wind resource dataset, wind power production time series, and simulated forecasts from a numerical weather predictionmore » model run on a nationwide 2-km grid at 5-min resolution will be made publicly available for more than 110,000 onshore and offshore wind power production sites.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Getman, Dan; Bush, Brian; Inman, Danny
Data used by the National Renewable Energy Laboratory (NREL) in energy analysis are often produced by industry and licensed or purchased for analysis. While this practice provides needed flexibility in selecting data for analysis it presents challenges in understanding the differences among multiple, ostensibly similar, datasets. As options for source data become more varied, it is important to be able to articulate why certain datasets were chosen and to ensure those include the data that best meet the boundaries and/or limitations of a particular analysis. This report represents the first of three phases of research intended to develop methods tomore » quantitatively assess and compare both input datasets and the results of analyses performed at NREL. This capability is critical to identifying tipping points in the costs or benefits of achieving high spatial and temporal resolution of input data.« less
SUMER-IRIS Observations of AR11875
NASA Astrophysics Data System (ADS)
Schmit, Donald; Innes, Davina
2014-05-01
We present results of the first joint observing campaign of IRIS and SOHO/SUMER. While the IRIS datasets provide information on the chromosphere and transition region, SUMER provides complementary diagnostics on the corona. On 2013-10-24, we observed an active region, AR11875, and the surrounding plage for approximately 4 hours using rapid-cadence observing programs. These datasets include spectra from a small C -class flare which occurs in conjunction with an Ellerman-bomb type event. Our analysis focusses on how the high spatial resolution and slit jaw imaging capabilities of IRIS shed light on the unresolved structure of transient events in the SUMER catalog.
NASA Astrophysics Data System (ADS)
Hiebl, Johann; Frei, Christoph
2018-04-01
Spatial precipitation datasets that are long-term consistent, highly resolved and extend over several decades are an increasingly popular basis for modelling and monitoring environmental processes and planning tasks in hydrology, agriculture, energy resources management, etc. Here, we present a grid dataset of daily precipitation for Austria meant to promote such applications. It has a grid spacing of 1 km, extends back till 1961 and is continuously updated. It is constructed with the classical two-tier analysis, involving separate interpolations for mean monthly precipitation and daily relative anomalies. The former was accomplished by kriging with topographic predictors as external drift utilising 1249 stations. The latter is based on angular distance weighting and uses 523 stations. The input station network was kept largely stationary over time to avoid artefacts on long-term consistency. Example cases suggest that the new analysis is at least as plausible as previously existing datasets. Cross-validation and comparison against experimental high-resolution observations (WegenerNet) suggest that the accuracy of the dataset depends on interpretation. Users interpreting grid point values as point estimates must expect systematic overestimates for light and underestimates for heavy precipitation as well as substantial random errors. Grid point estimates are typically within a factor of 1.5 from in situ observations. Interpreting grid point values as area mean values, conditional biases are reduced and the magnitude of random errors is considerably smaller. Together with a similar dataset of temperature, the new dataset (SPARTACUS) is an interesting basis for modelling environmental processes, studying climate change impacts and monitoring the climate of Austria.
Recommended GIS Analysis Methods for Global Gridded Population Data
NASA Astrophysics Data System (ADS)
Frye, C. E.; Sorichetta, A.; Rose, A.
2017-12-01
When using geographic information systems (GIS) to analyze gridded, i.e., raster, population data, analysts need a detailed understanding of several factors that affect raster data processing, and thus, the accuracy of the results. Global raster data is most often provided in an unprojected state, usually in the WGS 1984 geographic coordinate system. Most GIS functions and tools evaluate data based on overlay relationships (area) or proximity (distance). Area and distance for global raster data can be either calculated directly using the various earth ellipsoids or after transforming the data to equal-area/equidistant projected coordinate systems to analyze all locations equally. However, unlike when projecting vector data, not all projected coordinate systems can support such analyses equally, and the process of transforming raster data from one coordinate space to another often results unmanaged loss of data through a process called resampling. Resampling determines which values to use in the result dataset given an imperfect locational match in the input dataset(s). Cell size or resolution, registration, resampling method, statistical type, and whether the raster represents continuous or discreet information potentially influence the quality of the result. Gridded population data represent estimates of population in each raster cell, and this presentation will provide guidelines for accurately transforming population rasters for analysis in GIS. Resampling impacts the display of high resolution global gridded population data, and we will discuss how to properly handle pyramid creation using the Aggregate tool with the sum option to create overviews for mosaic datasets.
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; LaFontaine, Frank J.; Kumar, Sujay V.; Peters-Lidard, Christa D.
2012-01-01
Since June 2010, the NASA Short-term Prediction Research and Transition (SPoRT; Goodman et al. 2004; Darden et al. 2010; Stano et al. 2012; Fuell et al. 2012) Center has been generating a real-time Normalized Difference Vegetation Index (NDVI) and corresponding Green Vegetation Fraction (GVF) composite based on reflectances from NASA s Moderate Resolution Imaging Spectroradiometer (MODIS) instrument. This dataset is generated at 0.01 resolution across the Continental United States (CONUS), and updated daily. The goal of producing such a vegetation dataset is to improve over the default climatological GVF dataset in land surface and numerical weather prediction models, in order to have better simulations of heat and moisture exchange between the land surface and the planetary boundary layer. Details on the SPoRT/MODIS vegetation composite algorithm are presented in Case et al. (2011). Vegetation indices such as GVF and Leaf Area Index (LAI) are used by land surface models (LSMs) to represent the horizontal and vertical density of plant vegetation (Gutman and Ignatov 1998), in order to calculate transpiration, interception and radiative shading. Both of these indices are related to the NDVI; however, there is an inherent ambiguity in determining GVF and LAI simultaneously from NDVI, as described in Gutman and Ignatov (1998). One practice is to specify the LAI while allowing the GVF to vary both spatially and temporally, as is done in the Noah LSM (Chen and Dudhia 2001; Ek et al. 2003). Operational versions of Noah within several of the National Centers for Environmental Prediction (NCEP) global and regional modeling systems hold the LAI fixed, while the GVF varies according to a global monthly climatology. This GVF climatology was derived from NDVI data on the NOAA Advanced Very High Resolution Radiometer (AVHRR) polar orbiting satellite, using information from 1985 to 1991 (Gutman and Ignatov 1998; Jiang et al. 2010). Representing data at the mid-point of every month, the climatological dataset is on a grid with 0.144 (16 km) spatial resolution and is distributed with the community WRF model (Ek et al. 2003; Jiang et al. 2010; Skamarock et al. 2008).
Transmembrane helix prediction: a comparative evaluation and analysis.
Cuthbertson, Jonathan M; Doyle, Declan A; Sansom, Mark S P
2005-06-01
The prediction of transmembrane (TM) helices plays an important role in the study of membrane proteins, given the relatively small number (approximately 0.5% of the PDB) of high-resolution structures for such proteins. We used two datasets (one redundant and one non-redundant) of high-resolution structures of membrane proteins to evaluate and analyse TM helix prediction. The redundant (non-redundant) dataset contains structure of 434 (268) TM helices, from 112 (73) polypeptide chains. Of the 434 helices in the dataset, 20 may be classified as 'half-TM' as they are too short to span a lipid bilayer. We compared 13 TM helix prediction methods, evaluating each method using per segment, per residue and termini scores. Four methods consistently performed well: SPLIT4, TMHMM2, HMMTOP2 and TMAP. However, even the best methods were in error by, on average, about two turns of helix at the TM helix termini. The best and worst case predictions for individual proteins were analysed. In particular, the performance of the various methods and of a consensus prediction method, were compared for a number of proteins (e.g. SecY, ClC, KvAP) containing half-TM helices. The difficulties of predicting half-TM helices suggests that current prediction methods successfully embody the two-state model of membrane protein folding, but do not accommodate a third stage in which, e.g., short helices and re-entrant loops fold within a bundle of stable TM helices.
Datasets on hub-height wind speed comparisons for wind farms in California.
Wang, Meina; Ullrich, Paul; Millstein, Dev
2018-08-01
This article includes the description of data information related to the research article entitled "The future of wind energy in California: Future projections with the Variable-Resolution CESM"[1], with reference number RENE_RENE-D-17-03392. Datasets from the Variable-Resolution CESM, Det Norske Veritas Germanischer Lloyd Virtual Met, MERRA-2, CFSR, NARR, ISD surface observations, and upper air sounding observations were used for calculating and comparing hub-height wind speed at multiple major wind farms across California. Information on hub-height wind speed interpolation and power curves at each wind farm sites are also presented. All datasets, except Det Norske Veritas Germanischer Lloyd Virtual Met, are publicly available for future analysis.
Modelling land use change in the Ganga basin
NASA Astrophysics Data System (ADS)
Moulds, Simon; Mijic, Ana; Buytaert, Wouter
2014-05-01
Over recent decades the green revolution in India has driven substantial environmental change. Modelling experiments have identified northern India as a "hot spot" of land-atmosphere coupling strength during the boreal summer. However, there is a wide range of sensitivity of atmospheric variables to soil moisture between individual climate models. The lack of a comprehensive land use change dataset to force climate models has been identified as a major contributor to model uncertainty. This work aims to construct a monthly time series dataset of land use change for the period 1966 to 2007 for northern India to improve the quantification of regional hydrometeorological feedbacks. The Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on board the Aqua and Terra satellites provides near-continuous remotely sensed datasets from 2000 to the present day. However, the quality and availability of satellite products before 2000 is poor. To complete the dataset MODIS images are extrapolated back in time using the Conversion of Land Use and its Effects at Small regional extent (CLUE-S) modelling framework, recoded in the R programming language to overcome limitations of the original interface. Non-spatial estimates of land use area published by the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) for the study period, available on an annual, district-wise basis, are used as a direct model input. Land use change is allocated spatially as a function of biophysical and socioeconomic drivers identified using logistic regression. The dataset will provide an essential input to a high-resolution, physically-based land-surface model to generate the lower boundary condition to assess the impact of land use change on regional climate.
Vanderhoof, Melanie; Distler, Hayley; Lang, Megan W.; Alexander, Laurie C.
2018-01-01
The dependence of downstream waters on upstream ecosystems necessitates an improved understanding of watershed-scale hydrological interactions including connections between wetlands and streams. An evaluation of such connections is challenging when, (1) accurate and complete datasets of wetland and stream locations are often not available and (2) natural variability in surface-water extent influences the frequency and duration of wetland/stream connectivity. The Upper Choptank River watershed on the Delmarva Peninsula in eastern Maryland and Delaware is dominated by a high density of small, forested wetlands. In this analysis, wetland/stream surface water connections were quantified using multiple wetland and stream datasets, including headwater streams and depressions mapped from a lidar-derived digital elevation model. Surface-water extent was mapped across the watershed for spring 2015 using Landsat-8, Radarsat-2 and Worldview-3 imagery. The frequency of wetland/stream connections increased as a more complete and accurate stream dataset was used and surface-water extent was included, in particular when the spatial resolution of the imagery was finer (i.e., <10 m). Depending on the datasets used, 12–60% of wetlands by count (21–93% of wetlands by area) experienced surface-water interactions with streams during spring 2015. This translated into a range of 50–94% of the watershed contributing direct surface water runoff to streamflow. This finding suggests that our interpretation of the frequency and duration of wetland/stream connections will be influenced not only by the spatial and temporal characteristics of wetlands, streams and potential flowpaths, but also by the completeness, accuracy and resolution of input datasets.
Pengra, Bruce; Long, Jordan; Dahal, Devendra; Stehman, Stephen V.; Loveland, Thomas R.
2015-01-01
The methodology for selection, creation, and application of a global remote sensing validation dataset using high resolution commercial satellite data is presented. High resolution data are obtained for a stratified random sample of 500 primary sampling units (5 km × 5 km sample blocks), where the stratification based on Köppen climate classes is used to distribute the sample globally among biomes. The high resolution data are classified to categorical land cover maps using an analyst mediated classification workflow. Our initial application of these data is to evaluate a global 30 m Landsat-derived, continuous field tree cover product. For this application, the categorical reference classification produced at 2 m resolution is converted to percent tree cover per 30 m pixel (secondary sampling unit)for comparison to Landsat-derived estimates of tree cover. We provide example results (based on a subsample of 25 sample blocks in South America) illustrating basic analyses of agreement that can be produced from these reference data. Commercial high resolution data availability and data quality are shown to provide a viable means of validating continuous field tree cover. When completed, the reference classifications for the full sample of 500 blocks will be released for public use.
Spatial Downscaling of Alien Species Presences using Machine Learning
NASA Astrophysics Data System (ADS)
Daliakopoulos, Ioannis N.; Katsanevakis, Stelios; Moustakas, Aristides
2017-07-01
Large scale, high-resolution data on alien species distributions are essential for spatially explicit assessments of their environmental and socio-economic impacts, and management interventions for mitigation. However, these data are often unavailable. This paper presents a method that relies on Random Forest (RF) models to distribute alien species presence counts at a finer resolution grid, thus achieving spatial downscaling. A sufficiently large number of RF models are trained using random subsets of the dataset as predictors, in a bootstrapping approach to account for the uncertainty introduced by the subset selection. The method is tested with an approximately 8×8 km2 grid containing floral alien species presence and several indices of climatic, habitat, land use covariates for the Mediterranean island of Crete, Greece. Alien species presence is aggregated at 16×16 km2 and used as a predictor of presence at the original resolution, thus simulating spatial downscaling. Potential explanatory variables included habitat types, land cover richness, endemic species richness, soil type, temperature, precipitation, and freshwater availability. Uncertainty assessment of the spatial downscaling of alien species’ occurrences was also performed and true/false presences and absences were quantified. The approach is promising for downscaling alien species datasets of larger spatial scale but coarse resolution, where the underlying environmental information is available at a finer resolution than the alien species data. Furthermore, the RF architecture allows for tuning towards operationally optimal sensitivity and specificity, thus providing a decision support tool for designing a resource efficient alien species census.
Los Angeles and San Diego Margin High-Resolution Multibeam Bathymetry and Backscatter Data
Dartnell, Peter; Gardner, James V.; Mayer, Larry A.; Hughes-Clarke, John E.
2004-01-01
Summary -- The U.S. Geological Survey in cooperation with the University of New Hampshire and the University of New Brunswick mapped the nearshore regions off Los Angeles and San Diego, California using multibeam echosounders. Multibeam bathymetry and co-registered, corrected acoustic backscatter were collected in water depths ranging from about 3 to 900 m offshore Los Angeles and in water depths ranging from about 17 to 1230 m offshore San Diego. Continuous, 16-m spatial resolution, GIS ready format data of the entire Los Angeles Margin and San Diego Margin are available online as separate USGS Open-File Reports. For ongoing research, the USGS has processed sub-regions within these datasets at finer resolutions. The resolution of each sub-region was determined by the density of soundings within the region. This Open-File Report contains the finer resolution multibeam bathymetry and acoustic backscatter data that the USGS, Western Region, Coastal and Marine Geology Team has processed into GIS ready formats as of April 2004. The data are available in ArcInfo GRID and XYZ formats. See the Los Angeles or San Diego maps for the sub-region locations. These datasets in their present form were not originally intended for publication. The bathymetry and backscatter have data-collection and processing artifacts. These data are being made public to fulfill a Freedom of Information Act request. Care must be taken not to confuse artifacts with real seafloor morphology and acoustic backscatter.
Online Visualization and Analysis of Merged Global Geostationary Satellite Infrared Dataset
NASA Technical Reports Server (NTRS)
Liu, Zhong; Ostrenga, D.; Leptoukh, G.; Mehta, A.
2008-01-01
The NASA Goddard Earth Sciences Data Information Services Center (GES DISC) is home of Tropical Rainfall Measuring Mission (TRMM) data archive. The global merged IR product also known as the NCEP/CPC 4-km Global (60 degrees N - 60 degrees S) IR Dataset, is one of TRMM ancillary datasets. They are globally merged (60 degrees N - 60 degrees S) pixel-resolution (4 km) IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 and GMS). The availability of data from METEOSAT-5, which is located at 63E at the present time, yields a unique opportunity for total global (60 degrees N- 60 degrees S) coverage. The GES DISC has collected over 8 years of the data beginning from February of 2000. This high temporal resolution dataset can not only provide additional background information to TRMM and other satellite missions, but also allow observing a wide range of meteorological phenomena from space, such as, mesoscale convection systems, tropical cyclones, hurricanes, etc. The dataset can also be used to verify model simulations. Despite that the data can be downloaded via ftp, however, its large volume poses a challenge for many users. A single file occupies about 70 MB disk space and there is a total of approximately 73,000 files (approximately 4.5 TB) for the past 8 years. In order to facilitate data access, we have developed a web prototype to allow users to conduct online visualization and analysis of this dataset. With a web browser and few mouse clicks, users can have a full access to over 8 year and over 4.5 TB data and generate black and white IR imagery and animation without downloading any software and data. In short, you can make your own images! Basic functions include selection of area of interest, single imagery or animation, a time skip capability for different temporal resolution and image size. Users can save an animation as a file (animated gif) and import it in other presentation software, such as, Microsoft PowerPoint. The prototype will be integrated into GIOVANNI and existing GIOVANNI capabilities, such as, data download, Google Earth KMZ, etc will be available. Users will also be able to access other data products in the GIOVANNI family.
Hernández-Ceballos, M A; Skjøth, C A; García-Mozo, H; Bolívar, J P; Galán, C
2014-12-01
Airborne pollen transport at micro-, meso-gamma and meso-beta scales must be studied by atmospheric models, having special relevance in complex terrain. In these cases, the accuracy of these models is mainly determined by the spatial resolution of the underlying meteorological dataset. This work examines how meteorological datasets determine the results obtained from atmospheric transport models used to describe pollen transport in the atmosphere. We investigate the effect of the spatial resolution when computing backward trajectories with the HYSPLIT model. We have used meteorological datasets from the WRF model with 27, 9 and 3 km resolutions and from the GDAS files with 1° resolution. This work allows characterizing atmospheric transport of Olea pollen in a region with complex flows. The results show that the complex terrain affects the trajectories and this effect varies with the different meteorological datasets. Overall, the change from GDAS to WRF-ARW inputs improves the analyses with the HYSPLIT model, thereby increasing the understanding the pollen episode. The results indicate that a spatial resolution of at least 9 km is needed to simulate atmospheric flows that are considerable affected by the relief of the landscape. The results suggest that the appropriate meteorological files should be considered when atmospheric models are used to characterize the atmospheric transport of pollen on micro-, meso-gamma and meso-beta scales. Furthermore, at these scales, the results are believed to be generally applicable for related areas such as the description of atmospheric transport of radionuclides or in the definition of nuclear-radioactivity emergency preparedness.
NASA Astrophysics Data System (ADS)
Hernández-Ceballos, M. A.; Skjøth, C. A.; García-Mozo, H.; Bolívar, J. P.; Galán, C.
2014-12-01
Airborne pollen transport at micro-, meso-gamma and meso-beta scales must be studied by atmospheric models, having special relevance in complex terrain. In these cases, the accuracy of these models is mainly determined by the spatial resolution of the underlying meteorological dataset. This work examines how meteorological datasets determine the results obtained from atmospheric transport models used to describe pollen transport in the atmosphere. We investigate the effect of the spatial resolution when computing backward trajectories with the HYSPLIT model. We have used meteorological datasets from the WRF model with 27, 9 and 3 km resolutions and from the GDAS files with 1 ° resolution. This work allows characterizing atmospheric transport of Olea pollen in a region with complex flows. The results show that the complex terrain affects the trajectories and this effect varies with the different meteorological datasets. Overall, the change from GDAS to WRF-ARW inputs improves the analyses with the HYSPLIT model, thereby increasing the understanding the pollen episode. The results indicate that a spatial resolution of at least 9 km is needed to simulate atmospheric flows that are considerable affected by the relief of the landscape. The results suggest that the appropriate meteorological files should be considered when atmospheric models are used to characterize the atmospheric transport of pollen on micro-, meso-gamma and meso-beta scales. Furthermore, at these scales, the results are believed to be generally applicable for related areas such as the description of atmospheric transport of radionuclides or in the definition of nuclear-radioactivity emergency preparedness.
High-Performance Tiled WMS and KML Web Server
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2007-01-01
This software is an Apache 2.0 module implementing a high-performance map server to support interactive map viewers and virtual planet client software. It can be used in applications that require access to very-high-resolution geolocated images, such as GIS, virtual planet applications, and flight simulators. It serves Web Map Service (WMS) requests that comply with a given request grid from an existing tile dataset. It also generates the KML super-overlay configuration files required to access the WMS image tiles.
Downscaling of Remotely Sensed Land Surface Temperature with multi-sensor based products
NASA Astrophysics Data System (ADS)
Jeong, J.; Baik, J.; Choi, M.
2016-12-01
Remotely sensed satellite data provides a bird's eye view, which allows us to understand spatiotemporal behavior of hydrologic variables at global scale. Especially, geostationary satellite continuously observing specific regions is useful to monitor the fluctuations of hydrologic variables as well as meteorological factors. However, there are still problems regarding spatial resolution whether the fine scale land cover can be represented with the spatial resolution of the satellite sensor, especially in the area of complex topography. To solve these problems, many researchers have been trying to establish the relationship among various hydrological factors and combine images from multi-sensor to downscale land surface products. One of geostationary satellite, Communication, Ocean and Meteorological Satellite (COMS), has Meteorological Imager (MI) and Geostationary Ocean Color Imager (GOCI). MI performing the meteorological mission produce Rainfall Intensity (RI), Land Surface Temperature (LST), and many others every 15 minutes. Even though it has high temporal resolution, low spatial resolution of MI data is treated as major research problem in many studies. This study suggests a methodology to downscale 4 km LST datasets derived from MI in finer resolution (500m) by using GOCI datasets in Northeast Asia. Normalized Difference Vegetation Index (NDVI) recognized as variable which has significant relationship with LST are chosen to estimate LST in finer resolution. Each pixels of NDVI and LST are separated according to land cover provided from MODerate resolution Imaging Spectroradiometer (MODIS) to achieve more accurate relationship. Downscaled LST are compared with LST observed from Automated Synoptic Observing System (ASOS) for assessing its accuracy. The downscaled LST results of this study, coupled with advantage of geostationary satellite, can be applied to observe hydrologic process efficiently.
NASA Astrophysics Data System (ADS)
Ludwig, V. S.; Istomina, L.; Spreen, G.
2017-12-01
Arctic sea ice concentration (SIC), the fraction of a grid cell that is covered by sea ice, is relevant for a multitude of branches: physics (heat/momentum exchange), chemistry (gas exchange), biology (photosynthesis), navigation (location of pack ice) and others. It has been observed from passive microwave (PMW) radiometers on satellites continuously since 1979, providing an almost 40-year time series. However, the resolution is limited to typically 25 km which is good enough for climate studies but too coarse to properly resolve the ice edge or to show leads. The highest resolution from PMW sensors today is 5 km of the AMSR2 89 GHz channels. Thermal infrared (TIR) and visible (VIS) measurements provide much higher resolutions between 1 km (TIR) and 30 m (VIS, regional daily coverage). The higher resolutions come at the cost of depending on cloud-free fields of view (TIR and VIS) and daylight (VIS). We present a merged product of ASI-AMSR2 SIC (PMW) and MODIS SIC (TIR) at a nominal resolution of 1 km. This product benefits from both the independence of PMW towards cloud coverage and the high resolution of TIR data. An independent validation data set has been produced from manually selected, cloud-free Landsat VIS data at 30 m resolution. This dataset is used to evaluate the performance of the merged SIC dataset. Our results show that the merged product resolves features which are smeared out by the PMW data while benefitting from the PMW data in cloudy cases and is thus indeed more than the sum of its parts.
Comparing Goldstone Solar System Radar Earth-based Observations of Mars with Orbital Datasets
NASA Technical Reports Server (NTRS)
Haldemann, A. F. C.; Larsen, K. W.; Jurgens, R. F.; Slade, M. A.
2005-01-01
The Goldstone Solar System Radar (GSSR) has collected a self-consistent set of delay-Doppler near-nadir radar echo data from Mars since 1988. Prior to the Mars Global Surveyor (MGS) Mars Orbiter Laser Altimeter (MOLA) global topography for Mars, these radar data provided local elevation information, along with radar scattering information with global coverage. Two kinds of GSSR Mars delay-Doppler data exist: low 5 km x 150 km resolution and, more recently, high (5 to 10 km) spatial resolution. Radar data, and non-imaging delay-Doppler data in particular, requires significant data processing to extract elevation, reflectivity and roughness of the reflecting surface. Interpretation of these parameters, while limited by the complexities of electromagnetic scattering, provide information directly relevant to geophysical and geomorphic analyses of Mars. In this presentation we want to demonstrate how to compare GSSR delay-Doppler data to other Mars datasets, including some idiosyncracies of the radar data. Additional information is included in the original extended abstract.
Surficial geology and benthic habitat of the German Bank seabed, Scotian Shelf, Canada
Todd, Brian J.; Kostylev, Vladimir E.
2011-01-01
To provide the scientific context for management of a newly opened scallop fishing ground, surficial geology and benthic habitats were mapped on German Bank on the southern Scotian Shelf off Atlantic Canada. To provide a seamless regional dataset, multibeam sonar surveys covered 5320 sqaure kilometres of the bank in water depths of 30–250 m and provided 5 m horizontal resolution bathymetry and backscatter strength. Geoscience data included high-resolution geophysical profiles (seismic reflection and sidescan sonar) and seabed sediment samples. Geological interpretation and is overlain in places by glacial and postglacial sediment. Biological data included seafloor video transects and photographs from which 127 taxa of visible megabenthos were identified. Trawl bycatch data were obtained from government annual research surveys. Statistical analysis of revealed that bedrock is exposed at the seafloor on much of German Bankthese two datasets and a suite of oceanographic environmental variables demonstrated that significantly different fauna exist on bedrock, glacial sediment and postglacial sediment.
Investigating Bacterial-Animal Symbioses with Light Sheet Microscopy
Taormina, Michael J.; Jemielita, Matthew; Stephens, W. Zac; Burns, Adam R.; Troll, Joshua V.; Parthasarathy, Raghuveer; Guillemin, Karen
2014-01-01
SUMMARY Microbial colonization of the digestive tract is a crucial event in vertebrate development, required for maturation of host immunity and establishment of normal digestive physiology. Advances in genomic, proteomic, and metabolomic technologies are providing a more detailed picture of the constituents of the intestinal habitat, but these approaches lack the spatial and temporal resolution needed to characterize the assembly and dynamics of microbial communities in this complex environment. We report the use of light sheet microscopy to provide high resolution imaging of bacterial colonization of the zebrafish intestine. The methodology allows us to characterize bacterial population dynamics across the entire organ and the behaviors of individual bacterial and host cells throughout the colonization process. The large four-dimensional datasets generated by these imaging approaches require new strategies for image analysis. When integrated with other “omics” datasets, information about the spatial and temporal dynamics of microbial cells within the vertebrate intestine will provide new mechanistic insights into how microbial communities assemble and function within hosts. PMID:22983029
NASA Astrophysics Data System (ADS)
Willkofer, Florian; Wood, Raul R.; Schmid, Josef; von Trentini, Fabian; Ludwig, Ralf
2016-04-01
The ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) focuses on the effects of climate change on hydro-meteorological extreme events and their implications for water management in Bavaria and Québec. It builds on the conjoint analysis of a large ensemble of the CRCM5, driven by 50 members of the CanESM2, and the latest information provided through the CORDEX-initiative, to better assess the influence of natural climate variability and climatic change on the dynamics of extreme events. A critical point in the entire project is the preparation of a meteorological reference dataset with the required temporal (1-6h) and spatial (500m) resolution to be able to better evaluate hydrological extreme events in mesoscale river basins. For Bavaria a first reference data set (daily, 1km) used for bias-correction of RCM data was created by combining raster based data (E-OBS [1], HYRAS [2], MARS [3]) and interpolated station data using the meteorological interpolation schemes of the hydrological model WaSiM [4]. Apart from the coarse temporal and spatial resolution, this mosaic of different data sources is considered rather inconsistent and hence, not applicable for modeling of hydrological extreme events. Thus, the objective is to create a dataset with hourly data of temperature, precipitation, radiation, relative humidity and wind speed, which is then used for bias-correction of the RCM data being used as driver for hydrological modeling in the river basins. Therefore, daily data is disaggregated to hourly time steps using the 'Method of fragments' approach [5], based on available training stations. The disaggregation chooses fragments of daily values from observed hourly datasets, based on similarities in magnitude and behavior of previous and subsequent events. The choice of a certain reference station (hourly data, provision of fragments) for disaggregating daily station data (application of fragments) is crucial and several methods will be tested to achieve a profound spatial interpolation. This entire methodology shall be applicable for existing or newly developed datasets. References [1] Haylock, M.R., N. Hofstra, A.M.G. Klein Tank, E.J. Klok, P.D. Jones and M. New. A European daily high-resolution gridded dataset of surface temperature and precipitation. J. Geophys. Res (Atmospheres) (2008), 113, D20119, doi:10.1029/2008JD10201. [2] Rauthe, M., Steiner, H., Riediger, U., Mazurkiewicz, A. and A. Gratzki. A Central European precipitation climatology - Part I: Generation and validation of a high-resolution gridded daily data set (HYRAS). Meteorologische Zeitschrift (2013), 22/3, p.238-256. [3] MARS-AGRI4CAST. AGRI4CAST Interpolated Meteorological Data. http://mars.jrc.ec.europa.eu/mars/ About-us/AGRI4CAST/Data-distribution/AGRI4CAST-Interpolated-Meteorological-Data. 2007, last accessed May 10th, 2013. [4] Schulla, J. Model Description WaSiM - Water balance Simulation Model. 2015, available at: http://wasim.ch/en/products/wasim_description.htm. [5] Sharma, A. and S. Srikanthan. Continuous Rainfall Simulation: A Nonparametric Alternative. 30th Hydrology and Water Resources Symposium, Launceston, Tasmania, 4-7 December, 2006.
A global distributed basin morphometric dataset
NASA Astrophysics Data System (ADS)
Shen, Xinyi; Anagnostou, Emmanouil N.; Mei, Yiwen; Hong, Yang
2017-01-01
Basin morphometry is vital information for relating storms to hydrologic hazards, such as landslides and floods. In this paper we present the first comprehensive global dataset of distributed basin morphometry at 30 arc seconds resolution. The dataset includes nine prime morphometric variables; in addition we present formulas for generating twenty-one additional morphometric variables based on combination of the prime variables. The dataset can aid different applications including studies of land-atmosphere interaction, and modelling of floods and droughts for sustainable water management. The validity of the dataset has been consolidated by successfully repeating the Hack's law.
Global River Water Temperature Modelling at Hyper-Resolution
NASA Astrophysics Data System (ADS)
Wanders, N.; van Vliet, M. T. H.; Wada, Y.; Van Beek, L. P.
2017-12-01
The temperature of river water plays a crucial role in many physical, chemical and biological aquatic processes. The influence of changing water temperatures is not only felt locally, but also has regional and downstream impacts. Sectors that might be affected by sudden or gradual changes in the water temperature are: energy production, industry and recreation. Although it is very important to have detailed information on this environmental variable, high-resolution simulations of water temperature on a large scale are currently lacking. Here we present a novel hyper-resolution water temperature dataset at the global scale. We developed the 1-D energy routing model WARM, to simulate river temperature for the period 1980-2014 at 10 km and 50 km resolution. The WARM model accounts for surface water abstraction, reservoirs, riverine flooding and formation of ice, therefore enabling a realistic representation of the water temperature. The water temperature simulations have been validated against 358 river monitoring stations globally for the period 1980 to 2014. The results indicate the increase in resolution significantly improves the simulation performance with a decrease in the water temperature RMSE from 3.5°C to 3.0°C and an increase in the mean correlation of the daily discharge simulations, from R=0.4 to 0.6. We find an average global increase in water temperature of 0.22°C per decade between 1960-2014, with increasing trends towards the end of the simulations period. Strong increasing trends in maxima in the Northern Hemisphere (0.62°C per decade) and minima in the Southern Hemisphere (0.45°C per decade). Finally, we show the impact of major heatwaves and drought events on the water temperature and water availability. The high resolution not only improves the model performance; it also positively impacts the relevancy of the simulation for local and regional scale studies and impact assessments. This new global water temperature dataset could help to develop decision-support system related to water quality with increasing precision and accuracy.
Small scale denitrification variability in riparian zones: Results from a high-resolution dataset
NASA Astrophysics Data System (ADS)
Gassen, Niklas; Knöller, Kay; Musolff, Andreas; Popp, Felix; Lüders, Tillmann; Stumpp, Christine
2017-04-01
Riparian zones are important compartments at the interface between groundwater and surface water where biogeochemical processes like denitrification are often enhanced. Nitrate loads of either groundwater entering a stream through the riparian zone or streamwater infiltrating into the riparian zone can be substantially reduced. These processes are spatially and temporally highly variable, making it difficult to capture solute variabilities, estimate realistic turnover rates and thus to quantify integral mass removal. A crucial step towards a more detailed characterization is to monitor solutes on a scale which adequately resemble the highly heterogeneous distribution and on a scale where processes occur. We measured biogeochemical parameters in a spatial high resolution within a riparian corridor of a German lowland river system over the course of one year. Samples were taken from three newly developed high-resolution multi-level wells with a maximum vertical resolution of 5 cm and analyzed for major ions, DOC and N-O isotopes. Sediment derived during installation of the wells was analyzed for specific denitrifying enzymes. Results showed a distinct depth zonation of hydrochemistry within the shallow alluvial aquifer, with a 1 m thick zone just below the water table with lower nitrate concentrations and EC values similar to the nearby river. Conservative parameters were consistent inbetween the three wells, but nitrate was highly variable. In addition, spots with low nitrate concentrations showed isotopic and microbial evidence for higher denitrification activities. The depth zonation was observed throughout the year, with stronger temporal variations of nitrate concentrations just below the water table compared to deeper layers. Nitrate isotopes showed a clear seasonal trend of denitrification activities (high in summer, low in winter). Our dataset gives new insight into river-groundwater exchange processes and shows the highly heterogeneous distribution of denitrification in riparian zones, both in time and space. With these new insights, we are able to improve our understanding of spatial scaling of denitrification processes. This leads to a better prediction and improved management strategies for buffer mechanisms in riparian zones.
Flexible high-resolution display systems for the next generation of radiology reading rooms
NASA Astrophysics Data System (ADS)
Caban, Jesus J.; Wood, Bradford J.; Park, Adrian
2007-03-01
A flexible, scalable, high-resolution display system is presented to support the next generation of radiology reading rooms or interventional radiology suites. The project aims to create an environment for radiologists that will simultaneously facilitate image interpretation, analysis, and understanding while lowering visual and cognitive stress. Displays currently in use present radiologists with technical challenges to exploring complex datasets that we seek to address. These include resolution and brightness, display and ambient lighting differences, and degrees of complexity in addition to side-by-side comparison of time-variant and 2D/3D images. We address these issues through a scalable projector-based system that uses our custom-designed geometrical and photometrical calibration process to create a seamless, bright, high-resolution display environment that can reduce the visual fatigue commonly experienced by radiologists. The system we have designed uses an array of casually aligned projectors to cooperatively increase overall resolution and brightness. Images from a set of projectors in their narrowest zoom are combined at a shared projection surface, thus increasing the global "pixels per inch" (PPI) of the display environment. Two primary challenges - geometric calibration and photometric calibration - remained to be resolved before our high-resolution display system could be used in a radiology reading room or procedure suite. In this paper we present a method that accomplishes those calibrations and creates a flexible high-resolution display environment that appears seamless, sharp, and uniform across different devices.
Ambiguity of Quality in Remote Sensing Data
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Leptoukh, Greg
2010-01-01
This slide presentation reviews some of the issues in quality of remote sensing data. Data "quality" is used in several different contexts in remote sensing data, with quite different meanings. At the pixel level, quality typically refers to a quality control process exercised by the processing algorithm, not an explicit declaration of accuracy or precision. File level quality is usually a statistical summary of the pixel-level quality but is of doubtful use for scenes covering large areal extents. Quality at the dataset or product level, on the other hand, usually refers to how accurately the dataset is believed to represent the physical quantities it purports to measure. This assessment often bears but an indirect relationship at best to pixel level quality. In addition to ambiguity at different levels of granularity, ambiguity is endemic within levels. Pixel-level quality terms vary widely, as do recommendations for use of these flags. At the dataset/product level, quality for low-resolution gridded products is often extrapolated from validation campaigns using high spatial resolution swath data, a suspect practice at best. Making use of quality at all levels is complicated by the dependence on application needs. We will present examples of the various meanings of quality in remote sensing data and possible ways forward toward a more unified and usable quality framework.
Bahrami, Sheyda; Shamsi, Mousa
2017-01-01
Functional magnetic resonance imaging (fMRI) is a popular method to probe the functional organization of the brain using hemodynamic responses. In this method, volume images of the entire brain are obtained with a very good spatial resolution and low temporal resolution. However, they always suffer from high dimensionality in the face of classification algorithms. In this work, we combine a support vector machine (SVM) with a self-organizing map (SOM) for having a feature-based classification by using SVM. Then, a linear kernel SVM is used for detecting the active areas. Here, we use SOM for feature extracting and labeling the datasets. SOM has two major advances: (i) it reduces dimension of data sets for having less computational complexity and (ii) it is useful for identifying brain regions with small onset differences in hemodynamic responses. Our non-parametric model is compared with parametric and non-parametric methods. We use simulated fMRI data sets and block design inputs in this paper and consider the contrast to noise ratio (CNR) value equal to 0.6 for simulated datasets. fMRI simulated dataset has contrast 1-4% in active areas. The accuracy of our proposed method is 93.63% and the error rate is 6.37%.
The Importance of Chaos and Lenticulae on Europa for the JIMO Mission
NASA Technical Reports Server (NTRS)
Spaun, Nicole A.
2003-01-01
The Galileo Solid State Imaging (SSI) experiment provided high-resolution images of Europa's surface allowing identification of surface features barely distinguishable at Voyager's resolution. SSI revealed the visible pitting on Europa's surface to be due to large disrupted features, chaos, and smaller sub-circular patches, lenticulae. Chaos features contain a hummocky matrix material and commonly contain dislocated blocks of ridged plains. Lenticulae are morphologically interrelated and can be divided into three classes: domes, spots, and micro-chaos. Domes are broad, upwarped features that generally do not disrupt the texture of the ridged plains. Spots are areas of low albedo that are generally smooth in texture compared to other units. Micro-chaos are disrupted features with a hummocky matrix material, resembling that observed within chaos regions. Chaos and lenticulae are ubiquitous in the SSI regional map observations, which average approximately 200 meters per pixel (m/pxl) in resolution, and appear in several of the ultra-high resolution, i.e., better than 50 m/pxl, images of Europa as well. SSI also provided a number of multi-spectral observations of chaos and lenticulae. Using this dataset we have undertaken a thorough study of the morphology, size, spacing, stratigraphy, and color of chaos and lenticulae to determine their properties and evaluate models of their formation. Geological mapping indicates that chaos and micro-chaos have a similar internal morphology of in-situ degradation suggesting that a similar process was operating during their formation. The size distribution denotes a dominant size of 4-8 km in diameter for features containing hummocky material (i.e., chaos and micro-chaos). Results indicate a dominant spacing of 15 - 36 km apart. Chaos and lenticulae are generally among the youngest features stratigraphically observed on the surface, suggesting a recent change in resurfacing style. Also, the reddish non-icy materials on Europa's surface have high concentrations in many chaos and lenticulae features. Nonetheless, a complete global map of the distribution of chaos and lenticulae is not possible with the SSI dataset. Only <20% of the surface has been imaged at 200 m/pxl or better resolution, mostly of the near-equatorial regions. Color and ultra-high-res images have much less surface coverage. Thus we suggest that full global imaging of Europa at 200 m/pxl or better resolution, preferably in multi-spectral wavelengths, should be a high priority for the JIMO mission.
NASA Astrophysics Data System (ADS)
Tarquini, S.; Nannipieri, L.; Favalli, M.; Fornaciai, A.; Vinci, S.; Doumaz, F.
2012-04-01
Digital elevation models (DEMs) are fundamental in any kind of environmental or morphological study. DEMs are obtained from a variety of sources and generated in several ways. Nowadays, a few global-coverage elevation datasets are available for free (e.g., SRTM, http://www.jpl.nasa.gov/srtm; ASTER, http://asterweb.jpl.nasa.gov/). When the matrix of a DEM is used also for computational purposes, the choice of the elevation dataset which better suits the target of the study is crucial. Recently, the increasing use of DEM-based numerical simulation tools (e.g. for gravity driven mass flows), would largely benefit from the use of a higher resolution/higher accuracy topography than those available at planetary scale. Similar elevation datasets are neither easily nor freely available for all countries worldwide. Here we introduce a new web resource which made available for free (for research purposes only) a 10 m-resolution DEM for the whole Italian territory. The creation of this elevation dataset was presented by Tarquini et al. (2007). This DEM was obtained in triangular irregular network (TIN) format starting from heterogeneous vector datasets, mostly consisting in elevation contour lines and elevation points derived from several sources. The input vector database was carefully cleaned up to obtain an improved seamless TIN refined by using the DEST algorithm, thus improving the Delaunay tessellation. The whole TINITALY/01 DEM was converted in grid format (10-m cell size) according to a tiled structure composed of 193, 50-km side square elements. The grid database consists of more than 3 billions of cells and occupies almost 12 GB of disk memory. A web-GIS has been created (http://tinitaly.pi.ingv.it/ ) where a seamless layer of images in full resolution (10 m) obtained from the whole DEM (both in color-shaded and anaglyph mode) is open for browsing. Accredited navigators are allowed to download the elevation dataset.
Detection of grapes in natural environment using HOG features in low resolution images
NASA Astrophysics Data System (ADS)
Škrabánek, Pavel; Majerík, Filip
2017-07-01
Detection of grapes in real-life images has importance in various viticulture applications. A grape detector based on an SVM classifier, in combination with a HOG descriptor, has proven to be very efficient in detection of white varieties in high-resolution images. Nevertheless, the high time complexity of such utilization was not suitable for its real-time applications, even when a detector of a simplified structure was used. Thus, we examined possibilities of the simplified version application on images of lower resolutions. For this purpose, we designed a method aimed at search for a detector’s setting which gives the best time complexity vs. performance ratio. In order to provide precise evaluation results, we formed new extended datasets. We discovered that even applied on low-resolution images, the simplified detector, with an appropriate setting of all tuneable parameters, was competitive with other state of the art solutions. We concluded that the detector is qualified for real-time detection of grapes in real-life images.
High-resolution digital brain atlases: a Hubble telescope for the brain.
Jones, Edward G; Stone, James M; Karten, Harvey J
2011-05-01
We describe implementation of a method for digitizing at microscopic resolution brain tissue sections containing normal and experimental data and for making the content readily accessible online. Web-accessible brain atlases and virtual microscopes for online examination can be developed using existing computer and internet technologies. Resulting databases, made up of hierarchically organized, multiresolution images, enable rapid, seamless navigation through the vast image datasets generated by high-resolution scanning. Tools for visualization and annotation of virtual microscope slides enable remote and universal data sharing. Interactive visualization of a complete series of brain sections digitized at subneuronal levels of resolution offers fine grain and large-scale localization and quantification of many aspects of neural organization and structure. The method is straightforward and replicable; it can increase accessibility and facilitate sharing of neuroanatomical data. It provides an opportunity for capturing and preserving irreplaceable, archival neurohistological collections and making them available to all scientists in perpetuity, if resources could be obtained from hitherto uninterested agencies of scientific support. © 2011 New York Academy of Sciences.
NASA Astrophysics Data System (ADS)
Zhu, X.; Wen, X.; Zheng, Z.
2017-12-01
For better prediction and understanding of land-atmospheric interaction, in-situ observed meteorological data acquired from the China Meteorological Administration (CMA) were assimilated in the Weather Research and Forecasting (WRF) model and the monthly Green Vegetation Coverage (GVF) data, which was calculated using the Normalized Difference Vegetation Index (NDVI) of the Earth Observing System Moderate-Resolution Imaging Spectroradiometer (EOS-MODIS) and Digital Elevation Model (DEM) data of the Shuttle Radar Topography Mission (SRTM) system. Furthermore, the WRF model produced a High-Resolution Assimilation Dataset of the water-energy cycle in China (HRADC). This dataset has a horizontal resolution of 25 km for near surface meteorological data, such as air temperature, humidity, wind vectors and pressure (19 levels); soil temperature and moisture (four levels); surface temperature; downward/upward short/long radiation; 3-h latent heat flux; sensible heat flux; and ground heat flux. In this study, we 1) briefly introduce the cycling 3D-Var assimilation method and 2) compare results of meteorological elements, such as 2 m temperature and precipitation generated by the HRADC with the gridded observation data from CMA, and surface temperature and specific humidity with Global LandData Assimilation System (GLDAS) output data from the National Aeronautics and Space Administration (NASA). We found that the satellite-derived GVF from MODIS increased over southeast China compared with the default model over the whole year. The simulated results of soil temperature, net radiation and surface energy flux from the HRADC are improved compared with the control simulation and are close to GLDAS outputs. The values of net radiation from HRADC are higher than the GLDAS outputs, and the differences in the simulations are large in the east region but are smaller in northwest China and on the Qinghai-Tibet Plateau. The spatial distribution of the sensible heat flux and the ground heat flux from HRADC is consistent with the GLDAS outputs in summer. In general, the simulated results from HRADC are an improvement on the control simulation and can present the characteristics of the spatial and temporal variation of the water-energy cycle in China.
NASA Astrophysics Data System (ADS)
Thomas, Ian; Murphy, Paul; Fenton, Owen; Shine, Oliver; Mellander, Per-Erik; Dunlop, Paul; Jordan, Phil
2015-04-01
A new phosphorus index (PI) tool is presented which aims to improve the identification of critical source areas (CSAs) of phosphorus (P) losses from agricultural land to surface waters. In a novel approach, the PI incorporates topographic indices rather than watercourse proximity as proxies for runoff risk, to account for the dominant control of topography on runoff-generating areas and P transport pathways. Runoff propensity and hydrological connectivity are modelled using the Topographic Wetness Index (TWI) and Network Index (NI) respectively, utilising high resolution digital elevation models (DEMs) derived from Light Detection and Ranging (LiDAR) to capture the influence of micro-topographic features on runoff pathways. Additionally, the PI attempts to improve risk estimates of particulate P losses by incorporating an erosion factor that accounts for fine-scale topographic variability within fields. Erosion risk is modelled using the Unit Stream Power Erosion Deposition (USPED) model, which integrates DEM-derived upslope contributing area and Universal Soil Loss Equation (USLE) factors. The PI was developed using field, sub-field and sub-catchment scale datasets of P source, mobilisation and transport factors, for four intensive agricultural catchments in Ireland representing different agri-environmental conditions. Datasets included soil test P concentrations, degree of P saturation, soil attributes, land use, artificial subsurface drainage locations, and 2 m resolution LiDAR DEMs resampled from 0.25 m resolution data. All factor datasets were integrated within a Geographical Information System (GIS) and rasterised to 2 m resolution. For each factor, values were categorised and assigned relative risk scores which ranked P loss potential. Total risk scores were calculated for each grid cell using a component formulation, which summed the products of weighted factor risk scores for runoff and erosion pathways. Results showed that the new PI was able to predict in-field risk variability and hence was able to identify CSAs at the sub-field scale. PI risk estimates and component scores were analysed at catchment and subcatchment scales, and validated using measured dissolved, particulate and total P losses at subcatchment snapshot sites and gauging stations at catchment outlets. The new PI provides CSA delineations at higher precision compared to conventional PIs, and more robust P transport risk estimates. The tool can be used to target cost-effective mitigation measures for P management within single farm units and wider catchments.
Signal to Noise Ratio for Different Gridded Rainfall Products of Indian Monsoon
NASA Astrophysics Data System (ADS)
Nehra, P.; Shastri, H. K.; Ghosh, S.; Mishra, V.; Murtugudde, R. G.
2014-12-01
Gridded rainfall datasets provide useful information of spatial and temporal distribution of precipitation over a region. For India, there are 3 gridded rainfall data products available from India Meteorological Department (IMD), Tropical Rainfall Measurement Mission (TRMM) and Asian Precipitation - Highly Resolved Observational Data Integration towards Evaluation of Water Resources (APHRODITE), these compile precipitation information obtained through satellite based measurement and ground station based data. The gridded rainfall data from IMD is available at spatial resolution of 1°, 0.5° and 0.25° where as TRMM and APHRODITE is available at 0.25°. Here, we employ 7 years (1998-2004) of common time period amongst the 3 data products for the south-west monsoon season, i.e., the months June to September. We examine temporal mean and standard deviation of these 3 products to observe substantial variation amongst them at 1° resolution whereas for 0.25° resolution, all the data types are nearly identical. We determine the Signal to Noise Ratio (SNR) of the 3 products at 1° and 0.25° resolution based on noise separation technique adopting horizontal separation of the power spectrum generated with the Fast Fourier Transformation (FFT). A methodology is developed for threshold based separation of signal and noise from the power spectrum, treating the noise as white. The variance of signal to that of noise is computed to obtain SNR. Determination of SNR for different regions over the country shows the highest SNR with APHRODITE at 0.25° resolution. It is observed that the eastern part of India has the highest SNR in all cases considered whereas the northern and southern most Indian regions have lowest SNR. An incremental linear trend is observed among the SNR values and the spatial variance of corresponding region. Relationship between the computed SNR values and the interpolation method used with the dataset is analyzed. The SNR analysis provides an effective tool to evaluate the gridded precipitation data products. However detailed analysis is needed to determine the processes that lead to these SNR distributions so that the quality of the gridded rainfall data products can be further improved and transferability of the gridding algorithms can be explored to produce a unified high-quality rainfall dataset.
NASA Astrophysics Data System (ADS)
Parisi, L.; Calo, M.; Luzio, D.; Sulli, A.
2011-12-01
In this work we present Vp and Vs velocity models of the crust and uppermost mantle beneath the Sicilian-Tyrrhenian region (Southern Italy). We applied the double-difference tomography of Zhang and Thurber (2003) further optimized by the post-processing Weighted Average Model method (Calò et al., 2009; Calò, 2009). The tomographic method was applied to three datasets. The first dataset contains 31270 P- and 13588 S- absolute data and 73022 P- and 27893 S- differential times regarding earthquakes occurred from 1981 to 2005 and recorded by 192 stations. The second dataset is composed by 27668 P- and 11183 S- absolute data and 63296 P- and 29683 S- differential times of earthquakes occurred between January 2006 and December 2009 and recorded by 140 stations. The third dataset results as a merging of the two datasets above described. After an assessment of the results obtained after the inversion of the three datasets, we constructed the final Vp and Vs models as syntheses of all results using the WAM method. Checkerboard tests indicate that horizontal resolution allow to recovery velocity structures 20 km wide in the southern Tyrrhenian Sea and north-eastern Sicily area whereas anomalies of from 40 to 70 km are restored in the southern part of Sicily, Ionian Sea and Sicily Channel. Vertical resolution is 3 km in the shallower parts of the models (down to about 20 km) and 8 -10 km in the deeper ones (down to 50 km). Furthermore, a Vp- Vs correlation analysis was performed in order to assess the minimum threshold of DWS (Toomey and Foulger, 1986) that ensures a sufficient reliability of the seismic velocity distributions. These preliminary results show highly resolved Vp and Vs models and provide new constrains on the lithospheric structures of the study area.
SAR image dataset of military ground targets with multiple poses for ATR
NASA Astrophysics Data System (ADS)
Belloni, Carole; Balleri, Alessio; Aouf, Nabil; Merlet, Thomas; Le Caillec, Jean-Marc
2017-10-01
Automatic Target Recognition (ATR) is the task of automatically detecting and classifying targets. Recognition using Synthetic Aperture Radar (SAR) images is interesting because SAR images can be acquired at night and under any weather conditions, whereas optical sensors operating in the visible band do not have this capability. Existing SAR ATR algorithms have mostly been evaluated using the MSTAR dataset.1 The problem with the MSTAR is that some of the proposed ATR methods have shown good classification performance even when targets were hidden,2 suggesting the presence of a bias in the dataset. Evaluations of SAR ATR techniques are currently challenging due to the lack of publicly available data in the SAR domain. In this paper, we present a high resolution SAR dataset consisting of images of a set of ground military target models taken at various aspect angles, The dataset can be used for a fair evaluation and comparison of SAR ATR algorithms. We applied the Inverse Synthetic Aperture Radar (ISAR) technique to echoes from targets rotating on a turntable and illuminated with a stepped frequency waveform. The targets in the database consist of four variants of two 1.7m-long models of T-64 and T-72 tanks. The gun, the turret position and the depression angle are varied to form 26 different sequences of images. The emitted signal spanned the frequency range from 13 GHz to 18 GHz to achieve a bandwidth of 5 GHz sampled with 4001 frequency points. The resolution obtained with respect to the size of the model targets is comparable to typical values obtained using SAR airborne systems. Single polarized images (Horizontal-Horizontal) are generated using the backprojection algorithm.3 A total of 1480 images are produced using a 20° integration angle. The images in the dataset are organized in a suggested training and testing set to facilitate a standard evaluation of SAR ATR algorithms.
Characterization of ASTER GDEM Elevation Data over Vegetated Area Compared with Lidar Data
NASA Technical Reports Server (NTRS)
Ni, Wenjian; Sun, Guoqing; Ranson, Kenneth J.
2013-01-01
Current researches based on areal or spaceborne stereo images with very high resolutions (less than 1 meter) have demonstrated that it is possible to derive vegetation height from stereo images. The second version of the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM) is a state-of-the-art global elevation data-set developed by stereo images. However, the resolution of ASTER stereo images (15 meters) is much coarser than areal stereo images, and the ASTER GDEM is compiled products from stereo images acquired over 10 years. The forest disturbances as well as forest growth are inevitable in 10 years time span. In this study, the features of ASTER GDEM over vegetated areas under both flat and mountainous conditions were investigated by comparisons with lidar data. The factors possibly affecting the extraction of vegetation canopy height considered include (1) co-registration of DEMs; (2) spatial resolution of digital elevation models (DEMs); (3) spatial vegetation structure; and (4) terrain slope. The results show that accurate co-registration between ASTER GDEM and the National Elevation Dataset (NED) is necessary over mountainous areas. The correlation between ASTER GDEM minus NED and vegetation canopy height is improved from 0.328 to 0.43 by degrading resolutions from 1 arc-second to 5 arc-seconds and further improved to 0.6 if only homogenous vegetated areas were considered.
A new method to generate a high-resolution global distribution map of lake chlorophyll
Sayers, Michael J; Grimm, Amanda G.; Shuchman, Robert A.; Deines, Andrew M.; Bunnell, David B.; Raymer, Zachary B; Rogers, Mark W.; Woelmer, Whitney; Bennion, David; Brooks, Colin N.; Whitley, Matthew A.; Warner, David M.; Mychek-Londer, Justin G.
2015-01-01
A new method was developed, evaluated, and applied to generate a global dataset of growing-season chlorophyll-a (chl) concentrations in 2011 for freshwater lakes. Chl observations from freshwater lakes are valuable for estimating lake productivity as well as assessing the role that these lakes play in carbon budgets. The standard 4 km NASA OceanColor L3 chlorophyll concentration products generated from MODIS and MERIS sensor data are not sufficiently representative of global chl values because these can only resolve larger lakes, which generally have lower chl concentrations than lakes of smaller surface area. Our new methodology utilizes the 300 m-resolution MERIS full-resolution full-swath (FRS) global dataset as input and does not rely on the land mask used to generate standard NASA products, which masks many lakes that are otherwise resolvable in MERIS imagery. The new method produced chl concentration values for 78,938 and 1,074 lakes in the northern and southern hemispheres, respectively. The mean chl for lakes visible in the MERIS composite was 19.2 ± 19.2, the median was 13.3, and the interquartile range was 3.90–28.6 mg m−3. The accuracy of the MERIS-derived values was assessed by comparison with temporally near-coincident and globally distributed in situmeasurements from the literature (n = 185, RMSE = 9.39, R2 = 0.72). This represents the first global-scale dataset of satellite-derived chl estimates for medium to large lakes.
An effective approach for gap-filling continental scale remotely sensed time-series
Weiss, Daniel J.; Atkinson, Peter M.; Bhatt, Samir; Mappin, Bonnie; Hay, Simon I.; Gething, Peter W.
2014-01-01
The archives of imagery and modeled data products derived from remote sensing programs with high temporal resolution provide powerful resources for characterizing inter- and intra-annual environmental dynamics. The impressive depth of available time-series from such missions (e.g., MODIS and AVHRR) affords new opportunities for improving data usability by leveraging spatial and temporal information inherent to longitudinal geospatial datasets. In this research we develop an approach for filling gaps in imagery time-series that result primarily from cloud cover, which is particularly problematic in forested equatorial regions. Our approach consists of two, complementary gap-filling algorithms and a variety of run-time options that allow users to balance competing demands of model accuracy and processing time. We applied the gap-filling methodology to MODIS Enhanced Vegetation Index (EVI) and daytime and nighttime Land Surface Temperature (LST) datasets for the African continent for 2000–2012, with a 1 km spatial resolution, and an 8-day temporal resolution. We validated the method by introducing and filling artificial gaps, and then comparing the original data with model predictions. Our approach achieved R2 values above 0.87 even for pixels within 500 km wide introduced gaps. Furthermore, the structure of our approach allows estimation of the error associated with each gap-filled pixel based on the distance to the non-gap pixels used to model its fill value, thus providing a mechanism for including uncertainty associated with the gap-filling process in downstream applications of the resulting datasets. PMID:25642100
Analysis of multispectral and hyperspectral longwave infrared (LWIR) data for geologic mapping
NASA Astrophysics Data System (ADS)
Kruse, Fred A.; McDowell, Meryl
2015-05-01
Multispectral MODIS/ASTER Airborne Simulator (MASTER) data and Hyperspectral Thermal Emission Spectrometer (HyTES) data covering the 8 - 12 μm spectral range (longwave infrared or LWIR) were analyzed for an area near Mountain Pass, California. Decorrelation stretched images were initially used to highlight spectral differences between geologic materials. Both datasets were atmospherically corrected using the ISAC method, and the Normalized Emissivity approach was used to separate temperature and emissivity. The MASTER data had 10 LWIR spectral bands and approximately 35-meter spatial resolution and covered a larger area than the HyTES data, which were collected with 256 narrow (approximately 17nm-wide) spectral bands at approximately 2.3-meter spatial resolution. Spectra for key spatially-coherent, spectrally-determined geologic units for overlap areas were overlain and visually compared to determine similarities and differences. Endmember spectra were extracted from both datasets using n-dimensional scatterplotting and compared to emissivity spectral libraries for identification. Endmember distributions and abundances were then mapped using Mixture-Tuned Matched Filtering (MTMF), a partial unmixing approach. Multispectral results demonstrate separation of silica-rich vs non-silicate materials, with distinct mapping of carbonate areas and general correspondence to the regional geology. Hyperspectral results illustrate refined mapping of silicates with distinction between similar units based on the position, character, and shape of high resolution emission minima near 9 μm. Calcite and dolomite were separated, identified, and mapped using HyTES based on a shift of the main carbonate emissivity minimum from approximately 11.3 to 11.2 μm respectively. Both datasets demonstrate the utility of LWIR spectral remote sensing for geologic mapping.
Zhao, Guangjun; Wang, Xuchu; Niu, Yanmin; Tan, Liwen; Zhang, Shao-Xiang
2016-01-01
Cryosection brain images in Chinese Visible Human (CVH) dataset contain rich anatomical structure information of tissues because of its high resolution (e.g., 0.167 mm per pixel). Fast and accurate segmentation of these images into white matter, gray matter, and cerebrospinal fluid plays a critical role in analyzing and measuring the anatomical structures of human brain. However, most existing automated segmentation methods are designed for computed tomography or magnetic resonance imaging data, and they may not be applicable for cryosection images due to the imaging difference. In this paper, we propose a supervised learning-based CVH brain tissues segmentation method that uses stacked autoencoder (SAE) to automatically learn the deep feature representations. Specifically, our model includes two successive parts where two three-layer SAEs take image patches as input to learn the complex anatomical feature representation, and then these features are sent to Softmax classifier for inferring the labels. Experimental results validated the effectiveness of our method and showed that it outperformed four other classical brain tissue detection strategies. Furthermore, we reconstructed three-dimensional surfaces of these tissues, which show their potential in exploring the high-resolution anatomical structures of human brain. PMID:27057543
Zhao, Guangjun; Wang, Xuchu; Niu, Yanmin; Tan, Liwen; Zhang, Shao-Xiang
2016-01-01
Cryosection brain images in Chinese Visible Human (CVH) dataset contain rich anatomical structure information of tissues because of its high resolution (e.g., 0.167 mm per pixel). Fast and accurate segmentation of these images into white matter, gray matter, and cerebrospinal fluid plays a critical role in analyzing and measuring the anatomical structures of human brain. However, most existing automated segmentation methods are designed for computed tomography or magnetic resonance imaging data, and they may not be applicable for cryosection images due to the imaging difference. In this paper, we propose a supervised learning-based CVH brain tissues segmentation method that uses stacked autoencoder (SAE) to automatically learn the deep feature representations. Specifically, our model includes two successive parts where two three-layer SAEs take image patches as input to learn the complex anatomical feature representation, and then these features are sent to Softmax classifier for inferring the labels. Experimental results validated the effectiveness of our method and showed that it outperformed four other classical brain tissue detection strategies. Furthermore, we reconstructed three-dimensional surfaces of these tissues, which show their potential in exploring the high-resolution anatomical structures of human brain.
Wootton, J Timothy; Pfister, Catherine A; Forester, James D
2008-12-02
Increasing global concentrations of atmospheric CO(2) are predicted to decrease ocean pH, with potentially severe impacts on marine food webs, but empirical data documenting ocean pH over time are limited. In a high-resolution dataset spanning 8 years, pH at a north-temperate coastal site declined with increasing atmospheric CO(2) levels and varied substantially in response to biological processes and physical conditions that fluctuate over multiple time scales. Applying a method to link environmental change to species dynamics via multispecies Markov chain models reveals strong links between in situ benthic species dynamics and variation in ocean pH, with calcareous species generally performing more poorly than noncalcareous species in years with low pH. The models project the long-term consequences of these dynamic changes, which predict substantial shifts in the species dominating the habitat as a consequence of both direct effects of reduced calcification and indirect effects arising from the web of species interactions. Our results indicate that pH decline is proceeding at a more rapid rate than previously predicted in some areas, and that this decline has ecological consequences for near shore benthic ecosystems.
Prinyakupt, Jaroonrut; Pluempitiwiriyawej, Charnchai
2015-06-30
Blood smear microscopic images are routinely investigated by haematologists to diagnose most blood diseases. However, the task is quite tedious and time consuming. An automatic detection and classification of white blood cells within such images can accelerate the process tremendously. In this paper we propose a system to locate white blood cells within microscopic blood smear images, segment them into nucleus and cytoplasm regions, extract suitable features and finally, classify them into five types: basophil, eosinophil, neutrophil, lymphocyte and monocyte. Two sets of blood smear images were used in this study's experiments. Dataset 1, collected from Rangsit University, were normal peripheral blood slides under light microscope with 100× magnification; 555 images with 601 white blood cells were captured by a Nikon DS-Fi2 high-definition color camera and saved in JPG format of size 960 × 1,280 pixels at 15 pixels per 1 μm resolution. In dataset 2, 477 cropped white blood cell images were downloaded from CellaVision.com. They are in JPG format of size 360 × 363 pixels. The resolution is estimated to be 10 pixels per 1 μm. The proposed system comprises a pre-processing step, nucleus segmentation, cell segmentation, feature extraction, feature selection and classification. The main concept of the segmentation algorithm employed uses white blood cell's morphological properties and the calibrated size of a real cell relative to image resolution. The segmentation process combined thresholding, morphological operation and ellipse curve fitting. Consequently, several features were extracted from the segmented nucleus and cytoplasm regions. Prominent features were then chosen by a greedy search algorithm called sequential forward selection. Finally, with a set of selected prominent features, both linear and naïve Bayes classifiers were applied for performance comparison. This system was tested on normal peripheral blood smear slide images from two datasets. Two sets of comparison were performed: segmentation and classification. The automatically segmented results were compared to the ones obtained manually by a haematologist. It was found that the proposed method is consistent and coherent in both datasets, with dice similarity of 98.9 and 91.6% for average segmented nucleus and cell regions, respectively. Furthermore, the overall correction rate in the classification phase is about 98 and 94% for linear and naïve Bayes models, respectively. The proposed system, based on normal white blood cell morphology and its characteristics, was applied to two different datasets. The results of the calibrated segmentation process on both datasets are fast, robust, efficient and coherent. Meanwhile, the classification of normal white blood cells into five types shows high sensitivity in both linear and naïve Bayes models, with slightly better results in the linear classifier.
Evaluation of registration accuracy between Sentinel-2 and Landsat 8
NASA Astrophysics Data System (ADS)
Barazzetti, Luigi; Cuca, Branka; Previtali, Mattia
2016-08-01
Starting from June 2015, Sentinel-2A is delivering high resolution optical images (ground resolution up to 10 meters) to provide a global coverage of the Earth's land surface every 10 days. The planned launch of Sentinel-2B along with the integration of Landsat images will provide time series with an unprecedented revisit time indispensable for numerous monitoring applications, in which high resolution multi-temporal information is required. They include agriculture, water bodies, natural hazards to name a few. However, the combined use of multi-temporal images requires an accurate geometric registration, i.e. pixel-to-pixel correspondence for terrain-corrected products. This paper presents an analysis of spatial co-registration accuracy for several datasets of Sentinel-2 and Landsat 8 images distributed all around the world. Images were compared with digital correlation techniques for image matching, obtaining an evaluation of registration accuracy with an affine transformation as geometrical model. Results demonstrate that sub-pixel accuracy was achieved between 10 m resolution Sentinel-2 bands (band 3) and 15 m resolution panchromatic Landsat images (band 8).
Du, Jia-Qiang; Shu, Jian-Min; Wang, Yue-Hui; Li, Ying-Chang; Zhang, Lin-Bo; Guo, Yang
2014-02-01
Consistent NDVI time series are basic and prerequisite in long-term monitoring of land surface properties. Advanced very high resolution radiometer (AVHRR) measurements provide the longest records of continuous global satellite measurements sensitive to live green vegetation, and moderate resolution imaging spectroradiometer (MODIS) is more recent typical with high spatial and temporal resolution. Understanding the relationship between the AVHRR-derived NDVI and MODIS NDVI is critical to continued long-term monitoring of ecological resources. NDVI time series acquired by the global inventory modeling and mapping studies (GIMMS) and Terra MODIS were compared over the same time periods from 2000 to 2006 at four scales of Qinghai-Tibet Plateau (whole region, sub-region, biome and pixel) to assess the level of agreement in terms of absolute values and dynamic change by independently assessing the performance of GIMMS and MODIS NDVI and using 495 Landsat samples of 20 km x20 km covering major land cover type. High correlations existed between the two datasets at the four scales, indicating their mostly equal capability of capturing seasonal and monthly phenological variations (mostly at 0. 001 significance level). Simi- larities of the two datasets differed significantly among different vegetation types. The relative low correlation coefficients and large difference of NDVI value between the two datasets were found among dense vegetation types including broadleaf forest and needleleaf forest, yet the correlations were strong and the deviations were small in more homogeneous vegetation types, such as meadow, steppe and crop. 82% of study area was characterized by strong consistency between GIMMS and MODIS NDVI at pixel scale. In the Landsat NDVI vs. GIMMS and MODIS NDVI comparison of absolute values, the MODIS NDVI performed slightly better than GIMMS NDVI, whereas in the comparison of temporal change values, the GIMMS data set performed best. Similar with comparison results of GIMMS and MODIS NDVI, the consistency across the three datasets was clearly different among various vegetation types. In dynamic changes, differences between Landsat and MODIS NDVI were smaller than Landsat NDVI vs. GIMMS NDVI for forest, but Landsat and GIMMS NDVI agreed better for grass and crop. The results suggested that spatial patterns and dynamic trends of GIMMS NDVI were found to be in overall acceptable agreement with MODIS NDVI. It might be feasible to successfully integrate historical GIMMS and more recent MODIS NDVI to provide continuity of NDVI products. The accuracy of merging AVHRR historical data recorded with more modern MODIS NDVI data strongly depends on vegetation type, season and phenological period, and spatial scale. The integration of the two datasets for needleleaf forest, broadleaf forest, and for all vegetation types in the phenological transition periods in spring and autumn should be treated with caution.
NASA Astrophysics Data System (ADS)
Shih, Hsuan-Chang; Hwang, Cheinway; Barriot, Jean-Pierre; Mouyen, Maxime; Corréia, Pascal; Lequeux, Didier; Sichoix, Lydie
2015-08-01
For the first time, we carry out an airborne gravity survey and we collect new land gravity data over the islands of Tahiti and Moorea in French Polynesia located in the South Pacific Ocean. The new land gravity data are registered with GPS-derived coordinates, network-adjusted and outlier-edited, resulting in a mean standard error of 17 μGal. A crossover analysis of the airborne gravity data indicates a mean gravity accuracy of 1.7 mGal. New marine gravity around the two islands is derived from Geosat/GM, ERS-1/GM, Jason-1/GM, and Cryosat-2 altimeter data. A new 1-s digital topography model is constructed and is used to compute the topographic gravitational effects. To use EGM08 over Tahiti and Moorea, the optimal degree of spherical harmonic expansion is 1500. The fusion of the gravity datasets is made by the band-limited least-squares collocation, which best integrates datasets of different accuracies and spatial resolutions. The new high-resolution gravity and geoid grids are constructed on a 9-s grid. Assessments of the grids by measurements of ground gravity and geometric geoidal height result in RMS differences of 0.9 mGal and 0.4 cm, respectively. The geoid model allows 1-cm orthometric height determination by GPS and Lidar and yields a consistent height datum for Tahiti and Moorea. The new Bouguer anomalies show gravity highs and lows in the centers and land-sea zones of the two islands, allowing further studies of the density structure and volcanism in the region.
Rea, Alan; Skinner, Kenneth D.
2012-01-01
The U.S. Geological Survey Hawaii StreamStats application uses an integrated suite of raster and vector geospatial datasets to delineate and characterize watersheds. The geospatial datasets used to delineate and characterize watersheds on the StreamStats website, and the methods used to develop the datasets are described in this report. The datasets for Hawaii were derived primarily from 10 meter resolution National Elevation Dataset (NED) elevation models, and the National Hydrography Dataset (NHD), using a set of procedures designed to enforce the drainage pattern from the NHD into the NED, resulting in an integrated suite of elevation-derived datasets. Additional sources of data used for computing basin characteristics include precipitation, land cover, soil permeability, and elevation-derivative datasets. The report also includes links for metadata and downloads of the geospatial datasets.
NASA Astrophysics Data System (ADS)
Sankey, T.; Donald, J.; McVay, J.
2015-12-01
High resolution remote sensing images and datasets are typically acquired at a large cost, which poses big a challenge for many scientists. Northern Arizona University recently acquired a custom-engineered, cutting-edge UAV and we can now generate our own images with the instrument. The UAV has a unique capability to carry a large payload including a hyperspectral sensor, which images the Earth surface in over 350 spectral bands at 5 cm resolution, and a lidar scanner, which images the land surface and vegetation in 3-dimensions. Both sensors represent the newest available technology with very high resolution, precision, and accuracy. Using the UAV sensors, we are monitoring the effects of regional forest restoration treatment efforts. Individual tree canopy width and height are measured in the field and via the UAV sensors. The high-resolution UAV images are then used to segment individual tree canopies and to derive 3-dimensional estimates. The UAV image-derived variables are then correlated to the field-based measurements and scaled to satellite-derived tree canopy measurements. The relationships between the field-based and UAV-derived estimates are then extrapolated to a larger area to scale the tree canopy dimensions and to estimate tree density within restored and control forest sites.
Self-organizing maps: a versatile tool for the automatic analysis of untargeted imaging datasets.
Franceschi, Pietro; Wehrens, Ron
2014-04-01
MS-based imaging approaches allow for location-specific identification of chemical components in biological samples, opening up possibilities of much more detailed understanding of biological processes and mechanisms. Data analysis, however, is challenging, mainly because of the sheer size of such datasets. This article presents a novel approach based on self-organizing maps, extending previous work in order to be able to handle the large number of variables present in high-resolution mass spectra. The key idea is to generate prototype images, representing spatial distributions of ions, rather than prototypical mass spectra. This allows for a two-stage approach, first generating typical spatial distributions and associated m/z bins, and later analyzing the interesting bins in more detail using accurate masses. The possibilities and advantages of the new approach are illustrated on an in-house dataset of apple slices. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DeFelice, Thomas P.; Lloyd, D.; Meyer, D.J.; Baltzer, T. T.; Piraina, P.
2003-01-01
An atmospheric correction algorithm developed for the 1 km Advanced Very High Resolution Radiometer (AVHRR) global land dataset was modified to include a near real-time total column water vapour data input field to account for the natural variability of atmospheric water vapour. The real-time data input field used for this study is the Television and Infrared Observational Satellite (TIROS) Operational Vertical Sounder (TOVS) Pathfinder A global total column water vapour dataset. It was validated prior to its use in the AVHRR atmospheric correction process using two North American AVHRR scenes, namely 13 June and 28 November 1996. The validation results are consistent with those reported by others and entail a comparison between TOVS, radiosonde, experimental sounding, microwave radiometer, and data from a hand-held sunphotometer. The use of this data layer as input to the AVHRR atmospheric correction process is discussed.
Optimized multiple linear mappings for single image super-resolution
NASA Astrophysics Data System (ADS)
Zhang, Kaibing; Li, Jie; Xiong, Zenggang; Liu, Xiuping; Gao, Xinbo
2017-12-01
Learning piecewise linear regression has been recognized as an effective way for example learning-based single image super-resolution (SR) in literature. In this paper, we employ an expectation-maximization (EM) algorithm to further improve the SR performance of our previous multiple linear mappings (MLM) based SR method. In the training stage, the proposed method starts with a set of linear regressors obtained by the MLM-based method, and then jointly optimizes the clustering results and the low- and high-resolution subdictionary pairs for regression functions by using the metric of the reconstruction errors. In the test stage, we select the optimal regressor for SR reconstruction by accumulating the reconstruction errors of m-nearest neighbors in the training set. Thorough experimental results carried on six publicly available datasets demonstrate that the proposed SR method can yield high-quality images with finer details and sharper edges in terms of both quantitative and perceptual image quality assessments.
NASA Astrophysics Data System (ADS)
Oaida, C. M.; Andreadis, K.; Reager, J. T., II; Famiglietti, J. S.; Levoe, S.
2017-12-01
Accurately estimating how much snow water equivalent (SWE) is stored in mountainous regions characterized by complex terrain and snowmelt-driven hydrologic cycles is not only greatly desirable, but also a big challenge. Mountain snowpack exhibits high spatial variability across a broad range of spatial and temporal scales due to a multitude of physical and climatic factors, making it difficult to observe or estimate in its entirety. Combing remotely sensed data and high resolution hydrologic modeling through data assimilation (DA) has the potential to provide a spatially and temporally continuous SWE dataset at horizontal scales that capture sub-grid snow spatial variability and are also relevant to stakeholders such as water resource managers. Here, we present the evaluation of a new snow DA approach that uses a Local Ensemble Transform Kalman Filter (LETKF) in tandem with the Variable Infiltration Capacity macro-scale hydrologic model across the Western United States, at a daily temporal resolution, and a horizontal resolution of 1.75 km x 1.75 km. The LETKF is chosen for its relative simplicity, ease of implementation, and computational efficiency and scalability. The modeling/DA system assimilates daily MODIS Snow Covered Area and Grain Size (MODSCAG) fractional snow cover over, and has been developed to efficiently calculate SWE estimates over extended periods of time and covering large regional-scale areas at relatively high spatial resolution, ultimately producing a snow reanalysis-type dataset. Here we focus on the assessment of SWE produced by the DA scheme over several basins in California's Sierra Nevada Mountain range where Airborne Snow Observatory data is available, during the last five water years (2013-2017), which include both one of the driest and one of the wettest years. Comparison against such a spatially distributed SWE observational product provides a greater understanding of the model's ability to estimate SWE and SWE spatial variability, and highlights under which conditions snow cover DA can add value in estimating SWE.
Aubert, Alice H; Thrun, Michael C; Breuer, Lutz; Ultsch, Alfred
2016-08-30
High-frequency, in-situ monitoring provides large environmental datasets. These datasets will likely bring new insights in landscape functioning and process scale understanding. However, tailoring data analysis methods is necessary. Here, we detach our analysis from the usual temporal analysis performed in hydrology to determine if it is possible to infer general rules regarding hydrochemistry from available large datasets. We combined a 2-year in-stream nitrate concentration time series (time resolution of 15 min) with concurrent hydrological, meteorological and soil moisture data. We removed the low-frequency variations through low-pass filtering, which suppressed seasonality. We then analyzed the high-frequency variability component using Pareto Density Estimation, which to our knowledge has not been applied to hydrology. The resulting distribution of nitrate concentrations revealed three normally distributed modes: low, medium and high. Studying the environmental conditions for each mode revealed the main control of nitrate concentration: the saturation state of the riparian zone. We found low nitrate concentrations under conditions of hydrological connectivity and dominant denitrifying biological processes, and we found high nitrate concentrations under hydrological recession conditions and dominant nitrifying biological processes. These results generalize our understanding of hydro-biogeochemical nitrate flux controls and bring useful information to the development of nitrogen process-based models at the landscape scale.
Wong, Kim; Navarro, José Fernández; Bergenstråhle, Ludvig; Ståhl, Patrik L; Lundeberg, Joakim
2018-06-01
Spatial Transcriptomics (ST) is a method which combines high resolution tissue imaging with high troughput transcriptome sequencing data. This data must be aligned with the images for correct visualization, a process that involves several manual steps. Here we present ST Spot Detector, a web tool that automates and facilitates this alignment through a user friendly interface. jose.fernandez.navarro@scilifelab.se. Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Zhang, K.; Han, B.; Mansaray, L. R.; Xu, X.; Guo, Q.; Jingfeng, H.
2017-12-01
Synthetic aperture radar (SAR) instruments on board satellites are valuable for high-resolution wind field mapping, especially for coastal studies. Since the launch of Sentinel-1A on April 3, 2014, followed by Sentinel-1B on April 25, 2016, large amount of C-band SAR data have been added to a growing accumulation of SAR datasets (ERS-1/2, RADARSAT-1/2, ENVISAT). These new developments are of great significance for a wide range of applications in coastal sea areas, especially for high spatial resolution wind resource assessment, in which the accuracy of retrieved wind fields is extremely crucial. Recently, it is reported that wind speeds can also be retrieved from C-band cross-polarized SAR images, which is an important complement to wind speed retrieval from co-polarization. However, there is no consensus on the optimal resolution for wind speed retrieval from cross-polarized SAR images. This paper presents a comparison strategy for investigating the influence of spatial resolutions on sea surface wind speed retrieval accuracy with cross-polarized SAR images. Firstly, for wind speeds retrieved from VV-polarized images, the optimal geophysical C-band model (CMOD) function was selected among four CMOD functions. Secondly, the most suitable C-band cross-polarized ocean (C-2PO) model was selected between two C-2POs for the VH-polarized image dataset. Then, the VH-wind speeds retrieved by the selected C-2PO were compared with the VV-polarized sea surface wind speeds retrieved using the optimal CMOD, which served as reference, at different spatial resolutions. Results show that the VH-polarized wind speed retrieval accuracy increases rapidly with the decrease in spatial resolutions from 100 m to 1000 m, with a drop in RMSE of 42%. However, the improvement in wind speed retrieval accuracy levels off with spatial resolutions decreasing from 1000 m to 5000 m. This demonstrates that the pixel spacing of 1 km may be the compromising choice for the tradeoff between the spatial resolution and wind speed retrieval accuracy with cross-polarized images obtained from RADASAT-2 fine quad polarization mode. Figs. 1 illustrate the variation of the following statistical parameters: Bias, Corr, R2, RMSE and STD as a function of spatial resolution.
Color image guided depth image super resolution using fusion filter
NASA Astrophysics Data System (ADS)
He, Jin; Liang, Bin; He, Ying; Yang, Jun
2018-04-01
Depth cameras are currently playing an important role in many areas. However, most of them can only obtain lowresolution (LR) depth images. Color cameras can easily provide high-resolution (HR) color images. Using color image as a guide image is an efficient way to get a HR depth image. In this paper, we propose a depth image super resolution (SR) algorithm, which uses a HR color image as a guide image and a LR depth image as input. We use the fusion filter of guided filter and edge based joint bilateral filter to get HR depth image. Our experimental results on Middlebury 2005 datasets show that our method can provide better quality in HR depth images both numerically and visually.
NASA Astrophysics Data System (ADS)
Duarte, Débora; Santos, Joana; Terrinha, Pedro; Brito, Pedro; Noiva, João; Ribeiro, Carlos; Roque, Cristina
2017-04-01
More than 300 nautical miles of multichannel seismic reflection data were acquired in the scope of the ASTARTE project (Assessment Strategy and Risk Reduction for Tsunamis in Europe), off Quarteira, Algarve, South Portugal. The main goal of this very high resolution multichannel seismic survey was to obtain high-resolution images of the sedimentary record to try to discern the existence of high energy events, possibly tsunami backwash deposits associated with large magnitude earthquakes generated at the Africa-Eurasia plate boundary This seismic dataset was processed at the Instituto Português do Mar e da Atmosfera (IPMA), with the SeisSpace PROMAX Seismic Processing software. A tailor-made processing flow was applied, focusing in the removal of the seafloor multiple and in the enhancement of the superficial layers. A sparker source, using with 300 J of energy and a fire rate of 0,5 s was used onboard Xunauta, an 18 m long vessel. The preliminary seismostratigraphic interpretation of the Algarve ASTARTE seismic dataset allowed the identification of a complex sequence seismic units of progradational and agradational bodies as well as Mass Transported Deposits (MTD). The MTD package of sediments has a very complex internal structure, 20m of thickness, is apparently spatially controlled by an escarpment probably associated to past sea level low stands. The MTD covers across an area, approximately parallel to an ancient coastline, with >30 km (length) x 5 km (across). Acknowledgements: This work was developed as part of the project ASTARTE (603839 FP7) supported by the grant agreement No 603839 of the European Union's Seventh. The Instituto Portugues do Mar e da Atmosfera acknowledges support by Landmark Graphics (SeisWorks) via the Landmark University Grant Program.
NASA Astrophysics Data System (ADS)
Mandolesi, E.; Jones, A. G.; Roux, E.; Lebedev, S.
2009-12-01
Recently different studies were undertaken on the correlation between diverse geophysical datasets. Magnetotelluric (MT) data are used to map the electrical conductivity structure behind the Earth, but one of the problems in MT method is the lack in resolution in mapping zones beneath a region of high conductivity. Joint inversion of different datasets in which a common structure is recognizable reduces non-uniqueness and may improve the quality of interpretation when different dataset are sensitive to different physical properties with an underlined common structure. A common structure is recognized if the change of physical properties occur at the same spatial locations. Common structure may be recognized in 1D inversion of seismic and MT datasets, and numerous authors show that also 2D common structure may drive to an improvement of inversion quality while dataset are jointly inverted. In this presentation a tool to constrain MT 2D inversion with phase velocity of surface wave seismic data (SW) is proposed and is being developed and tested on synthetic data. Results obtained suggest that a joint inversion scheme could be applied with success along a section profile for which data are compatible with a 2D MT model.
Exploring Sedimentary Basins with High Frequency Receiver Function: the Dublin Basin Case Study
NASA Astrophysics Data System (ADS)
Licciardi, A.; Piana Agostinetti, N.
2015-12-01
The Receiver Function (RF) method is a widely applied seismological tool for the imaging of crustal and lithospheric structures beneath a single seismic station with one to tens kilometers of vertical resolution. However, detailed information about the upper crust (0-10 km depth) can also be retrieved by increasing the frequency content of the analyzed RF data-set (with a vertical resolution lower than 0.5km). This information includes depth of velocity contrasts, S-wave velocities within layers, as well as presence and location of seismic anisotropy or dipping interfaces (e.g., induced by faulting) at depth. These observables provides valuable constraints on the structural settings and properties of sedimentary basins both for scientific and industrial applications. To test the RF capabilities for this high resolution application, six broadband seismic stations have been deployed across the southwestern margin of the Dublin Basin (DB), Ireland, whose geothermal potential has been investigated in the last few years. With an inter-station distance of about 1km, this closely spaced array has been designed to provide a clear picture of the structural transition between the margin and the inner portion of the basin. In this study, a Bayesian approach is used to retrieve the posterior probability distributions of S-wave velocity at depth beneath each seismic station. A multi-frequency RF data-set is analyzed and RF and curves of apparent velocity are jointly inverted to better constrain absolute velocity variations. A pseudo 2D section is built to observe the lateral changes in elastic properties across the margin of the basin with a focus in the shallow portion of the crust. Moreover, by means of the harmonic decomposition technique, the azimuthal variations in the RF data-set are isolated and interpreted in terms of anisotropy and dipping interfaces associated with the major fault system in the area. These results are compared with the available information from previous seismic active surveys in the area, including boreholes data.
Estimating sowing and harvest dates based on the Asian summer monsoon
NASA Astrophysics Data System (ADS)
Mathison, Camilla; Deva, Chetan; Falloon, Pete; Challinor, Andrew J.
2018-05-01
Sowing and harvest dates are a significant source of uncertainty within crop models, especially for regions where high-resolution data are unavailable or, as is the case in future climate runs, where no data are available at all. Global datasets are not always able to distinguish when wheat is grown in tropical and subtropical regions, and they are also often coarse in resolution. South Asia is one such region where large spatial variation means higher-resolution datasets are needed, together with greater clarity for the timing of the main wheat growing season. Agriculture in South Asia is closely associated with the dominating climatological phenomenon, the Asian summer monsoon (ASM). Rice and wheat are two highly important crops for the region, with rice being mainly cultivated in the wet season during the summer monsoon months and wheat during the dry winter. We present a method for estimating the crop sowing and harvest dates for rice and wheat using the ASM onset and retreat. The aim of this method is to provide a more accurate alternative to the global datasets of cropping calendars than is currently available and generate more representative inputs for climate impact assessments. We first demonstrate that there is skill in the model prediction of monsoon onset and retreat for two downscaled general circulation models (GCMs) by comparing modelled precipitation with observations. We then calculate and apply sowing and harvest rules for rice and wheat for each simulation to climatological estimates of the monsoon onset and retreat for a present day period. We show that this method reproduces the present day sowing and harvest dates for most parts of India. The application of the method to two future simulations demonstrates that the estimated sowing and harvest dates are successfully modified to ensure that the growing season remains consistent with the internal model climate. The study therefore provides a useful way of modelling potential growing season adaptations to changes in future climate.
NASA Astrophysics Data System (ADS)
Lague, D.
2014-12-01
High Resolution Topographic (HRT) datasets are predominantly stored and analyzed as 2D raster grids of elevations (i.e., Digital Elevation Models). Raster grid processing is common in GIS software and benefits from a large library of fast algorithms dedicated to geometrical analysis, drainage network computation and topographic change measurement. Yet, all instruments or methods currently generating HRT datasets (e.g., ALS, TLS, SFM, stereo satellite imagery) output natively 3D unstructured point clouds that are (i) non-regularly sampled, (ii) incomplete (e.g., submerged parts of river channels are rarely measured), and (iii) include 3D elements (e.g., vegetation, vertical features such as river banks or cliffs) that cannot be accurately described in a DEM. Interpolating the raw point cloud onto a 2D grid generally results in a loss of position accuracy, spatial resolution and in more or less controlled interpolation. Here I demonstrate how studying earth surface topography and processes directly on native 3D point cloud datasets offers several advantages over raster based methods: point cloud methods preserve the accuracy of the original data, can better handle the evaluation of uncertainty associated to topographic change measurements and are more suitable to study vegetation characteristics and steep features of the landscape. In this presentation, I will illustrate and compare Point Cloud based and Raster based workflows with various examples involving ALS, TLS and SFM for the analysis of bank erosion processes in bedrock and alluvial rivers, rockfall statistics (including rockfall volume estimate directly from point clouds) and the interaction of vegetation/hydraulics and sedimentation in salt marshes. These workflows use 2 recently published algorithms for point cloud classification (CANUPO) and point cloud comparison (M3C2) now implemented in the open source software CloudCompare.
Heterogeneous data fusion for brain tumor classification.
Metsis, Vangelis; Huang, Heng; Andronesi, Ovidiu C; Makedon, Fillia; Tzika, Aria
2012-10-01
Current research in biomedical informatics involves analysis of multiple heterogeneous data sets. This includes patient demographics, clinical and pathology data, treatment history, patient outcomes as well as gene expression, DNA sequences and other information sources such as gene ontology. Analysis of these data sets could lead to better disease diagnosis, prognosis, treatment and drug discovery. In this report, we present a novel machine learning framework for brain tumor classification based on heterogeneous data fusion of metabolic and molecular datasets, including state-of-the-art high-resolution magic angle spinning (HRMAS) proton (1H) magnetic resonance spectroscopy and gene transcriptome profiling, obtained from intact brain tumor biopsies. Our experimental results show that our novel framework outperforms any analysis using individual dataset.
NASA Astrophysics Data System (ADS)
Wrzesien, M.; Durand, M. T.; Pavelsky, T.
2017-12-01
The hydrologic cycle is a key component of many aspects of daily life, yet not all water cycle processes are fully understood. In particular, water storage in mountain snowpacks remains largely unknown. Previous work with a high resolution regional climate model suggests that global and continental models underestimate mountain snow accumulation, perhaps by as much as 50%. Therefore, we hypothesize that since snow water equivalent (one aspect of the water balance) is underestimated, accepted water balances for major river basins are likely wrong, particularly for mountainous river basins. Here we examine water balances for four major high latitude North American watersheds - the Columbia, Mackenzie, Nelson, and Yukon. The mountainous percentage of each basin ranges, which allows us to consider whether a bias in the water balance is affected by mountain area percentage within the watershed. For our water balance evaluation, we especially consider precipitation estimates from a variety of datasets, including models, such as WRF and MERRA, and observation-based, such as CRU and GPCP. We ask whether the precipitation datasets provide enough moisture for seasonal snow to accumulate within the basin and whether we see differences in the variability of annual and seasonal precipitation from each dataset. From our reassessment of high-latitude water balances, we aim to determine whether the current understanding is sufficient to describe all processes within the hydrologic cycle or whether datasets appear to be biased, particularly in high-elevation precipitation. Should currently-available datasets appear to be similarly biased in precipitation, as we have seen in mountain snow accumulation, we discuss the implications for the continental water budget.
NASA Astrophysics Data System (ADS)
Stevens, S. E.; Nelson, B. R.; Langston, C.; Qi, Y.
2012-12-01
The National Mosaic and Multisensor QPE (NMQ/Q2) software suite, developed at NOAA's National Severe Storms Laboratory (NSSL) in Norman, OK, addresses a large deficiency in the resolution of currently archived precipitation datasets. Current standards, both radar- and satellite-based, provide for nationwide precipitation data with a spatial resolution of up to 4-5 km, with a temporal resolution as fine as one hour. Efforts are ongoing to process archived NEXRAD data for the period of record (1996 - present), producing a continuous dataset providing precipitation data at a spatial resolution of 1 km, on a timescale of only five minutes. In addition, radar-derived precipitation data are adjusted hourly using a wide variety of automated gauge networks spanning the United States. Applications for such a product range widely, from emergency management and flash flood guidance, to hydrological studies and drought monitoring. Results are presented from a subset of the NEXRAD dataset, providing basic statistics on the distribution of rainrates, relative frequency of precipitation types, and several other variables which demonstrate the variety of output provided by the software. Precipitation data from select case studies are also presented to highlight the increased resolution provided by this reanalysis and the possibilities that arise from the availability of data on such fine scales. A previously completed pilot project and steps toward a nationwide implementation are presented along with proposed strategies for managing and processing such a large dataset. Reprocessing efforts span several institutions in both North Carolina and Oklahoma, and data/software coordination are key in producing a homogeneous record of precipitation to be archived alongside NOAA's other Climate Data Records. Methods are presented for utilizing supercomputing capability in expediting processing, to allow for the iterative nature of a reanalysis effort.
Fast 3D Surface Extraction 2 pages (including abstract)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sewell, Christopher Meyer; Patchett, John M.; Ahrens, James P.
Ocean scientists searching for isosurfaces and/or thresholds of interest in high resolution 3D datasets required a tedious and time-consuming interactive exploration experience. PISTON research and development activities are enabling ocean scientists to rapidly and interactively explore isosurfaces and thresholds in their large data sets using a simple slider with real time calculation and visualization of these features. Ocean Scientists can now visualize more features in less time, helping them gain a better understanding of the high resolution data sets they work with on a daily basis. Isosurface timings (512{sup 3} grid): VTK 7.7 s, Parallel VTK (48-core) 1.3 s, PISTONmore » OpenMP (48-core) 0.2 s, PISTON CUDA (Quadro 6000) 0.1 s.« less
High resolution satellite image indexing and retrieval using SURF features and bag of visual words
NASA Astrophysics Data System (ADS)
Bouteldja, Samia; Kourgli, Assia
2017-03-01
In this paper, we evaluate the performance of SURF descriptor for high resolution satellite imagery (HRSI) retrieval through a BoVW model on a land-use/land-cover (LULC) dataset. Local feature approaches such as SIFT and SURF descriptors can deal with a large variation of scale, rotation and illumination of the images, providing, therefore, a better discriminative power and retrieval efficiency than global features, especially for HRSI which contain a great range of objects and spatial patterns. Moreover, we combine SURF and color features to improve the retrieval accuracy, and we propose to learn a category-specific dictionary for each image category which results in a more discriminative image representation and boosts the image retrieval performance.
Smoothing and gap-filling of high resolution multi-spectral time series: Example of Landsat data
NASA Astrophysics Data System (ADS)
Vuolo, Francesco; Ng, Wai-Tim; Atzberger, Clement
2017-05-01
This paper introduces a novel methodology for generating 15-day, smoothed and gap-filled time series of high spatial resolution data. The approach is based on templates from high quality observations to fill data gaps that are subsequently filtered. We tested our method for one large contiguous area (Bavaria, Germany) and for nine smaller test sites in different ecoregions of Europe using Landsat data. Overall, our results match the validation dataset to a high degree of accuracy with a mean absolute error (MAE) of 0.01 for visible bands, 0.03 for near-infrared and 0.02 for short-wave-infrared. Occasionally, the reconstructed time series are affected by artefacts due to undetected clouds. Less frequently, larger uncertainties occur as a result of extended periods of missing data. Reliable cloud masks are highly warranted for making full use of time series.
OpenMSI Arrayed Analysis Tools v2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
BOWEN, BENJAMIN; RUEBEL, OLIVER; DE ROND, TRISTAN
2017-02-07
Mass spectrometry imaging (MSI) enables high-resolution spatial mapping of biomolecules in samples and is a valuable tool for the analysis of tissues from plants and animals, microbial interactions, high-throughput screening, drug metabolism, and a host of other applications. This is accomplished by desorbing molecules from the surface on spatially defined locations, using a laser or ion beam. These ions are analyzed by a mass spectrometry and collected into a MSI 'image', a dataset containing unique mass spectra from the sampled spatial locations. MSI is used in a diverse and increasing number of biological applications. The OpenMSI Arrayed Analysis Tool (OMAAT)more » is a new software method that addresses the challenges of analyzing spatially defined samples in large MSI datasets, by providing support for automatic sample position optimization and ion selection.« less
Gregoretti, Francesco; Cesarini, Elisa; Lanzuolo, Chiara; Oliva, Gennaro; Antonelli, Laura
2016-01-01
The large amount of data generated in biological experiments that rely on advanced microscopy can be handled only with automated image analysis. Most analyses require a reliable cell image segmentation eventually capable of detecting subcellular structures.We present an automatic segmentation method to detect Polycomb group (PcG) proteins areas isolated from nuclei regions in high-resolution fluorescent cell image stacks. It combines two segmentation algorithms that use an active contour model and a classification technique serving as a tool to better understand the subcellular three-dimensional distribution of PcG proteins in live cell image sequences. We obtained accurate results throughout several cell image datasets, coming from different cell types and corresponding to different fluorescent labels, without requiring elaborate adjustments to each dataset.
Towards a High-Resolution Global Inundation Delineation Dataset
NASA Astrophysics Data System (ADS)
Fluet-Chouinard, E.; Lehner, B.
2011-12-01
Although their importance for biodiversity, flow regulation and ecosystem service provision is widely recognized, wetlands and temporarily inundated landscapes remain poorly mapped globally because of their inherent elusive nature. Inventorying of wetland resources has been identified in international agreements as an essential component of appropriate conservation efforts and management initiatives of these threatened ecosystems. However, despite recent advances in remote sensing surface water monitoring, current inventories of surface water variations remain incomplete at the regional-to-global scale due to methodological limitations restricting truly global application. Remote sensing wetland applications such as SAR L-band are particularly constrained by image availability and heterogeneity of acquisition dates, while coarse resolution passive microwave and multi-sensor methods cannot discriminate distinct surface water bodies. As a result, the most popular global wetland dataset remains to this day the Global Lake & Wetland Database (Lehner and Doll, 2004) a spatially inconsistent database assembled from various existing data sources. The approach taken in this project circumvents the limitations of current global wetland monitoring methods by combining globally available topographic and hydrographic data to downscale coarse resolution global inundation data (Prigent et al., 2007) and thus create a superior inundation delineation map product. The developed procedure downscales inundation data from the coarse resolution (~27km) of current passive microwave sensors to the finer spatial resolution (~500m) of the topographic and hydrographic layers of HydroSHEDS' data suite (Lehner et al., 2006), while retaining the high temporal resolution of the multi-sensor inundation dataset. From the downscaling process emerges new information on the specific location of inundation, but also on its frequency and duration. The downscaling algorithm employs a decision tree classifier trained on regional remote sensing wetland maps, to derive inundation probability followed by a seeded region growing segmentation process to redistribute the inundated area at the finer resolution. Assessment of the algorithm's performance is accomplished by evaluating the level of agreement between its outputted downscaled inundation maps and existing regional remote sensing inundation delineation. Upon completion, this project's will offer a dynamic globally seamless inundation map at an unprecedented spatial and temporal scale, which will provide the baseline inventory long requested by the research community, and will open the door to a wide array of possible conservation and hydrological modeling applications which were until now data-restricted. Literature Lehner, B., K. Verdin, and A. Jarvis. 2008. New global hydrography derived from spaceborne elevation data. Eos 89, no. 10. Lehner, B, and P Doll. 2004. Development and validation of a global database of lakes, reservoirs and wetlands. Journal of Hydrology 296, no. 1-4: 1-22. Prigent, C., F. Papa, F. Aires, W. B. Rossow, and E. Matthews. 2007. Global inundation dynamics inferred from multiple satellite observations, 1993-2000. Journal of Geophysical Research 112, no. D12: 1-13.
Modeling diurnal land temperature cycles over Los Angeles using downscaled GOES imagery
NASA Astrophysics Data System (ADS)
Weng, Qihao; Fu, Peng
2014-11-01
Land surface temperature is a key parameter for monitoring urban heat islands, assessing heat related risks, and estimating building energy consumption. These environmental issues are characterized by high temporal variability. A possible solution from the remote sensing perspective is to utilize geostationary satellites images, for instance, images from Geostationary Operational Environmental System (GOES) and Meteosat Second Generation (MSG). These satellite systems, however, with coarse spatial but high temporal resolution (sub-hourly imagery at 3-10 km resolution), often limit their usage to meteorological forecasting and global climate modeling. Therefore, how to develop efficient and effective methods to disaggregate these coarse resolution images to a proper scale suitable for regional and local studies need be explored. In this study, we propose a least square support vector machine (LSSVM) method to achieve the goal of downscaling of GOES image data to half-hourly 1-km LSTs by fusing it with MODIS data products and Shuttle Radar Topography Mission (SRTM) digital elevation data. The result of downscaling suggests that the proposed method successfully disaggregated GOES images to half-hourly 1-km LSTs with accuracy of approximately 2.5 K when validated against with MODIS LSTs at the same over-passing time. The synthetic LST datasets were further explored for monitoring of surface urban heat island (UHI) in the Los Angeles region by extracting key diurnal temperature cycle (DTC) parameters. It is found that the datasets and DTC derived parameters were more suitable for monitoring of daytime- other than nighttime-UHI. With the downscaled GOES 1-km LSTs, the diurnal temperature variations can well be characterized. An accuracy of about 2.5 K was achieved in terms of the fitted results at both 1 km and 5 km resolutions.
Shi, Yue; Huang, Wenjiang; Ye, Huichun; Ruan, Chao; Xing, Naichen; Geng, Yun; Dong, Yingying; Peng, Dailiang
2018-06-11
In recent decades, rice disease co-epidemics have caused tremendous damage to crop production in both China and Southeast Asia. A variety of remote sensing based approaches have been developed and applied to map diseases distribution using coarse- to moderate-resolution imagery. However, the detection and discrimination of various disease species infecting rice were seldom assessed using high spatial resolution data. The aims of this study were (1) to develop a set of normalized two-stage vegetation indices (VIs) for characterizing the progressive development of different diseases with rice; (2) to explore the performance of combined normalized two-stage VIs in partial least square discriminant analysis (PLS-DA); and (3) to map and evaluate the damage caused by rice diseases at fine spatial scales, for the first time using bi-temporal, high spatial resolution imagery from PlanetScope datasets at a 3 m spatial resolution. Our findings suggest that the primary biophysical parameters caused by different disease (e.g., changes in leaf area, pigment contents, or canopy morphology) can be captured using combined normalized two-stage VIs. PLS-DA was able to classify rice diseases at a sub-field scale, with an overall accuracy of 75.62% and a Kappa value of 0.47. The approach was successfully applied during a typical co-epidemic outbreak of rice dwarf (Rice dwarf virus, RDV), rice blast ( Magnaporthe oryzae ), and glume blight ( Phyllosticta glumarum ) in Guangxi Province, China. Furthermore, our approach highlighted the feasibility of the method in capturing heterogeneous disease patterns at fine spatial scales over the large spatial extents.
Palaseanu-Lovejoy, Monica; Poppenga, Sandra K.; Danielson, Jeffrey J.; Tyler, Dean J.; Gesch, Dean B.; Kottermair, Maria; Jalandoni, Andrea; Carlson, Edward; Thatcher, Cindy A.; Barbee, Matthew M.
2018-03-30
Atoll and island coastal communities are highly exposed to sea-level rise, tsunamis, storm surges, rogue waves, king tides, and the occasional combination of multiple factors, such as high regional sea levels, extreme high local tides, and unusually strong wave set-up. The elevation of most of these atolls averages just under 3 meters (m), with many areas roughly at sea level. The lack of high-resolution topographic data has been identified as a critical data gap for hazard vulnerability and adaptation efforts and for high-resolution inundation modeling for atoll nations. Modern topographic survey equipment and airborne lidar surveys can be very difficult and costly to deploy. Therefore, unmanned aircraft systems (UAS) were investigated for collecting overlapping imagery to generate topographic digital elevation models (DEMs). Medium- and high-resolution satellite imagery (Landsat 8 and WorldView-3) was investigated to derive nearshore bathymetry.The Republic of the Marshall Islands is associated with the United States through a Compact of Free Association, and Majuro Atoll is home to the capital city of Majuro and the largest population of the Republic of the Marshall Islands. The only elevation datasets currently available for the entire Majuro Atoll are the Shuttle Radar Topography Mission and the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model Version 2 elevation data, which have a 30-m grid-cell spacing and a 8-m vertical root mean square error (RMSE). Both these datasets have inadequate spatial resolution and vertical accuracy for inundation modeling.The final topobathymetric DEM (TBDEM) developed for Majuro Atoll is derived from various data sources including charts, soundings, acoustic sonar, and UAS and satellite imagery spanning over 70 years of data collection (1944 to 2016) on different sections of the atoll. The RMSE of the TBDEM over the land area is 0.197 m using over 70,000 Global Navigation Satellite System real-time kinematic survey points for validation, and 1.066 m for Landsat 8 and 1.112 m for WorldView-3 derived bathymetry using over 16,000 and 9,000 lidar bathymetry points, respectively.
NASA Astrophysics Data System (ADS)
Kleber, E.; Crosby, C. J.; Arrowsmith, R.; Robinson, S.; Haddad, D. E.
2013-12-01
The use of Light Detection and Ranging (lidar) derived topography has become an indispensable tool in Earth science research. The collection of high-resolution lidar topography from an airborne or terrestrial platform allows landscapes and landforms to be represented at sub-meter resolution and in three dimensions. In addition to its high value for scientific research, lidar derived topography has tremendous potential as a tool for Earth science education. Recent science education initiatives and a community call for access to research-level data make the time ripe to expose lidar data and derived data products as a teaching tool. High resolution topographic data fosters several Disciplinary Core Ideas (DCIs) of the Next Generation Science Standards (NGS, 2013), presents respective Big Ideas of the new community-driven Earth Science Literacy Initiative (ESLI, 2009), teaches to a number National Science Education Standards (NSES, 1996), and Benchmarks for Science Literacy (AAAS, 1993) for science education for undergraduate physical and environmental earth science classes. The spatial context of lidar data complements concepts like visualization, place-based learning, inquiry based teaching and active learning essential to teaching in the geosciences. As official host to EarthScope lidar datasets for tectonically active areas in the western United States, the NSF-funded OpenTopography facility provides user-friendly access to a wealth of data that is easily incorporated into Earth science educational materials. OpenTopography (www.opentopography.org), in collaboration with EarthScope, has developed education and outreach activities to foster teacher, student and researcher utilization of lidar data. These educational resources use lidar data coupled with free tools such as Google Earth to provide a means for students and the interested public to visualize and explore Earth's surface in an interactive manner not possible with most other remotely sensed imagery. The education section of the OpenTopography portal has recently been strengthened with the addition of several new resources and the re-organization of existing content for easy discovery. New resources include a detailed frequently asked questions (FAQ) section, updated 'How-to' videos for downloading data from OpenTopography and additional webpages aimed at students, educators and researchers leveraging existing and updated resources from OpenTopography, EarthScope and other organizations. In addition, the OpenLandform catalog, an online collection of classic geologic landforms depicted in lidar, has been updated to include additional tectonic landforms from EarthScope lidar datasets.
Towards systematic evaluation of crop model outputs for global land-use models
NASA Astrophysics Data System (ADS)
Leclere, David; Azevedo, Ligia B.; Skalský, Rastislav; Balkovič, Juraj; Havlík, Petr
2016-04-01
Land provides vital socioeconomic resources to the society, however at the cost of large environmental degradations. Global integrated models combining high resolution global gridded crop models (GGCMs) and global economic models (GEMs) are increasingly being used to inform sustainable solution for agricultural land-use. However, little effort has yet been done to evaluate and compare the accuracy of GGCM outputs. In addition, GGCM datasets require a large amount of parameters whose values and their variability across space are weakly constrained: increasing the accuracy of such dataset has a very high computing cost. Innovative evaluation methods are required both to ground credibility to the global integrated models, and to allow efficient parameter specification of GGCMs. We propose an evaluation strategy for GGCM datasets in the perspective of use in GEMs, illustrated with preliminary results from a novel dataset (the Hypercube) generated by the EPIC GGCM and used in the GLOBIOM land use GEM to inform on present-day crop yield, water and nutrient input needs for 16 crops x 15 management intensities, at a spatial resolution of 5 arc-minutes. We adopt the following principle: evaluation should provide a transparent diagnosis of model adequacy for its intended use. We briefly describe how the Hypercube data is generated and how it articulates with GLOBIOM in order to transparently identify the performances to be evaluated, as well as the main assumptions and data processing involved. Expected performances include adequately representing the sub-national heterogeneity in crop yield and input needs: i) in space, ii) across crop species, and iii) across management intensities. We will present and discuss measures of these expected performances and weight the relative contribution of crop model, input data and data processing steps in performances. We will also compare obtained yield gaps and main yield-limiting factors against the M3 dataset. Next steps include iterative improvement of parameter assumptions and evaluation of implications of GGCM performances for intended use in the IIASA EPIC-GLOBIOM model cluster. Our approach helps targeting future efforts at improving GGCM accuracy and would achieve highest efficiency if combined with traditional field-scale evaluation and sensitivity analysis.
NASA Astrophysics Data System (ADS)
Wang, Hui-Lin; An, Ru; You, Jia-jun; Wang, Ying; Chen, Yuehong; Shen, Xiao-ji; Gao, Wei; Wang, Yi-nan; Zhang, Yu; Wang, Zhe; Quaye-Ballard, Jonathan Arthur
2017-10-01
Soil moisture plays an important role in the water cycle within the surface ecosystem, and it is the basic condition for the growth of plants. Currently, the spatial resolutions of most soil moisture data from remote sensing range from ten to several tens of km, while those observed in-situ and simulated for watershed hydrology, ecology, agriculture, weather, and drought research are generally <1 km. Therefore, the existing coarse-resolution remotely sensed soil moisture data need to be downscaled. This paper proposes a universal and multitemporal soil moisture downscaling method suitable for large areas. The datasets comprise land surface, brightness temperature, precipitation, and soil and topographic parameters from high-resolution data and active/passive microwave remotely sensed essential climate variable soil moisture (ECV_SM) data with a spatial resolution of 25 km. Using this method, a total of 288 soil moisture maps of 1-km resolution from the first 10-day period of January 2003 to the last 10-day period of December 2010 were derived. The in-situ observations were used to validate the downscaled ECV_SM. In general, the downscaled soil moisture values for different land cover and land use types are consistent with the in-situ observations. Mean square root error is reduced from 0.070 to 0.061 using 1970 in-situ time series observation data from 28 sites distributed over different land uses and land cover types. The performance was also assessed using the GDOWN metric, a measure of the overall performance of the downscaling methods based on the same dataset. It was positive in 71.429% of cases, indicating that the suggested method in the paper generally improves the representation of soil moisture at 1-km resolution.
Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D
2009-01-01
Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the biological sample. Conclusion Decon2LS is an efficient software package for discovering and visualizing features in proteomics studies that require automated interpretation of mass spectra. Besides being easy to use, fast, and reliable, Decon2LS is also open-source, which allows developers in the proteomics and bioinformatics communities to reuse and refine the algorithms to meet individual needs. Decon2LS source code, installer, and tutorials may be downloaded free of charge at . PMID:19292916
Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D
2009-03-17
Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the biological sample. Decon2LS is an efficient software package for discovering and visualizing features in proteomics studies that require automated interpretation of mass spectra. Besides being easy to use, fast, and reliable, Decon2LS is also open-source, which allows developers in the proteomics and bioinformatics communities to reuse and refine the algorithms to meet individual needs.Decon2LS source code, installer, and tutorials may be downloaded free of charge at http://http:/ncrr.pnl.gov/software/.
Moerel, Michelle; De Martino, Federico; Kemper, Valentin G; Schmitter, Sebastian; Vu, An T; Uğurbil, Kâmil; Formisano, Elia; Yacoub, Essa
2018-01-01
Following rapid technological advances, ultra-high field functional MRI (fMRI) enables exploring correlates of neuronal population activity at an increasing spatial resolution. However, as the fMRI blood-oxygenation-level-dependent (BOLD) contrast is a vascular signal, the spatial specificity of fMRI data is ultimately determined by the characteristics of the underlying vasculature. At 7T, fMRI measurement parameters determine the relative contribution of the macro- and microvasculature to the acquired signal. Here we investigate how these parameters affect relevant high-end fMRI analyses such as encoding, decoding, and submillimeter mapping of voxel preferences in the human auditory cortex. Specifically, we compare a T 2 * weighted fMRI dataset, obtained with 2D gradient echo (GE) EPI, to a predominantly T 2 weighted dataset obtained with 3D GRASE. We first investigated the decoding accuracy based on two encoding models that represented different hypotheses about auditory cortical processing. This encoding/decoding analysis profited from the large spatial coverage and sensitivity of the T 2 * weighted acquisitions, as evidenced by a significantly higher prediction accuracy in the GE-EPI dataset compared to the 3D GRASE dataset for both encoding models. The main disadvantage of the T 2 * weighted GE-EPI dataset for encoding/decoding analyses was that the prediction accuracy exhibited cortical depth dependent vascular biases. However, we propose that the comparison of prediction accuracy across the different encoding models may be used as a post processing technique to salvage the spatial interpretability of the GE-EPI cortical depth-dependent prediction accuracy. Second, we explored the mapping of voxel preferences. Large-scale maps of frequency preference (i.e., tonotopy) were similar across datasets, yet the GE-EPI dataset was preferable due to its larger spatial coverage and sensitivity. However, submillimeter tonotopy maps revealed biases in assigned frequency preference and selectivity for the GE-EPI dataset, but not for the 3D GRASE dataset. Thus, a T 2 weighted acquisition is recommended if high specificity in tonotopic maps is required. In conclusion, different fMRI acquisitions were better suited for different analyses. It is therefore critical that any sequence parameter optimization considers the eventual intended fMRI analyses and the nature of the neuroscience questions being asked. Copyright © 2017 Elsevier Inc. All rights reserved.
Greg C. Liknes; Dacia M. Meneguzzo; Todd A. Kellerman
2017-01-01
Windbreaks are an important ecological resource across the large expanse of agricultural land in the central United States and are often planted in straight-line or L-shaped configurations to serve specific functions. As high-resolution (i.e., <5 m) land cover datasets become more available for these areas, semi-or fully-automated methods for distinguishing...
Liao, Congyu; Chen, Ying; Cao, Xiaozhi; Chen, Song; He, Hongjian; Mani, Merry; Jacob, Mathews; Magnotta, Vincent; Zhong, Jianhui
2017-03-01
To propose a novel reconstruction method using parallel imaging with low rank constraint to accelerate high resolution multishot spiral diffusion imaging. The undersampled high resolution diffusion data were reconstructed based on a low rank (LR) constraint using similarities between the data of different interleaves from a multishot spiral acquisition. The self-navigated phase compensation using the low resolution phase data in the center of k-space was applied to correct shot-to-shot phase variations induced by motion artifacts. The low rank reconstruction was combined with sensitivity encoding (SENSE) for further acceleration. The efficiency of the proposed joint reconstruction framework, dubbed LR-SENSE, was evaluated through error quantifications and compared with ℓ1 regularized compressed sensing method and conventional iterative SENSE method using the same datasets. It was shown that with a same acceleration factor, the proposed LR-SENSE method had the smallest normalized sum-of-squares errors among all the compared methods in all diffusion weighted images and DTI-derived index maps, when evaluated with different acceleration factors (R = 2, 3, 4) and for all the acquired diffusion directions. Robust high resolution diffusion weighted image can be efficiently reconstructed from highly undersampled multishot spiral data with the proposed LR-SENSE method. Magn Reson Med 77:1359-1366, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
NASA Astrophysics Data System (ADS)
Crawford, Ben; Grimmond, Sue; Kent, Christoph; Gabey, Andrew; Ward, Helen; Sun, Ting; Morrison, William
2017-04-01
Remotely sensed data from satellites have potential to enable high-resolution, automated calculation of urban surface energy balance terms and inform decisions about urban adaptations to environmental change. However, aerodynamic resistance methods to estimate sensible heat flux (QH) in cities using satellite-derived observations of surface temperature are difficult in part due to spatial and temporal variability of the thermal aerodynamic resistance term (rah). In this work, we extend an empirical function to estimate rah using observational data from several cities with a broad range of surface vegetation land cover properties. We then use this function to calculate spatially and temporally variable rah in London based on high-resolution (100 m) land cover datasets and in situ meteorological observations. In order to calculate high-resolution QH based on satellite-observed land surface temperatures, we also develop and employ novel methods to i) apply source area-weighted averaging of surface and meteorological variables across the study spatial domain, ii) calculate spatially variable, high-resolution meteorological variables (wind speed, friction velocity, and Obukhov length), iii) incorporate spatially interpolated urban air temperatures from a distributed sensor network, and iv) apply a modified Monte Carlo approach to assess uncertainties with our results, methods, and input variables. Modeled QH using the aerodynamic resistance method is then compared to in situ observations in central London from a unique network of scintillometers and eddy-covariance measurements.
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Estes, Maurice G., Jr.; Crosson, William L.; Khan, Maudood N.
2006-01-01
The Atlanta Urban Heat Island and Air Quality Project had its genesis in Project ATLANTA (ATlanta Land use Analysis: Temperature and Air quality) that began in 1996. Project ATLANTA examined how high-spatial resolution thermal remote sensing data could be used to derive better measurements of the Urban Heat Island effect over Atlanta. We have explored how these thermal remote sensing, as well as other imaged datasets, can be used to better characterize the urban landscape for improved air quality modeling over the Atlanta area. For the air quality modeling project, the National Land Cover Dataset and the local scale Landpro99 dataset at 30m spatial resolutions have been used to derive land use/land cover characteristics for input into the MM5 mesoscale meteorological model that is one of the foundations for the Community Multiscale Air Quality (CMAQ) model to assess how these data can improve output from CMAQ. Additionally, land use changes to 2030 have been predicted using a Spatial Growth Model (SGM). SGM simulates growth around a region using population, employment and travel demand forecasts. Air quality modeling simulations were conducted using both current and future land cover. Meteorological modeling simulations indicate a 0.5 C increase in daily maximum air temperatures by 2030. Air quality modeling simulations show substantial differences in relative contributions of individual atmospheric pollutant constituents as a result of land cover change. Enhanced boundary layer mixing over the city tends to offset the increase in ozone concentration expected due to higher surface temperatures as a result of urbanization.
NASA Astrophysics Data System (ADS)
Oo, Sungmin; Foelsche, Ulrich; Kirchengast, Gottfried; Fuchsberger, Jürgen
2016-04-01
The research level products of the Integrated Multi-Satellite Retrievals for Global Precipitation Measurement (IMERG "Final" run datasets) were compared with rainfall measurements from the WegenerNet high density network as part of ground validation (GV) projects of GPM missions. The WegenerNet network comprises 151 ground level weather stations in an area of 15 km × 20 km in south-eastern Austria (Feldbach region, ˜46.93° N, ˜15.90° E) designed to serve as a long-term monitoring and validation facility for weather and climate research and applications. While the IMERG provides rainfall estimations every half hour at 0.1° resolution, the WegenerNet network measures rainfall every 5 minutes at around 2 km2 resolution and produces 200 m × 200 m gridded datasets. The study was conducted on the domain of the WegenerNet network; eight IMERG grids are overlapped with the network, two of which are entirely covered by the WegenerNet (40 and 39 stations in each grid). We investigated data from April to September of the years 2014 to 2015; the date of first two years after the launch of the GPM Core Observatory. Since the network has a flexibility to work with various spatial and temporal scales, the comparison could be conducted on average-points to pixel basis at both sub-daily and daily timescales. This presentation will summarize the first results of the comparison and future plans to explore the characteristics of errors in the IMERG datasets.
NASA Astrophysics Data System (ADS)
Obelcz, J.; Xu, K.; Bentley, S. J.; Georgiou, I. Y.; Maloney, J. M.; Miner, M. D.; Hanegan, K.; Keller, G.
2014-12-01
The northern Gulf of Mexico, including the subaqueous Mississippi River delta front (MRDF), has been productive for oil and gas development since the early 1900s. In 1969 cyclic seafloor wave loading associated with the passage of Hurricane Camille triggered subaqueous mudflows across the MRDF, destroying several offshore oil platforms. This incident spurred geophysical and geotechnical studies of the MRDF, which found that the delta front is prone to mass failures on gentle gradients (<0.5°) due to (1) high rates of fine-grained sedimentation and associated underconsolidation, (2) excess sediment pore pressure attributed to in-situ biogenic gas production, and (3) the frequent passage of tropical cyclones. In June 2014, a geophysical pilot study was conducted 8 km southwest of Southwest Pass, the distributary that currently receives the largest fraction of Mississippi River sediment supply. The resultant dataset encompasses 216 km of subbottom Chirp seismic profiles and a 60 km2 grid of bathymetry and sidescan data. Preliminary interpretation of these data shows the survey area can be classified into four primary sedimentary facies: mudflow gullies, mudflow lobes, undisturbed prodelta, and undisturbed delta front. Subbottom profiles reveal extensive biogenic gas from 20 to about 80 m water depths on the delta front; sidescan data show a variety of bottleneck slides, mudflow gullies and mudflow noses. Previous studies have attempted to constrain the periodicity and magnitude of subaqueous mudslides on the MRDF. However, large age gaps and varied resolution between datasets result in ambiguity regarding the cause and magnitude of observed bathymetric changes. We present high-temporal resolution MRDF bathymetric variations from 2005 (post Hurricane Katrina), 2009 (relatively quiescent storm period), and 2014 (post 2011 Mississippi River flood). These data yield better magnitude and timing estimates of mass movements. This exercise represents a first step towards (1) assembling a comprehensive geologic dataset upon which future MRDF geohazard assessments can be founded, and (2) understanding the dynamics of a massive passive margin deltaic lobe entering a phase of decline.
NASA Astrophysics Data System (ADS)
Sefton-Nash, E.; Williams, J.-P.; Greenhagen, B. T.; Aye, K.-M.; Paige, D. A.
2017-12-01
An approach is presented to efficiently produce high quality gridded data records from the large, global point-based dataset returned by the Diviner Lunar Radiometer Experiment aboard NASA's Lunar Reconnaissance Orbiter. The need to minimize data volume and processing time in production of science-ready map products is increasingly important with the growth in data volume of planetary datasets. Diviner makes on average >1400 observations per second of radiance that is reflected and emitted from the lunar surface, using 189 detectors divided into 9 spectral channels. Data management and processing bottlenecks are amplified by modeling every observation as a probability distribution function over the field of view, which can increase the required processing time by 2-3 orders of magnitude. Geometric corrections, such as projection of data points onto a digital elevation model, are numerically intensive and therefore it is desirable to perform them only once. Our approach reduces bottlenecks through parallel binning and efficient storage of a pre-processed database of observations. Database construction is via subdivision of a geodesic icosahedral grid, with a spatial resolution that can be tailored to suit the field of view of the observing instrument. Global geodesic grids with high spatial resolution are normally impractically memory intensive. We therefore demonstrate a minimum storage and highly parallel method to bin very large numbers of data points onto such a grid. A database of the pre-processed and binned points is then used for production of mapped data products that is significantly faster than if unprocessed points were used. We explore quality controls in the production of gridded data records by conditional interpolation, allowed only where data density is sufficient. The resultant effects on the spatial continuity and uncertainty in maps of lunar brightness temperatures is illustrated. We identify four binning regimes based on trades between the spatial resolution of the grid, the size of the FOV and the on-target spacing of observations. Our approach may be applicable and beneficial for many existing and future point-based planetary datasets.
NASA Astrophysics Data System (ADS)
Sava, E.; Thornton, J. C.; Kalyanapu, A. J.; Cervone, G.
2016-12-01
Transportation infrastructure networks in urban areas are highly sensitive to natural disasters, yet are a very critical source for the success of rescue, recovery, and renovation operations. Therefore, prompt restoration of such networks is of high importance for disaster relief services. Satellite and aerial images provide data with high spatial and temporal resolution and are a powerful tool for monitoring the environment and mapping the spatio-temporal variability of the Earth's surface. They provide a synoptic overview and give useful environmental information for a wide range of scales, from entire continents to urban areas, with spatial pixel resolutions ranging from kilometers to centimeters. However, sensor limitations are often a serious drawback since no single sensor offers the optimal spectral, spatial, and temporal resolution at the same time. Specific data may not be collected in the time and space most urgently required and/or may it contain gaps as a result of the satellite revisit time, atmospheric opacity, or other obstructions. In this study, the feasibility of integrating multiple sources of contributed data including remotely sensed datasets and open-source geospatial datasets, into hydrodynamic models for flood inundation simulations is assessed. The 2015 Dallas floods that caused up to $61 million dollars in damage was selected for this study. A Hydraulic Engineering Center - River Analysis System (HEC-RAS) model was developed for the study area, using reservoir surcharge releases and geometry provided by the U.S. Army Corps of Engineers Fort Worth District. The simulated flood inundation is compared with the "contributed data" for the location (such as Civil Air Patrol data and WorldView 3 dataset) which indicated the model's lack of representing lateral inflows near the upstream section. An Artificial Neural Network (ANN) model is developed that used local precipitation and discharge values in the vicinity to estimate the lateral flows. This addition of estimated lateral inflows is expected to improve the model performance to match with the observed flows. Future work will focus on extending this preliminary work to assess the model performance after integrating these additional data sources.
NASA Astrophysics Data System (ADS)
Beamer, J.; Hill, D. F.; Arendt, A. A.; Luthcke, S. B.; Liston, G. E.
2015-12-01
A comprehensive study of the Gulf of Alaska (GOA) drainage basin was carried out to improve understanding of the coastal freshwater discharge (FWD) and surface mass balance (SMB) of glaciers. Coastal FWD and SMB for all glacier surfaces were modeled using a suite of physically based, spatially distributed weather, energy-balance snow/ice melt, soil water balance, and runoff routing models at a high resolution (1 km horizontal grid; daily time step). A 35 year hind cast was performed, providing complete records of precipitation, runoff, snow water equivalent (SWE) depth, evapotranspiration, coastal FWD and glacier SMB. Meteorological forcing was provided by the North American Regional Reanalysis (NARR), Modern Era Retrospective Analysis for Research and Applications (MERRA), and NCEP Climate Forecast System Reanalysis (CFSR) datasets. A fourth dataset was created by bias-correcting the NARR data to recently-developed monthly weather grids based on PRISM climatologies (NARR-BC). Each weather dataset and model combination was individually calibrated using PRISM climatologies, streamflow, and glacier mass balance measurements from four locations in the study domain. Simulated mean annual FWD into the GOA ranged from 600 km3 yr-1 using NARR to 850 km3 yr-1 from NARR-BC. The CFSR-forced simulations with optimized model parameters produced a simulated regional water storage that compared favorably to data from the NASA/DLR Gravity Recovery and Climate Experiment (GRACE) high resolution mascon solutions (Figure). Glacier runoff, taken as the sum of rainfall, snow and ice melt occurring on glacier surfaces, ranged from 260 km3 yr-1 from MERRA to 400 km3 yr-1 from NARR-BC, approximately one half of the signal from both glaciers and surrounding terrain. The large contribution from non-glacier surfaces to the seasonal water balance is likely not being fully removed from GRACE solutions aimed at isolating the glacier signal alone. We will discuss methods to use our simulations to forward-model the hydrology of the Gulf of Alaska region and minimize uncertainty in the partitioning of the hydrological signal. This study provides significant insight into the linkages between hydrological modeling and gravimetric measurements in mountain environments.
Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard
2016-01-01
Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment. PMID:27922592
NASA Astrophysics Data System (ADS)
Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard
2016-12-01
Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.
A new, long-term daily satellite-based rainfall dataset for operational monitoring in Africa
NASA Astrophysics Data System (ADS)
Maidment, Ross I.; Grimes, David; Black, Emily; Tarnavsky, Elena; Young, Matthew; Greatrex, Helen; Allan, Richard P.; Stein, Thorwald; Nkonde, Edson; Senkunda, Samuel; Alcántara, Edgar Misael Uribe
2017-05-01
Rainfall information is essential for many applications in developing countries, and yet, continually updated information at fine temporal and spatial scales is lacking. In Africa, rainfall monitoring is particularly important given the close relationship between climate and livelihoods. To address this information gap, this paper describes two versions (v2.0 and v3.0) of the TAMSAT daily rainfall dataset based on high-resolution thermal-infrared observations, available from 1983 to the present. The datasets are based on the disaggregation of 10-day (v2.0) and 5-day (v3.0) total TAMSAT rainfall estimates to a daily time-step using daily cold cloud duration. This approach provides temporally consistent historic and near-real time daily rainfall information for all of Africa. The estimates have been evaluated using ground-based observations from five countries with contrasting rainfall climates (Mozambique, Niger, Nigeria, Uganda, and Zambia) and compared to other satellite-based rainfall estimates. The results indicate that both versions of the TAMSAT daily estimates reliably detects rainy days, but have less skill in capturing rainfall amount—results that are comparable to the other datasets.
Bradbury, Kyle; Saboo, Raghav; L Johnson, Timothy; Malof, Jordan M; Devarajan, Arjun; Zhang, Wuming; M Collins, Leslie; G Newell, Richard
2016-12-06
Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.
Primate Brain Anatomy: New Volumetric MRI Measurements for Neuroanatomical Studies.
Navarrete, Ana F; Blezer, Erwin L A; Pagnotta, Murillo; de Viet, Elizabeth S M; Todorov, Orlin S; Lindenfors, Patrik; Laland, Kevin N; Reader, Simon M
2018-06-12
Since the publication of the primate brain volumetric dataset of Stephan and colleagues in the early 1980s, no major new comparative datasets covering multiple brain regions and a large number of primate species have become available. However, technological and other advances in the last two decades, particularly magnetic resonance imaging (MRI) and the creation of institutions devoted to the collection and preservation of rare brain specimens, provide opportunities to rectify this situation. Here, we present a new dataset including brain region volumetric measurements of 39 species, including 20 species not previously available in the literature, with measurements of 16 brain areas. These volumes were extracted from MRI of 46 brains of 38 species from the Netherlands Institute of Neuroscience Primate Brain Bank, scanned at high resolution with a 9.4-T scanner, plus a further 7 donated MRI of 4 primate species. Partial measurements were made on an additional 8 brains of 5 species. We make the dataset and MRI scans available online in the hope that they will be of value to researchers conducting comparative studies of primate evolution. © 2018 S. Karger AG, Basel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yu, E-mail: yuzhang@smu.edu.cn, E-mail: qianjinfeng08@gmail.com; Wu, Xiuxiu; Yang, Wei
2014-11-01
Purpose: The use of 4D computed tomography (4D-CT) of the lung is important in lung cancer radiotherapy for tumor localization and treatment planning. Sometimes, dense sampling is not acquired along the superior–inferior direction. This disadvantage results in an interslice thickness that is much greater than in-plane voxel resolutions. Isotropic resolution is necessary for multiplanar display, but the commonly used interpolation operation blurs images. This paper presents a super-resolution (SR) reconstruction method to enhance 4D-CT resolution. Methods: The authors assume that the low-resolution images of different phases at the same position can be regarded as input “frames” to reconstruct high-resolution images.more » The SR technique is used to recover high-resolution images. Specifically, the Demons deformable registration algorithm is used to estimate the motion field between different “frames.” Then, the projection onto convex sets approach is implemented to reconstruct high-resolution lung images. Results: The performance of the SR algorithm is evaluated using both simulated and real datasets. Their method can generate clearer lung images and enhance image structure compared with cubic spline interpolation and back projection (BP) method. Quantitative analysis shows that the proposed algorithm decreases the root mean square error by 40.8% relative to cubic spline interpolation and 10.2% versus BP. Conclusions: A new algorithm has been developed to improve the resolution of 4D-CT. The algorithm outperforms the cubic spline interpolation and BP approaches by producing images with markedly improved structural clarity and greatly reduced artifacts.« less
Multiresolution persistent homology for excessively large biomolecular datasets
NASA Astrophysics Data System (ADS)
Xia, Kelin; Zhao, Zhixiong; Wei, Guo-Wei
2015-10-01
Although persistent homology has emerged as a promising tool for the topological simplification of complex data, it is computationally intractable for large datasets. We introduce multiresolution persistent homology to handle excessively large datasets. We match the resolution with the scale of interest so as to represent large scale datasets with appropriate resolution. We utilize flexibility-rigidity index to access the topological connectivity of the data set and define a rigidity density for the filtration analysis. By appropriately tuning the resolution of the rigidity density, we are able to focus the topological lens on the scale of interest. The proposed multiresolution topological analysis is validated by a hexagonal fractal image which has three distinct scales. We further demonstrate the proposed method for extracting topological fingerprints from DNA molecules. In particular, the topological persistence of a virus capsid with 273 780 atoms is successfully analyzed which would otherwise be inaccessible to the normal point cloud method and unreliable by using coarse-grained multiscale persistent homology. The proposed method has also been successfully applied to the protein domain classification, which is the first time that persistent homology is used for practical protein domain analysis, to our knowledge. The proposed multiresolution topological method has potential applications in arbitrary data sets, such as social networks, biological networks, and graphs.
NASA Astrophysics Data System (ADS)
Alemu, H.; Senay, G. B.; Velpuri, N.; Asante, K. O.
2008-12-01
The nomadic pastoral communities in East Africa heavily depend on small water bodies and artificial lakes for domestic and livestock uses. The shortage of water in the region has made these water resources of great importance to them and sometimes even the reason for conflicts amongst rival communities in the region. Satellite-based data has significantly transformed the way we track and estimate hydrological processes such as precipitation and evapotranspiration. This approach has been particularly useful in remote places where conventional station-based weather networks are scarce. Tropical Rainfall Measuring Mission (TRMM) satellite data were extracted for the study region. National Oceanic and Atmospheric Administration's (NOAA) Global Data Assimilation System (GDAS) data were used to extract the climatic parameters needed to calculate reference evapotranspiration. The elevation data needed to delineate the watersheds were extracted from the Shuttle Radar Topography Mission (SRTM) with spatial resolution of 90m. The waterholes (most of which have average surface area less than a hectare) were identified using Advanced Space-borne Thermal Emission and Reflection Radiometer (ASTER) images with a spatial resolution of 15 m. As part of National Aeronautics and Space Administration's (NASA) funded enhancement to a livestock early warning decision support system, a simple hydrologic water balance model was developed to estimate daily waterhole depth variations. The model was run for over 10 years from 1998 till 2008 for 10 representative waterholes in the region. Although there were no independent datasets to validate the results, the temporal patterns captured both the seasonal and inter-annual variations, depicting known drought and flood years. Future research includes the installation of staff-gauges for model calibration and validation. The simple modeling approach demonstrated the effectiveness of integrating dynamic coarse resolution datasets such as TRMM with high resolution static datasets such as ASTER and SRTM DEM (Digital Elevation Model) to monitor water resources for drought early warning applications.
NASA Astrophysics Data System (ADS)
Kruschke, Tim; Kunze, Markus; Misios, Stergios; Matthes, Katja; Langematz, Ulrike; Tourpali, Kleareti
2016-04-01
Advanced spectral solar irradiance (SSI) reconstructions differ significantly from each other in terms of the mean solar spectrum, that is the spectral distribution of energy, and solar cycle variability. Largest uncertainties - relative to mean irradiance - are found for the ultraviolet range of the spectrum, a spectral region highly important for radiative heating and chemistry in the stratosphere and troposphere. This study systematically analyzes the effects of employing different SSI reconstructions in long-term (40 years) chemistry-climate model (CCM) simulations to estimate related uncertainties of the atmospheric response. These analyses are highly relevant for the next round of CCM studies as well as climate models within the CMIP6 exercise. The simulations are conducted by means of two state-of-the-art CCMs - CESM1(WACCM) and EMAC - run in "atmosphere-only"-mode. These models are quite different with respect to the complexity of the implemented radiation and chemistry schemes. CESM1(WACCM) features a chemistry module with considerably higher spectral resolution of the photolysis scheme while EMAC employs a radiation code with notably higher spectral resolution. For all simulations, concentrations of greenhouse gases and ozone depleting substances, as well as observed sea surface temperatures (SST) are set to average conditions representative for the year 2000 (for SSTs: mean of decade centered over year 2000) to exclude anthropogenic influences and differences due to variable SST forcing. Only the SSI forcing differs for the various simulations. Four different forcing datasets are used: NRLSSI1 (used as a reference in all previous climate modeling intercomparisons, i.e. CMIP5, CCMVal, CCMI), NRLSSI2, SATIRE-S, and the SSI forcing dataset recommended for the CMIP6 exercise. For each dataset, a solar maximum and minimum timeslice is integrated, respectively. The results of these simulations - eight in total - are compared to each other with respect to their shortwave heating rate differences (additionally collated with line-by-line calculations using libradtran), differences in the photolysis rates, as well as atmospheric circulation features (temperature, zonal wind, geopotential height, etc.). It is shown that atmospheric responses to the different SSI datasets differ significantly from each other. This is a result from direct radiative effects as well as indirect effects induced by ozone feedbacks. Differences originating from using different SSI datasets for the same level of solar activity are in the same order of magnitude as those associated with the 11 year solar cycle within a specific dataset. However, the climate signals related to the solar cycle are quite comparable across datasets.
Paech, S.J.; Mecikalski, J.R.; Sumner, D.M.; Pathak, C.S.; Wu, Q.; Islam, S.; Sangoyomi, T.
2009-01-01
Estimates of incoming solar radiation (insolation) from Geostationary Operational Environmental Satellite observations have been produced for the state of Florida over a 10-year period (1995-2004). These insolation estimates were developed into well-calibrated half-hourly and daily integrated solar insolation fields over the state at 2 km resolution, in addition to a 2-week running minimum surface albedo product. Model results of the daily integrated insolation were compared with ground-based pyranometers, and as a result, the entire dataset was calibrated. This calibration was accomplished through a three-step process: (1) comparison with ground-based pyranometer measurements on clear (noncloudy) reference days, (2) correcting for a bias related to cloudiness, and (3) deriving a monthly bias correction factor. Precalibration results indicated good model performance, with a station-averaged model error of 2.2 MJ m-2/day (13%). Calibration reduced errors to 1.7 MJ m -2/day (10%), and also removed temporal-related, seasonal-related, and satellite sensor-related biases. The calibrated insolation dataset will subsequently be used by state of Florida Water Management Districts to produce statewide, 2-km resolution maps of estimated daily reference and potential evapotranspiration for water management-related activities. ?? 2009 American Water Resources Association.
NASA Astrophysics Data System (ADS)
Delikaraoglou, D.; Mintourakis, I.; Kallianou, F.
2009-04-01
With the realization of the Shuttle Radar Topographic Mission (SRTM) and the free distribution of its global elevation dataset with 3 arcsec (90 m) resolution and less than 16 m vertical accuracy, together with the availability of the higher resolution (30 m) and accuracy (10 m) Digital Terrain Models (DTM) from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), these two valuable sources of uniform DEM data represent a revolution in the world of terrain modelling. DEMs are an important source of data for the generation of high resolution geoids since they provide the high-frequency content of the gravity field spectrum and are suitable for the computation of terrain effects to gravity and indirect effects to the geoid, thus allowing the combination of global geopotential models, local gravity anomalies and information about the earth's topography (represented by a given DEM). However, although such models are available for land, there are no readily accessible Digital Bathymetry Models (DBMs) of equivalent quality for the coastal and oceanic regions. Most of the global DBM's (e.g. ETOPO1, SRTM30, and GEBCO global bathymetric grid) are compilations of heterogeneous data with medium resolution and accuracy. This prevents to exploit the potential of the recent high resolution (1 arcmin) marine free-air gravity anomalies datasets derived from satellite altimetry (such as the DNSC08, and the Sandwell & Smith v18.1 (S&Sv18.1) global solutions) in conjunction with such global DBM's. Fortunately, for some regions, recently have become available DBM's of much better accuracy and resolution, such as the DBM of 1 km resolution for many regions of the Mediterranean Sea which is distributed by IFREMER, the French Research Institute for Exploitation of the Sea. The scope of this study is to use this latest regional DBM in combination with the newly available DNSC08 and SSV18.1 global marine free-air gravity anomalies datasets for marine and near shore geoid modelling of archipelagic (island) areas. We have concentrated in two test regions: (a) the Catalano-Balearic Sea (South of Spain in the NW Meditteranean), where adequate marine and land gravity data allow a detailed evaluation of our processing methodologies and their results and, (b) the Aegean Sea where the presence of many islands in varying distances from the mainland Greece and located on the continental shelf and/or divided by steep sea floor topography present some unique challenges for any high resolution geoid modelling efforts. For both test regions, we generated a combined DEM (C-DEM) using the IFREMER and SRTM 30 arcsec bathymetric data for the sea areas and SRTM 3 arcsec data for the surrounding land areas. In this contribution, we discuss various computational aspects relating to the so-called "Direct Topographical Effect" (DTE) and the "Indirect Topographical Effect" (ITE), the two most significant topographical effects that have to be evaluated when a precise geoid is being compiled. In addition, we outline the evaluation and the impact of the results obtained, especially with regard to the differences in the geoid models when different elevation data are used, and point out the main limitations and possibilities for further improvements in the use of the aforementioned satellite and terrestrial data for regional and local geoid mapping in coastal and island regions. Keywords: IFREMER, SRTM, terrain effects, free-air gravity anomalies, geoid modelling,Digital Bathymetry Models.