DOE Office of Scientific and Technical Information (OSTI.GOV)
Sellers, P.J.; Collatz, J.; Koster, R.
1996-09-01
A comprehensive series of global datasets for land-atmosphere models has been collected, formatted to a common grid, and released on a set of CD-ROMs. This paper describes the motivation for and the contents of the dataset. In June of 1992, an interdisciplinary earth science workshop was convened in Columbia, Maryland, to assess progress in land-atmosphere research, specifically in the areas of models, satellite data algorithms, and field experiments. At the workshop, representatives of the land-atmosphere modeling community defined a need for global datasets to prescribe boundary conditions, initialize state variables, and provide near-surface meteorological and radiative forcings for their models.more » The International Satellite Land Surface Climatology Project (ISLSCP), a part of the Global Energy and Water Cycle Experiment, worked with the Distributed Active Archive Center of the National Aeronautics and Space Administration Goddard Space Flight Center to bring the required datasets together in a usable format. The data have since been released on a collection of CD-ROMs. The datasets on the CD-ROMs are grouped under the following headings: vegetation; hydrology and soils; snow, ice, and oceans; radiation and clouds; and near-surface meteorology. All datasets cover the period 1987-88, and all but a few are spatially continuous over the earth`s land surface. All have been mapped to a common 1{degree} x 1{degree} equal-angle grid. The temporal frequency for most of the datasets is monthly. A few of the near-surface meteorological parameters are available both as six-hourly values and as monthly means. 26 refs., 8 figs., 2 tabs.« less
The SeaFlux Turbulent Flux Dataset Version 1.0 Documentation
NASA Technical Reports Server (NTRS)
Clayson, Carol Anne; Roberts, J. Brent; Bogdanoff, Alec S.
2012-01-01
Under the auspices of the World Climate Research Programme (WCRP) Global Energy and Water cycle EXperiment (GEWEX) Data and Assessment Panel (GDAP), the SeaFlux Project was created to investigate producing a high-resolution satellite-based dataset of surface turbulent fluxes over the global oceans. The most current release of the SeaFlux product is Version 1.0; this represents the initial release of turbulent surface heat fluxes, associated near-surface variables including a diurnally varying sea surface temperature.
Automatic initialization for 3D bone registration
NASA Astrophysics Data System (ADS)
Foroughi, Pezhman; Taylor, Russell H.; Fichtinger, Gabor
2008-03-01
In image-guided bone surgery, sample points collected from the surface of the bone are registered to the preoperative CT model using well-known registration methods such as Iterative Closest Point (ICP). These techniques are generally very sensitive to the initial alignment of the datasets. Poor initialization significantly increases the chances of getting trapped local minima. In order to reduce the risk of local minima, the registration is manually initialized by locating the sample points close to the corresponding points on the CT model. In this paper, we present an automatic initialization method that aligns the sample points collected from the surface of pelvis with CT model of the pelvis. The main idea is to exploit a mean shape of pelvis created from a large number of CT scans as the prior knowledge to guide the initial alignment. The mean shape is constant for all registrations and facilitates the inclusion of application-specific information into the registration process. The CT model is first aligned with the mean shape using the bilateral symmetry of the pelvis and the similarity of multiple projections. The surface points collected using ultrasound are then aligned with the pelvis mean shape. This will, in turn, lead to initial alignment of the sample points with the CT model. The experiments using a dry pelvis and two cadavers show that the method can align the randomly dislocated datasets close enough for successful registration. The standard ICP has been used for final registration of datasets.
NASA Technical Reports Server (NTRS)
Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.
2010-01-01
One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using traditional verification methodologies. Output from object-based verification within NCAR s Meteorological Evaluation Tools reveals that the WRF runs initialized with LIS+MODIS data consistently generated precipitation objects that better matched observed precipitation objects, especially at higher precipitation intensities. The LIS+MODIS runs produced on average a 4% increase in matched precipitation areas and a simultaneous 4% decrease in unmatched areas during three months of daily simulations.
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; LaCasse, Katherine M.; Santanello, Joseph A., Jr.; Lapenta, William M.; Petars-Lidard, Christa D.
2007-01-01
The exchange of energy and moisture between the Earth's surface and the atmospheric boundary layer plays a critical role in many hydrometeorological processes. Accurate and high-resolution representations of surface properties such as sea-surface temperature (SST), vegetation, soil temperature and moisture content, and ground fluxes are necessary to better understand the Earth-atmosphere interactions and improve numerical predictions of weather and climate phenomena. The NASA/NWS Short-term Prediction Research and Transition (SPORT) Center is currently investigating the potential benefits of assimilating high-resolution datasets derived from the NASA moderate resolution imaging spectroradiometer (MODIS) instruments using the Weather Research and Forecasting (WRF) model and the Goddard Space Flight Center Land Information System (LIS). The LIS is a software framework that integrates satellite and ground-based observational and modeled data along with multiple land surface models (LSMs) and advanced computing tools to accurately characterize land surface states and fluxes. The LIS can be run uncoupled to provide a high-resolution land surface initial condition, and can also be run in a coupled mode with WRF to integrate surface and soil quantities using any of the LSMs available in LIS. The LIS also includes the ability to optimize the initialization of surface and soil variables by tuning the spin-up time period and atmospheric forcing parameters, which cannot be done in the standard WRF. Among the datasets available from MODIS, a leaf-area index field and composite SST analysis are used to improve the lower boundary and initial conditions to the LIS/WRF coupled model over both land and water. Experiments will be conducted to measure the potential benefits from using the coupled LIS/WRF model over the Florida peninsula during May 2004. This month experienced relatively benign weather conditions, which will allow the experiments to focus on the local and mesoscale impacts of the high-resolution MODIS datasets and optimized soil and surface initial conditions. Follow-on experiments will examine the utility of such an optimized WRF configuration for more complex weather scenarios such as convective initiation. This paper will provide an overview of the experiment design and present preliminary results from selected cases in May 2004.
Recent Upgrades to NASA SPoRT Initialization Datasets for the Environmental Modeling System
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Lafontaine, Frank J.; Molthan, Andrew L.; Zavodsky, Bradley T.; Rozumalski, Robert A.
2012-01-01
The NASA Short-term Prediction Research and Transition (SPoRT) Center has developed several products for its NOAA/National Weather Service (NWS) partners that can initialize specific fields for local model runs within the NOAA/NWS Science and Training Resource Center Environmental Modeling System (EMS). The suite of SPoRT products for use in the EMS consists of a Sea Surface Temperature (SST) composite that includes a Lake Surface Temperature (LST) analysis over the Great Lakes, a Great Lakes sea-ice extent within the SST composite, a real-time Green Vegetation Fraction (GVF) composite, and NASA Land Information System (LIS) gridded output. This paper and companion poster describe each dataset and provide recent upgrades made to the SST, Great Lakes LST, GVF composites, and the real-time LIS runs.
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Kumar, Sujay V.; Kuliogwski, Robert J.; Langston, Carrie
2013-01-01
This paper and poster presented a description of the current real-time SPoRT-LIS run over the southeastern CONUS to provide high-resolution, land surface initialization grids for local numerical model forecasts at NWS forecast offices. The LIS hourly output also offers a supplemental dataset to aid in situational awareness for convective initiation forecasts, assessing flood potential, and monitoring drought at fine scales. It is a goal of SPoRT and several NWS forecast offices to expand the LIS to an entire CONUS domain, so that LIS output can be utilized by NWS Western Region offices, among others. To make this expansion cleanly so as to provide high-quality land surface output, SPoRT tested new precipitation datasets in LIS as an alternative forcing dataset to the current radar+gauge Stage IV product. Similar to the Stage IV product, the NMQ product showed comparable patterns of precipitation and soil moisture distribution, but suffered from radar gaps in the intermountain West, and incorrectly set values to zero instead of missing in the data-void regions of Mexico and Canada. The other dataset tested was the next-generation GOES-R QPE algorithm, which experienced a high bias in both coverage and intensity of accumulated precipitation relative to the control (NLDAS2), Stage IV, and NMQ simulations. The resulting root zone soil moisture was substantially higher in most areas.
Comparison of crown fire modeling systems used in three fire management applications
Joe H. Scott
2006-01-01
The relative behavior of surface-crown fire spread rate modeling systems used in three fire management applications-CFIS (Crown Fire Initiation and Spread), FlamMap and NEXUS- is compared using fire environment characteristics derived from a dataset of destructively measured canopy fuel and associated stand characteristics. Although the surface-crown modeling systems...
James, Eric P.; Benjamin, Stanley G.; Marquis, Melinda
2016-10-28
A new gridded dataset for wind and solar resource estimation over the contiguous United States has been derived from hourly updated 1-h forecasts from the National Oceanic and Atmospheric Administration High-Resolution Rapid Refresh (HRRR) 3-km model composited over a three-year period (approximately 22 000 forecast model runs). The unique dataset features hourly data assimilation, and provides physically consistent wind and solar estimates for the renewable energy industry. The wind resource dataset shows strong similarity to that previously provided by a Department of Energy-funded study, and it includes estimates in southern Canada and northern Mexico. The solar resource dataset represents anmore » initial step towards application-specific fields such as global horizontal and direct normal irradiance. This combined dataset will continue to be augmented with new forecast data from the advanced HRRR atmospheric/land-surface model.« less
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.
2016-01-01
Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated into the KMD-WRF runs, using the product generated by NOAA/NESDIS. Model verification capabilities are also being transitioned to KMD using NCAR's Model *Corresponding author address: Jonathan Case, ENSCO, Inc., 320 Sparkman Dr., Room 3008, Huntsville, AL, 35805. Email: Jonathan.Case-1@nasa.gov Evaluation Tools (MET; Brown et al. 2009) software in conjunction with a SPoRT-developed scripting package, in order to quantify and compare errors in simulated temperature, moisture and precipitation in the experimental WRF model simulations. This extended abstract and accompanying presentation summarizes the efforts and training done to date to support this unique regional modeling initiative at KMD. To honor the memory of Dr. Peter J. Lamb and his extensive efforts in bolstering weather and climate science and capacity-building in Africa, we offer this contribution to the special Peter J. Lamb symposium. The remainder of this extended abstract is organized as follows. The collaborating international organizations involved in the project are presented in Section 2. Background information on the unique land surface input datasets is presented in Section 3. The hands-on training sessions from March 2014 and June 2015 are described in Section 4. Sample experimental WRF output and verification from the June 2015 training are given in Section 5. A summary is given in Section 6, followed by Acknowledgements and References.
Osuri, K. K.; Nadimpalli, R.; Mohanty, U. C.; Chen, F.; Rajeevan, M.; Niyogi, D.
2017-01-01
The hypothesis that realistic land conditions such as soil moisture/soil temperature (SM/ST) can significantly improve the modeling of mesoscale deep convection is tested over the Indian monsoon region (IMR). A high resolution (3 km foot print) SM/ST dataset prepared from a land data assimilation system, as part of a national monsoon mission project, showed close agreement with observations. Experiments are conducted with (LDAS) and without (CNTL) initialization of SM/ST dataset. Results highlight the significance of realistic land surface conditions on numerical prediction of initiation, movement and timing of severe thunderstorms as compared to that currently being initialized by climatological fields in CNTL run. Realistic land conditions improved mass flux, convective updrafts and diabatic heating in the boundary layer that contributed to low level positive potential vorticity. The LDAS run reproduced reflectivity echoes and associated rainfall bands more efficiently. Improper representation of surface conditions in CNTL run limit the evolution boundary layer processes and thereby failed to simulate convection at right time and place. These findings thus provide strong support to the role land conditions play in impacting the deep convection over the IMR. These findings also have direct implications for improving heavy rain forecasting over the IMR, by developing realistic land conditions. PMID:28128293
A novel orthoimage mosaic method using the weighted A* algorithm for UAV imagery
NASA Astrophysics Data System (ADS)
Zheng, Maoteng; Zhou, Shunping; Xiong, Xiaodong; Zhu, Junfeng
2017-12-01
A weighted A* algorithm is proposed to select optimal seam-lines in orthoimage mosaic for UAV (Unmanned Aircraft Vehicle) imagery. The whole workflow includes four steps: the initial seam-line network is firstly generated by standard Voronoi Diagram algorithm; an edge diagram is then detected based on DSM (Digital Surface Model) data; the vertices (conjunction nodes) of initial network are relocated since some of them are on the high objects (buildings, trees and other artificial structures); and, the initial seam-lines are finally refined using the weighted A* algorithm based on the edge diagram and the relocated vertices. The method was tested with two real UAV datasets. Preliminary results show that the proposed method produces acceptable mosaic images in both the urban and mountainous areas, and is better than the result of the state-of-the-art methods on the datasets.
NASA SPoRT Initialization Datasets for Local Model Runs in the Environmental Modeling System
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; LaFontaine, Frank J.; Molthan, Andrew L.; Carcione, Brian; Wood, Lance; Maloney, Joseph; Estupinan, Jeral; Medlin, Jeffrey M.; Blottman, Peter; Rozumalski, Robert A.
2011-01-01
The NASA Short-term Prediction Research and Transition (SPoRT) Center has developed several products for its National Weather Service (NWS) partners that can be used to initialize local model runs within the Weather Research and Forecasting (WRF) Environmental Modeling System (EMS). These real-time datasets consist of surface-based information updated at least once per day, and produced in a composite or gridded product that is easily incorporated into the WRF EMS. The primary goal for making these NASA datasets available to the WRF EMS community is to provide timely and high-quality information at a spatial resolution comparable to that used in the local model configurations (i.e., convection-allowing scales). The current suite of SPoRT products supported in the WRF EMS include a Sea Surface Temperature (SST) composite, a Great Lakes sea-ice extent, a Greenness Vegetation Fraction (GVF) composite, and Land Information System (LIS) gridded output. The SPoRT SST composite is a blend of primarily the Moderate Resolution Imaging Spectroradiometer (MODIS) infrared and Advanced Microwave Scanning Radiometer for Earth Observing System data for non-precipitation coverage over the oceans at 2-km resolution. The composite includes a special lake surface temperature analysis over the Great Lakes using contributions from the Remote Sensing Systems temperature data. The Great Lakes Environmental Research Laboratory Ice Percentage product is used to create a sea-ice mask in the SPoRT SST composite. The sea-ice mask is produced daily (in-season) at 1.8-km resolution and identifies ice percentage from 0 100% in 10% increments, with values above 90% flagged as ice.
NASA Astrophysics Data System (ADS)
Andersen, O. B.; Passaro, M.; Benveniste, J.; Piccioni, G.
2016-12-01
A new initiative within the ESA Sea Level Climate Change initiative (SL-cci) framework to improve the Arctic sea level record has been initiated as a combined effort to reprocess and retrack past altimetry to create a 25-year combined sea level record for sea level research studies. One of the objectives is to retracked ERS-2 dataset for the high latitudes based on the ALES retracking algorithm through adapting the ALES retracker for retracking of specular surfaces (leads). Secondly a reprocessing using tailored editing to Arctic Conditions will be carried out also focusing on the merging of the multi-mission data. Finally an effort is to combine physical and empirical retracked sea surface height information to derive an experimental spatio-temporal enhanced sea level product for high latitude. The first results in analysing Arctic Sea level variations on annual inter-annual scales for the 1992-2015 from a preliminar version of this dataset is presented. By including the GRACE water storage estimates and NOAA halo- and thermo-steric sea level variatios since 2002 a preliminary attempt to close the Arctic Sea level budget is presented here. Closing the Arctic sea level budget is by no mean trivial as both steric data and satellite altimetry is both sparse temporally and limited geographically.
Assimilation of Cloud Information in Numerical Weather Prediction Model in Southwest China
NASA Astrophysics Data System (ADS)
HENG, Z.
2016-12-01
Based on the ARPS Data Analysis System (ADAS), Weather Research and Forecasting (WRF) model, simulation experiments from July 1st 2015 to August 1st 2015 are conducted in the region of Southwest China. In the assimilation experiment (EXP), datasets from surface observations are assimilated, cloud information from weather Doppler radar, Fengyun-2E (FY-2E) geostationary satellite are retrieved by using the complex cloud analysis scheme in the ADAS, to insert microphysical variables and adjust the humility structure in the initial condition. As a control run (CTL), datasets from surface observations are assimilated, but no cloud information is used in the ADAS. The simulation result of a rainstorm caused by the Southwest Vortex during 14-15 July 2015 shows that, the EXP run has a better capability in representing the shape and intensity of precipitation, especially the center of rainstorm. The one-month inter-comparison of the initial and prediction results between the EXP and CTL runs reveled that, EXP runs can present a more reasonable phenomenon of rain and get a higher score in the rain prediction. Keywords: NWP, rainstorm, Data assimilation
Multiplatform observations enabling albedo retrievals with high temporal resolution
NASA Astrophysics Data System (ADS)
Riihelä, Aku; Manninen, Terhikki; Key, Jeffrey; Sun, Qingsong; Sütterlin, Melanie; Lattanzio, Alessio; Schaaf, Crystal
2017-04-01
In this paper we show that combining observations from different polar orbiting satellite families (such as AVHRR and MODIS) is physically justifiable and technically feasible. Our proposed approach will lead to surface albedo retrievals at higher temporal resolution than the state of the art, with comparable or better accuracy. This study is carried out in the World Meteorological Organization (WMO) Sustained and coordinated processing of Environmental Satellite data for Climate Monitoring (SCOPE-CM) project SCM-02 (http://www.scope-cm.org/projects/scm-02/). Following a spectral homogenization of the Top-of-Atmosphere reflectances of bands 1 & 2 from AVHRR and MODIS, both observation datasets are atmospherically corrected with a coherent atmospheric profile and algorithm. The resulting surface reflectances are then fed into an inversion of the RossThick-LiSparse-Reciprocal surface bidirectional reflectance distribution function (BRDF) model. The results of the inversion (BRDF kernels) may then be integrated to estimate various surface albedo quantities. A key principle here is that the larger number of valid surface observations with multiple satellites allows us to invert the BRDF coefficients within a shorter time span, enabling the monitoring of relatively rapid surface phenomena such as snowmelt. The proposed multiplatform approach is expected to bring benefits in particular to the observation of the albedo of the polar regions, where persistent cloudiness and long atmospheric path lengths present challenges to satellite-based retrievals. Following a similar logic, the retrievals over tropical regions with high cloudiness should also benefit from the method. We present results from a demonstrator dataset of a global combined AVHRR-GAC and MODIS dataset covering the year 2010. The retrieved surface albedo is compared against quality-monitored in situ albedo observations from the Baseline Surface Radiation Network (BSRN). Additionally, the combined retrieval dataset is compared against MODIS C6 albedo/BRDF datasets to assess the quality of the multiplatform approach against current state of the art. This approach is not limited to AHVRR and MODIS observations. Provided that the spectral homogenization produces an acceptably good match, any instrument observing the Earth's surface in the visible and near-infrared wavelengths could, in principal, be included to further enhance the temporal resolution and accuracy of the retrievals. The SCOPE-CM initiative provides a potential framework for such expansion in the future.
Simultaneous segmentation of the bone and cartilage surfaces of a knee joint in 3D
NASA Astrophysics Data System (ADS)
Yin, Y.; Zhang, X.; Anderson, D. D.; Brown, T. D.; Hofwegen, C. Van; Sonka, M.
2009-02-01
We present a novel framework for the simultaneous segmentation of multiple interacting surfaces belonging to multiple mutually interacting objects. The method is a non-trivial extension of our previously reported optimal multi-surface segmentation. Considering an example application of knee-cartilage segmentation, the framework consists of the following main steps: 1) Shape model construction: Building a mean shape for each bone of the joint (femur, tibia, patella) from interactively segmented volumetric datasets. Using the resulting mean-shape model - identification of cartilage, non-cartilage, and transition areas on the mean-shape bone model surfaces. 2) Presegmentation: Employment of iterative optimal surface detection method to achieve approximate segmentation of individual bone surfaces. 3) Cross-object surface mapping: Detection of inter-bone equidistant separating sheets to help identify corresponding vertex pairs for all interacting surfaces. 4) Multi-object, multi-surface graph construction and final segmentation: Construction of a single multi-bone, multi-surface graph so that two surfaces (bone and cartilage) with zero and non-zero intervening distances can be detected for each bone of the joint, according to whether or not cartilage can be locally absent or present on the bone. To define inter-object relationships, corresponding vertex pairs identified using the separating sheets were interlinked in the graph. The graph optimization algorithm acted on the entire multiobject, multi-surface graph to yield a globally optimal solution. The segmentation framework was tested on 16 MR-DESS knee-joint datasets from the Osteoarthritis Initiative database. The average signed surface positioning error for the 6 detected surfaces ranged from 0.00 to 0.12 mm. When independently initialized, the signed reproducibility error of bone and cartilage segmentation ranged from 0.00 to 0.26 mm. The results showed that this framework provides robust, accurate, and reproducible segmentation of the knee joint bone and cartilage surfaces of the femur, tibia, and patella. As a general segmentation tool, the developed framework can be applied to a broad range of multi-object segmentation problems.
A novel orthoimage mosaic method using a weighted A∗ algorithm - Implementation and evaluation
NASA Astrophysics Data System (ADS)
Zheng, Maoteng; Xiong, Xiaodong; Zhu, Junfeng
2018-04-01
The implementation and evaluation of a weighted A∗ algorithm for orthoimage mosaic with UAV (Unmanned Aircraft Vehicle) imagery is proposed. The initial seam-line network is firstly generated by standard Voronoi Diagram algorithm; an edge diagram is generated based on DSM (Digital Surface Model) data; the vertices (conjunction nodes of seam-lines) of the initial network are relocated if they are on high objects (buildings, trees and other artificial structures); and the initial seam-lines are refined using the weighted A∗ algorithm based on the edge diagram and the relocated vertices. Our method was tested with three real UAV datasets. Two quantitative terms are introduced to evaluate the results of the proposed method. Preliminary results show that the method is suitable for regular and irregular aligned UAV images for most terrain types (flat or mountainous areas), and is better than the state-of-the-art method in both quality and efficiency based on the test datasets.
NASA Astrophysics Data System (ADS)
Merchant, C. J.; Hulley, G. C.
2013-12-01
There are many datasets describing the evolution of global sea surface temperature (SST) over recent decades -- so why make another one? Answer: to provide observations of SST that have particular qualities relevant to climate applications: independence, accuracy and stability. This has been done within the European Space Agency (ESA) Climate Change Initative (CCI) project on SST. Independence refers to the fact that the new SST CCI dataset is not derived from or tuned to in situ observations. This matters for climate because the in situ observing network used to assess marine climate change (1) was not designed to monitor small changes over decadal timescales, and (2) has evolved significantly in its technology and mix of types of observation, even during the past 40 years. The potential for significant artefacts in our picture of global ocean surface warming is clear. Only by having an independent record can we confirm (or refute) that the work done to remove biases/trend artefacts in in-situ datasets has been successful. Accuracy is the degree to which SSTs are unbiased. For climate applications, a common accuracy target is 0.1 K for all regions of the ocean. Stability is the degree to which the bias, if any, in a dataset is constant over time. Long-term instability introduces trend artefacts. To observe trends of the magnitude of 'global warming', SST datasets need to be stable to <5 mK/year. The SST CCI project has produced a satellite-based dataset that addresses these characteristics relevant to climate applications. Satellite radiances (brightness temperatures) have been harmonised exploiting periods of overlapping observations between sensors. Less well-characterised sensors have had their calibration tuned to that of better characterised sensors (at radiance level). Non-conventional retrieval methods (optimal estimation) have been employed to reduce regional biases to the 0.1 K level, a target violated in most satellite SST datasets. Models for quantifying uncertainty have been developed to attach uncertainty to SST across a range of space-time scales. The stability of the data has been validated.
First global WCRP shortwave surface radiation budget dataset
NASA Technical Reports Server (NTRS)
Whitlock, C. H.; Charlock, T. P.; Staylor, W. F.; Pinker, R. T.; Laszlo, I.; Ohmura, A.; Gilgen, H.; Konzelman, T.; Dipasquale, R. C.; Moats, C. D.
1995-01-01
Shortwave radiative fluxes that reach the earth's surface are key factors that influence atmospheric and oceanic circulations as well as surface climate. Yet, information on these fluxes is meager. Surface site data are generally available from only a limited number of observing stations over land. Much less is known about the large-scale variability of the shortwave radiative fluxes over the oceans, which cover most of the globe. Recognizing the need to produce global-scale fields of such fluxes for use in climate research, the World Climate Research Program has initiated activities that led to the establishment of the Surface Radiation Budget Climatology Project with the ultimate goal to determine various components of the surface radiation budget from satellite data. In this paper, the first global products that resulted from this activity are described. Monthly and daily data on a 280-km grid scale are available. Samples of climate parameters obtainable from the dataset are presented. Emphasis is given to validation and limitations of the results. For most of the globe, satellite estimates have bias values between +/- 20 W/sq m and root mean square (rms) values are around 25 W/sq m. There are specific regions with much larger uncertainties however.
First global WCRP shortwave surface radiation budget dataset
NASA Technical Reports Server (NTRS)
Whitlock, C. H.; Charlock, T. P.; Staylor, W. F.; Pinker, R. T.; Laszlo, I.; Ohmura, A.; Gilgen, H.; Konzelman, T.; DiPasquale, R. C.; Moats, C. D.
1995-01-01
Shortwave radiative fluxes that reach the Earth's surface are key factors that influence atmospheric and oceanic circulations as well as surface climate. Yet, information on these fluxes is meager. Surface site data are generally available from only a limited number of observing stations over land. Much less is known about the large-scale variability of the shortwave radiative fluxes over the oceans, which cover most of the globe. Recognizing the need to produce global-scale fields of such fluxes for use in climate research, the World Climate Research Program has initiated activities that led to the establishment of the Surface Radiation Budget Climatology Project with the ultimate goal to determine various components of the surface radiation budget from satellite data. In this paper, the first global products that resulted from this activity are described. Monthly and daily data on a 280-km grid scale are available. Samples of climate parameters obtainable from the dataset are presented. Emphasis is given to validation and limitations of the results. For most of the globe, satellite estimates have bias values between +/- 20 W/sq m and rms values are around 25 W/sq m. There are specific regions with much larger uncertainties however.
Drilling informatics: data-driven challenges of scientific drilling
NASA Astrophysics Data System (ADS)
Yamada, Yasuhiro; Kyaw, Moe; Saito, Sanny
2017-04-01
The primary aim of scientific drilling is to precisely understand the dynamic nature of the Earth. This is the reason why we investigate the subsurface materials (rock and fluid including microbial community) existing under particular environmental conditions. This requires sample collection and analytical data production from the samples, and in-situ data measurement at boreholes. Current available data comes from cores, cuttings, mud logging, geophysical logging, and exploration geophysics, but these datasets are difficult to be integrated because of their different kinds and scales. Now we are producing more useful datasets to fill the gap between the exiting data and extracting more information from such datasets and finally integrating the information. In particular, drilling parameters are very useful datasets as geomechanical properties. We believe such approach, 'drilling informatics', would be the most appropriate to obtain the comprehensive and dynamic picture of our scientific target, such as the seismogenic fault zone and the Moho discontinuity surface. This presentation introduces our initiative and current achievements of drilling informatics.
NASA Astrophysics Data System (ADS)
Karlsson, K.
2010-12-01
The EUMETSAT CMSAF project (www.cmsaf.eu) compiles climatological datasets from various satellite sources with emphasis on the use of EUMETSAT-operated satellites. However, since climate monitoring primarily has a global scope, also datasets merging data from various satellites and satellite operators are prepared. One such dataset is the CMSAF historic GAC (Global Area Coverage) dataset which is based on AVHRR data from the full historic series of NOAA-satellites and the European METOP satellite in mid-morning orbit launched in October 2006. The CMSAF GAC dataset consists of three groups of products: Macroscopical cloud products (cloud amount, cloud type and cloud top), cloud physical products (cloud phase, cloud optical thickness and cloud liquid water path) and surface radiation products (including surface albedo). Results will be presented and discussed for all product groups, including some preliminary inter-comparisons with other datasets (e.g., PATMOS-X, MODIS and CloudSat/CALIPSO datasets). A background will also be given describing the basic methodology behind the derivation of all products. This will include a short historical review of AVHRR cloud processing and resulting AVHRR applications at SMHI. Historic GAC processing is one of five pilot projects selected by the SCOPE-CM (Sustained Co-Ordinated Processing of Environmental Satellite data for Climate Monitoring) project organised by the WMO Space programme. The pilot project is carried out jointly between CMSAF and NOAA with the purpose of finding an optimal GAC processing approach. The initial activity is to inter-compare results of the CMSAF GAC dataset and the NOAA PATMOS-X dataset for the case when both datasets have been derived using the same inter-calibrated AVHRR radiance dataset. The aim is to get further knowledge of e.g. most useful multispectral methods and the impact of ancillary datasets (for example from meteorological reanalysis datasets from NCEP and ECMWF). The CMSAF project is currently defining plans for another five years (2012-2017) of operations and development. New GAC reprocessing efforts are planned and new methodologies will be tested. Central questions here will be how to increase the quantitative use of the products through improving error and uncertainty estimates and how to compile the information in a way to allow meaningful and efficient ways of using the data for e.g. validation of climate model information.
NASA Astrophysics Data System (ADS)
Blyverket, J.; Hamer, P.; Bertino, L.; Lahoz, W. A.
2017-12-01
The European Space Agency Climate Change Initiative for soil moisture (ESA CCI SM) was initiated in 2012 for a period of six years, the objective for this period was to produce the most complete and consistent global soil moisture data record based on both active and passive sensors. The ESA CCI SM products consist of three surface soil moisture datasets: The ACTIVE product and the PASSIVE product were created by fusing scatterometer and radiometer soil moisture data, respectively. The COMBINED product is a blended product based on the former two datasets. In this study we assimilate globally both the ACTIVE and PASSIVE product at a 25 km spatial resolution. The different satellite platforms have different overpass times, an observation is mapped to the hours 00.00, 06.00, 12.00 or 18.00 if it falls within a 3 hour window centred at these times. We use the SURFEX land surface model with the ISBA diffusion scheme for the soil hydrology. For the assimilation routine we apply the Ensemble Transform Kalman Filter (ETKF). The land surface model is driven by perturbed MERRA-2 atmospheric forcing data, which has a temporal resolution of one hour and is mapped to the SURFEX model grid. Bias between the land surface model and the ESA CCI product is removed by cumulative distribution function (CDF) matching. This work is a step towards creating a global root zone soil moisture product from the most comprehensive satellite surface soil moisture product available. As a first step we consider the period from 2010 - 2016. This allows for comparison against other global root zone soil moisture products (SMAP Level 4, which is independent of the ESA CCI SM product).
Using ERA-Interim reanalysis for creating datasets of energy-relevant climate variables
NASA Astrophysics Data System (ADS)
Jones, Philip D.; Harpham, Colin; Troccoli, Alberto; Gschwind, Benoit; Ranchin, Thierry; Wald, Lucien; Goodess, Clare M.; Dorling, Stephen
2017-07-01
The construction of a bias-adjusted dataset of climate variables at the near surface using ERA-Interim reanalysis is presented. A number of different, variable-dependent, bias-adjustment approaches have been proposed. Here we modify the parameters of different distributions (depending on the variable), adjusting ERA-Interim based on gridded station or direct station observations. The variables are air temperature, dewpoint temperature, precipitation (daily only), solar radiation, wind speed, and relative humidity. These are available on either 3 or 6 h timescales over the period 1979-2016. The resulting bias-adjusted dataset is available through the Climate Data Store (CDS) of the Copernicus Climate Change Data Store (C3S) and can be accessed at present from ftp://ecem.climate.copernicus.eu. The benefit of performing bias adjustment is demonstrated by comparing initial and bias-adjusted ERA-Interim data against gridded observational fields.
While the vast majority of operational air-pollution networks across the world are designed to measure relevant metrics at the surface, the air pollution problem is a three-dimensional phenomenon. The lack of adequate observations aloft to routinely characterize the nature of ai...
The Impact of Sea-Surface Winds on Meteorological Conditions in Israel: An Initial Study
NASA Technical Reports Server (NTRS)
Otterman, J.; Saaroni, H.; Atlas, R.; Ardizzone, J.; Ben-Dor, E.; Druyan, L.; Jusem, C. J.; Karnieli, A.; Terry, J.
2000-01-01
The SSM/I (Spectral Sensor Microwave Imager) dataset is used to monitor surface wind speed and direction at four locations over the Eastern Mediterranean during December 1998 - January 1999. Time series of these data are compared to concurrent series of precipitation, surface temperature, humidity and winds at selected Israeli stations: Sde Dov (coastal), Bet Dagan (5 km. inland), Jerusalem (Judean Hills), Hafetz Haim (3 km. inland) and Sde Boker (central Negev). December 1998 and the beginning of January 1999 were dry in Israel, but significant precipitation was recorded at many stations during the second half of January (1999). SSM/I data show a surge in westerly surface winds west of Israel (32 N, 32.5 E) on 15 January, coinciding with the renewal of precipitation. We discuss the relevant circulation and pressure patterns during this transition in the context of the evolving meteorological conditions at the selected Israeli locations. The SSM/I dataset of near ocean surface winds, available for the last 12 years, is described. We analyze lagged correlation between these data and the Israeli station data and investigate possibility of predictive skill. Application of such relationships to short-term weather prediction would require real-time access to the SSM/I observations.
The SPoRT-WRF: Evaluating the Impact of NASA Datasets on Convective Forecasts
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Case, Jonathan; Kozlowski, Danielle; Molthan, Andrew
2012-01-01
The Short-term Prediction Research and Transition Center (SPoRT) is a collaborative partnership between NASA and operational forecasting entities, including a number of National Weather Service offices. SPoRT transitions real-time NASA products and capabilities to its partners to address specific operational forecast challenges. One challenge that forecasters face is applying convection-allowing numerical models to predict mesoscale convective weather. In order to address this specific forecast challenge, SPoRT produces real-time mesoscale model forecasts using the Weather Research and Forecasting (WRF) model that includes unique NASA products and capabilities. Currently, the SPoRT configuration of the WRF model (SPoRT-WRF) incorporates the 4-km Land Information System (LIS) land surface data, 1-km SPoRT sea surface temperature analysis and 1-km Moderate resolution Imaging Spectroradiometer (MODIS) greenness vegetation fraction (GVF) analysis, and retrieved thermodynamic profiles from the Atmospheric Infrared Sounder (AIRS). The LIS, SST, and GVF data are all integrated into the SPoRT-WRF through adjustments to the initial and boundary conditions, and the AIRS data are assimilated into a 9-hour SPoRT WRF forecast each day at 0900 UTC. This study dissects the overall impact of the NASA datasets and the individual surface and atmospheric component datasets on daily mesoscale forecasts. A case study covering the super tornado outbreak across the Ce ntral and Southeastern United States during 25-27 April 2011 is examined. Three different forecasts are analyzed including the SPoRT-WRF (NASA surface and atmospheric data), the SPoRT WRF without AIRS (NASA surface data only), and the operational National Severe Storms Laboratory (NSSL) WRF (control with no NASA data). The forecasts are compared qualitatively by examining simulated versus observed radar reflectivity. Differences between the simulated reflectivity are further investigated using convective parameters along with model soundings to determine the impacts of the various NASA datasets. Additionally, quantitative evaluation of select meteorological parameters is performed using the Meteorological Evaluation Tools model verification package to compare forecasts to in situ surface and upper air observations.
Prototype global burnt area algorithm using the AVHRR-LTDR time series
NASA Astrophysics Data System (ADS)
López-Saldaña, Gerardo; Pereira, José Miguel; Aires, Filipe
2013-04-01
One of the main limitations of products derived from remotely-sensed data is the length of the data records available for climate studies. The Advanced Very High Resolution Radiometer (AVHRR) long-term data record (LTDR) comprises a daily global atmospherically-corrected surface reflectance dataset at 0.05° spatial resolution and is available for the 1981-1999 time period. Fire is strong cause of land surface change and emissions of greenhouse gases around the globe. A global long-term identification of areas affected by fire is needed to analyze trends and fire-clime relationships. A burnt area algorithm can be seen as a change point detection problem where there is an abrupt change in the surface reflectance due to the biomass burning. Using the AVHRR-LTDR dataset, a time series of bidirectional reflectance distribution function (BRDF) corrected surface reflectance was generated using the daily observations and constraining the BRDF model inversion using a climatology of BRDF parameters derived from 12 years of MODIS data. The identification of the burnt area was performed using a t-test in the pre- and post-fire reflectance values and a change point detection algorithm, then spectral constraints were applied to flag changes caused by natural land processes like vegetation seasonality or flooding. Additional temporal constraints are applied focusing in the persistence of the affected areas. Initial results for year 1998, which was selected because of a positive fire anomaly, show spatio-temporal coherence but further analysis is required and a formal rigorous validation will be applied using burn scars identified from high-resolution datasets.
Localized Segment Based Processing for Automatic Building Extraction from LiDAR Data
NASA Astrophysics Data System (ADS)
Parida, G.; Rajan, K. S.
2017-05-01
The current methods of object segmentation and extraction and classification of aerial LiDAR data is manual and tedious task. This work proposes a technique for object segmentation out of LiDAR data. A bottom-up geometric rule based approach was used initially to devise a way to segment buildings out of the LiDAR datasets. For curved wall surfaces, comparison of localized surface normals was done to segment buildings. The algorithm has been applied to both synthetic datasets as well as real world dataset of Vaihingen, Germany. Preliminary results show successful segmentation of the buildings objects from a given scene in case of synthetic datasets and promissory results in case of real world data. The advantages of the proposed work is non-dependence on any other form of data required except LiDAR. It is an unsupervised method of building segmentation, thus requires no model training as seen in supervised techniques. It focuses on extracting the walls of the buildings to construct the footprint, rather than focussing on roof. The focus on extracting the wall to reconstruct the buildings from a LiDAR scene is crux of the method proposed. The current segmentation approach can be used to get 2D footprints of the buildings, with further scope to generate 3D models. Thus, the proposed method can be used as a tool to get footprints of buildings in urban landscapes, helping in urban planning and the smart cities endeavour.
Access to Emissions Distributions and Related Ancillary Data through the ECCAD database
NASA Astrophysics Data System (ADS)
Darras, Sabine; Granier, Claire; Liousse, Catherine; De Graaf, Erica; Enriquez, Edgar; Boulanger, Damien; Brissebrat, Guillaume
2017-04-01
The ECCAD database (Emissions of atmospheric Compounds and Compilation of Ancillary Data) provides a user-friendly access to global and regional surface emissions for a large set of chemical compounds and ancillary data (land use, active fires, burned areas, population,etc). The emissions inventories are time series gridded data at spatial resolution from 1x1 to 0.1x0.1 degrees. ECCAD is the emissions database of the GEIA (Global Emissions InitiAtive) project and a sub-project of the French Atmospheric Data Center AERIS (http://www.aeris-data.fr). ECCAD has currently more than 2200 users originating from more than 80 countries. The project benefits from this large international community of users to expand the number of emission datasets made available. ECCAD provides detailed metadata for each of the datasets and various tools for data visualization, for computing global and regional totals and for interactive spatial and temporal analysis. The data can be downloaded as interoperable NetCDF CF-compliant files, i.e. the data are compatible with many other client interfaces. The presentation will provide information on the datasets available within ECCAD, as well as examples of the analysis work that can be done online through the website: http://eccad.aeris-data.fr.
Access to Emissions Distributions and Related Ancillary Data through the ECCAD database
NASA Astrophysics Data System (ADS)
Darras, Sabine; Enriquez, Edgar; Granier, Claire; Liousse, Catherine; Boulanger, Damien; Fontaine, Alain
2016-04-01
The ECCAD database (Emissions of atmospheric Compounds and Compilation of Ancillary Data) provides a user-friendly access to global and regional surface emissions for a large set of chemical compounds and ancillary data (land use, active fires, burned areas, population,etc). The emissions inventories are time series gridded data at spatial resolution from 1x1 to 0.1x0.1 degrees. ECCAD is the emissions database of the GEIA (Global Emissions InitiAtive) project and a sub-project of the French Atmospheric Data Center AERIS (http://www.aeris-data.fr). ECCAD has currently more than 2200 users originating from more than 80 countries. The project benefits from this large international community of users to expand the number of emission datasets made available. ECCAD provides detailed metadata for each of the datasets and various tools for data visualization, for computing global and regional totals and for interactive spatial and temporal analysis. The data can be downloaded as interoperable NetCDF CF-compliant files, i.e. the data are compatible with many other client interfaces. The presentation will provide information on the datasets available within ECCAD, as well as examples of the analysis work that can be done online through the website: http://eccad.aeris-data.fr.
Khalid Hussein
2012-02-01
This "Weakly Anomalous to Anomalous Surface Temperature" dataset differs from the "Anomalous Surface Temperature" dataset for this county (another remotely sensed CIRES product) by showing areas of modeled temperatures between 1o and 2o above the mean, as opposed to the greater than 2o temperatures contained in the "Anomalous Surface Temperature" dataset. Note: 'o' is used in this description to represent lowercase sigma
Soil moisture - precipitation feedbacks in observations and models (Invited)
NASA Astrophysics Data System (ADS)
Taylor, C.
2013-12-01
There is considerable uncertainty about the strength, geographical extent, and even the sign of feedbacks between soil moisture and precipitation. Whilst precipitation trivially increases soil moisture, the impact of soil moisture, via surface fluxes, on convective rainfall is far from straight-forward, and likely depends on space and time scale, soil and synoptic conditions, and the nature of the convection itself. In considering how daytime convection responds to surface fluxes, large-scale models based on convective parameterisations may not necessarily provide reliable depictions, particularly given their long-standing inability to reproduce a realistic diurnal cycle of convection. On the other hand, long-term satellite data provide the potential to establish robust relationships between soil moisture and precipitation across the world, notwithstanding some fundamental weaknesses and uncertainties in the datasets. Here, results from regional and global satellite-based analyses are presented. Globally, using 3-hourly precipitation and daily soil moisture datasets, a methodology has been developed to compare the statistics of antecedent soil moisture in the region of localised afternoon rain events (Taylor et al 2012). Specifically the analysis tests whether there are any significant differences in pre-event soil moisture between rainfall maxima and nearby (50-100km) minima. The results reveal a clear signal across a number of semi-arid regions, most notably North Africa, indicating a preference for afternoon rain over drier soil. Analysis by continent and by climatic zone reveals that this signal (locally a negative feedback) is evident in other continents and climatic zones, but is somewhat weaker. This may be linked to the inherent geographical differences across the world, as detection of a feedback requires water-stressed surfaces coincident with frequent active convective initiations. The differences also reflect the quality and utility of the soil moisture datasets outside of sparsely-vegetated regions. No evidence is found for afternoon convection developing preferentially above locally moister soils. Higher resolution datasets are used to provide a clearer relationship between soil moisture patterns and convective initiation in both the Sahel (Taylor et al 2011) and Europe. The observations indicate a preference for convection to initiate on soil moisture gradients, consistent with many high resolution numerical studies. The ability of models to capture the observed relationships between soil moisture and rainfall in the Sahel has been evaluated. This focuses on models run at different resolutions, and with convective parameterisations switched on or off, and highlights issues associated with the parameterisation of convection. Taylor, C.M., Gounou, A., Guichard, F., Harris, P.P., Ellis, R.J.,Couvreux, F., and M. De Kauwe. 2011, Frequency of Sahelian storm initiation enhanced over mesoscale soil-moisture patterns, Nature Geoscience, 4, 430-433, doi:10.1038/ngeo1173 Taylor, C.M., de Jeu, R.A.M., Guichard, F., Harris, P.P, and W.A. Dorigo. 2012, Afternoon rain more likely over drier soils, Nature, 489, 423-426, doi:10.1038/nature11377
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Kumar, Sujay V.; Kuligowski, Robert J.; Langston, Carrie
2013-01-01
The NASA Short ]term Prediction Research and Transition (SPoRT) Center in Huntsville, AL is running a real ]time configuration of the NASA Land Information System (LIS) with the Noah land surface model (LSM). Output from the SPoRT ]LIS run is used to initialize land surface variables for local modeling applications at select National Weather Service (NWS) partner offices, and can be displayed in decision support systems for situational awareness and drought monitoring. The SPoRT ]LIS is run over a domain covering the southern and eastern United States, fully nested within the National Centers for Environmental Prediction Stage IV precipitation analysis grid, which provides precipitation forcing to the offline LIS ]Noah runs. The SPoRT Center seeks to expand the real ]time LIS domain to the entire Continental U.S. (CONUS); however, geographical limitations with the Stage IV analysis product have inhibited this expansion. Therefore, a goal of this study is to test alternative precipitation forcing datasets that can enable the LIS expansion by improving upon the current geographical limitations of the Stage IV product. The four precipitation forcing datasets that are inter ]compared on a 4 ]km resolution CONUS domain include the Stage IV, an experimental GOES quantitative precipitation estimate (QPE) from NESDIS/STAR, the National Mosaic and QPE (NMQ) product from the National Severe Storms Laboratory, and the North American Land Data Assimilation System phase 2 (NLDAS ]2) analyses. The NLDAS ]2 dataset is used as the control run, with each of the other three datasets considered experimental runs compared against the control. The regional strengths, weaknesses, and biases of each precipitation analysis are identified relative to the NLDAS ]2 control in terms of accumulated precipitation pattern and amount, and the impacts on the subsequent LSM spin ]up simulations. The ultimate goal is to identify an alternative precipitation forcing dataset that can best support an expansion of the real ]time SPoRT ]LIS to a domain covering the entire CONUS.
EverVIEW: a visualization platform for hydrologic and Earth science gridded data
Romañach, Stephanie S.; McKelvy, James M.; Suir, Kevin J.; Conzelmann, Craig
2015-01-01
The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.
NASA Astrophysics Data System (ADS)
Gelati, Emiliano; Decharme, Bertrand; Calvet, Jean-Christophe; Minvielle, Marie; Polcher, Jan; Fairbairn, David; Weedon, Graham P.
2018-04-01
Physically consistent descriptions of land surface hydrology are crucial for planning human activities that involve freshwater resources, especially in light of the expected climate change scenarios. We assess how atmospheric forcing data uncertainties affect land surface model (LSM) simulations by means of an extensive evaluation exercise using a number of state-of-the-art remote sensing and station-based datasets. For this purpose, we use the CO2-responsive ISBA-A-gs LSM coupled with the CNRM version of the Total Runoff Integrated Pathways (CTRIP) river routing model. We perform multi-forcing simulations over the Euro-Mediterranean area (25-75.5° N, 11.5° W-62.5° E, at 0.5° resolution) from 1979 to 2012. The model is forced using four atmospheric datasets. Three of them are based on the ERA-Interim reanalysis (ERA-I). The fourth dataset is independent from ERA-Interim: PGF, developed at Princeton University. The hydrological impacts of atmospheric forcing uncertainties are assessed by comparing simulated surface soil moisture (SSM), leaf area index (LAI) and river discharge against observation-based datasets: SSM from the European Space Agency's Water Cycle Multi-mission Observation Strategy and Climate Change Initiative projects (ESA-CCI), LAI of the Global Inventory Modeling and Mapping Studies (GIMMS), and Global Runoff Data Centre (GRDC) river discharge. The atmospheric forcing data are also compared to reference datasets. Precipitation is the most uncertain forcing variable across datasets, while the most consistent are air temperature and SW and LW radiation. At the monthly timescale, SSM and LAI simulations are relatively insensitive to forcing uncertainties. Some discrepancies with ESA-CCI appear to be forcing-independent and may be due to different assumptions underlying the LSM and the remote sensing retrieval algorithm. All simulations overestimate average summer and early-autumn LAI. Forcing uncertainty impacts on simulated river discharge are larger on mean values and standard deviations than on correlations with GRDC data. Anomaly correlation coefficients are not inferior to those computed from raw monthly discharge time series, indicating that the model reproduces inter-annual variability fairly well. However, simulated river discharge time series generally feature larger variability compared to measurements. They also tend to overestimate winter-spring high flows and underestimate summer-autumn low flows. Considering that several differences emerge between simulations and reference data, which may not be completely explained by forcing uncertainty, we suggest several research directions. These range from further investigating the discrepancies between LSMs and remote sensing retrievals to developing new model components to represent physical and anthropogenic processes.
Determination of elastic moduli from measured acoustic velocities.
Brown, J Michael
2018-06-01
Methods are evaluated in solution of the inverse problem associated with determination of elastic moduli for crystals of arbitrary symmetry from elastic wave velocities measured in many crystallographic directions. A package of MATLAB functions provides a robust and flexible environment for analysis of ultrasonic, Brillouin, or Impulsive Stimulated Light Scattering datasets. Three inverse algorithms are considered: the gradient-based methods of Levenberg-Marquardt and Backus-Gilbert, and a non-gradient-based (Nelder-Mead) simplex approach. Several data types are considered: body wave velocities alone, surface wave velocities plus a side constraint on X-ray-diffraction-based axes compressibilities, or joint body and surface wave velocities. The numerical algorithms are validated through comparisons with prior published results and through analysis of synthetic datasets. Although all approaches succeed in finding low-misfit solutions, the Levenberg-Marquardt method consistently demonstrates effectiveness and computational efficiency. However, linearized gradient-based methods, when applied to a strongly non-linear problem, may not adequately converge to the global minimum. The simplex method, while slower, is less susceptible to being trapped in local misfit minima. A "multi-start" strategy (initiate searches from more than one initial guess) provides better assurance that global minima have been located. Numerical estimates of parameter uncertainties based on Monte Carlo simulations are compared to formal uncertainties based on covariance calculations. Copyright © 2018 Elsevier B.V. All rights reserved.
Machine learning in a graph framework for subcortical segmentation
NASA Astrophysics Data System (ADS)
Guo, Zhihui; Kashyap, Satyananda; Sonka, Milan; Oguz, Ipek
2017-02-01
Automated and reliable segmentation of subcortical structures from human brain magnetic resonance images is of great importance for volumetric and shape analyses in quantitative neuroimaging studies. However, poor boundary contrast and variable shape of these structures make the automated segmentation a tough task. We propose a 3D graph-based machine learning method, called LOGISMOS-RF, to segment the caudate and the putamen from brain MRI scans in a robust and accurate way. An atlas-based tissue classification and bias-field correction method is applied to the images to generate an initial segmentation for each structure. Then a 3D graph framework is utilized to construct a geometric graph for each initial segmentation. A locally trained random forest classifier is used to assign a cost to each graph node. The max-flow algorithm is applied to solve the segmentation problem. Evaluation was performed on a dataset of T1-weighted MRI's of 62 subjects, with 42 images used for training and 20 images for testing. For comparison, FreeSurfer, FSL and BRAINSCut approaches were also evaluated using the same dataset. Dice overlap coefficients and surface-to-surfaces distances between the automated segmentation and expert manual segmentations indicate the results of our method are statistically significantly more accurate than the three other methods, for both the caudate (Dice: 0.89 +/- 0.03) and the putamen (0.89 +/- 0.03).
NASA Technical Reports Server (NTRS)
Case, Johnathan L.; Mungai, John; Sakwa, Vincent; Kabuchanga, Eric; Zavodsky, Bradley T.; Limaye, Ashutosh S.
2014-01-01
Flooding and drought are two key forecasting challenges for the Kenya Meteorological Service (KMS). Atmospheric processes leading to excessive precipitation and/or prolonged drought can be quite sensitive to the state of the land surface, which interacts with the planetary boundary layer (PBL) of the atmosphere providing a source of heat and moisture. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface, particularly within weakly-sheared environments such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in land surface and numerical weather prediction (NWP) models. Enhanced regional modeling capabilities have the potential to improve forecast guidance in support of daily operations and high-impact weather over eastern Africa. KMS currently runs a configuration of the Weather Research and Forecasting (WRF) NWP model in real time to support its daily forecasting operations, making use of the NOAA/National Weather Service (NWS) Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the KMS-WRF runs on a regional grid over eastern Africa. Two organizations at the NASA Marshall Space Flight Center in Huntsville, AL, SERVIR and the Shortterm Prediction Research and Transition (SPoRT) Center, have established a working partnership with KMS for enhancing its regional modeling capabilities through new datasets and tools. To accomplish this goal, SPoRT and SERVIR is providing enhanced, experimental land surface initialization datasets and model verification capabilities to KMS as part of this collaboration. To produce a land-surface initialization more consistent with the resolution of the KMS-WRF runs, the NASA Land Information System (LIS) is run at a comparable resolution to provide real-time, daily soil initialization data in place of data interpolated from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model soil moisture and temperature fields. Additionally, realtime green vegetation fraction (GVF) data from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (Suomi- NPP) satellite will be incorporated into the KMS-WRF runs, once it becomes publicly available from the National Environmental Satellite Data and Information Service (NESDIS). Finally, model verification capabilities will be transitioned to KMS using the Model Evaluation Tools (MET; Brown et al. 2009) package in conjunction with a dynamic scripting package developed by SPoRT (Zavodsky et al. 2014), to help quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. Furthermore, the transition of these MET tools will enable KMS to monitor model forecast accuracy in near real time. This paper presents preliminary efforts to improve land surface model initialization over eastern Africa in support of operations at KMS. The remainder of this extended abstract is organized as follows: The collaborating organizations involved in the project are described in Section 2; background information on LIS and the configuration for eastern Africa is presented in Section 3; the WRF configuration used in this modeling experiment is described in Section 4; sample experimental WRF output with and without LIS initialization data are given in Section 5; a summary is given in Section 6 followed by acknowledgements and references.
NASA Astrophysics Data System (ADS)
Lal, Mohan; Mishra, S. K.; Pandey, Ashish; Pandey, R. P.; Meena, P. K.; Chaudhary, Anubhav; Jha, Ranjit Kumar; Shreevastava, Ajit Kumar; Kumar, Yogendra
2017-01-01
The Soil Conservation Service curve number (SCS-CN) method, also known as the Natural Resources Conservation Service curve number (NRCS-CN) method, is popular for computing the volume of direct surface runoff for a given rainfall event. The performance of the SCS-CN method, based on large rainfall (P) and runoff (Q) datasets of United States watersheds, is evaluated using a large dataset of natural storm events from 27 agricultural plots in India. On the whole, the CN estimates from the National Engineering Handbook (chapter 4) tables do not match those derived from the observed P and Q datasets. As a result, the runoff prediction using former CNs was poor for the data of 22 (out of 24) plots. However, the match was little better for higher CN values, consistent with the general notion that the existing SCS-CN method performs better for high rainfall-runoff (high CN) events. Infiltration capacity (fc) was the main explanatory variable for runoff (or CN) production in study plots as it exhibited the expected inverse relationship between CN and fc. The plot-data optimization yielded initial abstraction coefficient (λ) values from 0 to 0.659 for the ordered dataset and 0 to 0.208 for the natural dataset (with 0 as the most frequent value). Mean and median λ values were, respectively, 0.030 and 0 for the natural rainfall-runoff dataset and 0.108 and 0 for the ordered rainfall-runoff dataset. Runoff estimation was very sensitive to λ and it improved consistently as λ changed from 0.2 to 0.03.
NASA Astrophysics Data System (ADS)
Li, Tao; Zheng, Xiaogu; Dai, Yongjiu; Yang, Chi; Chen, Zhuoqi; Zhang, Shupeng; Wu, Guocan; Wang, Zhonglei; Huang, Chengcheng; Shen, Yan; Liao, Rongwei
2014-09-01
As part of a joint effort to construct an atmospheric forcing dataset for mainland China with high spatiotemporal resolution, a new approach is proposed to construct gridded near-surface temperature, relative humidity, wind speed and surface pressure with a resolution of 1 km×1 km. The approach comprises two steps: (1) fit a partial thin-plate smoothing spline with orography and reanalysis data as explanatory variables to ground-based observations for estimating a trend surface; (2) apply a simple kriging procedure to the residual for trend surface correction. The proposed approach is applied to observations collected at approximately 700 stations over mainland China. The generated forcing fields are compared with the corresponding components of the National Centers for Environmental Prediction (NCEP) Climate Forecast System Reanalysis dataset and the Princeton meteorological forcing dataset. The comparison shows that, both within the station network and within the resolutions of the two gridded datasets, the interpolation errors of the proposed approach are markedly smaller than the two gridded datasets.
NASA Astrophysics Data System (ADS)
Lara, Mark J.; Nitze, Ingmar; Grosse, Guido; McGuire, A. David
2018-04-01
Arctic tundra landscapes are composed of a complex mosaic of patterned ground features, varying in soil moisture, vegetation composition, and surface hydrology over small spatial scales (10-100 m). The importance of microtopography and associated geomorphic landforms in influencing ecosystem structure and function is well founded, however, spatial data products describing local to regional scale distribution of patterned ground or polygonal tundra geomorphology are largely unavailable. Thus, our understanding of local impacts on regional scale processes (e.g., carbon dynamics) may be limited. We produced two key spatiotemporal datasets spanning the Arctic Coastal Plain of northern Alaska (~60,000 km2) to evaluate climate-geomorphological controls on arctic tundra productivity change, using (1) a novel 30 m classification of polygonal tundra geomorphology and (2) decadal-trends in surface greenness using the Landsat archive (1999-2014). These datasets can be easily integrated and adapted in an array of local to regional applications such as (1) upscaling plot-level measurements (e.g., carbon/energy fluxes), (2) mapping of soils, vegetation, or permafrost, and/or (3) initializing ecosystem biogeochemistry, hydrology, and/or habitat modeling.
Lara, Mark J; Nitze, Ingmar; Grosse, Guido; McGuire, A David
2018-04-10
Arctic tundra landscapes are composed of a complex mosaic of patterned ground features, varying in soil moisture, vegetation composition, and surface hydrology over small spatial scales (10-100 m). The importance of microtopography and associated geomorphic landforms in influencing ecosystem structure and function is well founded, however, spatial data products describing local to regional scale distribution of patterned ground or polygonal tundra geomorphology are largely unavailable. Thus, our understanding of local impacts on regional scale processes (e.g., carbon dynamics) may be limited. We produced two key spatiotemporal datasets spanning the Arctic Coastal Plain of northern Alaska (~60,000 km 2 ) to evaluate climate-geomorphological controls on arctic tundra productivity change, using (1) a novel 30 m classification of polygonal tundra geomorphology and (2) decadal-trends in surface greenness using the Landsat archive (1999-2014). These datasets can be easily integrated and adapted in an array of local to regional applications such as (1) upscaling plot-level measurements (e.g., carbon/energy fluxes), (2) mapping of soils, vegetation, or permafrost, and/or (3) initializing ecosystem biogeochemistry, hydrology, and/or habitat modeling.
Recent Development on the NOAA's Global Surface Temperature Dataset
NASA Astrophysics Data System (ADS)
Zhang, H. M.; Huang, B.; Boyer, T.; Lawrimore, J. H.; Menne, M. J.; Rennie, J.
2016-12-01
Global Surface Temperature (GST) is one of the most widely used indicators for climate trend and extreme analyses. A widely used GST dataset is the NOAA merged land-ocean surface temperature dataset known as NOAAGlobalTemp (formerly MLOST). The NOAAGlobalTemp had recently been updated from version 3.5.4 to version 4. The update includes a significant improvement in the ocean surface component (Extended Reconstructed Sea Surface Temperature or ERSST, from version 3b to version 4) which resulted in an increased temperature trends in recent decades. Since then, advancements in both the ocean component (ERSST) and land component (GHCN-Monthly) have been made, including the inclusion of Argo float SSTs and expanded EOT modes in ERSST, and the use of ISTI databank in GHCN-Monthly. In this presentation, we describe the impact of those improvements on the merged global temperature dataset, in terms of global trends and other aspects.
Gridded global surface ozone metrics for atmospheric chemistry model evaluation
NASA Astrophysics Data System (ADS)
Sofen, E. D.; Bowdalo, D.; Evans, M. J.; Apadula, F.; Bonasoni, P.; Cupeiro, M.; Ellul, R.; Galbally, I. E.; Girgzdiene, R.; Luppo, S.; Mimouni, M.; Nahas, A. C.; Saliba, M.; Tørseth, K.; Wmo Gaw, Epa Aqs, Epa Castnet, Capmon, Naps, Airbase, Emep, Eanet Ozone Datasets, All Other Contributors To
2015-07-01
The concentration of ozone at the Earth's surface is measured at many locations across the globe for the purposes of air quality monitoring and atmospheric chemistry research. We have brought together all publicly available surface ozone observations from online databases from the modern era to build a consistent dataset for the evaluation of chemical transport and chemistry-climate (Earth System) models for projects such as the Chemistry-Climate Model Initiative and Aer-Chem-MIP. From a total dataset of approximately 6600 sites and 500 million hourly observations from 1971-2015, approximately 2200 sites and 200 million hourly observations pass screening as high-quality sites in regional background locations that are appropriate for use in global model evaluation. There is generally good data volume since the start of air quality monitoring networks in 1990 through 2013. Ozone observations are biased heavily toward North America and Europe with sparse coverage over the rest of the globe. This dataset is made available for the purposes of model evaluation as a set of gridded metrics intended to describe the distribution of ozone concentrations on monthly and annual timescales. Metrics include the moments of the distribution, percentiles, maximum daily eight-hour average (MDA8), SOMO35, AOT40, and metrics related to air quality regulatory thresholds. Gridded datasets are stored as netCDF-4 files and are available to download from the British Atmospheric Data Centre (doi:10.5285/08fbe63d-fa6d-4a7a-b952-5932e3ab0452). We provide recommendations to the ozone measurement community regarding improving metadata reporting to simplify ongoing and future efforts in working with ozone data from disparate networks in a consistent manner.
NASA Astrophysics Data System (ADS)
Arendt, A. A.; Houser, P.; Kapnick, S. B.; Kargel, J. S.; Kirschbaum, D.; Kumar, S.; Margulis, S. A.; McDonald, K. C.; Osmanoglu, B.; Painter, T. H.; Raup, B. H.; Rupper, S.; Tsay, S. C.; Velicogna, I.
2017-12-01
The High Mountain Asia Team (HiMAT) is an assembly of 13 research groups funded by NASA to improve understanding of cryospheric and hydrological changes in High Mountain Asia (HMA). Our project goals are to quantify historical and future variability in weather and climate over the HMA, partition the components of the water budget across HMA watersheds, explore physical processes driving changes, and predict couplings and feedbacks between physical and human systems through assessment of hazards and downstream impacts. These objectives are being addressed through analysis of remote sensing datasets combined with modeling and assimilation methods to enable data integration across multiple spatial and temporal scales. Our work to date has focused on developing improved high resolution precipitation, snow cover and snow water equivalence products through a variety of statistical uncertainty analysis, dynamical downscaling and assimilation techniques. These and other high resolution climate products are being used as input and validation for an assembly of land surface and General Circulation Models. To quantify glacier change in the region we have calculated multidecadal mass balances of a subset of HMA glaciers by comparing commercial satellite imagery with earlier elevation datasets. HiMAT is using these tools and datasets to explore the impact of atmospheric aerosols and surface impurities on surface energy exchanges, to determine drivers of glacier and snowpack melt rates, and to improve our capacity to predict future hydrological variability. Outputs from the climate and land surface assessments are being combined with landslide and glacier lake inventories to refine our ability to predict hazards in the region. Economic valuation models are also being used to assess impacts on water resources and hydropower. Field data of atmospheric aerosol, radiative flux and glacier lake conditions are being collected to provide ground validation for models and remote sensing products. In this presentation we will discuss initial results and outline plans for a scheduled release of our datasets and findings to the broader community. We will also describe our methods for cross-team collaboration through the adoption of cloud computing and data integration tools.
NASA Technical Reports Server (NTRS)
Minnis, Patrick; Smith, William L., Jr.; Bedka, Kristopher M.; Nguyen, Louis; Palikonda, Rabindra; Hong, Gang; Trepte, Qing Z.; Chee, Thad; Scarino, Benjamin; Spangenberg, Douglas A.;
2014-01-01
Cloud properties determined from satellite imager radiances provide a valuable source of information for nowcasting and weather forecasting. In recent years, it has been shown that assimilation of cloud top temperature, optical depth, and total water path can increase the accuracies of weather analyses and forecasts. Aircraft icing conditions can be accurately diagnosed in near--real time (NRT) retrievals of cloud effective particle size, phase, and water path, providing valuable data for pilots. NRT retrievals of surface skin temperature can also be assimilated in numerical weather prediction models to provide more accurate representations of solar heating and longwave cooling at the surface, where convective initiation. These and other applications are being exploited more frequently as the value of NRT cloud data become recognized. At NASA Langley, cloud properties and surface skin temperature are being retrieved in near--real time globally from both geostationary (GEO) and low--earth orbiting (LEO) satellite imagers for weather model assimilation and nowcasting for hazards such as aircraft icing. Cloud data from GEO satellites over North America are disseminated through NCEP, while those data and global LEO and GEO retrievals are disseminated from a Langley website. This paper presents an overview of the various available datasets, provides examples of their application, and discusses the use of the various datasets downstream. Future challenges and areas of improvement are also presented.
NASA Astrophysics Data System (ADS)
Minnis, P.; Smith, W., Jr.; Bedka, K. M.; Nguyen, L.; Palikonda, R.; Hong, G.; Trepte, Q.; Chee, T.; Scarino, B. R.; Spangenberg, D.; Sun-Mack, S.; Fleeger, C.; Ayers, J. K.; Chang, F. L.; Heck, P. W.
2014-12-01
Cloud properties determined from satellite imager radiances provide a valuable source of information for nowcasting and weather forecasting. In recent years, it has been shown that assimilation of cloud top temperature, optical depth, and total water path can increase the accuracies of weather analyses and forecasts. Aircraft icing conditions can be accurately diagnosed in near-real time (NRT) retrievals of cloud effective particle size, phase, and water path, providing valuable data for pilots. NRT retrievals of surface skin temperature can also be assimilated in numerical weather prediction models to provide more accurate representations of solar heating and longwave cooling at the surface, where convective initiation. These and other applications are being exploited more frequently as the value of NRT cloud data become recognized. At NASA Langley, cloud properties and surface skin temperature are being retrieved in near-real time globally from both geostationary (GEO) and low-earth orbiting (LEO) satellite imagers for weather model assimilation and nowcasting for hazards such as aircraft icing. Cloud data from GEO satellites over North America are disseminated through NCEP, while those data and global LEO and GEO retrievals are disseminated from a Langley website. This paper presents an overview of the various available datasets, provides examples of their application, and discusses the use of the various datasets downstream. Future challenges and areas of improvement are also presented.
NASA Astrophysics Data System (ADS)
Ryu, Youngryel; Jiang, Chongya
2016-04-01
To gain insights about the underlying impacts of global climate change on terrestrial ecosystem fluxes, we present a long-term (1982-2015) global radiation, carbon and water fluxes products by integrating multi-satellite data with a process-based model, the Breathing Earth System Simulator (BESS). BESS is a coupled processed model that integrates radiative transfer in the atmosphere and canopy, photosynthesis (GPP), and evapotranspiration (ET). BESS was designed most sensitive to the variables that can be quantified reliably, fully taking advantages of remote sensing atmospheric and land products. Originally, BESS entirely relied on MODIS as input variables to produce global GPP and ET during the MODIS era. This study extends the work to provide a series of long-term products from 1982 to 2015 by incorporating AVHRR data. In addition to GPP and ET, more land surface processes related datasets are mapped to facilitate the discovery of the ecological variations and changes. The CLARA-A1 cloud property datasets, the TOMS aerosol datasets, along with the GLASS land surface albedo datasets, were input to a look-up table derived from an atmospheric radiative transfer model to produce direct and diffuse components of visible and near infrared radiation datasets. Theses radiation components together with the LAI3g datasets and the GLASS land surface albedo datasets, were used to calculate absorbed radiation through a clumping corrected two-stream canopy radiative transfer model. ECMWF ERA interim air temperature data were downscaled by using ALP-II land surface temperature dataset and a region-dependent regression model. The spatial and seasonal variations of CO2 concentration were accounted by OCO-2 datasets, whereas NOAA's global CO2 growth rates data were used to describe interannual variations. All these remote sensing based datasets are used to run the BESS. Daily fluxes in 1/12 degree were computed and then aggregated to half-month interval to match with the spatial-temporal resolution of LAI3g dataset. The BESS GPP and ET products were compared to other independent datasets including MPI-BGC and CLM. Overall, the BESS products show good agreement with the other two datasets, indicating a compelling potential for bridging remote sensing and land surface models.
NASA Astrophysics Data System (ADS)
Moise Famien, Adjoua; Janicot, Serge; Delfin Ochou, Abe; Vrac, Mathieu; Defrance, Dimitri; Sultan, Benjamin; Noël, Thomas
2018-03-01
The objective of this paper is to present a new dataset of bias-corrected CMIP5 global climate model (GCM) daily data over Africa. This dataset was obtained using the cumulative distribution function transform (CDF-t) method, a method that has been applied to several regions and contexts but never to Africa. Here CDF-t has been applied over the period 1950-2099 combining Historical runs and climate change scenarios for six variables: precipitation, mean near-surface air temperature, near-surface maximum air temperature, near-surface minimum air temperature, surface downwelling shortwave radiation, and wind speed, which are critical variables for agricultural purposes. WFDEI has been used as the reference dataset to correct the GCMs. Evaluation of the results over West Africa has been carried out on a list of priority user-based metrics that were discussed and selected with stakeholders. It includes simulated yield using a crop model simulating maize growth. These bias-corrected GCM data have been compared with another available dataset of bias-corrected GCMs using WATCH Forcing Data as the reference dataset. The impact of WFD, WFDEI, and also EWEMBI reference datasets has been also examined in detail. It is shown that CDF-t is very effective at removing the biases and reducing the high inter-GCM scattering. Differences with other bias-corrected GCM data are mainly due to the differences among the reference datasets. This is particularly true for surface downwelling shortwave radiation, which has a significant impact in terms of simulated maize yields. Projections of future yields over West Africa are quite different, depending on the bias-correction method used. However all these projections show a similar relative decreasing trend over the 21st century.
NASA Astrophysics Data System (ADS)
Newman, A. J.; Clark, M. P.; Nijssen, B.; Wood, A.; Gutmann, E. D.; Mizukami, N.; Longman, R. J.; Giambelluca, T. W.; Cherry, J.; Nowak, K.; Arnold, J.; Prein, A. F.
2016-12-01
Gridded precipitation and temperature products are inherently uncertain due to myriad factors. These include interpolation from a sparse observation network, measurement representativeness, and measurement errors. Despite this inherent uncertainty, uncertainty is typically not included, or is a specific addition to each dataset without much general applicability across different datasets. A lack of quantitative uncertainty estimates for hydrometeorological forcing fields limits their utility to support land surface and hydrologic modeling techniques such as data assimilation, probabilistic forecasting and verification. To address this gap, we have developed a first of its kind gridded, observation-based ensemble of precipitation and temperature at a daily increment for the period 1980-2012 over the United States (including Alaska and Hawaii). A longer, higher resolution version (1970-present, 1/16th degree) has also been implemented to support real-time hydrologic- monitoring and prediction in several regional US domains. We will present the development and evaluation of the dataset, along with initial applications of the dataset for ensemble data assimilation and probabilistic evaluation of high resolution regional climate model simulations. We will also present results on the new high resolution products for Alaska and Hawaii (2 km and 250 m respectively), to complete the first ensemble observation based product suite for the entire 50 states. Finally, we will present plans to improve the ensemble dataset, focusing on efforts to improve the methods used for station interpolation and ensemble generation, as well as methods to fuse station data with numerical weather prediction model output.
NASA Technical Reports Server (NTRS)
Zhang, Yuanchong; Rossow, William B.; Stackhouse, Paul W., Jr.
2007-01-01
Direct estimates of surface radiative fluxes that resolve regional and weather-scale variabilty over the whole globe with reasonable accuracy have only become possible with the advent of extensive global, mostly satellite, datasets within the past couple of decades. The accuracy of these fluxes, estimated to be about 10-15 W per square meter is largely limited by the accuracy of the input datasets. The leading uncertainties in the surface fluxes are no longer predominantly induced by clouds but are now as much associated with uncertainties in the surface and near-surface atmospheric properties. This study presents a fuller, more quantitative evaluation of the uncertainties for the surface albedo and emissivity and surface skin temperatures by comparing the main available global datasets from the Moderate-Resolution Imaging Spectroradiometer product, the NASA Global Energy and Water Cycle Experiment Surface Radiation Budget project, the European Centre for Medium-Range Weather Forecasts, the National Aeronautics and Space Administration, the National Centers for Environmental Prediction, the International Satellite Cloud Climatology Project (ISCCP), the Laboratoire de Meteorologie Dynamique, NOAA/NASA Pathfinder Advanced Very High Resolution Radiometer project, NOAA Optimum Interpolation Sea Surface Temperature Analysis and the Tropical Rainfall Measuring Mission (TRMM) Microwave Image project. The datasets are, in practice, treated as an ensemble of realizations of the actual climate such that their differences represent an estimate of the uncertainty in their measurements because we do not possess global truth datasets for these quantities. The results are globally representative and may be taken as a generalization of our previous ISCCP-based uncertainty estimates for the input datasets. Surface properties have the primary role in determining the surface upward shortwave (SW) and longwave (LW) flux. From this study, the following conclusions are obtained. Although land surface albedos in the near near-infrared remain poorly constrained (highly uncertain), they do not cause too much error in total surface SW fluxes; the more subtle regional and seasonal variations associated with vegetation and snow are still on doubt. The uncertainty of the broadband black-sky SW albedo for land surface from this study is about 7%, which can easily induce 5-10 W per square meter uncertainty in (upwelling) surface SW flux estimates. Even though available surface (broadband) LW emissivity datasets differ significantly (3%-5% uncertainty), this disagreement is confined to wavelengths greater than 20 micrometers so that there is little practical effect (1-3 W per square meters) on the surface upwelling LW fluxes. The surface skin temperature is one of two leading factors that cause problems with surface LW fluxes. Even though the differences among the various datasets are generally only 2-4 K, this can easily cause 10-15 W per square meter uncertainty in calculated surface (upwelling) LW fluxes. Significant improvements could be obtained for surface LW flux calculations by improving the retrievals of (in order of decreasing importance): (1) surface skin temperature, (2) surface air and near-surface-layer temperature, (3) column precipitable water amount and (4) broadband emissivity. And for surface SW fluxes, improvements could be obtained (excluding improved cloud treatment) by improving the retrievals of (1) aerosols (from our sensitivity studies but not discussed in this work), and (2) surface (black-sky) albedo, of which, NIR part of the spectrum has much larger uncertainty.
NASA Technical Reports Server (NTRS)
Coy, James; Schultz, Christopher J.; Case, Jonathan L.
2017-01-01
Can we use modeled information of the land surface and characteristics of lightning beyond flash occurrence to increase the identification and prediction of wildfires? Combine observed cloud-to-ground (CG) flashes with real-time land surface model output, and Compare data with areas where lightning did not start a wildfire to determine what land surface conditions and lightning characteristics were responsible for causing wildfires. Statistical differences between suspected fire-starters and non-fire-starters were peak-current dependent 0-10 cm Volumetric and Relative Soil Moisture comparisons were statistically dependent to at least the p = 0.05 independence level for both polarity flash types Suspected fire-starters typically occurred in areas of lower soil moisture than non-fire-starters. GVF value comparisons were only found to be statistically dependent for -CG flashes. However, random sampling of the -CG non-fire starter dataset revealed that this relationship may not always hold.
An empirical understanding of triple collocation evaluation measure
NASA Astrophysics Data System (ADS)
Scipal, Klaus; Doubkova, Marcela; Hegyova, Alena; Dorigo, Wouter; Wagner, Wolfgang
2013-04-01
Triple collocation method is an advanced evaluation method that has been used in the soil moisture field for only about half a decade. The method requires three datasets with an independent error structure that represent an identical phenomenon. The main advantages of the method are that it a) doesn't require a reference dataset that has to be considered to represent the truth, b) limits the effect of random and systematic errors of other two datasets, and c) simultaneously assesses the error of three datasets. The objective of this presentation is to assess the triple collocation error (Tc) of the ASAR Global Mode Surface Soil Moisture (GM SSM 1) km dataset and highlight problems of the method related to its ability to cancel the effect of error of ancillary datasets. In particular, the goal is to a) investigate trends in Tc related to the change in spatial resolution from 5 to 25 km, b) to investigate trends in Tc related to the choice of a hydrological model, and c) to study the relationship between Tc and other absolute evaluation methods (namely RMSE and Error Propagation EP). The triple collocation method is implemented using ASAR GM, AMSR-E, and a model (either AWRA-L, GLDAS-NOAH, or ERA-Interim). First, the significance of the relationship between the three soil moisture datasets was tested that is a prerequisite for the triple collocation method. Second, the trends in Tc related to the choice of the third reference dataset and scale were assessed. For this purpose the triple collocation is repeated replacing AWRA-L with two different globally available model reanalysis dataset operating at different spatial resolution (ERA-Interim and GLDAS-NOAH). Finally, the retrieved results were compared to the results of the RMSE and EP evaluation measures. Our results demonstrate that the Tc method does not eliminate the random and time-variant systematic errors of the second and the third dataset used in the Tc. The possible reasons include the fact a) that the TC method could not fully function with datasets acting at very different spatial resolutions, or b) that the errors were not fully independent as initially assumed.
Mapping Global Ocean Surface Albedo from Satellite Observations: Models, Algorithms, and Datasets
NASA Astrophysics Data System (ADS)
Li, X.; Fan, X.; Yan, H.; Li, A.; Wang, M.; Qu, Y.
2018-04-01
Ocean surface albedo (OSA) is one of the important parameters in surface radiation budget (SRB). It is usually considered as a controlling factor of the heat exchange among the atmosphere and ocean. The temporal and spatial dynamics of OSA determine the energy absorption of upper level ocean water, and have influences on the oceanic currents, atmospheric circulations, and transportation of material and energy of hydrosphere. Therefore, various parameterizations and models have been developed for describing the dynamics of OSA. However, it has been demonstrated that the currently available OSA datasets cannot full fill the requirement of global climate change studies. In this study, we present a literature review on mapping global OSA from satellite observations. The models (parameterizations, the coupled ocean-atmosphere radiative transfer (COART), and the three component ocean water albedo (TCOWA)), algorithms (the estimation method based on reanalysis data, and the direct-estimation algorithm), and datasets (the cloud, albedo and radiation (CLARA) surface albedo product, dataset derived by the TCOWA model, and the global land surface satellite (GLASS) phase-2 surface broadband albedo product) of OSA have been discussed, separately.
NASA Technical Reports Server (NTRS)
Stubenrauch, C. J.; Rossow, W. B.; Kinne, S.; Ackerman, S.; Cesana, G.; Chepfer, H.; Getzewich, B.; Di Girolamo, L.; Guignard, A.; Heidinger, A.;
2012-01-01
Clouds cover about 70% of the Earth's surface and play a dominant role in the energy and water cycle of our planet. Only satellite observations provide a continuous survey of the state of the atmosphere over the whole globe and across the wide range of spatial and temporal scales that comprise weather and climate variability. Satellite cloud data records now exceed more than 25 years in length. However, climatologies compiled from different satellite datasets can exhibit systematic biases. Questions therefore arise as to the accuracy and limitations of the various sensors. The Global Energy and Water cycle Experiment (GEWEX) Cloud Assessment, initiated in 2005 by the GEWEX Radiation Panel, provided the first coordinated intercomparison of publically available, standard global cloud products (gridded, monthly statistics) retrieved from measurements of multi-spectral imagers (some with multiangle view and polarization capabilities), IR sounders and lidar. Cloud properties under study include cloud amount, cloud height (in terms of pressure, temperature or altitude), cloud radiative properties (optical depth or emissivity), cloud thermodynamic phase and bulk microphysical properties (effective particle size and water path). Differences in average cloud properties, especially in the amount of high-level clouds, are mostly explained by the inherent instrument measurement capability for detecting and/or identifying optically thin cirrus, especially when overlying low-level clouds. The study of long-term variations with these datasets requires consideration of many factors. A monthly, gridded database, in common format, facilitates further assessments, climate studies and the evaluation of climate models.
NASA Astrophysics Data System (ADS)
Zhao, Chunhong
2018-04-01
The Local Climate Zones (LCZs) concept was initiated in 2012 to improve the documentation of Urban Heat Island (UHI) observations. Despite the indispensable role and initial aim of LCZs concept in metadata reporting for atmospheric UHI research, its role in surface UHI investigation also needs to be emphasized. This study incorporated LCZs concept to study surface UHI effect for San Antonio, Texas. LCZ map was developed by a GIS-based LCZs classification scheme with the aid of airborne Lidar dataset and other freely available GIS data. Then, the summer LST was calculated based Landsat imagery, which was used to analyse the relations between LST and LCZs and the statistical significance of the differences of LST among the typical LCZs, in order to test if LCZs are able to efficiently facilitate SUHI investigation. The linkage of LCZs and land surface temperature (LST) indicated that the LCZs mapping can be used to compare and investigate the SUHI. Most of the pairs of LCZs illustrated significant differences in average LSTs with considerable significance. The intra-urban temperature comparison among different urban classes contributes to investigate the influence of heterogeneous urban morphology on local climate formation.
Resource Purpose:The National Hydrography Dataset (NHD) is a comprehensive set of digital spatial data that contains information about surface water features such as lakes, ponds, streams, rivers, springs and wells. Within the NHD, surface water features are combined to fo...
Polarimetric Signatures of Initiating Convection During MC3E
NASA Technical Reports Server (NTRS)
Emory, Amber
2012-01-01
One of the goals of the Mid-latitude Continental Convective Clouds Experiment (MC3E) field campaign was to provide constraints for space-based rainfall retrieval algorithms over land. This study used datasets collected during the 2011 field campaign to combine radiometer and ground-based radar polarimetric retrievals in order to better understand hydrometeor type, habit and distribution for initiating continental convection. Cross-track and conically scanning nadir views from the Conical Scanning Millimeter-wave Imaging Radiometer (CoSMIR) were compared with ground-based polarimetric radar retrievals along the ER-2 flight track. Polarimetric signatures for both airborne radiometers and ground-based radars were well co-located with deep convection to relate radiometric signatures with low-level polarimetric radar data for hydrometeor identification and diameter estimation. For the time period of study, Z(sub DR) values indicated no presence of hail at the surface. However, the Z(sub DR) column extended well above the melting level into the mixed phase region, suggesting a possible source of frozen drop embryos for the future formation of hail. The results shown from this study contribute ground truth datasets for GPM PR algorithm development for convective events, which is an improvement upon previous stratiform precipitation centered framework.
A Combined Surface Temperature Dataset for the Arctic from MODIS and AVHRR
NASA Astrophysics Data System (ADS)
Dodd, E.; Veal, K. L.; Ghent, D.; Corlett, G. K.; Remedios, J. J.
2017-12-01
Surface Temperature (ST) changes in the Polar Regions are predicted to be more rapid than either global averages or responses in lower latitudes. Observations of STs and other changes associated with climate change increasingly confirm these predictions in the Arctic. Furthermore, recent high profile events of anomalously warm temperatures have increased interest in Arctic surface temperatures. It is, therefore, particularly important to monitor Arctic climate change. Satellites are particularly relevant to observations of Polar Regions as they are well-served by low-Earth orbiting satellites. Whilst clouds often cause problems for satellite observations of the surface, in situ observations of STs are much sparser. Previous work at the University of Leicester has produced a combined land, ocean and ice ST dataset for the Arctic using ATSR data (AAST) which covers the period 1995 to 2012. In order to facilitate investigation of more recent changes in the Arctic (2010 to 2016) we have produced another combined surface temperature dataset using MODIS and AVHRR; the Metop-A AVHRR and MODIS Arctic Surface Temperature dataset (AMAST). The method of cloud-clearing, use of auxiliary data for ice classification and the ST retrievals used for each surface-type in AMAST will be described. AAST and AMAST were compared in the time period common to both datasets. We will provide results from this intercomparison, as well as an assessment of the impact of utilising data from wide and narrow swath sensors. Time series of ST anomalies over the Arctic region produced from AMAST will be presented.
How are the wetlands over tropical basins impacted by the extreme hydrological events?
NASA Astrophysics Data System (ADS)
Al-Bitar, A.; Parrens, M.; Frappart, F.; Papa, F.; Kerr, Y. H.; Cretaux, J. F.; Wigneron, J. P.
2016-12-01
Wetlands play a crucial role in tropical basins and still many questions remain unanswered on how extreme events (like El-Nino) impacts them. Answering these questions is challenging as monitoring of inland water surfaces via remote sensing over tropical areas is a difficult task because of impact of vegetation and cloud cover. Several microwave based products have been elaborated to monitor these surfaces (Papa et al. 2010). In this study we combine the use of L-band microwave brightness temperatures and altimetric data from SARAL/ALTIKA to derive water storage maps at relatively high (7days) temporal frequency. The area of interest concerns the Amazon, Congo and GBH basins A first order radiative model is used to derive surface water over land from the brightness temperature measured by ESA SMOS mission at coarse resolution (25 km x 25 km) and 7-days frequency. An initial investigation of the use of the SMAP mission for the same purpose will be also presented. The product is compared to the static land cover map such as ESA CCI and the International Geosphere-Biosphere Program (IGBP) and also dynamic maps from SWAPS. It is then combined to the altimetric data to derive water storage maps. The water surfaces and water storage products are then compared to precipitation data from GPM TRMM datasets, ground water storage change from GRACE and river discharge data from field data. The amplitudes and time shifts of the signals is compared based on the sub-basin definition from Hydroshed database. The dataset is then divided into years of strong and weak El-Nino signal and the anomaly is between the two dataset is compared. The results show a strong influence of EL-Nino on the time shift of the different components showing that the hydrological regime of wetlands is highly impacted by these extreme events. This can have dramatic impacts on the ecosystem as the wetlands are vulnerable with a high biodiversity.
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Lin, Yuh-Lang
2004-01-01
During the research project, sounding datasets were generated for the region surrounding 9 major airports, including Dallas, TX, Boston, MA, New York, NY, Chicago, IL, St. Louis, MO, Atlanta, GA, Miami, FL, San Francico, CA, and Los Angeles, CA. The numerical simulation of winter and summer environments during which no instrument flight rule impact was occurring at these 9 terminals was performed using the most contemporary version of the Terminal Area PBL Prediction System (TAPPS) model nested from 36 km to 6 km to 1 km horizontal resolution and very detailed vertical resolution in the planetary boundary layer. The soundings from the 1 km model were archived at 30 minute time intervals for a 24 hour period and the vertical dependent variables as well as derived quantities, i.e., 3-dimensional wind components, temperatures, pressures, mixing ratios, turbulence kinetic energy and eddy dissipation rates were then interpolated to 5 m vertical resolution up to 1000 m elevation above ground level. After partial validation against field experiment datasets for Dallas as well as larger scale and much coarser resolution observations at the other 8 airports, these sounding datasets were sent to NASA for use in the Virtual Air Space and Modeling program. The application of these datasets being to determine representative airport weather environments to diagnose the response of simulated wake vortices to realistic atmospheric environments. These virtual datasets are based on large scale observed atmospheric initial conditions that are dynamically interpolated in space and time. The 1 km nested-grid simulated datasets providing a very coarse and highly smoothed representation of airport environment meteorological conditions. Details concerning the airport surface forcing are virtually absent from these simulated datasets although the observed background atmospheric processes have been compared to the simulated fields and the fields were found to accurately replicate the flows surrounding the airport where coarse verification data were available as well as where airport scale datasets were available.
NASA Technical Reports Server (NTRS)
Mocko, David M.; Rui, Hualan; Acker, James G.
2013-01-01
The North American Land Data Assimilation System (NLDAS) is a collaboration project between NASA/GSFC, NOAA, Princeton Univ., and the Univ. of Washington. NLDAS has created a surface meteorology dataset using the best-available observations and reanalyses the backbone of this dataset is a gridded precipitation analysis from rain gauges. This dataset is used to drive four separate land-surface models (LSMs) to produce datasets of soil moisture, snow, runoff, and surface fluxes. NLDAS datasets are available hourly and extend from Jan 1979 to near real-time with a typical 4-day lag. The datasets are available at 1/8th-degree over CONUS and portions of Canada and Mexico from 25-53 North. The datasets have been extensively evaluated against observations, and are also used as part of a drought monitor. NLDAS datasets are available from the NASA GES DISC and can be accessed via ftp, GDS, Mirador, and Giovanni. GES DISC news articles were published showing figures from the heat wave of 2011, Hurricane Irene, Tropical Storm Lee, and the low-snow winter of 2011-2012. For this presentation, Giovanni-generated figures using NLDAS data from the derecho across the U.S. Midwest and Mid-Atlantic will be presented. Also, similar figures will be presented from the landfall of Hurricane Isaac and the before-and-after drought conditions of the path of the tropical moisture into the central states of the U.S. Updates on future products and datasets from the NLDAS project will also be introduced.
MatchingLand, geospatial data testbed for the assessment of matching methods.
Xavier, Emerson M A; Ariza-López, Francisco J; Ureña-Cámara, Manuel A
2017-12-05
This article presents datasets prepared with the aim of helping the evaluation of geospatial matching methods for vector data. These datasets were built up from mapping data produced by official Spanish mapping agencies. The testbed supplied encompasses the three geometry types: point, line and area. Initial datasets were submitted to geometric transformations in order to generate synthetic datasets. These transformations represent factors that might influence the performance of geospatial matching methods, like the morphology of linear or areal features, systematic transformations, and random disturbance over initial data. We call our 11 GiB benchmark data 'MatchingLand' and we hope it can be useful for the geographic information science research community.
The Surface Radiation Budget over Oceans and Continents.
NASA Astrophysics Data System (ADS)
Garratt, J. R.; Prata, A. J.; Rotstayn, L. D.; McAvaney, B. J.; Cusack, S.
1998-08-01
An updated evaluation of the surface radiation budget in climate models (1994-96 versions; seven datasets available, with and without aerosols) and in two new satellite-based global datasets (with aerosols) is presented. All nine datasets capture the broad mean monthly zonal variations in the flux components and in the net radiation, with maximum differences of some 100 W m2 occurring in the downwelling fluxes at specific latitudes. Using long-term surface observations, both from land stations and the Pacific warm pool (with typical uncertainties in the annual values varying between ±5 and 20 W m2), excess net radiation (RN) and downwelling shortwave flux density (So) are found in all datasets, consistent with results from earlier studies [for global land, excesses of 15%-20% (12 W m2) in RN and about 12% (20 W m2) in So]. For the nine datasets combined, the spread in annual fluxes is significant: for RN, it is 15 (50) W m2 over global land (Pacific warm pool) in an observed annual mean of 65 (135) W m2; for So, it is 25 (60) W m2 over land (warm pool) in an annual mean of 176 (197) W m2.The effects of aerosols are included in three of the authors' datasets, based on simple aerosol climatologies and assumptions regarding aerosol optical properties. They offer guidance on the broad impact of aerosols on climate, suggesting that the inclusion of aerosols in models would reduce the annual So by 15-20 W m2 over land and 5-10 W m2 over the oceans. Model differences in cloud cover contribute to differences in So between datasets; for global land, this is most clearly demonstrated through the effects of cloud cover on the surface shortwave cloud forcing. The tendency for most datasets to underestimate cloudiness, particularly over global land, and possibly to underestimate atmospheric water vapor absorption, probably contributes to the excess downwelling shortwave flux at the surface.
NASA Astrophysics Data System (ADS)
Ott, L.; Sellers, P. J.; Schimel, D.; Moore, B., III; O'Dell, C.; Crowell, S.; Kawa, S. R.; Pawson, S.; Chatterjee, A.; Baker, D. F.; Schuh, A. E.
2017-12-01
Satellite observations of carbon dioxide (CO2) and methane (CH4) are critically needed to improve understanding of the contemporary carbon budget and carbon-climate feedbacks. Though current carbon observing satellites have provided valuable data in regions not covered by surface in situ measurements, limited sampling of key regions and small but spatially coherent biases have limited the ability to estimate fluxes at the time and space scales needed for improved process-level understanding and informed decision-making. Next generation satellites will improve coverage in data sparse regions, either through use of active remote sensing, a geostationary vantage point, or increased swath width, but all techniques have limitations. The relative strengths and weaknesses of these approaches and their synergism have not previously been examined. To address these needs, a significant subset of the US carbon modeling community has come together with support from NASA to conduct a series of coordinated observing system simulation experiments (OSSEs), with close collaboration in framing the experiments and in analyzing the results. Here, we report on the initial phase of this initiative, which focused on creating realistic, physically consistent synthetic CO2 and CH4 observational datasets for use in inversion and signal detection experiments. These datasets have been created using NASA's Goddard Earth Observing System Model (GEOS) to represent the current state of atmospheric carbon as well as best available estimates of expected flux changes. Scenarios represented include changes in urban emissions, release of permafrost soil carbon, changes in carbon uptake in tropical and mid-latitude forests, changes in the Southern Ocean sink, and changes in both anthropogenic and natural methane emissions. This GEOS carbon `nature run' was sampled by instrument simulators representing the most prominent observing strategies with a focus on consistently representing the impacts of random errors and limitations in viewing due to clouds and aerosols. Statistical analyses of these synthetic datasets provide a simple, objective method for evaluating mission design choices. These datasets will also be made publicly available for use by the international carbon modeling community and in mission planning activities.
NASA Astrophysics Data System (ADS)
Moritz, R. E.; Rigor, I.
2006-12-01
ABSTRACT: The Arctic Buoy Program was initiated in 1978 to measure surface air pressure, surface temperature and sea-ice motion in the Arctic Ocean, on the space and time scales of synoptic weather systems, and to make the data available for research, forecasting and operations. The program, subsequently renamed the International Arctic Buoy Programme (IABP), has endured and expanded over the past 28 years. A hallmark of the IABP is the production, dissemination and archival of research-quality datasets and analyses. These datasets have been used by the authors of over 500 papers on meteorolgy, sea-ice physics, oceanography, air-sea interactions, climate, remote sensing and other topics. Elements of the IABP are described briefly, including measurements, analysis, data dissemination and data archival. Selected highlights of the research applications are reviewed, including ice dynamics, ocean-ice modeling, low-frequency variability of Arctic air-sea-ice circulation, and recent changes in the age, thickness and extent of Arctic Sea-ice. The extended temporal coverage of the data disseminated on the Environmental Working Group CD's is important for interpreting results in the context of climate.
Lara, Mark J.; Nitze, Ingmar; Grosse, Guido; McGuire, A. David
2018-01-01
Arctic tundra landscapes are composed of a complex mosaic of patterned ground features, varying in soil moisture, vegetation composition, and surface hydrology over small spatial scales (10–100 m). The importance of microtopography and associated geomorphic landforms in influencing ecosystem structure and function is well founded, however, spatial data products describing local to regional scale distribution of patterned ground or polygonal tundra geomorphology are largely unavailable. Thus, our understanding of local impacts on regional scale processes (e.g., carbon dynamics) may be limited. We produced two key spatiotemporal datasets spanning the Arctic Coastal Plain of northern Alaska (~60,000 km2) to evaluate climate-geomorphological controls on arctic tundra productivity change, using (1) a novel 30 m classification of polygonal tundra geomorphology and (2) decadal-trends in surface greenness using the Landsat archive (1999–2014). These datasets can be easily integrated and adapted in an array of local to regional applications such as (1) upscaling plot-level measurements (e.g., carbon/energy fluxes), (2) mapping of soils, vegetation, or permafrost, and/or (3) initializing ecosystem biogeochemistry, hydrology, and/or habitat modeling. PMID:29633984
Lara, Mark J.; Nitze, Ingmar; Grosse, Guido; McGuire, A. David
2018-01-01
Arctic tundra landscapes are composed of a complex mosaic of patterned ground features, varying in soil moisture, vegetation composition, and surface hydrology over small spatial scales (10–100 m). The importance of microtopography and associated geomorphic landforms in influencing ecosystem structure and function is well founded, however, spatial data products describing local to regional scale distribution of patterned ground or polygonal tundra geomorphology are largely unavailable. Thus, our understanding of local impacts on regional scale processes (e.g., carbon dynamics) may be limited. We produced two key spatiotemporal datasets spanning the Arctic Coastal Plain of northern Alaska (~60,000 km2) to evaluate climate-geomorphological controls on arctic tundra productivity change, using (1) a novel 30 m classification of polygonal tundra geomorphology and (2) decadal-trends in surface greenness using the Landsat archive (1999–2014). These datasets can be easily integrated and adapted in an array of local to regional applications such as (1) upscaling plot-level measurements (e.g., carbon/energy fluxes), (2) mapping of soils, vegetation, or permafrost, and/or (3) initializing ecosystem biogeochemistry, hydrology, and/or habitat modeling.
Preprocessed Consortium for Neuropsychiatric Phenomics dataset.
Gorgolewski, Krzysztof J; Durnez, Joke; Poldrack, Russell A
2017-01-01
Here we present preprocessed MRI data of 265 participants from the Consortium for Neuropsychiatric Phenomics (CNP) dataset. The preprocessed dataset includes minimally preprocessed data in the native, MNI and surface spaces accompanied with potential confound regressors, tissue probability masks, brain masks and transformations. In addition the preprocessed dataset includes unthresholded group level and single subject statistical maps from all tasks included in the original dataset. We hope that availability of this dataset will greatly accelerate research.
Comparison of Radiative Energy Flows in Observational Datasets and Climate Modeling
NASA Technical Reports Server (NTRS)
Raschke, Ehrhard; Kinne, Stefan; Rossow, William B.; Stackhouse, Paul W. Jr.; Wild, Martin
2016-01-01
This study examines radiative flux distributions and local spread of values from three major observational datasets (CERES, ISCCP, and SRB) and compares them with results from climate modeling (CMIP3). Examinations of the spread and differences also differentiate among contributions from cloudy and clear-sky conditions. The spread among observational datasets is in large part caused by noncloud ancillary data. Average differences of at least 10Wm(exp -2) each for clear-sky downward solar, upward solar, and upward infrared fluxes at the surface demonstrate via spatial difference patterns major differences in assumptions for atmospheric aerosol, solar surface albedo and surface temperature, and/or emittance in observational datasets. At the top of the atmosphere (TOA), observational datasets are less influenced by the ancillary data errors than at the surface. Comparisons of spatial radiative flux distributions at the TOA between observations and climate modeling indicate large deficiencies in the strength and distribution of model-simulated cloud radiative effects. Differences are largest for lower-altitude clouds over low-latitude oceans. Global modeling simulates stronger cloud radiative effects (CRE) by +30Wmexp -2) over trade wind cumulus regions, yet smaller CRE by about -30Wm(exp -2) over (smaller in area) stratocumulus regions. At the surface, climate modeling simulates on average about 15Wm(exp -2) smaller radiative net flux imbalances, as if climate modeling underestimates latent heat release (and precipitation). Relative to observational datasets, simulated surface net fluxes are particularly lower over oceanic trade wind regions (where global modeling tends to overestimate the radiative impact of clouds). Still, with the uncertainty in noncloud ancillary data, observational data do not establish a reliable reference.
REM-3D Reference Datasets: Reconciling large and diverse compilations of travel-time observations
NASA Astrophysics Data System (ADS)
Moulik, P.; Lekic, V.; Romanowicz, B. A.
2017-12-01
A three-dimensional Reference Earth model (REM-3D) should ideally represent the consensus view of long-wavelength heterogeneity in the Earth's mantle through the joint modeling of large and diverse seismological datasets. This requires reconciliation of datasets obtained using various methodologies and identification of consistent features. The goal of REM-3D datasets is to provide a quality-controlled and comprehensive set of seismic observations that would not only enable construction of REM-3D, but also allow identification of outliers and assist in more detailed studies of heterogeneity. The community response to data solicitation has been enthusiastic with several groups across the world contributing recent measurements of normal modes, (fundamental mode and overtone) surface waves, and body waves. We present results from ongoing work with body and surface wave datasets analyzed in consultation with a Reference Dataset Working Group. We have formulated procedures for reconciling travel-time datasets that include: (1) quality control for salvaging missing metadata; (2) identification of and reasons for discrepant measurements; (3) homogenization of coverage through the construction of summary rays; and (4) inversions of structure at various wavelengths to evaluate inter-dataset consistency. In consultation with the Reference Dataset Working Group, we retrieved the station and earthquake metadata in several legacy compilations and codified several guidelines that would facilitate easy storage and reproducibility. We find strong agreement between the dispersion measurements of fundamental-mode Rayleigh waves, particularly when made using supervised techniques. The agreement deteriorates substantially in surface-wave overtones, for which discrepancies vary with frequency and overtone number. A half-cycle band of discrepancies is attributed to reversed instrument polarities at a limited number of stations, which are not reflected in the instrument response history. By assessing inter-dataset consistency across similar paths, we quantify travel-time measurement errors for both surface and body waves. Finally, we discuss challenges associated with combining high frequency ( 1 Hz) and long period (10-20s) body-wave measurements into the REM-3D reference dataset.
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Lazarus, Steven M.; Splitt, Michael E.; Crosson, William L.; Lapenta, William M.; Jedlovec, Gary J.; Peters-Lidard, Christa D.
2008-01-01
The exchange of energy and moisture between the Earth's surface and the atmospheric boundary layer plays a critical role in many meteorological processes. High-resolution, accurate representations of surface properties such as sea-surface temperature (SST), soil temperature and moisture content, ground fluxes, and vegetation are necessary to better understand the Earth-atmosphere interactions and improve numerical predictions of sensible weather. The NASA Short-term Prediction Research and Transition (SPoRT) Center has been conducting separate studies to examine the impacts of high-resolution land-surface initialization data from the Goddard Space Flight Center Land Information System (LIS) on subsequent WRF forecasts, as well as the influence of initializing WRF with SST composites derived from the MODIS instrument. This current project addresses the combined impacts of using high-resolution lower boundary data over both land (LIS data) and water (MODIS SSTs) on the subsequent daily WRF forecasts over Florida during May 2004. For this experiment, the WRF model is configured to run on a nested domain with 9- km and 3-kin grid spacing, centered on the Florida peninsula and adjacent coastal waters of the Gulf of Mexico and Atlantic Ocean. A control configuration of WRF is established to take all initial condition data from the NCEP Eta model. Meanwhile, two WRF experimental runs are configured to use high-resolution initialization data from (1) LIS land-surface data only, and (2) a combination of LIS data and high-resolution MODIS SST composites. The experiment involves running 24-hour simulations of the control WRF configuration, the MS-initialized WRF, and the LIS+MODIS-initialized WRF daily for the entire month of May 2004. All atmospheric data for initial and boundary conditions for the Control, LIS, and LIS+MODIS runs come from the NCEP Eta model on a 40-km grid. Verification statistics are generated at land surface observation sites and buoys, and the impacts of the high-resolution lower boundary data on the development and evolution of mesoscale circulations such as sea and land breezes are examined, This paper will present the results of these WRF modeling experiments using LIS and MODIS lower boundary datasets over the Florida peninsula during May 2004.
Mapping 2000 2010 Impervious Surface Change in India Using Global Land Survey Landsat Data
NASA Technical Reports Server (NTRS)
Wang, Panshi; Huang, Chengquan; Brown De Colstoun, Eric C.
2017-01-01
Understanding and monitoring the environmental impacts of global urbanization requires better urban datasets. Continuous field impervious surface change (ISC) mapping using Landsat data is an effective way to quantify spatiotemporal dynamics of urbanization. It is well acknowledged that Landsat-based estimation of impervious surface is subject to seasonal and phenological variations. The overall goal of this paper is to map 200-02010 ISC for India using Global Land Survey datasets and training data only available for 2010. To this end, a method was developed that could transfer the regression tree model developed for mapping 2010 impervious surface to 2000 using an iterative training and prediction (ITP) approach An independent validation dataset was also developed using Google Earth imagery. Based on the reference ISC from the validation dataset, the RMSE of predicted ISC was estimated to be 18.4%. At 95% confidence, the total estimated ISC for India between 2000 and 2010 is 2274.62 +/- 7.84 sq km.
Data assimilation and model evaluation experiment datasets
NASA Technical Reports Server (NTRS)
Lai, Chung-Cheng A.; Qian, Wen; Glenn, Scott M.
1994-01-01
The Institute for Naval Oceanography, in cooperation with Naval Research Laboratories and universities, executed the Data Assimilation and Model Evaluation Experiment (DAMEE) for the Gulf Stream region during fiscal years 1991-1993. Enormous effort has gone into the preparation of several high-quality and consistent datasets for model initialization and verification. This paper describes the preparation process, the temporal and spatial scopes, the contents, the structure, etc., of these datasets. The goal of DAMEE and the need of data for the four phases of experiment are briefly stated. The preparation of DAMEE datasets consisted of a series of processes: (1) collection of observational data; (2) analysis and interpretation; (3) interpolation using the Optimum Thermal Interpolation System package; (4) quality control and re-analysis; and (5) data archiving and software documentation. The data products from these processes included a time series of 3D fields of temperature and salinity, 2D fields of surface dynamic height and mixed-layer depth, analysis of the Gulf Stream and rings system, and bathythermograph profiles. To date, these are the most detailed and high-quality data for mesoscale ocean modeling, data assimilation, and forecasting research. Feedback from ocean modeling groups who tested this data was incorporated into its refinement. Suggestions for DAMEE data usages include (1) ocean modeling and data assimilation studies, (2) diagnosis and theoretical studies, and (3) comparisons with locally detailed observations.
A synoptic climatology of derecho producing mesoscale convective systems in the North-Central Plains
NASA Astrophysics Data System (ADS)
Bentley, Mace L.; Mote, Thomas L.; Byrd, Stephen F.
2000-09-01
Synoptic-scale environments favourable for producing derechos, or widespread convectively induced windstorms, in the North-Central Plains are examined with the goal of providing pattern-recognition/diagnosis techniques. Fifteen derechos were identified across the North-Central Plains region during 1986-1995. The synoptic environment at the initiation, mid-point and decay of each derecho was then evaluated using surface, upper-air and National Center for Atmospheric Research (NCAR)/National Center for Environmental Prediction (NCEP) reanalysis datasets.Results suggest that the synoptic environment is critical in maintaining derecho producing mesoscale convective systems (DMCSs). The synoptic environment in place downstream of the MCS initiation region determines the movement and potential strength of the system. Circulation around surface low pressure increased the instability gradient and maximized leading edge convergence in the initiation region of nearly all events regardless of DMCS location or movement. Other commonalities in the environments of these events include the presence of a weak thermal boundary, high convective instability and a layer of dry low-to-mid-tropospheric air. Of the two corridors sampled, northeastward moving derechos tend to initiate east of synoptic-scale troughs, while southeastward moving derechos form on the northeast periphery of a synoptic-scale ridge. Other differences between these two DMCS events are also discussed.
A Historical Forcing Ice Sheet Model Validation Framework for Greenland
NASA Astrophysics Data System (ADS)
Price, S. F.; Hoffman, M. J.; Howat, I. M.; Bonin, J. A.; Chambers, D. P.; Kalashnikova, I.; Neumann, T.; Nowicki, S.; Perego, M.; Salinger, A.
2014-12-01
We propose an ice sheet model testing and validation framework for Greenland for the years 2000 to the present. Following Perego et al. (2014), we start with a realistic ice sheet initial condition that is in quasi-equilibrium with climate forcing from the late 1990's. This initial condition is integrated forward in time while simultaneously applying (1) surface mass balance forcing (van Angelen et al., 2013) and (2) outlet glacier flux anomalies, defined using a new dataset of Greenland outlet glacier flux for the past decade (Enderlin et al., 2014). Modeled rates of mass and elevation change are compared directly to remote sensing observations obtained from GRACE and ICESat. Here, we present a detailed description of the proposed validation framework including the ice sheet model and model forcing approach, the model-to-observation comparison process, and initial results comparing model output and observations for the time period 2000-2013.
Fast Longwave and Shortwave Radiative Fluxes (FLASHFlux) From CERES and MODIS Measurements
NASA Astrophysics Data System (ADS)
Stackhouse, Paul; Gupta, Shashi; Kratz, David; Geier, Erika; Edwards, Anne; Wilber, Anne
The Clouds and the Earth's Radiant Energy System (CERES) project is currently producing highly accurate surface and top-of-atmosphere (TOA) radiation budget datasets from measurements taken by CERES broadband radiometers and a subset of imaging channels on the Moderate-resolution Imaging Spectroradiometer (MODIS) instrument operating onboard Terra and Aqua satellites. The primary objective of CERES is to produce highly accurate and stable time-series datasets of radiation budget parameters to meet the needs of climate change research. Accomplishing such accuracy and stability requires monitoring the calibration and stability of the instruments, maintaining constancy of processing algorithms and meteorological inputs, and extensively validating the products against independent measurements. Such stringent requirements inevitably delay the release of products to the user community by as much as six months to a year. While such delays are inconsequential for climate research, other applications like short-term and seasonal predictions, agricultural and solar energy research, ocean and atmosphere assimilation, and field experiment support could greatly benefit if CERES products were available quickly after satellite measurements. To meet the needs of the latter class of applications, FLASHFlux was developed and is being implemented at the NASA/LaRC. FLASHFlux produces reliable surface and TOA radiative parameters within a one week of satellite observations using CERES "quicklook" data stream and fast surface flux algorithms. Cloud properties used in flux computation are derived concurrently using MODIS channel radiances. In the process, a modest degree of accuracy is sacrificed in the interest of speed. All fluxes are derived initially on a CERES footprint basis. Daily average fluxes are then derived on a 1° x1° grid in the next stage of processing. To date, FLASHFlux datasets have been used in operational processing of CloudSat data, in support of a field experiment, and for the S'COOL education outreach program. In this presentation, examples will be presented of footprint level and gridded/daily averaged fluxes and their validation. FLASHFlux datasets are available to the science community at the LaRC Atmospheric Sciences Data Center (ASDC) at: eosweb.larc.nasa.gov/PRODOCS/flashflux/table flashflux.html.
Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun
2014-01-01
Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.
Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun
2014-01-01
Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760
WRF Simulation over the Eastern Africa by use of Land Surface Initialization
NASA Astrophysics Data System (ADS)
Sakwa, V. N.; Case, J.; Limaye, A. S.; Zavodsky, B.; Kabuchanga, E. S.; Mungai, J.
2014-12-01
The East Africa region experiences severe weather events associated with hazards of varying magnitude. It receives heavy precipitation which leads to wide spread flooding and lack of sufficient rainfall in some parts results into drought. Cases of flooding and drought are two key forecasting challenges for the Kenya Meteorological Service (KMS). The source of heat and moisture depends on the state of the land surface which interacts with the boundary layer of the atmosphere to produce excessive precipitation or lack of it that leads to severe drought. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface within weakly-sheared environments, such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in numerical weather prediction models. Improved modeling capabilities within the region have the potential to enhance forecast guidance in support of daily operations and high-impact weather over East Africa. KMS currently runs a configuration of the Weather Research and Forecasting (WRF) model in real time to support its daily forecasting operations, invoking the Non-hydrostatic Mesoscale Model (NMM) dynamical core. They make use of the National Oceanic and Atmospheric Administration / National Weather Service Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the WRF-NMM model runs on a 7-km regional grid over Eastern Africa.SPoRT and SERVIR provide land surface initialization datasets and model verification tool. The NASA Land Information System (LIS) provide real-time, daily soil initialization data in place of interpolated Global Forecast System soil moisture and temperature data. Model verification is done using the Model Evaluation Tools (MET) package, in order to quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. These MET tools enable KMS to monitor model forecast accuracy in near real time. This study highlights verification results of WRF runs over East Africa using the LIS land surface initialization.
NASA Astrophysics Data System (ADS)
Shao, G.; Ji, C.; Lu, Z.; Hudnut, K. W.; Liu, J.; Zhang, W.
2009-12-01
We study the kinematic rupture process of the 2008 Mw 7.9 Wenchuan earthquake using all geophysical and geological datasets that we are able to access, including the waveforms of teleseismic long period surface waves, broadband body waves and local strong motions, GPS vectors, interferometic radar (INSAR) images, and geological surface offsets. The relocated aftershock locations have also been included to constrain the potential fault geometry. These datasets have very different sensitivities to not only the slip on the fault but also the “a priori” information of the source inversions, such as the local velocity structure and the details of irregular fault surface. Effects have then been made to reconcile these datasets by reasonably perturbing the velocity structure and fault geometry, which are both poorly constrained. We have used two 1D velocity models, one for the Tibet plateau and the other for Sichuan basin, to calculate the static and dynamic earth responses; and developed a complex fault system including two irregular fault planes for Beichuan and Pengguan faults, respectively. The long wavelength errors of the INSAR LOS displacements have also been considered and been corrected simultaneously during the joint inversions. Our preferred model not only explains the geodetic and tele-seismic data very well, but also reasonably matches most strong motion waveforms. According to this result, the Wenchuan earthquake has an unprecedented complex rupture process. It initiated southwest of the town of Yingxiu at a depth of about 12 km, where the low-angle Pengguan fault and the high-angle Beichuan fault intersect. The rupture initiated on the low angle Pengguan fault and then later triggered the rupture on the high angle Beichuan fault. It then unilaterally ruptured northeastward for 270 km, mainly on the Beichuan fault. The entire rupture duration is over 95 seconds with an average rupture velocity of 3.0 km/s. Except for the region near the hypocenter and the region near the northeast end of the rupture, the majority of slip occurred at depths less than 12 km. The total seismic moment released by this earthquake was 1.02 x 1021 Nm, with ~36% on the Pengguan fault. Our analysis also indicates that the aftershock zone along the extension of the Xiaoyudong fault is consistent with the theory of static stress triggering due to the co-seismic rupture.
NASA Astrophysics Data System (ADS)
Leavens, Claudia; Vik, Torbjørn; Schulz, Heinrich; Allaire, Stéphane; Kim, John; Dawson, Laura; O'Sullivan, Brian; Breen, Stephen; Jaffray, David; Pekar, Vladimir
2008-03-01
Manual contouring of target volumes and organs at risk in radiation therapy is extremely time-consuming, in particular for treating the head-and-neck area, where a single patient treatment plan can take several hours to contour. As radiation treatment delivery moves towards adaptive treatment, the need for more efficient segmentation techniques will increase. We are developing a method for automatic model-based segmentation of the head and neck. This process can be broken down into three main steps: i) automatic landmark identification in the image dataset of interest, ii) automatic landmark-based initialization of deformable surface models to the patient image dataset, and iii) adaptation of the deformable models to the patient-specific anatomical boundaries of interest. In this paper, we focus on the validation of the first step of this method, quantifying the results of our automatic landmark identification method. We use an image atlas formed by applying thin-plate spline (TPS) interpolation to ten atlas datasets, using 27 manually identified landmarks in each atlas/training dataset. The principal variation modes returned by principal component analysis (PCA) of the landmark positions were used by an automatic registration algorithm, which sought the corresponding landmarks in the clinical dataset of interest using a controlled random search algorithm. Applying a run time of 60 seconds to the random search, a root mean square (rms) distance to the ground-truth landmark position of 9.5 +/- 0.6 mm was calculated for the identified landmarks. Automatic segmentation of the brain, mandible and brain stem, using the detected landmarks, is demonstrated.
A new combined surface and volume registration
NASA Astrophysics Data System (ADS)
Lepore, Natasha; Joshi, Anand A.; Leahy, Richard M.; Brun, Caroline; Chou, Yi-Yu; Pennec, Xavier; Lee, Agatha D.; Barysheva, Marina; De Zubicaray, Greig I.; Wright, Margaret J.; McMahon, Katie L.; Toga, Arthur W.; Thompson, Paul M.
2010-03-01
3D registration of brain MRI data is vital for many medical imaging applications. However, purely intensitybased approaches for inter-subject matching of brain structure are generally inaccurate in cortical regions, due to the highly complex network of sulci and gyri, which vary widely across subjects. Here we combine a surfacebased cortical registration with a 3D fluid one for the first time, enabling precise matching of cortical folds, but allowing large deformations in the enclosed brain volume, which guarantee diffeomorphisms. This greatly improves the matching of anatomy in cortical areas. The cortices are segmented and registered with the software Freesurfer. The deformation field is initially extended to the full 3D brain volume using a 3D harmonic mapping that preserves the matching between cortical surfaces. Finally, these deformation fields are used to initialize a 3D Riemannian fluid registration algorithm, that improves the alignment of subcortical brain regions. We validate this method on an MRI dataset from 92 healthy adult twins. Results are compared to those based on volumetric registration without surface constraints; the resulting mean templates resolve consistent anatomical features both subcortically and at the cortex, suggesting that the approach is well-suited for cross-subject integration of functional and anatomic data.
Descriptive Characteristics of Surface Water Quality in Hong Kong by a Self-Organising Map
An, Yan; Zou, Zhihong; Li, Ranran
2016-01-01
In this study, principal component analysis (PCA) and a self-organising map (SOM) were used to analyse a complex dataset obtained from the river water monitoring stations in the Tolo Harbor and Channel Water Control Zone (Hong Kong), covering the period of 2009–2011. PCA was initially applied to identify the principal components (PCs) among the nonlinear and complex surface water quality parameters. SOM followed PCA, and was implemented to analyze the complex relationships and behaviors of the parameters. The results reveal that PCA reduced the multidimensional parameters to four significant PCs which are combinations of the original ones. The positive and inverse relationships of the parameters were shown explicitly by pattern analysis in the component planes. It was found that PCA and SOM are efficient tools to capture and analyze the behavior of multivariable, complex, and nonlinear related surface water quality data. PMID:26761018
Unorthodox bubbles when boiling in cold water.
Parker, Scott; Granick, Steve
2014-01-01
High-speed movies are taken when bubbles grow at gold surfaces heated spotwise with a near-infrared laser beam heating water below the boiling point (60-70 °C) with heating powers spanning the range from very low to so high that water fails to rewet the surface after bubbles detach. Roughly half the bubbles are conventional: They grow symmetrically through evaporation until buoyancy lifts them away. Others have unorthodox shapes and appear to contribute disproportionately to heat transfer efficiency: mushroom cloud shapes, violently explosive bubbles, and cavitation events, probably stimulated by a combination of superheating, convection, turbulence, and surface dewetting during the initial bubble growth. Moreover, bubbles often follow one another in complex sequences, often beginning with an unorthodox bubble that stirs the water, followed by several conventional bubbles. This large dataset is analyzed and discussed with emphasis on how explosive phenomena such as cavitation induce discrepancies from classical expectations about boiling.
Descriptive Characteristics of Surface Water Quality in Hong Kong by a Self-Organising Map.
An, Yan; Zou, Zhihong; Li, Ranran
2016-01-08
In this study, principal component analysis (PCA) and a self-organising map (SOM) were used to analyse a complex dataset obtained from the river water monitoring stations in the Tolo Harbor and Channel Water Control Zone (Hong Kong), covering the period of 2009-2011. PCA was initially applied to identify the principal components (PCs) among the nonlinear and complex surface water quality parameters. SOM followed PCA, and was implemented to analyze the complex relationships and behaviors of the parameters. The results reveal that PCA reduced the multidimensional parameters to four significant PCs which are combinations of the original ones. The positive and inverse relationships of the parameters were shown explicitly by pattern analysis in the component planes. It was found that PCA and SOM are efficient tools to capture and analyze the behavior of multivariable, complex, and nonlinear related surface water quality data.
NASA Astrophysics Data System (ADS)
Cammalleri, Carmelo; Vogt, Jürgen V.; Bisselink, Bernard; de Roo, Ad
2017-12-01
Agricultural drought events can affect large regions across the world, implying the need for a suitable global tool for an accurate monitoring of this phenomenon. Soil moisture anomalies are considered a good metric to capture the occurrence of agricultural drought events, and they have become an important component of several operational drought monitoring systems. In the framework of the JRC Global Drought Observatory (GDO, http://edo.jrc.ec.europa.eu/gdo/), the suitability of three datasets as possible representations of root zone soil moisture anomalies has been evaluated: (1) the soil moisture from the Lisflood distributed hydrological model (namely LIS), (2) the remotely sensed Land Surface Temperature data from the MODIS satellite (namely LST), and (3) the ESA Climate Change Initiative combined passive/active microwave skin soil moisture dataset (namely CCI). Due to the independency of these three datasets, the triple collocation (TC) technique has been applied, aiming at quantifying the likely error associated with each dataset in comparison to the unknown true status of the system. TC analysis was performed on five macro-regions (namely North America, Europe, India, southern Africa and Australia) detected as suitable for the experiment, providing insight into the mutual relationship between these datasets as well as an assessment of the accuracy of each method. Even if no definitive statement on the spatial distribution of errors can be provided, a clear outcome of the TC analysis is the good performance of the remote sensing datasets, especially CCI, over dry regions such as Australia and southern Africa, whereas the outputs of LIS seem to be more reliable over areas that are well monitored through meteorological ground station networks, such as North America and Europe. In a global drought monitoring system, the results of the error analysis are used to design a weighted-average ensemble system that exploits the advantages of each dataset.
NASA Astrophysics Data System (ADS)
Kucera, P. A.; Steinson, M.
2016-12-01
Accurate and reliable real-time monitoring and dissemination of observations of precipitation and surface weather conditions in general is critical for a variety of research studies and applications. Surface precipitation observations provide important reference information for evaluating satellite (e.g., GPM) precipitation estimates. High quality surface observations of precipitation, temperature, moisture, and winds are important for applications such as agriculture, water resource monitoring, health, and hazardous weather early warning systems. In many regions of the World, surface weather station and precipitation gauge networks are sparsely located and/or of poor quality. Existing stations have often been sited incorrectly, not well-maintained, and have limited communications established at the site for real-time monitoring. The University Corporation for Atmospheric Research (UCAR)/National Center for Atmospheric Research (NCAR), with support from USAID, has started an initiative to develop and deploy low-cost weather instrumentation including tipping bucket and weighing-type precipitation gauges in sparsely observed regions of the world. The goal is to improve the number of observations (temporally and spatially) for the evaluation of satellite precipitation estimates in data-sparse regions and to improve the quality of applications for environmental monitoring and early warning alert systems on a regional to global scale. One important aspect of this initiative is to make the data open to the community. The weather station instrumentation have been developed using innovative new technologies such as 3D printers, Raspberry Pi computing systems, and wireless communications. An initial pilot project have been implemented in the country of Zambia. This effort could be expanded to other data sparse regions around the globe. The presentation will provide an overview and demonstration of 3D printed weather station development and initial evaluation of observed precipitation datasets.
Putative archaeal viruses from the mesopelagic ocean.
Vik, Dean R; Roux, Simon; Brum, Jennifer R; Bolduc, Ben; Emerson, Joanne B; Padilla, Cory C; Stewart, Frank J; Sullivan, Matthew B
2017-01-01
Oceanic viruses that infect bacteria, or phages, are known to modulate host diversity, metabolisms, and biogeochemical cycling, while the viruses that infect marine Archaea remain understudied despite the critical ecosystem roles played by their hosts. Here we introduce "MArVD", for Metagenomic Archaeal Virus Detector, an annotation tool designed to identify putative archaeal virus contigs in metagenomic datasets. MArVD is made publicly available through the online iVirus analytical platform. Benchmarking analysis of MArVD showed it to be >99% accurate and 100% sensitive in identifying the 127 known archaeal viruses among the 12,499 viruses in the VirSorter curated dataset. Application of MArVD to 10 viral metagenomes from two depth profiles in the Eastern Tropical North Pacific (ETNP) oxygen minimum zone revealed 43 new putative archaeal virus genomes and large genome fragments ranging in size from 10 to 31 kb. Network-based classifications, which were consistent with marker gene phylogenies where available, suggested that these putative archaeal virus contigs represented six novel candidate genera. Ecological analyses, via fragment recruitment and ordination, revealed that the diversity and relative abundances of these putative archaeal viruses were correlated with oxygen concentration and temperature along two OMZ-spanning depth profiles, presumably due to structuring of the host Archaea community. Peak viral diversity and abundances were found in surface waters, where Thermoplasmata 16S rRNA genes are prevalent, suggesting these archaea as hosts in the surface habitats. Together these findings provide a baseline for identifying archaeal viruses in sequence datasets, and an initial picture of the ecology of such viruses in non-extreme environments.
Vanderhoof, Melanie; Distler, Hayley; Lang, Megan W.; Alexander, Laurie C.
2018-01-01
The dependence of downstream waters on upstream ecosystems necessitates an improved understanding of watershed-scale hydrological interactions including connections between wetlands and streams. An evaluation of such connections is challenging when, (1) accurate and complete datasets of wetland and stream locations are often not available and (2) natural variability in surface-water extent influences the frequency and duration of wetland/stream connectivity. The Upper Choptank River watershed on the Delmarva Peninsula in eastern Maryland and Delaware is dominated by a high density of small, forested wetlands. In this analysis, wetland/stream surface water connections were quantified using multiple wetland and stream datasets, including headwater streams and depressions mapped from a lidar-derived digital elevation model. Surface-water extent was mapped across the watershed for spring 2015 using Landsat-8, Radarsat-2 and Worldview-3 imagery. The frequency of wetland/stream connections increased as a more complete and accurate stream dataset was used and surface-water extent was included, in particular when the spatial resolution of the imagery was finer (i.e., <10 m). Depending on the datasets used, 12–60% of wetlands by count (21–93% of wetlands by area) experienced surface-water interactions with streams during spring 2015. This translated into a range of 50–94% of the watershed contributing direct surface water runoff to streamflow. This finding suggests that our interpretation of the frequency and duration of wetland/stream connections will be influenced not only by the spatial and temporal characteristics of wetlands, streams and potential flowpaths, but also by the completeness, accuracy and resolution of input datasets.
NASA Astrophysics Data System (ADS)
Huang, Min; Carmichael, Gregory R.; Crawford, James H.; Wisthaler, Armin; Zhan, Xiwu; Hain, Christopher R.; Lee, Pius; Guenther, Alex B.
2017-08-01
Land and atmospheric initial conditions of the Weather Research and Forecasting (WRF) model are often interpolated from a different model output. We perform case studies during NASA's SEAC4RS and DISCOVER-AQ Houston airborne campaigns, demonstrating that using land initial conditions directly downscaled from a coarser resolution dataset led to significant positive biases in the coupled NASA-Unified WRF (NUWRF, version 7) surface and near-surface air temperature and planetary boundary layer height (PBLH) around the Missouri Ozarks and Houston, Texas, as well as poorly partitioned latent and sensible heat fluxes. Replacing land initial conditions with the output from a long-term offline Land Information System (LIS) simulation can effectively reduce the positive biases in NUWRF surface air temperature by ˜ 2 °C. We also show that the LIS land initialization can modify surface air temperature errors almost 10 times as effectively as applying a different atmospheric initialization method. The LIS-NUWRF-based isoprene emission calculations by the Model of Emissions of Gases and Aerosols from Nature (MEGAN, version 2.1) are at least 20 % lower than those computed using the coarser resolution data-initialized NUWRF run, and are closer to aircraft-observation-derived emissions. Higher resolution MEGAN calculations are prone to amplified discrepancies with aircraft-observation-derived emissions on small scales. This is possibly a result of some limitations of MEGAN's parameterization and uncertainty in its inputs on small scales, as well as the representation error and the neglect of horizontal transport in deriving emissions from aircraft data. This study emphasizes the importance of proper land initialization to the coupled atmospheric weather modeling and the follow-on emission modeling. We anticipate it to also be critical to accurately representing other processes included in air quality modeling and chemical data assimilation. Having more confidence in the weather inputs is also beneficial for determining and quantifying the other sources of uncertainties (e.g., parameterization, other input data) of the models that they drive.
Abatzoglou, John T; Dobrowski, Solomon Z; Parks, Sean A; Hegewisch, Katherine C
2018-01-09
We present TerraClimate, a dataset of high-spatial resolution (1/24°, ~4-km) monthly climate and climatic water balance for global terrestrial surfaces from 1958-2015. TerraClimate uses climatically aided interpolation, combining high-spatial resolution climatological normals from the WorldClim dataset, with coarser resolution time varying (i.e., monthly) data from other sources to produce a monthly dataset of precipitation, maximum and minimum temperature, wind speed, vapor pressure, and solar radiation. TerraClimate additionally produces monthly surface water balance datasets using a water balance model that incorporates reference evapotranspiration, precipitation, temperature, and interpolated plant extractable soil water capacity. These data provide important inputs for ecological and hydrological studies at global scales that require high spatial resolution and time varying climate and climatic water balance data. We validated spatiotemporal aspects of TerraClimate using annual temperature, precipitation, and calculated reference evapotranspiration from station data, as well as annual runoff from streamflow gauges. TerraClimate datasets showed noted improvement in overall mean absolute error and increased spatial realism relative to coarser resolution gridded datasets.
NASA Astrophysics Data System (ADS)
Abatzoglou, John T.; Dobrowski, Solomon Z.; Parks, Sean A.; Hegewisch, Katherine C.
2018-01-01
We present TerraClimate, a dataset of high-spatial resolution (1/24°, ~4-km) monthly climate and climatic water balance for global terrestrial surfaces from 1958-2015. TerraClimate uses climatically aided interpolation, combining high-spatial resolution climatological normals from the WorldClim dataset, with coarser resolution time varying (i.e., monthly) data from other sources to produce a monthly dataset of precipitation, maximum and minimum temperature, wind speed, vapor pressure, and solar radiation. TerraClimate additionally produces monthly surface water balance datasets using a water balance model that incorporates reference evapotranspiration, precipitation, temperature, and interpolated plant extractable soil water capacity. These data provide important inputs for ecological and hydrological studies at global scales that require high spatial resolution and time varying climate and climatic water balance data. We validated spatiotemporal aspects of TerraClimate using annual temperature, precipitation, and calculated reference evapotranspiration from station data, as well as annual runoff from streamflow gauges. TerraClimate datasets showed noted improvement in overall mean absolute error and increased spatial realism relative to coarser resolution gridded datasets.
Prototype Global Burnt Area Algorithm Using a Multi-sensor Approach
NASA Astrophysics Data System (ADS)
López Saldaña, G.; Pereira, J.; Aires, F.
2013-05-01
One of the main limitations of products derived from remotely-sensed data is the length of the data records available for climate studies. The Advanced Very High Resolution Radiometer (AVHRR) long-term data record (LTDR) comprises a daily global atmospherically-corrected surface reflectance dataset at 0.05Deg spatial resolution and is available for the 1981-1999 time period. The Moderate Resolution Imaging Spectroradiometer (MODIS) instrument has been on orbit in the Terra platform since late 1999 and in Aqua since mid 2002; surface reflectance products, MYD09CMG and MOD09CMG, are available at 0.05Deg spatial resolution. Fire is strong cause of land surface change and emissions of greenhouse gases around the globe. A global long-term identification of areas affected by fire is needed to analyze trends and fire-clime relationships. A burnt area algorithm can be seen as a change point detection problem where there is an abrupt change in the surface reflectance due to the biomass burning. Using the AVHRR-LTDR and the aforementioned MODIS products, a time series of bidirectional reflectance distribution function (BRDF) corrected surface reflectance was generated using the daily observations and constraining the BRDF model inversion using a climatology of BRDF parameters derived from 12 years of MODIS data. The identification of the burnt area was performed using a t-test in the pre- and post-fire reflectance values and a change point detection algorithm, then spectral constraints were applied to flag changes caused by natural land processes like vegetation seasonality or flooding. Additional temporal constraints are applied focusing in the persistence of the affected areas. Initial results for years 1998 to 2002, show spatio-temporal coherence but further analysis is required and a formal rigorous validation will be applied using burn scars identified from high-resolution datasets.
Quantification of surface emissions: An historical perspective from GEIA
NASA Astrophysics Data System (ADS)
Granier, C.; Denier Van Der Gon, H.; Doumbia, E. H. T.; Frost, G. J.; Guenther, A. B.; Hassler, B.; Janssens-Maenhout, G. G. A.; Lasslop, G.; Melamed, M. L.; Middleton, P.; Sindelarova, K.; Tarrason, L.; van Marle, M.; W Kaiser, J.; van der Werf, G.
2015-12-01
Assessments of the composition of the atmosphere and its evolution require accurate knowledge of the surface emissions of atmospheric compounds. The first community development of global surface emissions started in 1990, when GEIA was established as a component of the International Global Atmospheric Chemistry (IGAC) project. At that time, GEIA meant "Global Emissions Inventory Activity". Since its inception, GEIA has brought together people to understand emissions from anthropogenic, biomass burning and natural sources. The first goal of GEIA was to establish a "best" inventory for the base year 1985 at 1x1 degree resolution. Since then many inventories have been developed by various groups at the global and regional scale at different temporal and spatial resolutions. GEIA, which now means the "Global Emissions Initiative", has evolved into assessing, harmonizing and distributing emissions datasets. We will review the main achievements of GEIA, and show how the development and evaluation of surface emissions has evolved during the last 25 years. We will discuss the use of surface, in-situ and remote sensing observations to evaluate and improve the quantification of emissions. We will highlight the main uncertainties currently limiting emissions datasets, such as the spatial and temporal evolution of emissions at different resolutions, the quantification of emerging emission sources (such as oil/gas extraction and distribution, biofuels, etc.), the speciation of the emissions of volatile organic compounds and of particulate matter, the capacity building necessary for organizing the development of regional emissions across the world, emissions from shipping, etc. We will present the ECCAD (Emissions of Atmospheric Compounds and Compilation of Ancillary Data) database, developed as part of GEIA to facilitate the access and evaluation of emission inventories.
Gangodagamage, Chandana; Wullschleger, Stan
2014-07-03
This dataset represent a map of the high center (HC) and low center (LC) polygon boundaries delineated from high resolution LiDAR data for the arctic coastal plain at Barrow, Alaska. The polygon troughs are considered as the surface expression of the ice-wedges. The troughs are in lower elevations than the interior polygon. The trough widths were initially identified from LiDAR data, and the boundary between two polygons assumed to be located along the lowest elevations on trough widths between them.
Multisource Estimation of Long-term Global Terrestrial Surface Radiation
NASA Astrophysics Data System (ADS)
Peng, L.; Sheffield, J.
2017-12-01
Land surface net radiation is the essential energy source at the earth's surface. It determines the surface energy budget and its partitioning, drives the hydrological cycle by providing available energy, and offers heat, light, and energy for biological processes. Individual components in net radiation have changed historically due to natural and anthropogenic climate change and land use change. Decadal variations in radiation such as global dimming or brightening have important implications for hydrological and carbon cycles. In order to assess the trends and variability of net radiation and evapotranspiration, there is a need for accurate estimates of long-term terrestrial surface radiation. While large progress in measuring top of atmosphere energy budget has been made, huge discrepancies exist among ground observations, satellite retrievals, and reanalysis fields of surface radiation, due to the lack of observational networks, the difficulty in measuring from space, and the uncertainty in algorithm parameters. To overcome the weakness of single source datasets, we propose a multi-source merging approach to fully utilize and combine multiple datasets of radiation components separately, as they are complementary in space and time. First, we conduct diagnostic analysis of multiple satellite and reanalysis datasets based on in-situ measurements such as Global Energy Balance Archive (GEBA), existing validation studies, and other information such as network density and consistency with other meteorological variables. Then, we calculate the optimal weighted average of multiple datasets by minimizing the variance of error between in-situ measurements and other observations. Finally, we quantify the uncertainties in the estimates of surface net radiation and employ physical constraints based on the surface energy balance to reduce these uncertainties. The final dataset is evaluated in terms of the long-term variability and its attribution to changes in individual components. The goal of this study is to provide a merged observational benchmark for large-scale diagnostic analyses, remote sensing and land surface modeling.
Online, On Demand Access to Coastal Digital Elevation Models
NASA Astrophysics Data System (ADS)
Long, J.; Bristol, S.; Long, D.; Thompson, S.
2014-12-01
Process-based numerical models for coastal waves, water levels, and sediment transport are initialized with digital elevation models (DEM) constructed by interpolating and merging bathymetric and topographic elevation data. These gridded surfaces must seamlessly span the land-water interface and may cover large regions where the individual raw data sources are collected at widely different spatial and temporal resolutions. In addition, the datasets are collected from different instrument platforms with varying accuracy and may or may not overlap in coverage. The lack of available tools and difficulties in constructing these DEMs lead scientists to 1) rely on previously merged, outdated, or over-smoothed DEMs; 2) discard more recent data that covers only a portion of the DEM domain; and 3) use inconsistent methodologies to generate DEMs. The objective of this work is to address the immediate need of integrating land and water-based elevation data sources and streamline the generation of a seamless data surface that spans the terrestrial-marine boundary. To achieve this, the U.S. Geological Survey (USGS) is developing a web processing service to format and initialize geoprocessing tasks designed to create coastal DEMs. The web processing service is maintained within the USGS ScienceBase data management system and has an associated user interface. Through the map-based interface, users define a geographic region that identifies the bounds of the desired DEM and a time period of interest. This initiates a query for elevation datasets within federal science agency data repositories. A geoprocessing service is then triggered to interpolate, merge, and smooth the data sources creating a DEM based on user-defined configuration parameters. Uncertainty and error estimates for the DEM are also returned by the geoprocessing service. Upon completion, the information management platform provides access to the final gridded data derivative and saves the configuration parameters for future reference. The resulting products and tools developed here could be adapted to future data sources and projects beyond the coastal environment.
The advanced qualtiy control techniques planned for the Internation Soil Moisture Network
NASA Astrophysics Data System (ADS)
Xaver, A.; Gruber, A.; Hegiova, A.; Sanchis-Dufau, A. D.; Dorigo, W. A.
2012-04-01
In situ soil moisture observations are essential to evaluate and calibrate modeled and remotely sensed soil moisture products. Although a number of meteorological networks and field campaigns measuring soil moisture exist on a global and long-term scale, their observations are not easily accessible and lack standardization of both technique and protocol. Thus, handling and especially comparing these datasets with satellite products or land surface models is a demanding issue. To overcome these limitations the International Soil Moisture Network (ISMN; http://www.ipf.tuwien.ac.at/insitu/) has been initiated to act as a centralized data hosting facility. One advantage of the ISMN is that users are able to access the harmonized datasets easily through a web portal. Another advantage is the fully automated processing chain including the data harmonization in terms of units and sampling interval, but even more important is the advanced quality control system each measurement has to run through. The quality of in situ soil moisture measurements is crucial for the validation of satellite- and model-based soil moisture retrievals; therefore a sophisticated quality control system was developed. After a check for plausibility and geophysical limits a quality flag is added to each measurement. An enhanced flagging mechanism was recently defined using a spectrum based approach to detect spurious spikes, jumps and plateaus. The International Soil Moisture Network has already evolved to one of the most important distribution platforms for in situ soil moisture observations and is still growing. Currently, data from 27 networks in total covering more than 800 stations in Europe, North America, Australia, Asia and Africa is hosted by the ISMN. Available datasets also include historical datasets as well as near real-time measurements. The improved quality control system will provide important information for satellite-based as well as land surface model-based validation studies.
Global surface displacement data for assessing variability of displacement at a point on a fault
Hecker, Suzanne; Sickler, Robert; Feigelson, Leah; Abrahamson, Norman; Hassett, Will; Rosa, Carla; Sanquini, Ann
2014-01-01
This report presents a global dataset of site-specific surface-displacement data on faults. We have compiled estimates of successive displacements attributed to individual earthquakes, mainly paleoearthquakes, at sites where two or more events have been documented, as a basis for analyzing inter-event variability in surface displacement on continental faults. An earlier version of this composite dataset was used in a recent study relating the variability of surface displacement at a point to the magnitude-frequency distribution of earthquakes on faults, and to hazard from fault rupture (Hecker and others, 2013). The purpose of this follow-on report is to provide potential data users with an updated comprehensive dataset, largely complete through 2010 for studies in English-language publications, as well as in some unpublished reports and abstract volumes.
NASA Technical Reports Server (NTRS)
Dickson, J.; Drury, H.; Van Essen, D. C.
2001-01-01
Surface reconstructions of the cerebral cortex are increasingly widely used in the analysis and visualization of cortical structure, function and connectivity. From a neuroinformatics perspective, dealing with surface-related data poses a number of challenges. These include the multiplicity of configurations in which surfaces are routinely viewed (e.g. inflated maps, spheres and flat maps), plus the diversity of experimental data that can be represented on any given surface. To address these challenges, we have developed a surface management system (SuMS) that allows automated storage and retrieval of complex surface-related datasets. SuMS provides a systematic framework for the classification, storage and retrieval of many types of surface-related data and associated volume data. Within this classification framework, it serves as a version-control system capable of handling large numbers of surface and volume datasets. With built-in database management system support, SuMS provides rapid search and retrieval capabilities across all the datasets, while also incorporating multiple security levels to regulate access. SuMS is implemented in Java and can be accessed via a Web interface (WebSuMS) or using downloaded client software. Thus, SuMS is well positioned to act as a multiplatform, multi-user 'surface request broker' for the neuroscience community.
Ability of the current global observing network to constrain N2O sources and sinks
NASA Astrophysics Data System (ADS)
Millet, D. B.; Wells, K. C.; Chaliyakunnel, S.; Griffis, T. J.; Henze, D. K.; Bousserez, N.
2014-12-01
The global observing network for atmospheric N2O combines flask and in-situ measurements at ground stations with sustained and campaign-based aircraft observations. In this talk we apply a new global model of N2O (based on GEOS-Chem) and its adjoint to assess the strengths and weaknesses of this network for quantifying N2O emissions. We employ an ensemble of pseudo-observation analyses to evaluate the relative constraints provided by ground-based (surface, tall tower) and airborne (HIPPO, CARIBIC) observations, and the extent to which variability (e.g. associated with pulsing or seasonality of emissions) not captured by the a priori inventory can bias the inferred fluxes. We find that the ground-based and HIPPO datasets each provide a stronger constraint on the distribution of global emissions than does the CARIBIC dataset on its own. Given appropriate initial conditions, we find that our inferred surface fluxes are insensitive to model errors in the stratospheric loss rate of N2O over the timescale of our analysis (2 years); however, the same is not necessarily true for model errors in stratosphere-troposphere exchange. Finally, we examine the a posteriori error reduction distribution to identify priority locations for future N2O measurements.
An Analysis of the Climate Data Initiative's Data Collection
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Bugbee, K.
2015-12-01
The Climate Data Initiative (CDI) is a broad multi-agency effort of the U.S. government that seeks to leverage the extensive existing federal climate-relevant data to stimulate innovation and private-sector entrepreneurship to support national climate-change preparedness. The CDI project is a systematic effort to manually curate and share openly available climate data from various federal agencies. To date, the CDI has curated seven themes, or topics, relevant to climate change resiliency. These themes include Coastal Flooding, Food Resilience, Water, Ecosystem Vulnerability, Human Health, Energy Infrastructure, and Transportation. Each theme was curated by subject matter experts who selected datasets relevant to the topic at hand. An analysis of the entire Climate Data Initiative data collection and the data curated for each theme offers insights into which datasets are considered most relevant in addressing climate resiliency. Other aspects of the data collection will be examined including which datasets were the most visited or popular and which datasets were the most sought after for curation by the theme teams. Results from the analysis of the CDI collection will be presented in this talk.
NASA Astrophysics Data System (ADS)
Willis, D. M.; Coffey, H. E.; Henwood, R.; Erwin, E. H.; Hoyt, D. V.; Wild, M. N.; Denig, W. F.
2013-11-01
The measurements of sunspot positions and areas that were published initially by the Royal Observatory, Greenwich, and subsequently by the Royal Greenwich Observatory (RGO), as the Greenwich Photo-heliographic Results ( GPR), 1874 - 1976, exist in both printed and digital forms. These printed and digital sunspot datasets have been archived in various libraries and data centres. Unfortunately, however, typographic, systematic and isolated errors can be found in the various datasets. The purpose of the present paper is to begin the task of identifying and correcting these errors. In particular, the intention is to provide in one foundational paper all the necessary background information on the original solar observations, their various applications in scientific research, the format of the different digital datasets, the necessary definitions of the quantities measured, and the initial identification of errors in both the printed publications and the digital datasets. Two companion papers address the question of specific identifiable errors; namely, typographic errors in the printed publications, and both isolated and systematic errors in the digital datasets. The existence of two independently prepared digital datasets, which both contain information on sunspot positions and areas, makes it possible to outline a preliminary strategy for the development of an even more accurate digital dataset. Further work is in progress to generate an extremely reliable sunspot digital dataset, based on the programme of solar observations supported for more than a century by the Royal Observatory, Greenwich, and the Royal Greenwich Observatory. This improved dataset should be of value in many future scientific investigations.
Improved Decadal Climate Prediction in the North Atlantic using EnOI-Assimilated Initial Condition
NASA Astrophysics Data System (ADS)
Li, Q.; Xin, X.; Wei, M.; Zhou, W.
2017-12-01
Decadal prediction experiments of Beijing Climate Center climate system model version 1.1(BCC-CSM1.1) participated in Coupled Model Intercomparison Project Phase 5 (CMIP5) had poor skill in extratropics of the North Atlantic, the initialization of which was done by relaxing modeled ocean temperature to the Simple Ocean Data Assimilation (SODA) reanalysis data. This study aims to improve the prediction skill of this model by using the assimilation technique in the initialization. New ocean data are firstly generated by assimilating the sea surface temperature (SST) of the Hadley Centre Sea Ice and Sea Surface Temperature (HadISST) dataset to the ocean model of BCC-CSM1.1 via Ensemble Optimum Interpolation (EnOI). Then a suite of decadal re-forecasts launched annually over the period 1961-2005 is carried out with simulated ocean temperature restored to the assimilated ocean data. Comparisons between the re-forecasts and previous CMIP5 forecasts show that the re-forecasts are more skillful in mid-to-high latitude SST of the North Atlantic. Improved prediction skill is also found for the Atlantic multi-decadal Oscillation (AMO), which is consistent with the better skill of Atlantic meridional overturning circulation (AMOC) predicted by the re-forecasts. We conclude that the EnOI assimilation generates better ocean data than the SODA reanalysis for initializing decadal climate prediction of BCC-CSM1.1 model.
Trace Gas/Aerosol Interactions and GMI Modeling Support
NASA Technical Reports Server (NTRS)
Penner, Joyce E.; Liu, Xiaohong; Das, Bigyani; Bergmann, Dan; Rodriquez, Jose M.; Strahan, Susan; Wang, Minghuai; Feng, Yan
2005-01-01
Current global aerosol models use different physical and chemical schemes and parameters, different meteorological fields, and often different emission sources. Since the physical and chemical parameterization schemes are often tuned to obtain results that are consistent with observations, it is difficult to assess the true uncertainty due to meteorology alone. Under the framework of the NASA global modeling initiative (GMI), the differences and uncertainties in aerosol simulations (for sulfate, organic carbon, black carbon, dust and sea salt) solely due to different meteorological fields are analyzed and quantified. Three meteorological datasets available from the NASA DAO GCM, the GISS-II' GCM, and the NASA finite volume GCM (FVGCM) are used to drive the same aerosol model. The global sulfate and mineral dust burdens with FVGCM fields are 40% and 20% less than those with DAO and GISS fields, respectively due to its heavier rainfall. Meanwhile, the sea salt burden predicted with FVGCM fields is 56% and 43% higher than those with DAO and GISS, respectively, due to its stronger convection especially over the Southern Hemispheric Ocean. Sulfate concentrations at the surface in the Northern Hemisphere extratropics and in the middle to upper troposphere differ by more than a factor of 3 between the three meteorological datasets. The agreement between model calculated and observed aerosol concentrations in the industrial regions (e.g., North America and Europe) is quite similar for all three meteorological datasets. Away from the source regions, however, the comparisons with observations differ greatly for DAO, FVGCM and GISS, and the performance of the model using different datasets varies largely depending on sites and species. Global annual average aerosol optical depth at 550 nm is 0.120-0.131 for the three meteorological datasets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Dengwang; Liu, Li; Chen, Jinhu
2014-06-01
Purpose: The aiming of this study was to extract liver structures for daily Cone beam CT (CBCT) images automatically. Methods: Datasets were collected from 50 intravenous contrast planning CT images, which were regarded as training dataset for probabilistic atlas and shape prior model construction. Firstly, probabilistic atlas and shape prior model based on sparse shape composition (SSC) were constructed by iterative deformable registration. Secondly, the artifacts and noise were removed from the daily CBCT image by an edge-preserving filtering using total variation with L1 norm (TV-L1). Furthermore, the initial liver region was obtained by registering the incoming CBCT image withmore » the atlas utilizing edge-preserving deformable registration with multi-scale strategy, and then the initial liver region was converted to surface meshing which was registered with the shape model where the major variation of specific patient was modeled by sparse vectors. At the last stage, the shape and intensity information were incorporated into joint probabilistic model, and finally the liver structure was extracted by maximum a posteriori segmentation.Regarding the construction process, firstly the manually segmented contours were converted into meshes, and then arbitrary patient data was chosen as reference image to register with the rest of training datasets by deformable registration algorithm for constructing probabilistic atlas and prior shape model. To improve the efficiency of proposed method, the initial probabilistic atlas was used as reference image to register with other patient data for iterative construction for removing bias caused by arbitrary selection. Results: The experiment validated the accuracy of the segmentation results quantitatively by comparing with the manually ones. The volumetric overlap percentage between the automatically generated liver contours and the ground truth were on an average 88%–95% for CBCT images. Conclusion: The experiment demonstrated that liver structures of CBCT with artifacts can be extracted accurately for following adaptive radiation therapy. This work is supported by National Natural Science Foundation of China (No. 61201441), Research Fund for Excellent Young and Middle-aged Scientists of Shandong Province (No. BS2012DX038), Project of Shandong Province Higher Educational Science and Technology Program (No. J12LN23), Jinan youth science and technology star (No.20120109)« less
Developing a Global Network of River Reaches in Preparation of SWOT
NASA Astrophysics Data System (ADS)
Lion, C.; Pavelsky, T.; Allen, G. H.; Beighley, E.; Schumann, G.; Durand, M. T.
2016-12-01
In 2020, the Surface Water and Ocean Topography satellite (SWOT), a joint mission of NASA/CNES/CSA/UK will be launched. One of its major products will be the measurements of continental water surfaces, including the width, height, and slope of rivers and the surface area and elevations of lakes. The mission will improve the monitoring of continental water and also our understanding of the interactions between different hydrologic reservoirs. For rivers, SWOT measurements of slope will be carried out over predefined river reaches. As such, an a priori dataset for rivers is needed in order to facilitate analysis of the raw SWOT data. The information required to produce this dataset includes measurements of river width, elevation, slope, planform, river network topology, and flow accumulation. To produce this product, we have linked two existing global datasets: the Global River Widths from Landsat (GRWL) database, which contains river centerline locations, widths, and a braiding index derived from Landsat imagery, and a modified version of the HydroSHEDS hydrologically corrected digital elevation product, which contains heights and flow accumulation measurements for streams at 3 arcseconds spatial resolution. Merging these two datasets requires considerable care. The difficulties, among others, lie in the difference of resolution: 30m versus 3 arseconds, and the age of the datasets: 2000 versus 2010 (some rivers have moved, the braided sections are different). As such, we have developed custom software to merge the two datasets, taking into account the spatial proximity of river channels in the two datasets and ensuring that flow accumulation in the final dataset always increases downstream. Here, we present our results for the globe.
Sea-Level Projections from the SeaRISE Initiative
NASA Technical Reports Server (NTRS)
Nowicki, Sophie; Bindschadler, Robert
2011-01-01
SeaRISE (Sea-level Response to Ice Sheet Evolution) is a community organized modeling effort, whose goal is to inform the fifth IPCC of the potential sea-level contribution from the Greenland and Antarctic ice sheets in the 21st and 22nd century. SeaRISE seeks to determine the most likely ice sheet response to imposed climatic forcing by initializing an ensemble of models with common datasets and applying the same forcing to each model. Sensitivity experiments were designed to quantify the sea-level rise associated with a change in: 1) surface mass balance, 2) basal lubrication, and 3) ocean induced basal melt. The range of responses, resulting from the multi-model approach, is interpreted as a proxy of uncertainty in our sea-level projections. http://websrv.cs .umt.edu/isis/index.php/SeaRISE_Assessment.
NASA Astrophysics Data System (ADS)
Boisserie, Marie
The goal of this dissertation research is to produce empirical soil moisture initial conditions (soil moisture analysis) and investigate its impact on the short-term (2 weeks) to subseasonal (2 months) forecasting skill of 2-m air temperature and precipitation. Because of soil moisture has a long memory and plays a role in controlling the surface water and energy budget, an accurate soil moisture analysis is today widely recognized as having the potential to increase summertime climate forecasting skill. However, because of a lack of global observations of soil moisture, there has been no scientific consensus on the importance of the contribution of a soil moisture initialization as close to the truth as possible to climate forecasting skill. In this study, the initial conditions are generated using a Precipitation Assimilation Reanalysis (PAR) technique to produce a soil moisture analysis. This technique consists mainly of nudging precipitation in the atmosphere component of a land-atmosphere model by adjusting the vertical air humidity profile based on the difference between the rate of the model-derived precipitation rate and the observed rate. The unique aspects of the PAR technique are the following: (1) based on the PAR technique, the soil moisture analysis is generated using a coupled land-atmosphere forecast model; therefore, no bias between the initial conditions and the forecast model (spinup problem) is encountered; and (2) the PAR technique is physically consistent; the surface and radiative fluxes remains in conjunction with the soil moisture analysis. To our knowledge, there has been no attempt to use a physically consistent soil moisture land assimilation system into a land-atmosphere model in a coupled mode. The effect of the PAR technique on the model soil moisture estimates is evaluated using the Global Soil Wetness Project Phase 2 (GSWP-2) multimodel analysis product (used as a proxy for global soil moisture observations) and actual in-situ observations from the state of Illinois. The results show that overall the PAR technique is effective; across most of the globe, the seasonal and anomaly variability of the model soil moisture estimates well reproduce the values of GSWP-2 in the top 1.5 m soil layer; by comparing to in-situ observations in Illinois, we find that the seasonal and anomaly soil moisture variability is also well represented deep into the soil. Therefore, in this study, we produce a new global soil moisture analysis dataset that can be used for many land surface studies (crop modeling, water resource management, soil erosion, etc.). Then, the contribution of the resulting soil moisture analysis (used as initial conditions) on air temperature and precipitation forecasts are investigated. For this, we follow the experimental set up of a model intercomparison study over the time period 1986-1995, the Global Land-Atmosphere Coupling Experiment second phase (GLACE-2), in which the FSU/COAPS climate model has participated. The results of the summertime air temperature forecasts show a significant increase in skill across most of the U.S. at short-term to subseasonal time scales. No increase in summertime precipitation forecasting skill is found at short-term to subseasonal time scales between 1986 and 1995, except for the anomalous drought year of 1988. We also analyze the forecasts of two extreme hydrological events, the 1988 U.S. drought and the 1993 U.S. flood. In general, the comparison of these two extreme hydrological event forecasts shows greater improvement for the summertime of 1988 than that of 1993, suggesting that soil moisture contributes more to the development of a drought than a flood. This result is consistent with Dirmeyer and Brubaker [1999] and Weaver et al. [2009]. By analyzing the evaporative sources of these two extreme events using the back-trajectory methodology of Dirmeyer and Brubaker [1999], we find similar results as this latter paper; the soil moisture-precipitation feedback mechanism seems to play a greater role during the drought year of 1988 than the flood year of 1993. Finally, the accuracy of this soil moisture initialization depends upon the quality of the precipitation dataset that is assimilated. Because of the lack of observed precipitation at a high temporal resolution (3-hourly) for the study period (1986-1995), a reanalysis product is used for precipitation assimilation in this study. It is important to keep in mind that precipitation data in reanalysis sometimes differ significantly from observations since precipitation is often not assimilated into the reanalysis model. In order to investigate that aspect, a similar analysis to that we performed in this study could be done using the 3-hourly Tropical Rainfall Measuring Mission (TRMM) dataset available for a the time period 1998-present. Then, since the TRMM dataset is a fully observational dataset, we expect the soil moisture initialization to be improved over that obtained in this study, which, in turn, may further increase the forecast skill.
Khalid Hussein
2012-02-01
Note: This "Weakly Anomalous to Anomalous Surface Temperature" dataset differs from the "Anomalous Surface Temperature" dataset for this county (another remotely sensed CIRES product) by showing areas of modeled temperatures between 1o and 2o above the mean, as opposed to the greater than 2o temperatures contained in the "Anomalous Surface Temperature" dataset. This layer contains areas of anomalous surface temperature in Chaffee County identified from ASTER thermal data and spatial based insolation model. The temperature is calculated using the Emissivity Normalization Algorithm that separate temperature from emissivity. The incoming solar radiation was calculated using spatial based insolation model developed by Fu and Rich (1999). Then the temperature due to solar radiation was calculated using emissivity derived from ASTER data. The residual temperature, i.e. temperature due to solar radiation subtracted from ASTER temperature was used to identify thermally anomalous areas. Areas that had temperature greater than 2o were considered ASTER modeled very warm surface exposures (thermal anomalies). Note: 'o' is used in this description to represent lowercase sigma.
Khalid Hussein
2012-02-01
Note: This "Weakly Anomalous to Anomalous Surface Temperature" dataset differs from the "Anomalous Surface Temperature" dataset for this county (another remotely sensed CIRES product) by showing areas of modeled temperatures between 1o and 2o above the mean, as opposed to the greater than 2o temperatures contained in the "Anomalous Surface Temperature" dataset. This layer contains areas of anomalous surface temperature in Garfield County identified from ASTER thermal data and spatial based insolation model. The temperature is calculated using the Emissivity Normalization Algorithm that separate temperature from emissivity. The incoming solar radiation was calculated using spatial based insolation model developed by Fu and Rich (1999). Then the temperature due to solar radiation was calculated using emissivity derived from ASTER data. The residual temperature, i.e. temperature due to solar radiation subtracted from ASTER temperature was used to identify thermally anomalous areas. Areas that had temperature between 1o and 2o were considered ASTER modeled warm surface exposures (thermal anomalies) Note: 'o' is used in this description to represent lowercase sigma.
Khalid Hussein
2012-02-01
Note: This "Weakly Anomalous to Anomalous Surface Temperature" dataset differs from the "Anomalous Surface Temperature" dataset for this county (another remotely sensed CIRES product) by showing areas of modeled temperatures between 1o and 2o above the mean, as opposed to the greater than 2o temperatures contained in the "Anomalous Surface Temperature" dataset. This layer contains areas of anomalous surface temperature in Routt County identified from ASTER thermal data and spatial based insolation model. The temperature is calculated using the Emissivity Normalization Algorithm that separate temperature from emissivity. The incoming solar radiation was calculated using spatial based insolation model developed by Fu and Rich (1999). Then the temperature due to solar radiation was calculated using emissivity derived from ASTER data. The residual temperature, i.e. temperature due to solar radiation subtracted from ASTER temperature was used to identify thermally anomalous areas. Areas that had temperature between 1o and 2o were considered ASTER modeled warm surface exposures (thermal anomalies). Note: 'o' is used in this description to represent lowercase sigma.
Khalid Hussein
2012-02-01
Note: This "Weakly Anomalous to Anomalous Surface Temperature" dataset differs from the "Anomalous Surface Temperature" dataset for this county (another remotely sensed CIRES product) by showing areas of modeled temperatures between 1o and 2o above the mean, as opposed to the greater than 2o temperatures contained in the "Anomalous Surface Temperature" dataset. This layer contains areas of anomalous surface temperature in Dolores County identified from ASTER thermal data and spatial based insolation model. The temperature is calculated using the Emissivity Normalization Algorithm that separate temperature from emissivity. The incoming solar radiation was calculated using spatial based insolation model developed by Fu and Rich (1999). Then the temperature due to solar radiation was calculated using emissivity derived from ASTER data. The residual temperature, i.e. temperature due to solar radiation subtracted from ASTER temperature was used to identify thermally anomalous areas. Areas that had temperature greater than 2o were considered ASTER modeled very warm surface exposures (thermal anomalies) Note: 'o' is used in this description to represent lowercase sigma.
Khalid Hussein
2012-02-01
Note: This "Weakly Anomalous to Anomalous Surface Temperature" dataset differs from the "Anomalous Surface Temperature" dataset for this county (another remotely sensed CIRES product) by showing areas of modeled temperatures between 1o and 2o above the mean, as opposed to the greater than 2o temperatures contained in the "Anomalous Surface Temperature" dataset. This layer contains areas of anomalous surface temperature in Archuleta County identified from ASTER thermal data and spatial based insolation model. The temperature is calculated using the Emissivity Normalization Algorithm that separate temperature from emissivity. The incoming solar radiation was calculated using spatial based insolation model developed by Fu and Rich (1999). Then the temperature due to solar radiation was calculated using emissivity derived from ASTER data. The residual temperature, i.e. temperature due to solar radiation subtracted from ASTER temperature was used to identify thermally anomalous areas. Areas that had temperature between 1o and 2o were considered ASTER modeled warm surface exposures (thermal anomalies). Note: 'o' is used in this description to represent lowercase sigma.
NASA Astrophysics Data System (ADS)
Hoyos, Isabel; Baquero-Bernal, Astrid; Hagemann, Stefan
2013-09-01
In Colombia, the access to climate related observational data is restricted and their quantity is limited. But information about the current climate is fundamental for studies on present and future climate changes and their impacts. In this respect, this information is especially important over the Colombian Caribbean Catchment Basin (CCCB) that comprises over 80 % of the population of Colombia and produces about 85 % of its GDP. Consequently, an ensemble of several datasets has been evaluated and compared with respect to their capability to represent the climate over the CCCB. The comparison includes observations, reconstructed data (CPC, Delaware), reanalyses (ERA-40, NCEP/NCAR), and simulated data produced with the regional climate model REMO. The capabilities to represent the average annual state, the seasonal cycle, and the interannual variability are investigated. The analyses focus on surface air temperature and precipitation as well as on surface water and energy balances. On one hand the CCCB characteristics poses some difficulties to the datasets as the CCCB includes a mountainous region with three mountain ranges, where the dynamical core of models and model parameterizations can fail. On the other hand, it has the most dense network of stations, with the longest records, in the country. The results can be summarised as follows: all of the datasets demonstrate a cold bias in the average temperature of CCCB. However, the variability of the average temperature of CCCB is most poorly represented by the NCEP/NCAR dataset. The average precipitation in CCCB is overestimated by all datasets. For the ERA-40, NCEP/NCAR, and REMO datasets, the amplitude of the annual cycle is extremely high. The variability of the average precipitation in CCCB is better represented by the reconstructed data of CPC and Delaware, as well as by NCEP/NCAR. Regarding the capability to represent the spatial behaviour of CCCB, temperature is better represented by Delaware and REMO, while precipitation is better represented by Delaware. Among the three datasets that permit an analysis of surface water and energy balances (REMO, ERA-40, and NCEP/NCAR), REMO best demonstrates the closure property of the surface water balance within the basin, while NCEP/NCAR does not demonstrate this property well. The three datasets represent the energy balance fairly well, although some inconsistencies were found in the individual balance components for NCEP/NCAR.
NASA Astrophysics Data System (ADS)
Fu, J. X.
2010-12-01
Predictability of Intra-Seasonal Oscillation (ISO) relies on both initial conditions and lower boundary conditions (or atmosphere-ocean interaction). The atmospheric reanalysis datasets are commonly used as initial conditions. Here, the biases of three reanalysis datasets (NCEP_R1, _R2, and ERA_Interim) in describing ISO were revealed and the impacts of these biases as initial conditions on ISO prediction skills were assessed. A signal recovery method is proposed to improve ISO prediction. All three reanalysis datasets underestimate the intensity of the equatorial eastward-propagating ISO. When these reanalyses are used as initial conditions in the ECHAM4-UH hybrid coupled model (UH_HCM hereinafter), skillful ISO prediction reaches only about one week for both the 850-hPa zonal winds (U850) and rainfall over Southeast Asia and the global tropics. An enhanced nudging of divergence field is shown to significantly improve the initial conditions, resulting in an extension of the skillful rainfall prediction by 2-3 days and U850 prediction by 5-10 days. After recovering the ISO signals in the original reanalyses, the resultant initial conditions contain ISO strength much closer to the observed. Use of these signal-recovered reanalyses as initial conditions extends the skillful prediction of U850 and rainfall, respectively, to 23 and 18 days over Southeast Asia, and to 20 and 10 days over the global tropics. This finding underlines the urgent need to improve data assimilation systems and observations in advancement of ISO prediction by offering better initial conditions. It is also found that small-scale synoptic weather disturbances in initial conditions generally increase ISO prediction skill. The UH_HCM has better rainfall prediction than the NCEP Climate Forecast System (CFS) over Southeast Asia and both models suffer the prediction barrier over the Maritime Continent.
NASA Astrophysics Data System (ADS)
Yin, L.; Kopans-Johnson, C. R.; LeGrande, A. N.; Kelly, S.
2015-12-01
The isotopic ratio of 18O to 16O in seawater (2005ppm in ocean water is defined as 𝛿18Oseawater≡0 permil or 0‰) is a fundamental ocean tracer due to its distinct linear relationship with salinity(𝛿18O -S) from regional inland freshwater sources. As opposed to salinity alone, 𝛿18O distinguishes river runoff from sea-ice melt and traces ocean circulation pathways from coastal to open waters and surface to deep waters. Observations from the past 60 years of 𝛿18O seawater were compiled into a database by Schimdt et al. (1999), and subsequently used to calculate a 3-dimensional 1°x1° 𝛿18O global gridded dataset by LeGrande and Schmidt (2006). Although the Schmidt et al. (1999) Global Seawater Oxygen-18 Database (𝛿18Oobs) contains 25,514 measurements used to calculate the global gridded dataset, LeGrande and Schmidt (2006) point out that, "data coverage varies greatly from region to region," with seasonal variability creating biases in areas where sea ice is present. Python Pandas is used to automate the addition of 2,942 records to the Schmidt et al. (1999) Global Seawater Oxygen-18 Database (𝛿18Oobs), and examine the spatial and temporal distributions of 18O in the Arctic Ocean. 10 initial water masses are defined using spatial and temporal trends, clusters of observations, and Arctic surface circulation. Jackknife slope analysis of water mass 𝛿18O -S is used to determine anomalous data points and regional hydrology, resulting in 4 distinct Arctic water masses. These techniques are used to improve the gridded 𝛿18Oseawater dataset by distinguishing unique water masses, and accounting for seasonal variability of complex high latitude areas.
Object-Oriented Image Clustering Method Using UAS Photogrammetric Imagery
NASA Astrophysics Data System (ADS)
Lin, Y.; Larson, A.; Schultz-Fellenz, E. S.; Sussman, A. J.; Swanson, E.; Coppersmith, R.
2016-12-01
Unmanned Aerial Systems (UAS) have been used widely as an imaging modality to obtain remotely sensed multi-band surface imagery, and are growing in popularity due to their efficiency, ease of use, and affordability. Los Alamos National Laboratory (LANL) has employed the use of UAS for geologic site characterization and change detection studies at a variety of field sites. The deployed UAS equipped with a standard visible band camera to collect imagery datasets. Based on the imagery collected, we use deep sparse algorithmic processing to detect and discriminate subtle topographic features created or impacted by subsurface activities. In this work, we develop an object-oriented remote sensing imagery clustering method for land cover classification. To improve the clustering and segmentation accuracy, instead of using conventional pixel-based clustering methods, we integrate the spatial information from neighboring regions to create super-pixels to avoid salt-and-pepper noise and subsequent over-segmentation. To further improve robustness of our clustering method, we also incorporate a custom digital elevation model (DEM) dataset generated using a structure-from-motion (SfM) algorithm together with the red, green, and blue (RGB) band data for clustering. In particular, we first employ an agglomerative clustering to create an initial segmentation map, from where every object is treated as a single (new) pixel. Based on the new pixels obtained, we generate new features to implement another level of clustering. We employ our clustering method to the RGB+DEM datasets collected at the field site. Through binary clustering and multi-object clustering tests, we verify that our method can accurately separate vegetation from non-vegetation regions, and are also able to differentiate object features on the surface.
Advancing land surface model development with satellite-based Earth observations
NASA Astrophysics Data System (ADS)
Orth, Rene; Dutra, Emanuel; Trigo, Isabel F.; Balsamo, Gianpaolo
2017-04-01
The land surface forms an essential part of the climate system. It interacts with the atmosphere through the exchange of water and energy and hence influences weather and climate, as well as their predictability. Correspondingly, the land surface model (LSM) is an essential part of any weather forecasting system. LSMs rely on partly poorly constrained parameters, due to sparse land surface observations. With the use of newly available land surface temperature observations, we show in this study that novel satellite-derived datasets help to improve LSM configuration, and hence can contribute to improved weather predictability. We use the Hydrology Tiled ECMWF Scheme of Surface Exchanges over Land (HTESSEL) and validate it comprehensively against an array of Earth observation reference datasets, including the new land surface temperature product. This reveals satisfactory model performance in terms of hydrology, but poor performance in terms of land surface temperature. This is due to inconsistencies of process representations in the model as identified from an analysis of perturbed parameter simulations. We show that HTESSEL can be more robustly calibrated with multiple instead of single reference datasets as this mitigates the impact of the structural inconsistencies. Finally, performing coupled global weather forecasts we find that a more robust calibration of HTESSEL also contributes to improved weather forecast skills. In summary, new satellite-based Earth observations are shown to enhance the multi-dataset calibration of LSMs, thereby improving the representation of insufficiently captured processes, advancing weather predictability and understanding of climate system feedbacks. Orth, R., E. Dutra, I. F. Trigo, and G. Balsamo (2016): Advancing land surface model development with satellite-based Earth observations. Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-628
Evaluation of reanalysis datasets against observational soil temperature data over China
NASA Astrophysics Data System (ADS)
Yang, Kai; Zhang, Jingyong
2018-01-01
Soil temperature is a key land surface variable, and is a potential predictor for seasonal climate anomalies and extremes. Using observational soil temperature data in China for 1981-2005, we evaluate four reanalysis datasets, the land surface reanalysis of the European Centre for Medium-Range Weather Forecasts (ERA-Interim/Land), the second modern-era retrospective analysis for research and applications (MERRA-2), the National Center for Environmental Prediction Climate Forecast System Reanalysis (NCEP-CFSR), and version 2 of the Global Land Data Assimilation System (GLDAS-2.0), with a focus on 40 cm soil layer. The results show that reanalysis data can mainly reproduce the spatial distributions of soil temperature in summer and winter, especially over the east of China, but generally underestimate their magnitudes. Owing to the influence of precipitation on soil temperature, the four datasets perform better in winter than in summer. The ERA-Interim/Land and GLDAS-2.0 produce spatial characteristics of the climatological mean that are similar to observations. The interannual variability of soil temperature is well reproduced by the ERA-Interim/Land dataset in summer and by the CFSR dataset in winter. The linear trend of soil temperature in summer is well rebuilt by reanalysis datasets. We demonstrate that soil heat fluxes in April-June and in winter are highly correlated with the soil temperature in summer and winter, respectively. Different estimations of surface energy balance components can contribute to different behaviors in reanalysis products in terms of estimating soil temperature. In addition, reanalysis datasets can mainly rebuild the northwest-southeast gradient of soil temperature memory over China.
NASA Astrophysics Data System (ADS)
Erwin, E. H.; Coffey, H. E.; Denig, W. F.; Willis, D. M.; Henwood, R.; Wild, M. N.
2013-11-01
A new sunspot and faculae digital dataset for the interval 1874 - 1955 has been prepared under the auspices of the NOAA National Geophysical Data Center (NGDC). This digital dataset contains measurements of the positions and areas of both sunspots and faculae published initially by the Royal Observatory, Greenwich, and subsequently by the Royal Greenwich Observatory (RGO), under the title Greenwich Photo-heliographic Results ( GPR) , 1874 - 1976. Quality control (QC) procedures based on logical consistency have been used to identify the more obvious errors in the RGO publications. Typical examples of identifiable errors are North versus South errors in specifying heliographic latitude, errors in specifying heliographic (Carrington) longitude, errors in the dates and times, errors in sunspot group numbers, arithmetic errors in the summation process, and the occasional omission of solar ephemerides. Although the number of errors in the RGO publications is remarkably small, an initial table of necessary corrections is provided for the interval 1874 - 1917. Moreover, as noted in the preceding companion papers, the existence of two independently prepared digital datasets, which both contain information on sunspot positions and areas, makes it possible to outline a preliminary strategy for the development of an even more accurate digital dataset. Further work is in progress to generate an extremely reliable sunspot digital dataset, based on the long programme of solar observations supported first by the Royal Observatory, Greenwich, and then by the Royal Greenwich Observatory.
Poppenga, Sandra K.; Gesch, Dean B.; Worstell, Bruce B.
2013-01-01
The 1:24,000-scale high-resolution National Hydrography Dataset (NHD) mapped hydrography flow lines require regular updating because land surface conditions that affect surface channel drainage change over time. Historically, NHD flow lines were created by digitizing surface water information from aerial photography and paper maps. Using these same methods to update nationwide NHD flow lines is costly and inefficient; furthermore, these methods result in hydrography that lacks the horizontal and vertical accuracy needed for fully integrated datasets useful for mapping and scientific investigations. Effective methods for improving mapped hydrography employ change detection analysis of surface channels derived from light detection and ranging (LiDAR) digital elevation models (DEMs) and NHD flow lines. In this article, we describe the usefulness of surface channels derived from LiDAR DEMs for hydrography change detection to derive spatially accurate and time-relevant mapped hydrography. The methods employ analyses of horizontal and vertical differences between LiDAR-derived surface channels and NHD flow lines to define candidate locations of hydrography change. These methods alleviate the need to analyze and update the nationwide NHD for time relevant hydrography, and provide an avenue for updating the dataset where change has occurred.
An Efficient Ray-Tracing Method for Determining Terrain Intercepts in EDL Simulations
NASA Technical Reports Server (NTRS)
Shidner, Jeremy D.
2016-01-01
The calculation of a ray's intercept from an arbitrary point in space to a prescribed surface is a common task in computer simulations. The arbitrary point often represents an object that is moving according to the simulation, while the prescribed surface is fixed in a defined frame. For detailed simulations, this surface becomes complex, taking the form of real-world objects such as mountains, craters or valleys which require more advanced methods to accurately calculate a ray's intercept location. Incorporation of these complex surfaces has commonly been implemented in graphics systems that utilize highly optimized graphics processing units to analyze such features. This paper proposes a simplified method that does not require computationally intensive graphics solutions, but rather an optimized ray-tracing method for an assumed terrain dataset. This approach was developed for the Mars Science Laboratory mission which landed on the complex terrain of Gale Crater. First, this paper begins with a discussion of the simulation used to implement the model and the applicability of finding surface intercepts with respect to atmosphere modeling, altitude determination, radar modeling, and contact forces influencing vehicle dynamics. Next, the derivation and assumptions of the intercept finding method are presented. Key assumptions are noted making the routines specific to only certain types of surface data sets that are equidistantly spaced in longitude and latitude. The derivation of the method relies on ray-tracing, requiring discussion on the formulation of the ray with respect to the terrain datasets. Further discussion includes techniques for ray initialization in order to optimize the intercept search. Then, the model implementation for various new applications in the simulation are demonstrated. Finally, a validation of the accuracy is presented along with the corresponding data sets used in the validation. A performance summary of the method will be shown using the analysis from the Mars Science Laboratory's terminal descent sensing model. Alternate uses will also be shown for determining horizon maps and orbiter set times.
NASA Astrophysics Data System (ADS)
Cescatti, A.; Duveiller, G.; Hooker, J.
2017-12-01
Changing vegetation cover not only affects the atmospheric concentration of greenhouse gases but also alters the radiative and non-radiative properties of the surface. The result of competing biophysical processes on Earth's surface energy balance varies spatially and seasonally, and can lead to warming or cooling depending on the specific vegetation change and on the background climate. To date these effects are not accounted for in land-based climate policies because of the complexity of the phenomena, contrasting model predictions and the lack of global data-driven assessments. To overcome the limitations of available observation-based diagnostics and of the on-going model inter-comparison, here we present a new benchmarking dataset derived from satellite remote sensing. This global dataset provides the potential changes induced by multiple vegetation transitions on the single terms of the surface energy balance. We used this dataset for two major goals: 1) Quantify the impact of actual vegetation changes that occurred during the decade 2000-2010, showing the overwhelming role of tropical deforestation in warming the surface by reducing evapotranspiration despite the concurrent brightening of the Earth. 2) Benchmark a series of ESMs against data-driven metrics of the land cover change impacts on the various terms of the surface energy budget and on the surface temperature. We anticipate that the dataset could be also used to evaluate future scenarios of land cover change and to develop the monitoring, reporting and verification guidelines required for the implementation of mitigation plans that account for biophysical land processes.
A reanalysis dataset of the South China Sea.
Zeng, Xuezhi; Peng, Shiqiu; Li, Zhijin; Qi, Yiquan; Chen, Rongyu
2014-01-01
Ocean reanalysis provides a temporally continuous and spatially gridded four-dimensional estimate of the ocean state for a better understanding of the ocean dynamics and its spatial/temporal variability. Here we present a 19-year (1992-2010) high-resolution ocean reanalysis dataset of the upper ocean in the South China Sea (SCS) produced from an ocean data assimilation system. A wide variety of observations, including in-situ temperature/salinity profiles, ship-measured and satellite-derived sea surface temperatures, and sea surface height anomalies from satellite altimetry, are assimilated into the outputs of an ocean general circulation model using a multi-scale incremental three-dimensional variational data assimilation scheme, yielding a daily high-resolution reanalysis dataset of the SCS. Comparisons between the reanalysis and independent observations support the reliability of the dataset. The presented dataset provides the research community of the SCS an important data source for studying the thermodynamic processes of the ocean circulation and meso-scale features in the SCS, including their spatial and temporal variability.
A reanalysis dataset of the South China Sea
Zeng, Xuezhi; Peng, Shiqiu; Li, Zhijin; Qi, Yiquan; Chen, Rongyu
2014-01-01
Ocean reanalysis provides a temporally continuous and spatially gridded four-dimensional estimate of the ocean state for a better understanding of the ocean dynamics and its spatial/temporal variability. Here we present a 19-year (1992–2010) high-resolution ocean reanalysis dataset of the upper ocean in the South China Sea (SCS) produced from an ocean data assimilation system. A wide variety of observations, including in-situ temperature/salinity profiles, ship-measured and satellite-derived sea surface temperatures, and sea surface height anomalies from satellite altimetry, are assimilated into the outputs of an ocean general circulation model using a multi-scale incremental three-dimensional variational data assimilation scheme, yielding a daily high-resolution reanalysis dataset of the SCS. Comparisons between the reanalysis and independent observations support the reliability of the dataset. The presented dataset provides the research community of the SCS an important data source for studying the thermodynamic processes of the ocean circulation and meso-scale features in the SCS, including their spatial and temporal variability. PMID:25977803
NASA Technical Reports Server (NTRS)
Claverie, Martin; Matthews, Jessica L.; Vermote, Eric F.; Justice, Christopher O.
2016-01-01
In- land surface models, which are used to evaluate the role of vegetation in the context ofglobal climate change and variability, LAI and FAPAR play a key role, specifically with respect to thecarbon and water cycles. The AVHRR-based LAIFAPAR dataset offers daily temporal resolution,an improvement over previous products. This climate data record is based on a carefully calibratedand corrected land surface reflectance dataset to provide a high-quality, consistent time-series suitablefor climate studies. It spans from mid-1981 to the present. Further, this operational dataset is availablein near real-time allowing use for monitoring purposes. The algorithm relies on artificial neuralnetworks calibrated using the MODIS LAI/FAPAR dataset. Evaluation based on cross-comparisonwith MODIS products and in situ data show the dataset is consistent and reliable with overalluncertainties of 1.03 and 0.15 for LAI and FAPAR, respectively. However, a clear saturation effect isobserved in the broadleaf forest biomes with high LAI (greater than 4.5) and FAPAR (greater than 0.8) values.
NASA Astrophysics Data System (ADS)
Brotas, Vanda; Valente, André; Couto, André B.; Grant, Mike; Chuprin, Andrei; Jackson, Thomas; Groom, Steve; Sathyendranath, Shubha
2014-05-01
Ocean colour (OC) is an Oceanic Essential Climate Variable, which is used by climate modellers and researchers. The European Space Agency (ESA) Climate Change Initiative project, is the ESA response for the need of climate-quality satellite data, with the goal of providing stable, long-term, satellite-based ECV data products. The ESA Ocean Colour CCI focuses on the production of Ocean Colour ECV uses remote sensing reflectances to derive inherent optical properties and chlorophyll a concentration from ESA's MERIS (2002-2012) and NASA's SeaWiFS (1997 - 2010) and MODIS (2002-2012) sensor archives. This work presents an integrated approach by setting up a global database of in situ measurements and by inter-comparing OC-CCI products with pre-cursor datasets. The availability of in situ databases is fundamental for the validation of satellite derived ocean colour products. A global distribution in situ database was assembled, from several pre-existing datasets, with data spanning between 1997 and 2012. It includes in-situ measurements of remote sensing reflectances, concentration of chlorophyll-a, inherent optical properties and diffuse attenuation coefficient. The database is composed from observations of the following datasets: NOMAD, SeaBASS, MERMAID, AERONET-OC, BOUSSOLE and HOTS. The result was a merged dataset tuned for the validation of satellite-derived ocean colour products. This was an attempt to gather, homogenize and merge, a large high-quality bio-optical marine in situ data, as using all datasets in a single validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. An inter-comparison analysis between OC-CCI chlorophyll-a product and satellite pre-cursor datasets was done with single missions and merged single mission products. Single mission datasets considered were SeaWiFS, MODIS-Aqua and MERIS; merged mission datasets were obtained from the GlobColour (GC) as well as the Making Earth Science Data Records for Use in Research Environments (MEaSUREs). OC-CCI product was found to be most similar to SeaWiFS record, and generally, the OC-CCI record was most similar to records derived from single mission than merged mission initiatives. Results suggest that CCI product is a more consistent dataset than other available merged mission initiatives. In conclusion, climate related science, requires long term data records to provide robust results, OC-CCI product proves to be a worthy data record for climate research, as it combines multi-sensor OC observations to provide a >15-year global error-characterized record.
NASA Cold Land Processes Experiment (CLPX 2002/03): Atmospheric analyses datasets
Glen E. Liston; Daniel L. Birkenheuer; Christopher A. Hiemstra; Donald W. Cline; Kelly Elder
2008-01-01
This paper describes the Local Analysis and Prediction System (LAPS) and the 20-km horizontal grid version of the Rapid Update Cycle (RUC20) atmospheric analyses datasets, which are available as part of the Cold Land Processes Field Experiment (CLPX) data archive. The LAPS dataset contains spatially and temporally continuous atmospheric and surface variables over...
Evaluation of Ten Methods for Initializing a Land Surface Model
NASA Technical Reports Server (NTRS)
Rodell, M.; Houser, P. R.; Berg, A. A.; Famiglietti, J. S.
2005-01-01
Land surface models (LSMs) are computer programs, similar to weather and climate prediction models, which simulate the stocks and fluxes of water (including soil moisture, snow, evaporation, and runoff) and energy (including the temperature of and sensible heat released from the soil) after they arrive on the land surface as precipitation and sunlight. It is not currently possible to measure all of the variables of interest everywhere on Earth with sufficient accuracy and space-time resolution. Hence LSMs have been developed to integrate the available observations with our understanding of the physical processes involved, using powerful computers, in order to map these stocks and fluxes as they change in time. The maps are used to improve weather forecasts, support water resources and agricultural applications, and study the Earth"s water cycle and climate variability. NASA"s Global Land Data Assimilation System (GLDAS) project facilitates testing of several different LSMs with a variety of input datasets (e.g., precipitation, plant type).
NASA Technical Reports Server (NTRS)
Ruane, Alex C.; Goldberg, Richard; Chryssanthacopoulos, James
2014-01-01
The AgMERRA and AgCFSR climate forcing datasets provide daily, high-resolution, continuous, meteorological series over the 1980-2010 period designed for applications examining the agricultural impacts of climate variability and climate change. These datasets combine daily resolution data from retrospective analyses (the Modern-Era Retrospective Analysis for Research and Applications, MERRA, and the Climate Forecast System Reanalysis, CFSR) with in situ and remotely-sensed observational datasets for temperature, precipitation, and solar radiation, leading to substantial reductions in bias in comparison to a network of 2324 agricultural-region stations from the Hadley Integrated Surface Dataset (HadISD). Results compare favorably against the original reanalyses as well as the leading climate forcing datasets (Princeton, WFD, WFD-EI, and GRASP), and AgMERRA distinguishes itself with substantially improved representation of daily precipitation distributions and extreme events owing to its use of the MERRA-Land dataset. These datasets also peg relative humidity to the maximum temperature time of day, allowing for more accurate representation of the diurnal cycle of near-surface moisture in agricultural models. AgMERRA and AgCFSR enable a number of ongoing investigations in the Agricultural Model Intercomparison and Improvement Project (AgMIP) and related research networks, and may be used to fill gaps in historical observations as well as a basis for the generation of future climate scenarios.
NASA Astrophysics Data System (ADS)
Sure, A.; Dikshit, O.
2017-12-01
Root zone soil moisture (RZSM) is an important element in hydrology and agriculture. The estimation of RZSM provides insight in selecting the appropriate crops for specific soil conditions (soil type, bulk density, etc.). RZSM governs various vadose zone phenomena and subsequently affects the groundwater processes. With various satellite sensors dedicated to estimating surface soil moisture at different spatial and temporal resolutions, estimation of soil moisture at root zone level for Indo - Gangetic basin which inherits complex heterogeneous environment, is quite challenging. This study aims at estimating RZSM and understand its variation at the level of Indo - Gangetic basin with changing land use/land cover, topography, crop cycles, soil properties, temperature and precipitation patterns using two satellite derived soil moisture datasets operating at distinct frequencies with different principles of acquisition. Two surface soil moisture datasets are derived from AMSR-2 (6.9 GHz - `C' Band) and SMOS (1.4 GHz - `L' band) passive microwave sensors with coarse spatial resolution. The Soil Water Index (SWI), accounting for soil moisture from the surface, is derived by considering a theoretical two-layered water balance model and contributes in ascertaining soil moisture at the vadose zone. This index is evaluated against the widely used modelled soil moisture dataset of GLDAS - NOAH, version 2.1. This research enhances the domain of utilising the modelled soil moisture dataset, wherever the ground dataset is unavailable. The coupling between the surface soil moisture and RZSM is analysed for two years (2015-16), by defining a parameter T, the characteristic time length. The study demonstrates that deriving an optimal value of T for estimating SWI at a certain location is a function of various factors such as land, meteorological, and agricultural characteristics.
Status update: is smoke on your mind? Using social media to assess smoke exposure
NASA Astrophysics Data System (ADS)
Ford, Bonne; Burke, Moira; Lassman, William; Pfister, Gabriele; Pierce, Jeffrey R.
2017-06-01
Exposure to wildland fire smoke is associated with negative effects on human health. However, these effects are poorly quantified. Accurately attributing health endpoints to wildland fire smoke requires determining the locations, concentrations, and durations of smoke events. Most current methods for assessing these smoke events (ground-based measurements, satellite observations, and chemical transport modeling) are limited temporally, spatially, and/or by their level of accuracy. In this work, we explore using daily social media posts from Facebook regarding smoke, haze, and air quality to assess population-level exposure for the summer of 2015 in the western US. We compare this de-identified, aggregated Facebook dataset to several other datasets that are commonly used for estimating exposure, such as satellite observations (MODIS aerosol optical depth and Hazard Mapping System smoke plumes), daily (24 h) average surface particulate matter measurements, and model-simulated (WRF-Chem) surface concentrations. After adding population-weighted spatial smoothing to the Facebook data, this dataset is well correlated (R2 generally above 0.5) with the other methods in smoke-impacted regions. The Facebook dataset is better correlated with surface measurements of PM2. 5 at a majority of monitoring sites (163 of 293 sites) than the satellite observations and our model simulation. We also present an example case for Washington state in 2015, for which we combine this Facebook dataset with MODIS observations and WRF-Chem-simulated PM2. 5 in a regression model. We show that the addition of the Facebook data improves the regression model's ability to predict surface concentrations. This high correlation of the Facebook data with surface monitors and our Washington state example suggests that this social-media-based proxy can be used to estimate smoke exposure in locations without direct ground-based particulate matter measurements.
NASA Astrophysics Data System (ADS)
Bremer, Magnus; Sass, Oliver; Vetter, Michael; Geilhausen, Martin
2010-05-01
Country-wide ALS datasets of high resolution become more and more available and can provide a solid basis for geomorphological research. On the other hand, terrain changes after geomorphological extreme events can be quickly and flexibly documented by TLS and be compared to the pre-existing ALS datasets. For quantifying net-erosion, net-sedimentation and transport rates of events like rock falls, landslides and debris flows, comparing TLS surveys after the event to ALS data before the event is likely to become a widespread and powerful tool. However, the accuracy and possible errors of fitting ALS and TLS data have to be carefully assessed. We tried to quantify sediment movement and terrain changes caused by a major debris-flow-event in the Halltal in the Karwendel Mountains (Tyrol, Austria). Wide areas of limestone debris were dissected and relocated in the course of an exceptional rainstorm event on 29th June 2008. The event occurred 64 years after wildfire-driven deforestation. In the area, dense dwarf pine (pinus mugo) shrub cover is widespread, causing specific problems in generating terrain models. We compared a pre-event ALS-dataset, provided by the federal-state of Tyrol, and a post-event TLS survey. The two scanner systems have differing system characteristics (scan angles, resolutions, application of dGPS, etc.), causing different systematic and random errors. Combining TLS and ALS point data was achieved using an algorithm of the RISCAN_PRO software (Multi Station Adjustment), enabling a least square fitting between the two surfaces. Adjustment and registration accuracies as well as the quality of applied vegetation filters, mainly eliminating non-groundpoints from the raw data, are crucial for the generation of high-quality terrain models and a reliable comparison of the two data sets. Readily available filter algorithms provide good performance for gently sloped terrain and high forest vegetation. However, the low krummholz vegetation on steep terrain proved difficult to be filtered. This is due to a small height difference between terrain and canopy, a very strong height variation of the terrain points compared to the height variation of the canopy points and a very high density of the vegetation. The letter leads to very low percentages of groundpoints (1 - 5%). A combined filtering approach using a surface-based filter and a morphological filter, adapted to the characteristics of the krummholz vegetation were applied to overcome these problems. In the next step, the datasets were compared, erosion- and sedimentation areas were detected and quantified (cut-and-fill) in view of the accuracy achieved. The position of the relocated surface areas were compared to the morphological structures of the initial surface (inclination, curvature, flowpaths, hydrological catchments). Considerable deviations between the datasets were caused, besides the geomorphic terrain changes, by systematic and random errors. Due to the scanner perspective, parts of the steep slopes are depicted inaccurately by ALS. Rugged terrain surfaces cause random errors of ALS/TLS adjustment when the ratio of point density to surface variability is low. Due to multiple returns and alteration of pulse shape, terrain altitude is frequently overestimated when dense shrub cover is present. This effect becomes stronger with larger footprints. Despite these problems, erosional and depositional areas of debris flows could be clearly identified and match the results of field surveys. Strongest erosion occurred along the flowpaths with the greatest runoff concentration, mainly at the bedrock-debris interface.
NHDPlusHR: A national geospatial framework for surface-water information
Viger, Roland; Rea, Alan H.; Simley, Jeffrey D.; Hanson, Karen M.
2016-01-01
The U.S. Geological Survey is developing a new geospatial hydrographic framework for the United States, called the National Hydrography Dataset Plus High Resolution (NHDPlusHR), that integrates a diversity of the best-available information, robustly supports ongoing dataset improvements, enables hydrographic generalization to derive alternate representations of the network while maintaining feature identity, and supports modern scientific computing and Internet accessibility needs. This framework is based on the High Resolution National Hydrography Dataset, the Watershed Boundaries Dataset, and elevation from the 3-D Elevation Program, and will provide an authoritative, high precision, and attribute-rich geospatial framework for surface-water information for the United States. Using this common geospatial framework will provide a consistent basis for indexing water information in the United States, eliminate redundancy, and harmonize access to, and exchange of water information.
Version 2 Goddard Satellite-Based Surface Turbulent Fluxes (GSSTF2)
NASA Technical Reports Server (NTRS)
Chou, Shu-Hsien; Nelkin, Eric; Ardizzone, Joe; Atlas, Robert M.; Shie, Chung-Lin; Starr, David O'C. (Technical Monitor)
2002-01-01
Information on the turbulent fluxes of momentum, moisture, and heat at the air-sea interface is essential in improving model simulations of climate variations and in climate studies. We have derived a 13.5-year (July 1987-December 2000) dataset of daily surface turbulent fluxes over global oceans from the Special Sensor Mcrowave/Imager (SSM/I) radiance measurements. This dataset, version 2 Goddard Satellite-based Surface Turbulent Fluxes (GSSTF2), has a spatial resolution of 1 degree x 1 degree latitude-longitude and a temporal resolution of 1 day. Turbulent fluxes are derived from the SSM/I surface winds and surface air humidity, as well as the 2-m air and sea surface temperatures (SST) of the NCEP/NCAR reanalysis, using a bulk aerodynamic algorithm based on the surface layer similarity theory.
Operational use of open satellite data for marine water quality monitoring
NASA Astrophysics Data System (ADS)
Symeonidis, Panagiotis; Vakkas, Theodoros
2017-09-01
The purpose of this study was to develop an operational platform for marine water quality monitoring using near real time satellite data. The developed platform utilizes free and open satellite data available from different data sources like COPERNICUS, the European Earth Observation Initiative, or NASA, from different satellites and instruments. The quality of the marine environment is operationally evaluated using parameters like chlorophyll-a concentration, water color and Sea Surface Temperature (SST). For each parameter, there are more than one dataset available, from different data sources or satellites, to allow users to select the most appropriate dataset for their area or time of interest. The above datasets are automatically downloaded from the data provider's services and ingested to the central, spatial engine. The spatial data platform uses the Postgresql database with the PostGIS extension for spatial data storage and Geoserver for the provision of the spatial data services. The system provides daily, 10 days and monthly maps and time series of the above parameters. The information is provided using a web client which is based on the GET SDI PORTAL, an easy to use and feature rich geospatial visualization and analysis platform. The users can examine the temporal variation of the parameters using a simple time animation tool. In addition, with just one click on the map, the system provides an interactive time series chart for any of the parameters of the available datasets. The platform can be offered as Software as a Service (SaaS) to any area in the Mediterranean region.
Extensive validation of CM SAF surface radiation products over Europe.
Urraca, Ruben; Gracia-Amillo, Ana M; Koubli, Elena; Huld, Thomas; Trentmann, Jörg; Riihelä, Aku; Lindfors, Anders V; Palmer, Diane; Gottschalg, Ralph; Antonanzas-Torres, Fernando
2017-09-15
This work presents a validation of three satellite-based radiation products over an extensive network of 313 pyranometers across Europe, from 2005 to 2015. The products used have been developed by the Satellite Application Facility on Climate Monitoring (CM SAF) and are one geostationary climate dataset (SARAH-JRC), one polar-orbiting climate dataset (CLARA-A2) and one geostationary operational product. Further, the ERA-Interim reanalysis is also included in the comparison. The main objective is to determine the quality level of the daily means of CM SAF datasets, identifying their limitations, as well as analyzing the different factors that can interfere in the adequate validation of the products. The quality of the pyranometer was the most critical source of uncertainty identified. In this respect, the use of records from Second Class pyranometers and silicon-based photodiodes increased the absolute error and the bias, as well as the dispersion of both metrics, preventing an adequate validation of the daily means. The best spatial estimates for the three datasets were obtained in Central Europe with a Mean Absolute Deviation (MAD) within 8-13 W/m 2 , whereas the MAD always increased at high-latitudes, snow-covered surfaces, high mountain ranges and coastal areas. Overall, the SARAH-JRC's accuracy was demonstrated over a dense network of stations making it the most consistent dataset for climate monitoring applications. The operational dataset was comparable to SARAH-JRC in Central Europe, but lacked of the temporal stability of climate datasets, while CLARA-A2 did not achieve the same level of accuracy despite predictions obtained showed high uniformity with a small negative bias. The ERA-Interim reanalysis shows the by-far largest deviations from the surface reference measurements.
NASA Technical Reports Server (NTRS)
deGoncalves, Luis Gustavo G.; Shuttleworth, William J.; Vila, Daniel; Larroza, Elaine; Bottino, Marcus J.; Herdies, Dirceu L.; Aravequia, Jose A.; De Mattos, Joao G. Z.; Toll, David L.; Rodell, Matthew;
2008-01-01
The definition and derivation of a 5-year, 0.125deg, 3-hourly atmospheric forcing dataset for the South America continent is described which is appropriate for use in a Land Data Assimilation System and which, because of the limited surface observational networks available in this region, uses remotely sensed data merged with surface observations as the basis for the precipitation and downward shortwave radiation fields. The quality of this data set is evaluated against available surface observations. There are regional difference in the biases for all variables in the dataset, with biases in precipitation of the order 0-1 mm/day and RMSE of 5-15 mm/day, biases in surface solar radiation of the order 10 W/sq m and RMSE of 20 W/sq m, positive biases in temperature typically between 0 and 4 K, depending on region, and positive biases in specific humidity around 2-3 g/Kg in tropical regions and negative biases around 1-2 g/Kg further south.
Cole, Christopher J.; Friesen, Beverly A.; Wilson, Earl M.; Wilds, Stanley R.; Noble, Suzanne M.
2015-01-01
This surface-water cover dataset was created as a timely representation of post-flood ground conditions to support response efforts. This dataset and all processed imagery and derived products were uploaded to the USGS Hazards Data Distribution System (HDDS) website (http://hddsexplorer.usgs.gov/uplift/hdds/) for distribution to those responding to the flood event.
Observation and numerical simulation of a convective initiation during COHMEX
NASA Technical Reports Server (NTRS)
Song, J. Aaron; Kaplan, Michael L.
1991-01-01
Under a synoptically undisturbed condition, a dual-peak convective lifecycle was observed with the COoperative Huntsville Meteorological EXperiment (COHMEX) observational network over a 24-hour period. The lifecycle included a multicell storm, which lasted about 6 hours, produced a peak rainrate exceeding 100 mm/hr, and initiated a downstream mesoscale convective system. The 24-hour accumulated rainfall of this event was the largest during the entire COHMEX. The downstream mesoscale convective system, unfortunately, was difficult to investigate quantitatively due to the lack of mesoscale observations. The dataset collected near the time of the multicell storm evolution, including its initiation, was one of the best datasets of COHMEX. In this study, the initiation of this multicell storm is chosen as the target of the numerical simulations.
LOGISMOS-B for primates: primate cortical surface reconstruction and thickness measurement
NASA Astrophysics Data System (ADS)
Oguz, Ipek; Styner, Martin; Sanchez, Mar; Shi, Yundi; Sonka, Milan
2015-03-01
Cortical thickness and surface area are important morphological measures with implications for many psychiatric and neurological conditions. Automated segmentation and reconstruction of the cortical surface from 3D MRI scans is challenging due to the variable anatomy of the cortex and its highly complex geometry. While many methods exist for this task in the context of the human brain, these methods are typically not readily applicable to the primate brain. We propose an innovative approach based on our recently proposed human cortical reconstruction algorithm, LOGISMOS-B, and the Laplace-based thickness measurement method. Quantitative evaluation of our approach was performed based on a dataset of T1- and T2-weighted MRI scans from 12-month-old macaques where labeling by our anatomical experts was used as independent standard. In this dataset, LOGISMOS-B has an average signed surface error of 0.01 +/- 0.03mm and an unsigned surface error of 0.42 +/- 0.03mm over the whole brain. Excluding the rather problematic temporal pole region further improves unsigned surface distance to 0.34 +/- 0.03mm. This high level of accuracy reached by our algorithm even in this challenging developmental dataset illustrates its robustness and its potential for primate brain studies.
Automatic 3D liver location and segmentation via convolutional neural network and graph cut.
Lu, Fang; Wu, Fa; Hu, Peijun; Peng, Zhiyi; Kong, Dexing
2017-02-01
Segmentation of the liver from abdominal computed tomography (CT) images is an essential step in some computer-assisted clinical interventions, such as surgery planning for living donor liver transplant, radiotherapy and volume measurement. In this work, we develop a deep learning algorithm with graph cut refinement to automatically segment the liver in CT scans. The proposed method consists of two main steps: (i) simultaneously liver detection and probabilistic segmentation using 3D convolutional neural network; (ii) accuracy refinement of the initial segmentation with graph cut and the previously learned probability map. The proposed approach was validated on forty CT volumes taken from two public databases MICCAI-Sliver07 and 3Dircadb1. For the MICCAI-Sliver07 test dataset, the calculated mean ratios of volumetric overlap error (VOE), relative volume difference (RVD), average symmetric surface distance (ASD), root-mean-square symmetric surface distance (RMSD) and maximum symmetric surface distance (MSD) are 5.9, 2.7 %, 0.91, 1.88 and 18.94 mm, respectively. For the 3Dircadb1 dataset, the calculated mean ratios of VOE, RVD, ASD, RMSD and MSD are 9.36, 0.97 %, 1.89, 4.15 and 33.14 mm, respectively. The proposed method is fully automatic without any user interaction. Quantitative results reveal that the proposed approach is efficient and accurate for hepatic volume estimation in a clinical setup. The high correlation between the automatic and manual references shows that the proposed method can be good enough to replace the time-consuming and nonreproducible manual segmentation method.
Mashburn, Shana L.; Winton, Kimberly T.
2010-01-01
This CD-ROM contains spatial datasets that describe natural and anthropogenic features and county-level estimates of agricultural pesticide use and pesticide data for surface-water, groundwater, and biological specimens in the state of Oklahoma. County-level estimates of pesticide use were compiled from the Pesticide National Synthesis Project of the U.S. Geological Survey, National Water-Quality Assessment Program. Pesticide data for surface water, groundwater, and biological specimens were compiled from U.S. Geological Survey National Water Information System database. These spatial datasets that describe natural and manmade features were compiled from several agencies and contain information collected by the U.S. Geological Survey. The U.S. Geological Survey datasets were not collected specifically for this compilation, but were previously collected for projects with various objectives. The spatial datasets were created by different agencies from sources with varied quality. As a result, features common to multiple layers may not overlay exactly. Users should check the metadata to determine proper use of these spatial datasets. These data were not checked for accuracy or completeness. If a question of accuracy or completeness arise, the user should contact the originator cited in the metadata.
NASA Astrophysics Data System (ADS)
Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Pohlmann, Holger; Müller, Wolfgang; Cubasch, Ulrich
2014-05-01
Decadal forecasting of climate variability is a growing need for different parts of society, industry and economy. The German initiative MiKlip (www.fona-miklip.de) focuses on the ongoing processes of medium-term climate prediction. The scientific major project funded by the Federal Ministry of Education and Research in Germany (BMBF) develops a forecast system, that aims for reliable predictions on decadal timescales. Using a single earth system model from the Max-Planck institute (MPI-ESM) and moving from the uninitialized runs on to the first initialized 'Coupled Model Intercomparison Project Phase 5' (CMIP5) hindcast experiments identified possibilities and open scientific tasks. The MiKlip decadal prediction system was improved on different aspects through new initialization techniques and datasets of the ocean and atmosphere. To accompany and emphasize such an improvement of a forecast system, a standardized evaluation system designed by the MiKlip sub-project 'Integrated data and evaluation system for decadal scale prediction' (INTEGRATION) analyzes every step of its evolution. This study aims at combining deterministic and probabilistic skill scores of this prediction system from its unitialized state to anomaly and then full-field oceanic initialization. The improved forecast skill in these different decadal hindcast experiments of surface air temperature and precipitation in the Pacific region and the complex area of the North Atlantic illustrate potential sources of skill. A standardized evaluation leads prediction systems depending on development to find its way to produce reliable forecasts. Different aspects of these research dependencies, e.g. ensemble size, resolution, initializations, etc. will be discussed.
Impacts of land cover changes on climate trends in Jiangxi province China.
Wang, Qi; Riemann, Dirk; Vogt, Steffen; Glaser, Rüdiger
2014-07-01
Land-use/land-cover (LULC) change is an important climatic force, and is also affected by climate change. In the present study, we aimed to assess the regional scale impact of LULC on climate change using Jiangxi Province, China, as a case study. To obtain reliable climate trends, we applied the standard normal homogeneity test (SNHT) to surface air temperature and precipitation data for the period 1951-1999. We also compared the temperature trends computed from Global Historical Climatology Network (GHCN) datasets and from our analysis. To examine the regional impacts of land surface types on surface air temperature and precipitation change integrating regional topography, we used the observation minus reanalysis (OMR) method. Precipitation series were found to be homogeneous. Comparison of GHCN and our analysis on adjusted temperatures indicated that the resulting climate trends varied slightly from dataset to dataset. OMR trends associated with surface vegetation types revealed a strong surface warming response to land barrenness and weak warming response to land greenness. A total of 81.1% of the surface warming over vegetation index areas (0-0.2) was attributed to surface vegetation type change and regional topography. The contribution of surface vegetation type change decreases as land cover greenness increases. The OMR precipitation trend has a weak dependence on surface vegetation type change. We suggest that LULC integrating regional topography should be considered as a force in regional climate modeling.
NASA Astrophysics Data System (ADS)
Green, D. N.; Neuberg, J.
2005-04-01
In March 2004, during a period of no magma extrusion at Soufrière Hills volcano, Montserrat, an explosive event occurred with little precursory activity. Recorded broadband seismic signals ranged from an ultra-long-period signal with a dominant period of 120 s to impulsive, short-duration events containing frequencies up to 30 Hz. Synthetic displacement functions were fit to the long-period data after application of the seismometer response. These indicate a shallow collapse of the volcanic edifice occurred, initiated ~300 m below the surface, lasting ~100 s. Infrasonic tremor and pulses were also recorded in the 1-20 Hz range. The high-frequency seismicity and infrasound are interpreted as the subsequent collapse of a gravitationally unstable buttress of remnant dome material which impacted upon the edifice surface. This unique dataset demonstrates the benefits of deploying multi-parameter stations equipped with broadband instruments.
NASA Astrophysics Data System (ADS)
Sledd, A.; L'Ecuyer, T. S.
2017-12-01
With Arctic sea ice declining rapidly and Arctic temperatures rising faster than the rest of the globe, a better understanding of the Arctic climate, and ice cover-radiation feedbacks in particular, is needed. Here we present the Arctic Observation and Reanalysis Integrated System (ArORIS), a dataset of integrated products to facilitate studying the Arctic using satellite, reanalysis, and in-situ datasets. The data include cloud properties, radiative fluxes, aerosols, meteorology, precipitation, and surface properties, to name just a few. Each dataset has uniform grid-spacing, time-averaging and naming conventions for ease of use between products. One intended use of ArORIS is to assess Arctic radiation and moisture budgets. Following that goal, we use observations from ArORIS - CERES-EBAF radiative fluxes and NSIDC sea ice fraction and area to quantify relationships between the Arctic energy balance and surface properties. We find a discernable difference between energy budgets for years with high and low September sea ice areas. Surface fluxes are especially responsive to the September sea ice minimum in months both leading up to September and the months following. In particular, longwave fluxes at the surface show increased sensitivity in the months preceding September. Using a single-layer model of solar radiation we also investigate the individual responses of surface and planetary albedos to changes in sea ice area. By partitioning the planetary albedo into surface and atmospheric contributions, we find that the atmospheric contribution to planetary albedo is less sensitive to changes in sea ice area than the surface contribution. Further comparisons between observations and reanalyses can be made using the available datasets in ArORIS.
NASA Astrophysics Data System (ADS)
Xu, Z.; Rhoades, A.; Johansen, H.; Ullrich, P. A.; Collins, W. D.
2017-12-01
Dynamical downscaling is widely used to properly characterize regional surface heterogeneities that shape the local hydroclimatology. However, the factors in dynamical downscaling, including the refinement of model horizontal resolution, large-scale forcing datasets and dynamical cores, have not been fully evaluated. Two cutting-edge global-to-regional downscaling methods are used to assess these, specifically the variable-resolution Community Earth System Model (VR-CESM) and the Weather Research & Forecasting (WRF) regional climate model, under different horizontal resolutions (28, 14, and 7 km). Two groups of WRF simulations are driven by either the NCEP reanalysis dataset (WRF_NCEP) or VR-CESM outputs (WRF_VRCESM) to evaluate the effects of the large-scale forcing datasets. The impacts of dynamical core are assessed by comparing the VR-CESM simulations to the coupled WRF_VRCESM simulations with the same physical parameterizations and similar grid domains. The simulated hydroclimatology (i.e., total precipitation, snow cover, snow water equivalent and surface temperature) are compared with the reference datasets. The large-scale forcing datasets are critical to the WRF simulations in more accurately simulating total precipitation, SWE and snow cover, but not surface temperature. Both the WRF and VR-CESM results highlight that no significant benefit is found in the simulated hydroclimatology by just increasing horizontal resolution refinement from 28 to 7 km. Simulated surface temperature is sensitive to the choice of dynamical core. WRF generally simulates higher temperatures than VR-CESM, alleviates the systematic cold bias of DJF temperatures over the California mountain region, but overestimates the JJA temperature in California's Central Valley.
Estimating flow-duration and low-flow frequency statistics for unregulated streams in Oregon.
DOT National Transportation Integrated Search
2008-08-01
Flow statistical datasets, basin-characteristic datasets, and regression equations were developed to provide decision makers with surface-water information needed for activities such as water-quality regulation, water-rights adjudication, biological ...
Medical Subject Headings (MeSH) for indexing and retrieving open-source healthcare data.
Marc, David T; Khairat, Saif S
2014-01-01
The US federal government initiated the Open Government Directive where federal agencies are required to publish high value datasets so that they are available to the public. Data.gov and the community site Healthdata.gov were initiated to disperse such datasets. However, data searches and retrieval for these sites are keyword driven and severely limited in performance. The purpose of this paper is to address the issue of extracting relevant open-source data by proposing a method of adopting the MeSH framework for indexing and data retrieval. A pilot study was conducted to compare the performance of traditional keywords to MeSH terms for retrieving relevant open-source datasets related to "mortality". The MeSH framework resulted in greater sensitivity with comparable specificity to the keyword search. MeSH showed promise as a method for indexing and retrieving data, yet future research should conduct a larger scale evaluation of the performance of the MeSH framework for retrieving relevant open-source healthcare datasets.
The optimization of high resolution topographic data for 1D hydrodynamic models
NASA Astrophysics Data System (ADS)
Ales, Ronovsky; Michal, Podhoranyi
2016-06-01
The main focus of our research presented in this paper is to optimize and use high resolution topographical data (HRTD) for hydrological modelling. Optimization of HRTD is done by generating adaptive mesh by measuring distance of coarse mesh and the surface of the dataset and adapting the mesh from the perspective of keeping the geometry as close to initial resolution as possible. Technique described in this paper enables computation of very accurate 1-D hydrodynamic models. In the paper, we use HEC-RAS software as a solver. For comparison, we have chosen the amount of generated cells/grid elements (in whole discretization domain and selected cross sections) with respect to preservation of the accuracy of the computational domain. Generation of the mesh for hydrodynamic modelling is strongly reliant on domain size and domain resolution. Topographical dataset used in this paper was created using LiDAR method and it captures 5.9km long section of a catchment of the river Olše. We studied crucial changes in topography for generated mesh. Assessment was done by commonly used statistical and visualization methods.
Taylor, Charles J.; Nelson, Hugh L.
2008-01-01
Geospatial data needed to visualize and evaluate the hydrogeologic framework and distribution of karst features in the Interior Low Plateaus physiographic region of the central United States were compiled during 2004-2007 as part of the Ground-Water Resources Program Karst Hydrology Initiative (KHI) project. Because of the potential usefulness to environmental and water-resources regulators, private consultants, academic researchers, and others, the geospatial data files created during the KHI project are being made available to the public as a provisional regional karst dataset. To enhance accessibility and visualization, the geospatial data files have been compiled as ESRI ArcReader data folders and user interactive Published Map Files (.pmf files), all of which are catalogued by the boundaries of surface watersheds using U.S. Geological Survey (USGS) eight-digit hydrologic unit codes (HUC-8s). Specific karst features included in the dataset include mapped sinkhole locations, sinking (or disappearing) streams, internally drained catchments, karst springs inventoried in the USGS National Water Information System (NWIS) database, relic stream valleys, and karst flow paths obtained from results of previously reported water-tracer tests.
The optimization of high resolution topographic data for 1D hydrodynamic models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ales, Ronovsky, E-mail: ales.ronovsky@vsb.cz; Michal, Podhoranyi
2016-06-08
The main focus of our research presented in this paper is to optimize and use high resolution topographical data (HRTD) for hydrological modelling. Optimization of HRTD is done by generating adaptive mesh by measuring distance of coarse mesh and the surface of the dataset and adapting the mesh from the perspective of keeping the geometry as close to initial resolution as possible. Technique described in this paper enables computation of very accurate 1-D hydrodynamic models. In the paper, we use HEC-RAS software as a solver. For comparison, we have chosen the amount of generated cells/grid elements (in whole discretization domainmore » and selected cross sections) with respect to preservation of the accuracy of the computational domain. Generation of the mesh for hydrodynamic modelling is strongly reliant on domain size and domain resolution. Topographical dataset used in this paper was created using LiDAR method and it captures 5.9km long section of a catchment of the river Olše. We studied crucial changes in topography for generated mesh. Assessment was done by commonly used statistical and visualization methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peng, Jialin, E-mail: 2004pjl@163.com; Zhang, Hongbo; Hu, Peijun
Purpose: Efficient and accurate 3D liver segmentations from contrast-enhanced computed tomography (CT) images play an important role in therapeutic strategies for hepatic diseases. However, inhomogeneous appearances, ambiguous boundaries, and large variance in shape often make it a challenging task. The existence of liver abnormalities poses further difficulty. Despite the significant intensity difference, liver tumors should be segmented as part of the liver. This study aims to address these challenges, especially when the target livers contain subregions with distinct appearances. Methods: The authors propose a novel multiregion-appearance based approach with graph cuts to delineate the liver surface. For livers with multiplemore » subregions, a geodesic distance based appearance selection scheme is introduced to utilize proper appearance constraint for each subregion. A special case of the proposed method, which uses only one appearance constraint to segment the liver, is also presented. The segmentation process is modeled with energy functions incorporating both boundary and region information. Rather than a simple fixed combination, an adaptive balancing weight is introduced and learned from training sets. The proposed method only calls initialization inside the liver surface. No additional constraints from user interaction are utilized. Results: The proposed method was validated on 50 3D CT images from three datasets, i.e., Medical Image Computing and Computer Assisted Intervention (MICCAI) training and testing set, and local dataset. On MICCAI testing set, the proposed method achieved a total score of 83.4 ± 3.1, outperforming nonexpert manual segmentation (average score of 75.0). When applying their method to MICCAI training set and local dataset, it yielded a mean Dice similarity coefficient (DSC) of 97.7% ± 0.5% and 97.5% ± 0.4%, respectively. These results demonstrated the accuracy of the method when applied to different computed tomography (CT) datasets. In addition, user operator variability experiments showed its good reproducibility. Conclusions: A multiregion-appearance based method is proposed and evaluated to segment liver. This approach does not require prior model construction and so eliminates the burdens associated with model construction and matching. The proposed method provides comparable results with state-of-the-art methods. Validation results suggest that it may be suitable for the clinical use.« less
Landfalling Tropical Cyclones: Forecast Problems and Associated Research Opportunities
Marks, F.D.; Shay, L.K.; Barnes, G.; Black, P.; Demaria, M.; McCaul, B.; Mounari, J.; Montgomery, M.; Powell, M.; Smith, J.D.; Tuleya, B.; Tripoli, G.; Xie, Lingtian; Zehr, R.
1998-01-01
The Fifth Prospectus Development Team of the U.S. Weather Research Program was charged to identify and delineate emerging research opportunities relevant to the prediction of local weather, flooding, and coastal ocean currents associated with landfalling U.S. hurricanes specifically, and tropical cyclones in general. Central to this theme are basic and applied research topics, including rapid intensity change, initialization of and parameterization in dynamical models, coupling of atmospheric and oceanic models, quantitative use of satellite information, and mobile observing strategies to acquire observations to evaluate and validate predictive models. To improve the necessary understanding of physical processes and provide the initial conditions for realistic predictions, a focused, comprehensive mobile observing system in a translating storm-coordinate system is required. Given the development of proven instrumentation and improvement of existing systems, three-dimensional atmospheric and oceanic datasets need to be acquired whenever major hurricanes threaten the United States. The spatial context of these focused three-dimensional datasets over the storm scales is provided by satellites, aircraft, expendable probes released from aircraft, and coastal (both fixed and mobile), moored, and drifting surface platforms. To take full advantage of these new observations, techniques need to be developed to objectively analyze these observations, and initialize models aimed at improving prediction of hurricane track and intensity from global-scale to mesoscale dynamical models. Multinested models allow prediction of all scales from the global, which determine long- term hurricane motion to the convective scale, which affect intensity. Development of an integrated analysis and model forecast system optimizing the use of three-dimensional observations and providing the necessary forecast skill on all relevant spatial scales is required. Detailed diagnostic analyses of these datasets will lead to improved understanding of the physical processes of hurricane motion, intensity change, the atmospheric and oceanic boundary layers, and the air- sea coupling mechanisms. The ultimate aim of this effort is the construction of real-time analyses of storm surge, winds, and rain, prior to and during landfall, to improve warnings and provide local officials with the comprehensive information required for recovery efforts in the hardest hit areas as quickly as possible.
NASA Astrophysics Data System (ADS)
Almazroui, Mansour; Raju, P. V. S.; Yusef, A.; Hussein, M. A. A.; Omar, M.
2018-04-01
In this paper, a nonhydrostatic Weather Research and Forecasting (WRF) model has been used to simulate the extreme precipitation event of 25 November 2009, over Jeddah, Saudi Arabia. The model is integrated in three nested (27, 9, and 3 km) domains with the initial and boundary forcing derived from the NCEP reanalysis datasets. As a control experiment, the model integrated for 48 h initiated at 0000 UTC on 24 November 2009. The simulated rainfall in the control experiment depicts in well agreement with Tropical Rainfall Measurement Mission rainfall estimates in terms of intensity as well as spatio-temporal distribution. Results indicate that a strong low-level (850 hPa) wind over Jeddah and surrounding regions enhanced the moisture and temperature gradient and created a conditionally unstable atmosphere that favored the development of the mesoscale system. The influences of topography and heat exchange process in the atmosphere were investigated on the development of extreme precipitation event; two sensitivity experiments are carried out: one without topography and another without exchange of surface heating to the atmosphere. The results depict that both surface heating and topography played crucial role in determining the spatial distribution and intensity of the extreme rainfall over Jeddah. The topography favored enhanced uplift motion that further strengthened the low-level jet and hence the rainfall over Jeddah and adjacent areas. On the other hand, the absence of surface heating considerably reduced the simulated rainfall by 30% as compared to the observations.
The Everglades Depth Estimation Network (EDEN) surface-water model, version 2
Telis, Pamela A.; Xie, Zhixiao; Liu, Zhongwei; Li, Yingru; Conrads, Paul
2015-01-01
Three applications of the EDEN-modeled water surfaces and other EDEN datasets are presented in the report to show how scientists and resource managers are using EDEN datasets to analyze biological and ecological responses to hydrologic changes in the Everglades. The biological responses of two important Everglades species, alligators and wading birds, to changes in hydrology are described. The effects of hydrology on fire dynamics in the Everglades are also discussed.
Correlation of Gear Surface Fatigue Lives to Lambda Ratio (Specific Film Thickness)
NASA Technical Reports Server (NTRS)
Krantz, Timothy Lewis
2013-01-01
The effect of the lubrication regime on gear performance has been recognized, qualitatively, for decades. Often the lubrication regime is characterized by the specific film thickness being the ratio of lubricant film thickness to the composite surface roughness. Three studies done at NASA to investigate gearing pitting life are revisited in this work. All tests were done at a common load. In one study, ground gears were tested using a variety of lubricants that included a range of viscosities, and therefore the gears operated with differing film thicknesses. In a second and third study, the performance of gears with ground teeth and superfinished teeth were assessed. Thicker oil films provided longer lives as did improved surface finish. These datasets were combined into a common dataset using the concept of specific film thickness. This unique dataset of more 258 tests provides gear designers with some qualitative information to make gear design decisions.
NASA Technical Reports Server (NTRS)
Case. Jonathan; Mungai, John; Sakwa, Vincent; Kabuchanga, Eric; Zavodsky, Bradley T.; Limaye, Ashutosh S.
2014-01-01
Flooding and drought are two key forecasting challenges for the Kenya Meteorological Department (KMD). Atmospheric processes leading to excessive precipitation and/or prolonged drought can be quite sensitive to the state of the land surface, which interacts with the boundary layer of the atmosphere providing a source of heat and moisture. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface within weakly-sheared environments, such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in numerical weather prediction models. Enhanced regional modeling capabilities have the potential to improve forecast guidance in support of daily operations and high-end events over east Africa. KMD currently runs a configuration of the Weather Research and Forecasting (WRF) model in real time to support its daily forecasting operations, invoking the Nonhydrostatic Mesoscale Model (NMM) dynamical core. They make use of the National Oceanic and Atmospheric Administration / National Weather Service Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the WRF-NMM model runs on a 7-km regional grid over eastern Africa. Two organizations at the National Aeronautics and Space Administration Marshall Space Flight Center in Huntsville, AL, SERVIR and the Short-term Prediction Research and Transition (SPoRT) Center, have established a working partnership with KMD for enhancing its regional modeling capabilities. To accomplish this goal, SPoRT and SERVIR will provide experimental land surface initialization datasets and model verification capabilities to KMD. To produce a land-surface initialization more consistent with the resolution of the KMD-WRF runs, the NASA Land Information System (LIS) will be run at a comparable resolution to provide real-time, daily soil initialization data in place of interpolated Global Forecast System soil moisture and temperature data. Additionally, real-time green vegetation fraction data from the Visible Infrared Imaging Radiometer Suite will be incorporated into the KMD-WRF runs, once it becomes publicly available from the National Environmental Satellite Data and Information Service. Finally, model verification capabilities will be transitioned to KMD using the Model Evaluation Tools (MET) package, in order to quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. The transition of these MET tools will enable KMD to monitor model forecast accuracy in near real time. This presentation will highlight preliminary verification results of WRF runs over east Africa using the LIS land surface initialization.
Sedimentology of Martian Gravels from Mardi Twilight Imaging: Techniques
NASA Technical Reports Server (NTRS)
Garvin, James B.; Malin, Michael C.; Minitti, M. E.
2014-01-01
Quantitative sedimentologic analysis of gravel surfaces dominated by pebble-sized clasts has been employed in an effort to untangle aspects of the provenance of surface sediments on Mars using Curiosity's MARDI nadir-viewing camera operated at twilight Images have been systematically acquired since sol 310 providing a representative sample of gravel-covered surfaces since the rover departed the Shaler region. The MARDI Twilight imaging dataset offers approximately 1 millimeter spatial resolution (slightly out of focus) for patches beneath the rover that cover just under 1 m2 in area, under illumination that makes clast size and inter-clast spacing analysis relatively straightforward using semi- automated codes developed for use with nadir images. Twilight images are utilized for these analyses in order to reduce light scattering off dust deposited on the front MARDI lens element during the terminal stages of Curiosity's entry, descent and landing. Such scattering is worse when imaging bright, directly-illuminated surfaces; twilight imaging times yield diffusely-illuminated surfaces that improve the clarity of the resulting MARDI product. Twilight images are obtained between 10-30 minutes after local sunset, governed by the timing of the end of the no-heat window for the camera. Techniques were also utilized to examine data terrestrial locations (the Kau Desert in Hawaii and near Askja Caldera in Iceland). Methods employed include log hyperbolic size distribution (LHD) analysis and Delauney Triangulation (DT) inter-clast spacing analysis. This work extends the initial results reported in Yingst et al., that covered the initial landing zone, to the Rapid-Transit Route (RTR) towards Mount Sharp.
NASA Technical Reports Server (NTRS)
Meneghini, Robert; Kim, Hyokyung; Liao, Liang; Jones, Jeffrey A.; Kwiatkowski, John M.
2015-01-01
It has long been recognized that path-integrated attenuation (PIA) can be used to improve precipitation estimates from high-frequency weather radar data. One approach that provides an estimate of this quantity from airborne or spaceborne radar data is the surface reference technique (SRT), which uses measurements of the surface cross section in the presence and absence of precipitation. Measurements from the dual-frequency precipitation radar (DPR) on the Global Precipitation Measurement (GPM) satellite afford the first opportunity to test the method for spaceborne radar data at Ka band as well as for the Ku-band-Ka-band combination. The study begins by reviewing the basis of the single- and dual-frequency SRT. As the performance of the method is closely tied to the behavior of the normalized radar cross section (NRCS or sigma(0)) of the surface, the statistics of sigma(0) derived from DPR measurements are given as a function of incidence angle and frequency for ocean and land backgrounds over a 1-month period. Several independent estimates of the PIA, formed by means of different surface reference datasets, can be used to test the consistency of the method since, in the absence of error, the estimates should be identical. Along with theoretical considerations, the comparisons provide an initial assessment of the performance of the single- and dual-frequency SRT for the DPR. The study finds that the dual-frequency SRT can provide improvement in the accuracy of path attenuation estimates relative to the single-frequency method, particularly at Ku band.
An object detection and tracking system for unmanned surface vehicles
NASA Astrophysics Data System (ADS)
Yang, Jian; Xiao, Yang; Fang, Zhiwen; Zhang, Naiwen; Wang, Li; Li, Tao
2017-10-01
Object detection and tracking are critical parts of unmanned surface vehicles(USV) to achieve automatic obstacle avoidance. Off-the-shelf object detection methods have achieved impressive accuracy in public datasets, though they still meet bottlenecks in practice, such as high time consumption and low detection quality. In this paper, we propose a novel system for USV, which is able to locate the object more accurately while being fast and stable simultaneously. Firstly, we employ Faster R-CNN to acquire several initial raw bounding boxes. Secondly, the image is segmented to a few superpixels. For each initial box, the superpixels inside will be grouped into a whole according to a combination strategy, and a new box is thereafter generated as the circumscribed bounding box of the final superpixel. Thirdly, we utilize KCF to track these objects after several frames, Faster-RCNN is again used to re-detect objects inside tracked boxes to prevent tracking failure as well as remove empty boxes. Finally, we utilize Faster R-CNN to detect objects in the next image, and refine object boxes by repeating the second module of our system. The experimental results demonstrate that our system is fast, robust and accurate, which can be applied to USV in practice.
Challenges in Extracting Information From Large Hydrogeophysical-monitoring Datasets
NASA Astrophysics Data System (ADS)
Day-Lewis, F. D.; Slater, L. D.; Johnson, T.
2012-12-01
Over the last decade, new automated geophysical data-acquisition systems have enabled collection of increasingly large and information-rich geophysical datasets. Concurrent advances in field instrumentation, web services, and high-performance computing have made real-time processing, inversion, and visualization of large three-dimensional tomographic datasets practical. Geophysical-monitoring datasets have provided high-resolution insights into diverse hydrologic processes including groundwater/surface-water exchange, infiltration, solute transport, and bioremediation. Despite the high information content of such datasets, extraction of quantitative or diagnostic hydrologic information is challenging. Visual inspection and interpretation for specific hydrologic processes is difficult for datasets that are large, complex, and (or) affected by forcings (e.g., seasonal variations) unrelated to the target hydrologic process. New strategies are needed to identify salient features in spatially distributed time-series data and to relate temporal changes in geophysical properties to hydrologic processes of interest while effectively filtering unrelated changes. Here, we review recent work using time-series and digital-signal-processing approaches in hydrogeophysics. Examples include applications of cross-correlation, spectral, and time-frequency (e.g., wavelet and Stockwell transforms) approaches to (1) identify salient features in large geophysical time series; (2) examine correlation or coherence between geophysical and hydrologic signals, even in the presence of non-stationarity; and (3) condense large datasets while preserving information of interest. Examples demonstrate analysis of large time-lapse electrical tomography and fiber-optic temperature datasets to extract information about groundwater/surface-water exchange and contaminant transport.
Mapping and Visualization of Storm-Surge Dynamics for Hurricane Katrina and Hurricane Rita
Gesch, Dean B.
2009-01-01
The damages caused by the storm surges from Hurricane Katrina and Hurricane Rita were significant and occurred over broad areas. Storm-surge maps are among the most useful geospatial datasets for hurricane recovery, impact assessments, and mitigation planning for future storms. Surveyed high-water marks were used to generate a maximum storm-surge surface for Hurricane Katrina extending from eastern Louisiana to Mobile Bay, Alabama. The interpolated surface was intersected with high-resolution lidar elevation data covering the study area to produce a highly detailed digital storm-surge inundation map. The storm-surge dataset and related data are available for display and query in a Web-based viewer application. A unique water-level dataset from a network of portable pressure sensors deployed in the days just prior to Hurricane Rita's landfall captured the hurricane's storm surge. The recorded sensor data provided water-level measurements with a very high temporal resolution at surveyed point locations. The resulting dataset was used to generate a time series of storm-surge surfaces that documents the surge dynamics in a new, spatially explicit way. The temporal information contained in the multiple storm-surge surfaces can be visualized in a number of ways to portray how the surge interacted with and was affected by land surface features. Spatially explicit storm-surge products can be useful for a variety of hurricane impact assessments, especially studies of wetland and land changes where knowledge of the extent and magnitude of storm-surge flooding is critical.
Factors affecting the simulated trajectory and intensification of Tropical Cyclone Yasi (2011)
NASA Astrophysics Data System (ADS)
Parker, Chelsea L.; Lynch, Amanda H.; Mooney, Priscilla A.
2017-09-01
This study investigates the sensitivity of the simulated trajectory, intensification, and forward speed of Tropical Cyclone Yasi to initial conditions, physical parameterizations, and sea surface temperatures. Yasi was a category 5 storm that made landfall in Queensland, Australia in February 2011. A series of simulations were performed using WRF-ARW v3.4.1 driven by ERA-Interim data at the lateral boundaries. To assess these simulations, a new simple skill score is devised to summarize the deviation from observed conditions at landfall. The results demonstrate the sensitivity to initial condition resolution and the need for a new initialization dataset. Ensemble testing of physics parameterizations revealed strong sensitivity to cumulus schemes, with a trade-off between trajectory and intensity accuracy. The Tiedtke scheme produces an accurate trajectory evolution and landfall location. The Kain Fritch scheme is associated with larger errors in trajectory due to a less active shallow convection over the ocean, leading to warmer temperatures at the 700 mb level and a stronger, more poleward steering flow. However, the Kain Fritsch scheme produces more accurate intensities and translation speeds. Tiedtke-derived intensities were weaker due to suppression of deep convection by active shallow convection. Accurate representation of the sea surface temperature through correcting a newly discovered SST lag in reanalysis data or increasing resolution of SST data can improve the simulation. Higher resolution increases relative vorticity and intensity. However, the sea surface boundary had a more pronounced effect on the simulation with the Tiedtke scheme due to its moisture convergence trigger and active shallow convection over the tropical ocean.
EAARL topography-Potato Creek watershed, Georgia, 2010
Bonisteel-Cormier, J.M.; Nayegandhi, Amar; Fredericks, Xan; Jones, J.W.; Wright, C.W.; Brock, J.C.; Nagle, D.B.
2011-01-01
This DVD contains lidar-derived first-surface (FS) and bare-earth (BE) topography GIS datasets of a portion of the Potato Creek watershed in the Apalachicola-Chattahoochee-Flint River basin, Georgia. These datasets were acquired on February 27, 2010.
NASA Astrophysics Data System (ADS)
Seers, T. D.; Hodgetts, D.
2013-12-01
Seers, T. D. & Hodgetts, D. School of Earth, Atmospheric and Environmental Sciences, University of Manchester, UK. M13 9PL. The detection of topological change at the Earth's surface is of considerable scholarly interest, allowing the quantification of the rates of geomorphic processes whilst providing lucid insights into the underlying mechanisms driving landscape evolution. In this regard, the past decade has witnessed the ever increasing proliferation of studies employing multi-temporal topographic data in within the geosciences, bolstered by continuing technical advancements in the acquisition and processing of prerequisite datasets. Provided by workers within the field of Computer Vision, multiview stereo (MVS) dense surface reconstructions, primed by structure-from-motion (SfM) based camera pose estimation represents one such development. Providing a cost effective, operationally efficient data capture medium, the modest requirement of a consumer grade camera for data collection coupled with the minimal user intervention required during post-processing makes SfM-MVS an attractive alternative to terrestrial laser scanners for collecting multi-temporal topographic datasets. However, in similitude to terrestrial scanner derived data, the co-registration of spatially coincident or partially overlapping scans produced by SfM-MVS presents a major technical challenge, particularly in the case of semi non-rigid scenes produced during topographic change detection studies. Moreover, the arbitrary scaling resulting from SfM ambiguity requires that a scale matrix must be estimated during the transformation, introducing further complexity into its formulation. Here, we present a novel, fully unsupervised algorithm which utilises non-linearly weighted image features for the solving the similarity transform (scale, translation rotation) between partially overlapping scans produced by SfM-MVS image processing. With the only initialization condition being partial intersection between input image sets, our method has major advantages over conventional iterative least squares minimization based methods (e.g. Iterative Closest Point variants), acting only on rigid areas of target scenes, being capable of reliably estimating the scaling factor and requiring no incipient estimation of the transformation to initialize (i.e. manual rough alignment). Moreover, because the solution is closed form, convergence is considerably more expedient that most iterative methods. It is hoped that the availability of improved co-registration routines, such as the one presented here, will facilitate the routine collection of multi-temporal topographic datasets by a wider range of geoscience practitioners.
Combining deterministic and stochastic velocity fields in the analysis of deep crustal seismic data
NASA Astrophysics Data System (ADS)
Larkin, Steven Paul
Standard crustal seismic modeling obtains deterministic velocity models which ignore the effects of wavelength-scale heterogeneity, known to exist within the Earth's crust. Stochastic velocity models are a means to include wavelength-scale heterogeneity in the modeling. These models are defined by statistical parameters obtained from geologic maps of exposed crystalline rock, and are thus tied to actual geologic structures. Combining both deterministic and stochastic velocity models into a single model allows a realistic full wavefield (2-D) to be computed. By comparing these simulations to recorded seismic data, the effects of wavelength-scale heterogeneity can be investigated. Combined deterministic and stochastic velocity models are created for two datasets, the 1992 RISC seismic experiment in southeastern California and the 1986 PASSCAL seismic experiment in northern Nevada. The RISC experiment was located in the transition zone between the Salton Trough and the southern Basin and Range province. A high-velocity body previously identified beneath the Salton Trough is constrained to pinch out beneath the Chocolate Mountains to the northeast. The lateral extent of this body is evidence for the ephemeral nature of rifting loci as a continent is initially rifted. Stochastic modeling of wavelength-scale structures above this body indicate that little more than 5% mafic intrusion into a more felsic continental crust is responsible for the observed reflectivity. Modeling of the wide-angle RISC data indicates that coda waves following PmP are initially dominated by diffusion of energy out of the near-surface basin as the wavefield reverberates within this low-velocity layer. At later times, this coda consists of scattered body waves and P to S conversions. Surface waves do not play a significant role in this coda. Modeling of the PASSCAL dataset indicates that a high-gradient crust-mantle transition zone or a rough Moho interface is necessary to reduce precritical PmP energy. Possibly related, inconsistencies in published velocity models are rectified by hypothesizing the existence of large, elongate, high-velocity bodies at the base of the crust oriented to and of similar scale as the basins and ranges at the surface. This structure would result in an anisotropic lower crust.
NASA Technical Reports Server (NTRS)
Kumar, Sujay; Santanello, Joseph; Peters-Lidard, Christa; Harrison, Ken
2011-01-01
Land-atmosphere (L-A) interactions play a critical role in determining the diurnal evolution of both planetary boundary layer (PBL) and land surface temperature and moisture budgets, as well as controlling feedbacks with clouds and precipitation that lead to the persistence of dry and wet regimes. Recent efforts to quantify the strength of L-A coupling in prediction models have produced diagnostics that integrate across both the land and PBL components of the system. In this study, we examine the impact of improved specification of land surface states, anomalies, and fluxes on coupled WRF forecasts during the summers of extreme dry (2006) and wet (2007) conditions in the U.S. Southern Great Plains. The improved land initialization and surface flux parameterizations are obtained through the use of a new optimization and uncertainty module in NASA's Land Information System (LIS-OPT), whereby parameter sets are calibrated in the Noah land surface model and classified according to the land cover and soil type mapping of the observations and the full domain. The impact of the calibrated parameters on the a) spin up of land surface states used as initial conditions, and b) heat and moisture fluxes of the coupled (LIS-WRF) simulations are then assessed in terms of ambient weather, PBL budgets, and precipitation along with L-A coupling diagnostics. In addition, the sensitivity of this approach to the period of calibration (dry, wet, normal) is investigated. Finally, tradeoffs of computational tractability and scientific validity (e.g.,. relating to the representation of the spatial dependence of parameters) and the feasibility of calibrating to multiple observational datasets are also discussed.
NASA Astrophysics Data System (ADS)
Wright, L.; Coddington, O.; Pilewskie, P.
2016-12-01
Hyperspectral instruments are a growing class of Earth observing sensors designed to improve remote sensing capabilities beyond discrete multi-band sensors by providing tens to hundreds of continuous spectral channels. Improved spectral resolution, range and radiometric accuracy allow the collection of large amounts of spectral data, facilitating thorough characterization of both atmospheric and surface properties. These new instruments require novel approaches for processing imagery and separating surface and atmospheric signals. One approach is numerical source separation, which allows the determination of the underlying physical causes of observed signals. Improved source separation will enable hyperspectral imagery to better address key science questions relevant to climate change, including land-use changes, trends in clouds and atmospheric water vapor, and aerosol characteristics. We developed an Informed Non-negative Matrix Factorization (INMF) method for separating atmospheric and surface sources. INMF offers marked benefits over other commonly employed techniques including non-negativity, which avoids physically impossible results; and adaptability, which tailors the method to hyperspectral source separation. The INMF algorithm is adapted to separate contributions from physically distinct sources using constraints on spectral and spatial variability, and library spectra to improve the initial guess. We also explore methods to produce an initial guess of the spatial separation patterns. Using this INMF algorithm we decompose hyperspectral imagery from the NASA Hyperspectral Imager for the Coastal Ocean (HICO) with a focus on separating surface and atmospheric signal contributions. HICO's coastal ocean focus provides a dataset with a wide range of atmospheric conditions, including high and low aerosol optical thickness and cloud cover, with only minor contributions from the ocean surfaces in order to isolate the contributions of the multiple atmospheric sources.
Fast automatic 3D liver segmentation based on a three-level AdaBoost-guided active shape model
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Baochun; Huang, Cheng; Zhou, Shoujun
Purpose: A robust, automatic, and rapid method for liver delineation is urgently needed for the diagnosis and treatment of liver disorders. Until now, the high variability in liver shape, local image artifacts, and the presence of tumors have complicated the development of automatic 3D liver segmentation. In this study, an automatic three-level AdaBoost-guided active shape model (ASM) is proposed for the segmentation of the liver based on enhanced computed tomography images in a robust and fast manner, with an emphasis on the detection of tumors. Methods: The AdaBoost voxel classifier and AdaBoost profile classifier were used to automatically guide three-levelmore » active shape modeling. In the first level of model initialization, fast automatic liver segmentation by an AdaBoost voxel classifier method is proposed. A shape model is then initialized by registration with the resulting rough segmentation. In the second level of active shape model fitting, a prior model based on the two-class AdaBoost profile classifier is proposed to identify the optimal surface. In the third level, a deformable simplex mesh with profile probability and curvature constraint as the external force is used to refine the shape fitting result. In total, three registration methods—3D similarity registration, probability atlas B-spline, and their proposed deformable closest point registration—are used to establish shape correspondence. Results: The proposed method was evaluated using three public challenge datasets: 3Dircadb1, SLIVER07, and Visceral Anatomy3. The results showed that our approach performs with promising efficiency, with an average of 35 s, and accuracy, with an average Dice similarity coefficient (DSC) of 0.94 ± 0.02, 0.96 ± 0.01, and 0.94 ± 0.02 for the 3Dircadb1, SLIVER07, and Anatomy3 training datasets, respectively. The DSC of the SLIVER07 testing and Anatomy3 unseen testing datasets were 0.964 and 0.933, respectively. Conclusions: The proposed automatic approach achieves robust, accurate, and fast liver segmentation for 3D CTce datasets. The AdaBoost voxel classifier can detect liver area quickly without errors and provides sufficient liver shape information for model initialization. The AdaBoost profile classifier achieves sufficient accuracy and greatly decreases segmentation time. These results show that the proposed segmentation method achieves a level of accuracy comparable to that of state-of-the-art automatic methods based on ASM.« less
Fast automatic 3D liver segmentation based on a three-level AdaBoost-guided active shape model.
He, Baochun; Huang, Cheng; Sharp, Gregory; Zhou, Shoujun; Hu, Qingmao; Fang, Chihua; Fan, Yingfang; Jia, Fucang
2016-05-01
A robust, automatic, and rapid method for liver delineation is urgently needed for the diagnosis and treatment of liver disorders. Until now, the high variability in liver shape, local image artifacts, and the presence of tumors have complicated the development of automatic 3D liver segmentation. In this study, an automatic three-level AdaBoost-guided active shape model (ASM) is proposed for the segmentation of the liver based on enhanced computed tomography images in a robust and fast manner, with an emphasis on the detection of tumors. The AdaBoost voxel classifier and AdaBoost profile classifier were used to automatically guide three-level active shape modeling. In the first level of model initialization, fast automatic liver segmentation by an AdaBoost voxel classifier method is proposed. A shape model is then initialized by registration with the resulting rough segmentation. In the second level of active shape model fitting, a prior model based on the two-class AdaBoost profile classifier is proposed to identify the optimal surface. In the third level, a deformable simplex mesh with profile probability and curvature constraint as the external force is used to refine the shape fitting result. In total, three registration methods-3D similarity registration, probability atlas B-spline, and their proposed deformable closest point registration-are used to establish shape correspondence. The proposed method was evaluated using three public challenge datasets: 3Dircadb1, SLIVER07, and Visceral Anatomy3. The results showed that our approach performs with promising efficiency, with an average of 35 s, and accuracy, with an average Dice similarity coefficient (DSC) of 0.94 ± 0.02, 0.96 ± 0.01, and 0.94 ± 0.02 for the 3Dircadb1, SLIVER07, and Anatomy3 training datasets, respectively. The DSC of the SLIVER07 testing and Anatomy3 unseen testing datasets were 0.964 and 0.933, respectively. The proposed automatic approach achieves robust, accurate, and fast liver segmentation for 3D CTce datasets. The AdaBoost voxel classifier can detect liver area quickly without errors and provides sufficient liver shape information for model initialization. The AdaBoost profile classifier achieves sufficient accuracy and greatly decreases segmentation time. These results show that the proposed segmentation method achieves a level of accuracy comparable to that of state-of-the-art automatic methods based on ASM.
Automatic multi-organ segmentation using learning-based segmentation and level set optimization.
Kohlberger, Timo; Sofka, Michal; Zhang, Jingdan; Birkbeck, Neil; Wetzl, Jens; Kaftan, Jens; Declerck, Jérôme; Zhou, S Kevin
2011-01-01
We present a novel generic segmentation system for the fully automatic multi-organ segmentation from CT medical images. Thereby we combine the advantages of learning-based approaches on point cloud-based shape representation, such a speed, robustness, point correspondences, with those of PDE-optimization-based level set approaches, such as high accuracy and the straightforward prevention of segment overlaps. In a benchmark on 10-100 annotated datasets for the liver, the lungs, and the kidneys we show that the proposed system yields segmentation accuracies of 1.17-2.89 mm average surface errors. Thereby the level set segmentation (which is initialized by the learning-based segmentations) contributes with an 20%-40% increase in accuracy.
Changing Hydrology in Glacier-fed High Altitude Andean Peatbogs
NASA Astrophysics Data System (ADS)
Slayback, D. A.; Yager, K.; Baraer, M.; Mohr, K. I.; Argollo, J.; Wigmore, O.; Meneses, R. I.; Mark, B. G.
2012-12-01
Montane peatbogs in the glacierized Andean highlands of Peru and Bolivia provide critical forage for camelids (llama and alpaca) in regionally extensive pastoral agriculture systems. During the long dry season, these wetlands often provide the only available green forage. A key question for the future of these peatbog systems, and the livelihoods they support, is the impact of climate change and glacier recession on their hydrology, and thus forage production. We have already documented substantial regional glacier recession, of, on average, approximately 30% of surface area over the past two decades. As glaciers begin to retreat under climate change, there is initially a period of increased meltwater outflow, culminating in a period of "peak water", and followed by a continual decline in outflows. Based on previous work, we know that some glaciers in the region have already passed peak water conditions, and are now declining. To better understand the impacts of these processes on peatbog hydrology and productivity, we have begun collecting a variety of surface data at several study sites in both Bolivia and Peru. These include precipitation, stream flow, water levels, water chemistry and isotope analyses, and peatbog biodiversity and biomass. These measurements will be used in conjunction with a regional model driven by satellite data to predict likely future impacts. We will present the results from these initial surface measurements, and an overview of satellite datasets to be used in the regional model.
EAARL coastal topography-Assategue Island National Seashore, Maryland and Virginia, 2010
Bonisteel-Cormier, J.M.; Nayegandhi, Amar; Wright, C.W.; Brock, J.C.; Nagle, D.B.; Vivekanandan, Saisudha; Klipp, E.S.; Fredericks, Xan; Stevens, Sara
2011-01-01
This DVD contains lidar-derived bare-earth (BE) and first-surface (FS) topography GIS datasets of a portion of the Assateague Island National Seashore in Maryland and Virginia. These datasets were acquired on March 19 and 24, 2010.
EAARL topography-Three Mile Creek and Mobile-Tensaw Delta, Alabama, 2010
Nayegandhi, Amar; Bonisteel-Cormier, J.M.; Clark, A.P.; Wright, C.W.; Brock, J.C.; Nagle, D.B.; Vivekanandan, Saisudha; Fredericks, Xan
2011-01-01
This DVD contains lidar-derived first-surface (FS) and bare-earth (BE) topography GIS datasets of a portion of the Mobile-Tensaw Delta region and Three Mile Creek in Alabama. These datasets were acquired on March 6, 2010.
A 7.5-Year Dataset of SSM/I-Derived Surface Turbulent Fluxes Over Global Oceans
NASA Technical Reports Server (NTRS)
Chou, Shu-Hsien; Shie, Chung-Lin; Atlas, Robert M.; Adizzone, Joe; Nelkin, Eric; Starr, David OC. (Technical Monitor)
2001-01-01
The global air-sea turbulent fluxes are needed for driving ocean models and validating coupled ocean-atmosphere global models. A method was developed to retrieve surface air humidity from the radiances measured by the Special Sensor Microwave/Imager (SSM/I) Using both SSM/I-retrieved surface wind and air humidity, they computed daily turbulent fluxes over global oceans with a stability-dependent bulk scheme. Based on this method, we have produced Version 1 of Goddard Satellite-Based Surface Turbulent Fluxes (GSSTF) dataset from the SSM/I data and other data. It provides daily- and monthly-mean surface turbulent fluxes and some relevant parameters over global oceans for individual F8, F10, and F11 satellites covering the period July 1987-December 1994. It also provides 1988-94 annual- and monthly-mean climatologies of the same variables, using only F8 and F1 1 satellite data. It has a spatial resolution of 2.0 degrees x 2.5 degrees lat-long and is archived at the NASA/GSFC DAAC. The purpose of this paper is to present an updated assessment of the GSSTF 1.0 dataset.
This paper highlights the similarities and differences in how emission inventories and datasets were developed and processed across North America and Europe for the Air Quality Model Evaluation International Initiative (AQMEII) project and then characterizes the emissions for the...
John T. Abatzoglou; Solomon Z. Dobrowski; Sean A. Parks; Katherine C. Hegewisch
2018-01-01
We present TerraClimate, a dataset of high-spatial resolution (1/24°, ~4-km) monthly climate and climatic water balance for global terrestrial surfaces from 1958â2015. TerraClimate uses climatically aided interpolation, combining high-spatial resolution climatological normals from the WorldClim dataset, with coarser resolution time varying (i.e., monthly) data from...
The Southampton-York Natural Scenes (SYNS) dataset: Statistics of surface attitude
Adams, Wendy J.; Elder, James H.; Graf, Erich W.; Leyland, Julian; Lugtigheid, Arthur J.; Muryy, Alexander
2016-01-01
Recovering 3D scenes from 2D images is an under-constrained task; optimal estimation depends upon knowledge of the underlying scene statistics. Here we introduce the Southampton-York Natural Scenes dataset (SYNS: https://syns.soton.ac.uk), which provides comprehensive scene statistics useful for understanding biological vision and for improving machine vision systems. In order to capture the diversity of environments that humans encounter, scenes were surveyed at random locations within 25 indoor and outdoor categories. Each survey includes (i) spherical LiDAR range data (ii) high-dynamic range spherical imagery and (iii) a panorama of stereo image pairs. We envisage many uses for the dataset and present one example: an analysis of surface attitude statistics, conditioned on scene category and viewing elevation. Surface normals were estimated using a novel adaptive scale selection algorithm. Across categories, surface attitude below the horizon is dominated by the ground plane (0° tilt). Near the horizon, probability density is elevated at 90°/270° tilt due to vertical surfaces (trees, walls). Above the horizon, probability density is elevated near 0° slant due to overhead structure such as ceilings and leaf canopies. These structural regularities represent potentially useful prior assumptions for human and machine observers, and may predict human biases in perceived surface attitude. PMID:27782103
An ontology design pattern for surface water features
Sinha, Gaurav; Mark, David; Kolas, Dave; Varanka, Dalia; Romero, Boleslo E.; Feng, Chen-Chieh; Usery, E. Lynn; Liebermann, Joshua; Sorokine, Alexandre
2014-01-01
Surface water is a primary concept of human experience but concepts are captured in cultures and languages in many different ways. Still, many commonalities exist due to the physical basis of many of the properties and categories. An abstract ontology of surface water features based only on those physical properties of landscape features has the best potential for serving as a foundational domain ontology for other more context-dependent ontologies. The Surface Water ontology design pattern was developed both for domain knowledge distillation and to serve as a conceptual building-block for more complex or specialized surface water ontologies. A fundamental distinction is made in this ontology between landscape features that act as containers (e.g., stream channels, basins) and the bodies of water (e.g., rivers, lakes) that occupy those containers. Concave (container) landforms semantics are specified in a Dry module and the semantics of contained bodies of water in a Wet module. The pattern is implemented in OWL, but Description Logic axioms and a detailed explanation is provided in this paper. The OWL ontology will be an important contribution to Semantic Web vocabulary for annotating surface water feature datasets. Also provided is a discussion of why there is a need to complement the pattern with other ontologies, especially the previously developed Surface Network pattern. Finally, the practical value of the pattern in semantic querying of surface water datasets is illustrated through an annotated geospatial dataset and sample queries using the classes of the Surface Water pattern.
Post-fire Thermokarst Development Along a Planned Road Corridor in Arctic Alaska
NASA Astrophysics Data System (ADS)
Jones, B. M.; Grosse, G.; Larsen, C. F.; Hayes, D. J.; Arp, C. D.; Liu, L.; Miller, E.
2015-12-01
Wildfire disturbance in northern high latitude regions is an important factor contributing to ecosystem and landscape change. In permafrost influenced terrain, fire may initiate thermokarst development which impacts hydrology, vegetation, wildlife, carbon storage and infrastructure. In this study we differenced two airborne LiDAR datasets that were acquired in the aftermath of the large and severe Anaktuvuk River tundra fire, which in 2007 burned across a proposed road corridor in Arctic Alaska. The 2009 LiDAR dataset was acquired by the Alaska Department of Transportation in preparation for construction of a gravel road that would connect the Dalton Highway with the logistical camp of Umiat. The 2014 LiDAR dataset was acquired by the USGS to quantify potential post-fire thermokarst development over the first seven years following the tundra fire event. By differencing the two 1 m resolution digital terrain models, we measured permafrost thaw subsidence across 34% of the burned tundra area studied, and observed less than 1% in similar, undisturbed tundra terrain units. Ice-rich, yedoma upland terrain was most susceptible to thermokarst development following the disturbance, accounting for 50% of the areal and volumetric change detected, with some locations subsiding more than six meters over the study period. Calculation of rugosity, or surface roughness, in the two datasets showed a doubling in microtopography on average across the burned portion of the study area, with a 340% increase in yedoma upland terrain. An additional LiDAR dataset was acquired in April 2015 to document the role of thermokarst development on enhanced snow accumulation and subsequent snowmelt runoff within the burn area. Our findings will enable future vulnerability assessments of ice-rich permafrost terrain as a result of shifting disturbance regimes. Such assessments are needed to address questions focused on the impact of permafrost degradation on physical, ecological, and socio-economic processes.
Developing a new global network of river reaches from merged satellite-derived datasets
NASA Astrophysics Data System (ADS)
Lion, C.; Allen, G. H.; Beighley, E.; Pavelsky, T.
2015-12-01
In 2020, the Surface Water and Ocean Topography satellite (SWOT), a joint mission of NASA/CNES/CSA/UK will be launched. One of its major products will be the measurements of continental water extent, including the width, height, and slope of rivers and the surface area and elevations of lakes. The mission will improve the monitoring of continental water and also our understanding of the interactions between different hydrologic reservoirs. For rivers, SWOT measurements of slope must be carried out over predefined river reaches. As such, an a priori dataset for rivers is needed in order to facilitate analysis of the raw SWOT data. The information required to produce this dataset includes measurements of river width, elevation, slope, planform, river network topology, and flow accumulation. To produce this product, we have linked two existing global datasets: the Global River Widths from Landsat (GRWL) database, which contains river centerline locations, widths, and a braiding index derived from Landsat imagery, and a modified version of the HydroSHEDS hydrologically corrected digital elevation product, which contains heights and flow accumulation measurements for streams at 3 arcsecond spatial resolution. Merging these two datasets requires considerable care. The difficulties, among others, lie in the difference of resolution: 30m versus 3 arseconds, and the age of the datasets: 2000 versus ~2010 (some rivers have moved, the braided sections are different). As such, we have developed custom software to merge the two datasets, taking into account the spatial proximity of river channels in the two datasets and ensuring that flow accumulation in the final dataset always increases downstream. Here, we present our preliminary results for a portion of South America and demonstrate the strengths and weaknesses of the method.
High-resolution mapping of global surface water and its long-term changes
NASA Astrophysics Data System (ADS)
Pekel, J. F.; Cottam, A.; Gorelick, N.; Belward, A.
2016-12-01
The location and persistence of surface water is both affected by climate and human activity and affects climate, biological diversity and human wellbeing. Global datasets documenting surface water location and seasonality have been produced but measuring long-term changes at high resolution remains a challenge.To address the dynamic nature of water, the European Commission's Joint Research Centre (JRC), working with the Google Earth Engine (GEE) team has processed each single pixel acquired by Landsat 5, 7, and 8 between 16th March 1984 to 10th October 2015 (> 3.000.000 Landsat scenes, representing > 1823 Terabytes of data).The produced dataset record months and years when water was present across 32 year, were occurrence changed and what form changes took in terms of seasonality and persistence, and document intra-annual persistence, inter-annual variability, and trends.This validated dataset shows that impacts of climate change and climate oscillations on surface water occurrence can be measured and that evidence can be gathered showing how surface water is altered by human activities.Freely available, we anticipate that this dataset will provide valuable information to those working in areas linked to security of water supply for agriculture, industry and human consumption, for assessing water-related disaster reduction and recovery and for the study of waterborne pollution and disease spread. The maps will also improve surface boundary condition setting in climate and weather models, improve carbon emissions estimates, inform regional climate change impact studies, delimit wetlands for biodiversity and determine desertification trends. Issues such as dam building (and less widespread dam removal), disappearing rivers, the geopolitics of water distribution and coastal erosion are also addressed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Katherine J; Hack, James J; Truesdale, John
A new high-resolution (0.9more » $$^{\\circ}$$x1.25$$^{\\circ}$$ in the horizontal) global tropospheric aerosol dataset with monthly resolution is generated using the finite-volume configuration of Community Atmosphere Model (CAM4) coupled to a bulk aerosol model and forced with recent estimates of surface emissions for the latter part of twentieth century. The surface emissions dataset is constructed from Coupled Model Inter-comparison Project (CMIP5) decadal-resolution surface emissions dataset to include REanalysis of TROpospheric chemical composition (RETRO) wildfire monthly emissions dataset. Experiments forced with the new tropospheric aerosol dataset and conducted using the spectral configuration of CAM4 with a T85 truncation (1.4$$^{\\circ}$$x1.4$$^{\\circ}$$) with prescribed twentieth century observed sea surface temperature, sea-ice and greenhouse gases reveal that variations in tropospheric aerosol levels can induce significant regional climate variability on the inter-annual timescales. Regression analyses over tropical Atlantic and Africa reveal that increasing dust aerosols can cool the North African landmass and shift convection southwards from West Africa into the Gulf of Guinea in the spring season in the simulations. Further, we find that increasing carbonaceous aerosols emanating from the southwestern African savannas can cool the region significantly and increase the marine stratocumulus cloud cover over the southeast tropical Atlantic ocean by aerosol-induced diabatic heating of the free troposphere above the low clouds. Experiments conducted with CAM4 coupled to a slab ocean model suggest that present day aerosols can shift the ITCZ southwards over the tropical Atlantic and can reduce the ocean mixed layer temperature beneath the increased marine stratocumulus clouds in the southeastern tropical Atlantic.« less
Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.; ...
2017-01-24
An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.
An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less
Cao, Lijuan; Yan, Zhongwei; Zhao, Ping; ...
2017-05-26
Monthly mean instrumental surface air temperature (SAT) observations back to the nineteenth century in China are synthesized from different sources via specific quality-control, interpolation, and homogenization. Compared with the first homogenized long-term SAT dataset for China which contained 18 stations mainly located in the middle and eastern part of China, the present dataset includes homogenized monthly SAT series at 32 stations, with an extended coverage especially towards western China. Missing values are interpolated by using observations at nearby stations, including those from neighboring countries. Cross validation shows that the mean bias error (MBE) is generally small and falls between 0.45more » °C and –0.35 °C. Multiple homogenization methods and available metadata are applied to assess the consistency of the time series and to adjust inhomogeneity biases. The homogenized annual mean SAT series shows a range of trends between 1.1 °C and 4.0 °C/century in northeastern China, between 0.4 °C and 1.9 °C/century in southeastern China, and between 1.4 °C and 3.7 °C/century in western China to the west of 105 E (from the initial years of the stations to 2015). The unadjusted data include unusually warm records during the 1940s and hence tend to underestimate the warming trends at a number of stations. As a result, the mean SAT series for China based on the climate anomaly method shows a warming trend of 1.56 °C/century during 1901–2015, larger than those based on other currently available datasets.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Lijuan; Yan, Zhongwei; Zhao, Ping
Monthly mean instrumental surface air temperature (SAT) observations back to the nineteenth century in China are synthesized from different sources via specific quality-control, interpolation, and homogenization. Compared with the first homogenized long-term SAT dataset for China which contained 18 stations mainly located in the middle and eastern part of China, the present dataset includes homogenized monthly SAT series at 32 stations, with an extended coverage especially towards western China. Missing values are interpolated by using observations at nearby stations, including those from neighboring countries. Cross validation shows that the mean bias error (MBE) is generally small and falls between 0.45more » °C and –0.35 °C. Multiple homogenization methods and available metadata are applied to assess the consistency of the time series and to adjust inhomogeneity biases. The homogenized annual mean SAT series shows a range of trends between 1.1 °C and 4.0 °C/century in northeastern China, between 0.4 °C and 1.9 °C/century in southeastern China, and between 1.4 °C and 3.7 °C/century in western China to the west of 105 E (from the initial years of the stations to 2015). The unadjusted data include unusually warm records during the 1940s and hence tend to underestimate the warming trends at a number of stations. As a result, the mean SAT series for China based on the climate anomaly method shows a warming trend of 1.56 °C/century during 1901–2015, larger than those based on other currently available datasets.« less
Mapping and spatiotemporal analysis tool for hydrological data: Spellmap
USDA-ARS?s Scientific Manuscript database
Lack of data management and analyses tools is one of the major limitations to effectively evaluate and use large datasets of high-resolution atmospheric, surface, and subsurface observations. High spatial and temporal resolution datasets better represent the spatiotemporal variability of hydrologica...
NASA Astrophysics Data System (ADS)
Roningen, J. M.; Daly, S. F.; Vuyovich, C.
2012-12-01
In Afghanistan, where both historical and current in situ hydrologic records are extremely limited, the development and stability operations communities require guidance as to how to best utilize capabilities in remote sensing of the water cycle to understand and predict seasonal flooding. In this study, three versions of Level 3 GRACE datasets (CSR, CSR 4.1 and GRGS) are compared to TRMM 3B42 products, SSM/I-derived snow water equivalent products (SWE), and MODIS-derived flooding extents to assess their potential for contributing to an understanding of the spatial and temporal patterns of spring flooding in Afghanistan from the period 2002-2012. GRACE, which allows for assessment of correlations between small-scale temporal changes in the gravitational field of the earth with changes in the total water storage in the hydrosphere, opens the possibility for incorporation of subsurface components of the hydrologic cycle into remote monitoring and modeling of water resources. GRACE data exhibit clear seasonal fluctuations in many areas of Afghanistan, but an assessment is required of the extent to which this data can be disaggregated spatially and related to geographic patterns of precipitation, snowmelt and flooding. In this study, TRMM 3B42 and SSM/I-derived SWE datasets were used as proxies for measured precipitation. These datasets were convolved with a Gaussian filter with a 300 km half-radius at each reported GRACE data point in order to compensate for spatial correlation ('leakage' effects) in the GRACE data. In mountainous and snowmelt-dominated basins such as the majority of those in this study, GRACE analyses that make use of land surface model (LSM) derived estimates may not provide adequate characterization of snow water equivalent and soil moisture in this region. Therefore, soil and subsurface moisture were evaluated as a single storage component using the GRACE data, and flooding occurrence was evaluated as a qualitative surface expression of this storage component. Initial results show that cumulative Gaussian-smoothed TRMM data correlate positively with GRACE CSR during the periods between yearly GRACE minima and maxima at points throughout most watersheds. The timing of peaks in GRACE data in central Afghanistan following the onset of the seasonal SWE decline also corresponds to seasonal rises in the nearby Kajakai Reservoir as measured by Jason-2 satellite altimetry and validated by manual records. Differences between datasets also appear to confirm the irregularities introduced in this region by the CSR 4.1 product that used a land surface model in the signal restoration process.
National Hydropower Plant Dataset, Version 1 (Update FY18Q2)
Samu, Nicole; Kao, Shih-Chieh; O'Connor, Patrick; Johnson, Megan; Uria-Martinez, Rocio; McManamay, Ryan
2016-09-30
The National Hydropower Plant Dataset, Version 1, Update FY18Q2, includes geospatial point-level locations and key characteristics of existing hydropower plants in the United States that are currently online. These data are a subset extracted from NHAAP’s Existing Hydropower Assets (EHA) dataset, which is a cornerstone of NHAAP’s EHA effort that has supported multiple U.S. hydropower R&D research initiatives related to market acceleration, environmental impact reduction, technology-to-market activities, and climate change impact assessment.
The Most Common Geometric and Semantic Errors in CityGML Datasets
NASA Astrophysics Data System (ADS)
Biljecki, F.; Ledoux, H.; Du, X.; Stoter, J.; Soon, K. H.; Khoo, V. H. S.
2016-10-01
To be used as input in most simulation and modelling software, 3D city models should be geometrically and topologically valid, and semantically rich. We investigate in this paper what is the quality of currently available CityGML datasets, i.e. we validate the geometry/topology of the 3D primitives (Solid and MultiSurface), and we validate whether the semantics of the boundary surfaces of buildings is correct or not. We have analysed all the CityGML datasets we could find, both from portals of cities and on different websites, plus a few that were made available to us. We have thus validated 40M surfaces in 16M 3D primitives and 3.6M buildings found in 37 CityGML datasets originating from 9 countries, and produced by several companies with diverse software and acquisition techniques. The results indicate that CityGML datasets without errors are rare, and those that are nearly valid are mostly simple LOD1 models. We report on the most common errors we have found, and analyse them. One main observation is that many of these errors could be automatically fixed or prevented with simple modifications to the modelling software. Our principal aim is to highlight the most common errors so that these are not repeated in the future. We hope that our paper and the open-source software we have developed will help raise awareness for data quality among data providers and 3D GIS software producers.
A variational approach to liver segmentation using statistics from multiple sources
NASA Astrophysics Data System (ADS)
Zheng, Shenhai; Fang, Bin; Li, Laquan; Gao, Mingqi; Wang, Yi
2018-01-01
Medical image segmentation plays an important role in digital medical research, and therapy planning and delivery. However, the presence of noise and low contrast renders automatic liver segmentation an extremely challenging task. In this study, we focus on a variational approach to liver segmentation in computed tomography scan volumes in a semiautomatic and slice-by-slice manner. In this method, one slice is selected and its connected component liver region is determined manually to initialize the subsequent automatic segmentation process. From this guiding slice, we execute the proposed method downward to the last one and upward to the first one, respectively. A segmentation energy function is proposed by combining the statistical shape prior, global Gaussian intensity analysis, and enforced local statistical feature under the level set framework. During segmentation, the shape of the liver shape is estimated by minimization of this function. The improved Chan-Vese model is used to refine the shape to capture the long and narrow regions of the liver. The proposed method was verified on two independent public databases, the 3D-IRCADb and the SLIVER07. Among all the tested methods, our method yielded the best volumetric overlap error (VOE) of 6.5 +/- 2.8 % , the best root mean square symmetric surface distance (RMSD) of 2.1 +/- 0.8 mm, the best maximum symmetric surface distance (MSD) of 18.9 +/- 8.3 mm in 3D-IRCADb dataset, and the best average symmetric surface distance (ASD) of 0.8 +/- 0.5 mm, the best RMSD of 1.5 +/- 1.1 mm in SLIVER07 dataset, respectively. The results of the quantitative comparison show that the proposed liver segmentation method achieves competitive segmentation performance with state-of-the-art techniques.
Estimating Vegetation Height from WorldView-02 and ArcticDEM Data for Broad Ecological Applications
NASA Astrophysics Data System (ADS)
Meddens, A. J.; Vierling, L. A.; Eitel, J.; Jennewein, J. S.; White, J. C.; Wulder, M.
2017-12-01
Boreal and arctic regions are warming at an unprecedented rate, and at a rate higher than in other regions across the globe. Ecological processes are highly responsive to temperature and therefore substantial changes in these northern ecosystems are expected. Recently, NASA initiated the Arctic-Boreal Vulnerability Experiment (ABoVE), which is a large-scale field campaign that aims to gain a better understanding of how the arctic responds to environmental change. High-resolution data products that quantify vegetation structure and function will improve efforts to assess these environmental change impacts. Our objective was to develop and test an approach that allows for mapping vegetation height at a 5m grid cell resolution across the ABoVE domain. To accomplish this, we selected three study areas across a north-south gradient in Alaska, representing an area of approximately 130 km2. We developed a RandomForest modeling approach for predicting vegetation height using the ArcticDEM (a digital surface model produced across the Arctic by the Polar Geospatial Center) and high-resolution multispectral satellite data (WorldView-2) in conjunction with aerial lidar data for calibration and validation. Vegetation height was successfully predicted across the three study areas and evaluated using an independent dataset, with R2 ranging from 0.58 to 0.76 and RMSEs ranging from 1.8 to 2.4 m. This predicted vegetation height dataset also led to the development of a digital terrain model using the ArcticDEM digital surface model by removing canopy heights from the surface heights. Our results show potential to establish a high resolution pan-arctic vegetation height map, which will provide useful information to a broad range of ongoing and future ecological research in high northern latitudes.
Unstructured-grid coastal ocean modelling in Southern Adriatic and Northern Ionian Seas
NASA Astrophysics Data System (ADS)
Federico, Ivan; Pinardi, Nadia; Coppini, Giovanni; Oddo, Paolo
2016-04-01
The Southern Adriatic Northern Ionian coastal Forecasting System (SANIFS) is a short-term forecasting system based on unstructured grid approach. The model component is built on SHYFEM finite element three-dimensional hydrodynamic model. The operational chain exploits a downscaling approach starting from the Mediterranean oceanographic-scale model MFS (Mediterranean Forecasting System, operated by INGV). The implementation set-up has been designed to provide accurate hydrodynamics and active tracer processes in the coastal waters of Southern Eastern Italy (Apulia, Basilicata and Calabria regions), where the model is characterized by a variable resolution in range of 50-500 m. The horizontal resolution is also high in open-sea areas, where the elements size is approximately 3 km. The model is forced: (i) at the lateral open boundaries through a full nesting strategy directly with the MFS (temperature, salinity, non-tidal sea surface height and currents) and OTPS (tidal forcing) fields; (ii) at surface through two alternative atmospheric forcing datasets (ECMWF and COSMOME) via MFS-bulk-formulae. Given that the coastal fields are driven by a combination of both local/coastal and deep ocean forcings propagating along the shelf, the performance of SANIFS was verified first (i) at the large and shelf-coastal scales by comparing with a large scale CTD survey and then (ii) at the coastal-harbour scale by comparison with CTD, ADCP and tide gauge data. Sensitivity tests were performed on initialization conditions (mainly focused on spin-up procedures) and on surface boundary conditions by assessing the reliability of two alternative datasets at different horizontal resolution (12.5 and 7 km). The present work highlights how downscaling could improve the simulation of the flow field going from typical open-ocean scales of the order of several km to the coastal (and harbour) scales of tens to hundreds of meters.
Automatic metastatic brain tumor segmentation for stereotactic radiosurgery applications.
Liu, Yan; Stojadinovic, Strahinja; Hrycushko, Brian; Wardak, Zabi; Lu, Weiguo; Yan, Yulong; Jiang, Steve B; Timmerman, Robert; Abdulrahman, Ramzi; Nedzi, Lucien; Gu, Xuejun
2016-12-21
The objective of this study is to develop an automatic segmentation strategy for efficient and accurate metastatic brain tumor delineation on contrast-enhanced T1-weighted (T1c) magnetic resonance images (MRI) for stereotactic radiosurgery (SRS) applications. The proposed four-step automatic brain metastases segmentation strategy is comprised of pre-processing, initial contouring, contour evolution, and contour triage. First, T1c brain images are preprocessed to remove the skull. Second, an initial tumor contour is created using a multi-scaled adaptive threshold-based bounding box and a super-voxel clustering technique. Third, the initial contours are evolved to the tumor boundary using a regional active contour technique. Fourth, all detected false-positive contours are removed with geometric characterization. The segmentation process was validated on a realistic virtual phantom containing Gaussian or Rician noise. For each type of noise distribution, five different noise levels were tested. Twenty-one cases from the multimodal brain tumor image segmentation (BRATS) challenge dataset and fifteen clinical metastases cases were also included in validation. Segmentation performance was quantified by the Dice coefficient (DC), normalized mutual information (NMI), structural similarity (SSIM), Hausdorff distance (HD), mean value of surface-to-surface distance (MSSD) and standard deviation of surface-to-surface distance (SDSSD). In the numerical phantom study, the evaluation yielded a DC of 0.98 ± 0.01, an NMI of 0.97 ± 0.01, an SSIM of 0.999 ± 0.001, an HD of 2.2 ± 0.8 mm, an MSSD of 0.1 ± 0.1 mm, and an SDSSD of 0.3 ± 0.1 mm. The validation on the BRATS data resulted in a DC of 0.89 ± 0.08, which outperform the BRATS challenge algorithms. Evaluation on clinical datasets gave a DC of 0.86 ± 0.09, an NMI of 0.80 ± 0.11, an SSIM of 0.999 ± 0.001, an HD of 8.8 ± 12.6 mm, an MSSD of 1.5 ± 3.2 mm, and an SDSSD of 1.8 ± 3.4 mm when comparing to the physician drawn ground truth. The result indicated that the developed automatic segmentation strategy yielded accurate brain tumor delineation and presented as a useful clinical tool for SRS applications.
Automatic metastatic brain tumor segmentation for stereotactic radiosurgery applications
NASA Astrophysics Data System (ADS)
Liu, Yan; Stojadinovic, Strahinja; Hrycushko, Brian; Wardak, Zabi; Lu, Weiguo; Yan, Yulong; Jiang, Steve B.; Timmerman, Robert; Abdulrahman, Ramzi; Nedzi, Lucien; Gu, Xuejun
2016-12-01
The objective of this study is to develop an automatic segmentation strategy for efficient and accurate metastatic brain tumor delineation on contrast-enhanced T1-weighted (T1c) magnetic resonance images (MRI) for stereotactic radiosurgery (SRS) applications. The proposed four-step automatic brain metastases segmentation strategy is comprised of pre-processing, initial contouring, contour evolution, and contour triage. First, T1c brain images are preprocessed to remove the skull. Second, an initial tumor contour is created using a multi-scaled adaptive threshold-based bounding box and a super-voxel clustering technique. Third, the initial contours are evolved to the tumor boundary using a regional active contour technique. Fourth, all detected false-positive contours are removed with geometric characterization. The segmentation process was validated on a realistic virtual phantom containing Gaussian or Rician noise. For each type of noise distribution, five different noise levels were tested. Twenty-one cases from the multimodal brain tumor image segmentation (BRATS) challenge dataset and fifteen clinical metastases cases were also included in validation. Segmentation performance was quantified by the Dice coefficient (DC), normalized mutual information (NMI), structural similarity (SSIM), Hausdorff distance (HD), mean value of surface-to-surface distance (MSSD) and standard deviation of surface-to-surface distance (SDSSD). In the numerical phantom study, the evaluation yielded a DC of 0.98 ± 0.01, an NMI of 0.97 ± 0.01, an SSIM of 0.999 ± 0.001, an HD of 2.2 ± 0.8 mm, an MSSD of 0.1 ± 0.1 mm, and an SDSSD of 0.3 ± 0.1 mm. The validation on the BRATS data resulted in a DC of 0.89 ± 0.08, which outperform the BRATS challenge algorithms. Evaluation on clinical datasets gave a DC of 0.86 ± 0.09, an NMI of 0.80 ± 0.11, an SSIM of 0.999 ± 0.001, an HD of 8.8 ± 12.6 mm, an MSSD of 1.5 ± 3.2 mm, and an SDSSD of 1.8 ± 3.4 mm when comparing to the physician drawn ground truth. The result indicated that the developed automatic segmentation strategy yielded accurate brain tumor delineation and presented as a useful clinical tool for SRS applications.
EAARL coastal topography-Northern Outer Banks, North Carolina, post-Nor'Ida, 2009
Bonisteel-Cormier, J.M.; Nayegandhi, Amar; Wright, C.W.; Sallenger, A.H.; Brock, J.C.; Nagle, D.B.; Vivekanandan, Saisudha; Klipp, E.S.; Fredericks, Xan
2011-01-01
This DVD contains lidar-derived first-surface (FS) and bare-earth (BE) topography GIS datasets of a portion of the northern Outer Banks beachface in North Carolina. These datasets were acquired post-Nor'Ida on November 27 and 29, 2009.
Integrating Healthcare Ethical Issues into IS Education
ERIC Educational Resources Information Center
Cellucci, Leigh W.; Layman, Elizabeth J.; Campbell, Robert; Zeng, Xiaoming
2011-01-01
Federal initiatives are encouraging the increase of IS graduates to work in the healthcare environment because they possess knowledge of datasets and dataset management that are key to effective management of electronic health records (EHRs) and health information technology (IT). IS graduates will be members of the healthcare team, and as such,…
Efficiently Exploring Multilevel Data with Recursive Partitioning
ERIC Educational Resources Information Center
Martin, Daniel P.; von Oertzen, Timo; Rimm-Kaufman, Sara E.
2015-01-01
There is an increasing number of datasets with many participants, variables, or both, in education and other fields that often deal with large, multilevel data structures. Once initial confirmatory hypotheses are exhausted, it can be difficult to determine how best to explore the dataset to discover hidden relationships that could help to inform…
Enhancing Conservation with High Resolution Productivity Datasets for the Conterminous United States
NASA Astrophysics Data System (ADS)
Robinson, Nathaniel Paul
Human driven alteration of the earth's terrestrial surface is accelerating through land use changes, intensification of human activity, climate change, and other anthropogenic pressures. These changes occur at broad spatio-temporal scales, challenging our ability to effectively monitor and assess the impacts and subsequent conservation strategies. While satellite remote sensing (SRS) products enable monitoring of the earth's terrestrial surface continuously across space and time, the practical applications for conservation and management of these products are limited. Often the processes driving ecological change occur at fine spatial resolutions and are undetectable given the resolution of available datasets. Additionally, the links between SRS data and ecologically meaningful metrics are weak. Recent advances in cloud computing technology along with the growing record of high resolution SRS data enable the development of SRS products that quantify ecologically meaningful variables at relevant scales applicable for conservation and management. The focus of my dissertation is to improve the applicability of terrestrial gross and net primary productivity (GPP/NPP) datasets for the conterminous United States (CONUS). In chapter one, I develop a framework for creating high resolution datasets of vegetation dynamics. I use the entire archive of Landsat 5, 7, and 8 surface reflectance data and a novel gap filling approach to create spatially continuous 30 m, 16-day composites of the normalized difference vegetation index (NDVI) from 1986 to 2016. In chapter two, I integrate this with other high resolution datasets and the MOD17 algorithm to create the first high resolution GPP and NPP datasets for CONUS. I demonstrate the applicability of these products for conservation and management, showing the improvements beyond currently available products. In chapter three, I utilize this dataset to evaluate the relationships between land ownership and terrestrial production across the CONUS domain. The main results of this work are three publicly available datasets: 1) 30 m Landsat NDVI; 2) 250 m MODIS based GPP and NPP; and 3) 30 m Landsat based GPP and NPP. My goal is that these products prove useful for the wider scientific, conservation, and land management communities as we continue to strive for better conservation and management practices.
Holmes, Avram J; Hollinshead, Marisa O; O'Keefe, Timothy M; Petrov, Victor I; Fariello, Gabriele R; Wald, Lawrence L; Fischl, Bruce; Rosen, Bruce R; Mair, Ross W; Roffman, Joshua L; Smoller, Jordan W; Buckner, Randy L
2015-01-01
The goal of the Brain Genomics Superstruct Project (GSP) is to enable large-scale exploration of the links between brain function, behavior, and ultimately genetic variation. To provide the broader scientific community data to probe these associations, a repository of structural and functional magnetic resonance imaging (MRI) scans linked to genetic information was constructed from a sample of healthy individuals. The initial release, detailed in the present manuscript, encompasses quality screened cross-sectional data from 1,570 participants ages 18 to 35 years who were scanned with MRI and completed demographic and health questionnaires. Personality and cognitive measures were obtained on a subset of participants. Each dataset contains a T1-weighted structural MRI scan and either one (n=1,570) or two (n=1,139) resting state functional MRI scans. Test-retest reliability datasets are included from 69 participants scanned within six months of their initial visit. For the majority of participants self-report behavioral and cognitive measures are included (n=926 and n=892 respectively). Analyses of data quality, structure, function, personality, and cognition are presented to demonstrate the dataset's utility.
NASA Astrophysics Data System (ADS)
Dunn, R. J. H.; Willett, K. M.; Thorne, P. W.; Woolley, E. V.; Durre, I.; Dai, A.; Parker, D. E.; Vose, R. S.
2012-10-01
This paper describes the creation of HadISD: an automatically quality-controlled synoptic resolution dataset of temperature, dewpoint temperature, sea-level pressure, wind speed, wind direction and cloud cover from global weather stations for 1973-2011. The full dataset consists of over 6000 stations, with 3427 long-term stations deemed to have sufficient sampling and quality for climate applications requiring sub-daily resolution. As with other surface datasets, coverage is heavily skewed towards Northern Hemisphere mid-latitudes. The dataset is constructed from a large pre-existing ASCII flatfile data bank that represents over a decade of substantial effort at data retrieval, reformatting and provision. These raw data have had varying levels of quality control applied to them by individual data providers. The work proceeded in several steps: merging stations with multiple reporting identifiers; reformatting to netCDF; quality control; and then filtering to form a final dataset. Particular attention has been paid to maintaining true extreme values where possible within an automated, objective process. Detailed validation has been performed on a subset of global stations and also on UK data using known extreme events to help finalise the QC tests. Further validation was performed on a selection of extreme events world-wide (Hurricane Katrina in 2005, the cold snap in Alaska in 1989 and heat waves in SE Australia in 2009). Some very initial analyses are performed to illustrate some of the types of problems to which the final data could be applied. Although the filtering has removed the poorest station records, no attempt has been made to homogenise the data thus far, due to the complexity of retaining the true distribution of high-resolution data when applying adjustments. Hence non-climatic, time-varying errors may still exist in many of the individual station records and care is needed in inferring long-term trends from these data. This dataset will allow the study of high frequency variations of temperature, pressure and humidity on a global basis over the last four decades. Both individual extremes and the overall population of extreme events could be investigated in detail to allow for comparison with past and projected climate. A version-control system has been constructed for this dataset to allow for the clear documentation of any updates and corrections in the future.
Initial Everglades Depth Estimation Network (EDEN) Digital Elevation Model Research and Development
Jones, John W.; Price, Susan D.
2007-01-01
Introduction The Everglades Depth Estimation Network (EDEN) offers a consistent and documented dataset that can be used to guide large-scale field operations, to integrate hydrologic and ecological responses, and to support biological and ecological assessments that measure ecosystem responses to the Comprehensive Everglades Restoration Plan (Telis, 2006). To produce historic and near-real time maps of water depths, the EDEN requires a system-wide digital elevation model (DEM) of the ground surface. Accurate Everglades wetland ground surface elevation data were non-existent before the U.S. Geological Survey (USGS) undertook the collection of highly accurate surface elevations at the regional scale. These form the foundation for EDEN DEM development. This development process is iterative as additional high accuracy elevation data (HAED) are collected, water surfacing algorithms improve, and additional ground-based ancillary data become available. Models are tested using withheld HAED and independently measured water depth data, and by using DEM data in EDEN adaptive management applications. Here the collection of HAED is briefly described before the approach to DEM development and the current EDEN DEM are detailed. Finally future research directions for continued model development, testing, and refinement are provided.
Quantifying the Consumptive Landscape in the Potomac Watershed Upstream From Washington DC
NASA Astrophysics Data System (ADS)
Kearns, M.; Zegre, N.; Fernandez, R.
2017-12-01
Some of the largest and fastest-growing eastern cities depend upon Appalachian headwaters for their fresh water. Today's relative abundance of water may be at risk: changes in climate and land use could alter the availability of surface water and human consumption could increase to meet the needs of a growing population and economy. Neither the supply of surface water nor the various withdrawals that support our population, irrigation, energy, and industry are distributed uniformly throughout our watersheds. This study correlates surface water withdrawals, consumptive use coefficients, and land-use/land-cover datasets to create a model for quantifying anthropogenic water consumption. The model suggests a method for downscaling and redistributing USGS county-level surface water withdrawals to 30 meter cells. Initially completed for the Potomac River watershed upstream from Washington DC's public supply intake, this approach could easily scale regionally or nationally. When combined with runoff estimates over the same landscape, the net-production or net-consumption of an area of interest may be calculated at high resolution. By better understanding the spatial relationship between hydrologic supply and demand, we can seek to improve the efficiency and security of our water resources.
NASA Astrophysics Data System (ADS)
Mandolesi, E.; Jones, A. G.; Roux, E.; Lebedev, S.
2009-12-01
Recently different studies were undertaken on the correlation between diverse geophysical datasets. Magnetotelluric (MT) data are used to map the electrical conductivity structure behind the Earth, but one of the problems in MT method is the lack in resolution in mapping zones beneath a region of high conductivity. Joint inversion of different datasets in which a common structure is recognizable reduces non-uniqueness and may improve the quality of interpretation when different dataset are sensitive to different physical properties with an underlined common structure. A common structure is recognized if the change of physical properties occur at the same spatial locations. Common structure may be recognized in 1D inversion of seismic and MT datasets, and numerous authors show that also 2D common structure may drive to an improvement of inversion quality while dataset are jointly inverted. In this presentation a tool to constrain MT 2D inversion with phase velocity of surface wave seismic data (SW) is proposed and is being developed and tested on synthetic data. Results obtained suggest that a joint inversion scheme could be applied with success along a section profile for which data are compatible with a 2D MT model.
Relationship between Defect Size and Fatigue Life Distributions in Al-7 Pct Si-Mg Alloy Castings
NASA Astrophysics Data System (ADS)
Tiryakioğlu, Murat
2009-07-01
A new method for predicting the variability in fatigue life of castings was developed by combining the size distribution for the fatigue-initiating defects and a fatigue life model based on the Paris-Erdoğan law for crack propagation. Two datasets for the fatigue-initiating defects in Al-7 pct Si-Mg alloy castings, reported previously in the literature, were used to demonstrate that (1) the size of fatigue-initiating defects follow the Gumbel distribution; (2) the crack propagation model developed previously provides respectable fits to experimental data; and (3) the method developed in the present study expresses the variability in both datasets, almost as well as the lognormal distribution and better than the Weibull distribution.
NASA Astrophysics Data System (ADS)
Kang, D. H.; Hwang, E.; Jung, H. C.; Kim, E. J.; Peters-Lidard, C. D.; Kumar, S.; Chae, H.; Baeck, S. H.
2017-12-01
NASA has contributed to resolve global water issues by utilizing their long-term legacy of remote sensing technologies supported by a state of art software engineering. In this context, NASA Goddard Space Flight Center has developed a land surface model framework to monitor and predict water hazards such as flood and drought with the Land Information System (hereafter LIS) applied to North America and beyond it to include a global coverage. However, it is still challenging to apply the LIS to East-Asia where a rice-paddy agriculture is prevalent compared to other parts of the world, but retains a high population density in this region. Thus, this paper introduces recent efforts from the Korea Water Resources Corporation (K-water) in S. Korea to establish the LIS in East-Asia including Korea, aiming at producing surface hydrology datasets in Asia. One of the ultimate goals of this project is to manage the water hazards in Korea and to provide the water resources dataset in East-Asia by adapting the LIS with their abundantly available hydrometeorological observations to support the LIS applications. Preliminary results from initiating efforts since the beginning of 2017 between NASA and K-water are addressed in the paper to review the possible outcomes after this ongoing project to benefit both entities. Acknowledgements This research was supported by a grant (17AWMP-B079625-04) from Water Management Research Program sponsored by Ministry of Land, Infrastructure and Transport of Korean government.
Impact of Land Cover Characterization and Properties on Snow Albedo in Climate Models
NASA Astrophysics Data System (ADS)
Wang, L.; Bartlett, P. A.; Chan, E.; Montesano, P.
2017-12-01
The simulation of winter albedo in boreal and northern environments has been a particular challenge for land surface modellers. Assessments of output from CMIP3 and CMIP5 climate models have revealed that many simulations are characterized by overestimation of albedo in the boreal forest. Recent studies suggest that inaccurate representation of vegetation distribution, improper simulation of leaf area index, and poor treatment of canopy-snow processes are the primary causes of albedo errors. While several land cover datasets are commonly used to derive plant functional types (PFT) for use in climate models, new land cover and vegetation datasets with higher spatial resolution have become available in recent years. In this study, we compare the spatial distribution of the dominant PFTs and canopy cover fractions based on different land cover datasets, and present results from offline simulations of the latest version Canadian Land Surface Scheme (CLASS) over the northern Hemisphere land. We discuss the impact of land cover representation and surface properties on winter albedo simulations in climate models.
Decibel: The Relational Dataset Branching System
Maddox, Michael; Goehring, David; Elmore, Aaron J.; Madden, Samuel; Parameswaran, Aditya; Deshpande, Amol
2017-01-01
As scientific endeavors and data analysis become increasingly collaborative, there is a need for data management systems that natively support the versioning or branching of datasets to enable concurrent analysis, cleaning, integration, manipulation, or curation of data across teams of individuals. Common practice for sharing and collaborating on datasets involves creating or storing multiple copies of the dataset, one for each stage of analysis, with no provenance information tracking the relationships between these datasets. This results not only in wasted storage, but also makes it challenging to track and integrate modifications made by different users to the same dataset. In this paper, we introduce the Relational Dataset Branching System, Decibel, a new relational storage system with built-in version control designed to address these shortcomings. We present our initial design for Decibel and provide a thorough evaluation of three versioned storage engine designs that focus on efficient query processing with minimal storage overhead. We also develop an exhaustive benchmark to enable the rigorous testing of these and future versioned storage engine designs. PMID:28149668
NASA Astrophysics Data System (ADS)
Jensen, K.; McDonald, K. C.; Ceccato, P.; Schroeder, R.; Podest, E.
2014-12-01
The potential impact of climate variability and change on the spread of infectious disease is of increasingly critical concern to public health. Newly-available remote sensing datasets may be combined with predictive modeling to develop new capabilities to mitigate risks of vector-borne diseases such as malaria, leishmaniasis, and rift valley fever. We have developed improved remote sensing-based products for monitoring water bodies and inundation dynamics that have potential utility for improving risk forecasts of vector-borne disease epidemics. These products include daily and seasonal surface inundation based on the global mappings of inundated area fraction derived at the 25-km scale from active and passive microwave instruments ERS, QuikSCAT, ASCAT, and SSM/I data - the Satellite Water Microwave Product Series (SWAMPS). Focusing on the East African region, we present validation of this product using multi-temporal classification of inundated areas in this region derived from high resolution PALSAR (100m) and Landsat (30m) observations. We assess historical occurrence of malaria in the east African country of Eritrea with respect to the time series SWAMPS datasets, and we aim to construct a framework for use of these new datasets to improve prediction of future malaria risk in this region. This work is supported through funding from the NASA Applied Sciences Program, the NASA Terrestrial Ecology Program, and the NASA Making Earth System Data Records for Use in Research Environments (MEaSUREs) Program. This study is also supported and monitored by National Oceanic and Atmospheric Administration (NOAA) under Grant - CREST Grant # NA11SEC4810004. The statements contained within the manuscript/research article are not the opinions of the funding agency or the U.S. government, but reflect the authors' opinions. This work was conducted in part under the framework of the ALOS Kyoto and Carbon Initiative. ALOS PALSAR data were provided by JAXA EORC.
Evaluating RGB photogrammetry and multi-temporal digital surface models for detecting soil erosion
NASA Astrophysics Data System (ADS)
Anders, Niels; Keesstra, Saskia; Seeger, Manuel
2013-04-01
Photogrammetry is a widely used tool for generating high-resolution digital surface models. Unmanned Aerial Vehicles (UAVs), equipped with a Red Green Blue (RGB) camera, have great potential in quickly acquiring multi-temporal high-resolution orthophotos and surface models. Such datasets would ease the monitoring of geomorphological processes, such as local soil erosion and rill formation after heavy rainfall events. In this study we test a photogrammetric setup to determine data requirements for soil erosion studies with UAVs. We used a rainfall simulator (5 m2) and above a rig with attached a Panasonic GX1 16 megapixel digital camera and 20mm lens. The soil material in the simulator consisted of loamy sand at an angle of 5 degrees. Stereo pair images were taken before and after rainfall simulation with 75-85% overlap. Acquired images were automatically mosaicked to create high-resolution orthorectified images and digital surface models (DSM). We resampled the DSM to different spatial resolutions to analyze the effect of cell size to the accuracy of measured rill depth and soil loss estimations, and determined an optimal cell size (thus flight altitude). Furthermore, the high spatial accuracy of the acquired surface models allows further analysis of rill formation and channel initiation related to e.g. surface roughness. We suggest implementing near-infrared and temperature sensors to combine soil moisture and soil physical properties with surface morphology for future investigations.
SMERGE: A multi-decadal root-zone soil moisture product for CONUS
NASA Astrophysics Data System (ADS)
Crow, W. T.; Dong, J.; Tobin, K. J.; Torres, R.
2017-12-01
Multi-decadal root-zone soil moisture products are of value for a range of water resource and climate applications. The NASA-funded root-zone soil moisture merging project (SMERGE) seeks to develop such products through the optimal merging of land surface model predictions with surface soil moisture retrievals acquired from multi-sensor remote sensing products. This presentation will describe the creation and validation of a daily, multi-decadal (1979-2015), vertically-integrated (both surface to 40 cm and surface to 100 cm), 0.125-degree root-zone product over the contiguous United States (CONUS). The modeling backbone of the system is based on hourly root-zone soil moisture simulations generated by the Noah model (v3.2) operating within the North American Land Data Assimilation System (NLDAS-2). Remotely-sensed surface soil moisture retrievals are taken from the multi-sensor European Space Agency Climate Change Initiative soil moisture data set (ESA CCI SM). In particular, the talk will detail: 1) the exponential smoothing approach used to convert surface ESA CCI SM retrievals into root-zone soil moisture estimates, 2) the averaging technique applied to merge (temporally-sporadic) remotely-sensed with (continuous) NLDAS-2 land surface model estimates of root-zone soil moisture into the unified SMERGE product, and 3) the validation of the SMERGE product using long-term, ground-based soil moisture datasets available within CONUS.
NASA Technical Reports Server (NTRS)
Kempler, Steven; Teng, William; Acker, James; Belvedere, Deborah; Liu, Zhong; Leptoukh, Gregory
2010-01-01
In support of the NASA Energy and Water Cycle Study (NEWS), the Collaborative Energy and Water Cycle Information Services (CEWIS), sponsored by NEWS Program Manager Jared Entin, was initiated to develop an evolving set of community-based data and information services that would facilitate users to locate, access, and bring together multiple distributed heterogeneous energy and water cycle datasets. The CEWIS workshop, June 15-16, 2010, at NASA/GSFC, was the initial step of the process, starting with identifying and scoping the issues, as defined by the community.
Modelling surface-water depression storage in a Prairie Pothole Region
Hay, Lauren E.; Norton, Parker A.; Viger, Roland; Markstrom, Steven; Regan, R. Steven; Vanderhoof, Melanie
2018-01-01
In this study, the Precipitation-Runoff Modelling System (PRMS) was used to simulate changes in surface-water depression storage in the 1,126-km2 Upper Pipestem Creek basin located within the Prairie Pothole Region of North Dakota, USA. The Prairie Pothole Region is characterized by millions of small water bodies (or surface-water depressions) that provide numerous ecosystem services and are considered an important contribution to the hydrologic cycle. The Upper Pipestem PRMS model was extracted from the U.S. Geological Survey's (USGS) National Hydrologic Model (NHM), developed to support consistent hydrologic modelling across the conterminous United States. The Geospatial Fabric database, created for the USGS NHM, contains hydrologic model parameter values derived from datasets that characterize the physical features of the entire conterminous United States for 109,951 hydrologic response units. Each hydrologic response unit in the Geospatial Fabric was parameterized using aggregated surface-water depression area derived from the National Hydrography Dataset Plus, an integrated suite of application-ready geospatial datasets. This paper presents a calibration strategy for the Upper Pipestem PRMS model that uses normalized lake elevation measurements to calibrate the parameters influencing simulated fractional surface-water depression storage. Results indicate that inclusion of measurements that give an indication of the change in surface-water depression storage in the calibration procedure resulted in accurate changes in surface-water depression storage in the water balance. Regionalized parameterization of the USGS NHM will require a proxy for change in surface-storage to accurately parameterize surface-water depression storage within the USGS NHM.
Zhang, Xu; Li, Yun; Chen, Xiang; Li, Guanglin; Rymer, William Zev; Zhou, Ping
2013-01-01
This study investigates the effect of involuntary motor activity of paretic-spastic muscles on classification of surface electromyography (EMG) signals. Two data collection sessions were designed for 8 stroke subjects to voluntarily perform 11 functional movements using their affected forearm and hand at a relatively slow and fast speed. For each stroke subject, the degree of involuntary motor activity present in voluntary surface EMG recordings was qualitatively described from such slow and fast experimental protocols. Myoelectric pattern recognition analysis was performed using different combinations of voluntary surface EMG data recorded from slow and fast sessions. Across all tested stroke subjects, our results revealed that when involuntary surface EMG was absent or present in both training and testing datasets, high accuracies (> 96%, > 98%, respectively, averaged over all the subjects) can be achieved in classification of different movements using surface EMG signals from paretic muscles. When involuntary surface EMG was solely involved in either training or testing datasets, the classification accuracies were dramatically reduced (< 89%, < 85%, respectively). However, if both training and testing datasets contained EMG signals with presence and absence of involuntary EMG interference, high accuracies were still achieved (> 97%). The findings of this study can be used to guide appropriate design and implementation of myoelectric pattern recognition based systems or devices toward promoting robot-aided therapy for stroke rehabilitation. PMID:23860192
NASA Astrophysics Data System (ADS)
van Osnabrugge, Bart; Weerts, Albrecht; Uijlenhoet, Remko
2017-04-01
Gridded areal precipitation, as one of the most important hydrometeorological input variables for initial state estimation in operational hydrological forecasting, is available in the form of raster data sets (e.g. HYRAS and EOBS) for the River Rhine basin. These datasets are compiled off-line on a daily time step using station data with the highest possible spatial density. However, such a product is not available operationally and at an hourly discretisation. Therefore, we constructed an hourly gridded precipitation dataset at 1.44 km2 resolution for the Rhine basin for the period from 1998 to present using a REGNIE-like interpolation procedure (Weerts et al., 2008) using a low and a high density rain gauge network. The datasets were validated against daily HYRAS (Rauthe, 2013) and EOBS (Haylock, 2008) data. The main goal of the operational procedure is to emulate the HYRAS dataset as good as possible, as the daily HYRAS dataset is used in the off-line calibration of the hydrological model. Our main findings are that even with low station density, the spatial patterns found in the HYRAS data set are well reproduced. With low station density (years 1999-2006) our dataset underestimates precipitation compared to HYRAS and EOBS, notably during the winter. However, interpolation based on the same set of stations overestimates precipitation compared to EOBS for the years 2006-2014. This discrepancy disappears when switching to the high station density. We also analyze the robustness of the hourly precipitation fields by comparing with stations not used during interpolation. Specific issues regarding the data when creating the gridded precipitation fields will be highlighted. Finally, the datasets are used to drive an hourly and daily gridded WFLOW_HBV model of the Rhine at the same spatial resolution. Haylock, M.R., N. Hofstra, A.M.G. Klein Tank, E.J. Klok, P.D. Jones and M. New. 2008: A European daily high-resolution gridded dataset of surface temperature and precipitation. J. Geophys. Res (Atmospheres), 113, D20119, doi:10.1029/2008JD10201 Rauthe, M., Steiner, H., Riediger, U., Mazurkiewicz, A., Gratzki, A. 2013: A Central European precipitation climatology - Part 1: Generation and validation of a high-resolution gridded daily data set (HYRAS). Meteorologische Zeitschrift, 22(3), 235 256 Weerts, A.H., D. Meißner, and S. Rademacher, 2008. Input data rainfall-runoff model operational system FEWS-NL & FEWS-DE. Technical report, Deltares.
Kittel, T.G.F.; Rosenbloom, N.A.; Royle, J. Andrew; Daly, Christopher; Gibson, W.P.; Fisher, H.H.; Thornton, P.; Yates, D.N.; Aulenbach, S.; Kaufman, C.; McKeown, R.; Bachelet, D.; Schimel, D.S.; Neilson, R.; Lenihan, J.; Drapek, R.; Ojima, D.S.; Parton, W.J.; Melillo, J.M.; Kicklighter, D.W.; Tian, H.; McGuire, A.D.; Sykes, M.T.; Smith, B.; Cowling, S.; Hickler, T.; Prentice, I.C.; Running, S.; Hibbard, K.A.; Post, W.M.; King, A.W.; Smith, T.; Rizzo, B.; Woodward, F.I.
2004-01-01
Analysis and simulation of biospheric responses to historical forcing require surface climate data that capture those aspects of climate that control ecological processes, including key spatial gradients and modes of temporal variability. We developed a multivariate, gridded historical climate dataset for the conterminous USA as a common input database for the Vegetation/Ecosystem Modeling and Analysis Project (VEMAP), a biogeochemical and dynamic vegetation model intercomparison. The dataset covers the period 1895-1993 on a 0.5?? latitude/longitude grid. Climate is represented at both monthly and daily timesteps. Variables are: precipitation, mininimum and maximum temperature, total incident solar radiation, daylight-period irradiance, vapor pressure, and daylight-period relative humidity. The dataset was derived from US Historical Climate Network (HCN), cooperative network, and snowpack telemetry (SNOTEL) monthly precipitation and mean minimum and maximum temperature station data. We employed techniques that rely on geostatistical and physical relationships to create the temporally and spatially complete dataset. We developed a local kriging prediction model to infill discontinuous and limited-length station records based on spatial autocorrelation structure of climate anomalies. A spatial interpolation model (PRISM) that accounts for physiographic controls was used to grid the infilled monthly station data. We implemented a stochastic weather generator (modified WGEN) to disaggregate the gridded monthly series to dailies. Radiation and humidity variables were estimated from the dailies using a physically-based empirical surface climate model (MTCLIM3). Derived datasets include a 100 yr model spin-up climate and a historical Palmer Drought Severity Index (PDSI) dataset. The VEMAP dataset exhibits statistically significant trends in temperature, precipitation, solar radiation, vapor pressure, and PDSI for US National Assessment regions. The historical climate and companion datasets are available online at data archive centers. ?? Inter-Research 2004.
Generation of the 30 M-Mesh Global Digital Surface Model by Alos Prism
NASA Astrophysics Data System (ADS)
Tadono, T.; Nagai, H.; Ishida, H.; Oda, F.; Naito, S.; Minakawa, K.; Iwamoto, H.
2016-06-01
Topographical information is fundamental to many geo-spatial related information and applications on Earth. Remote sensing satellites have the advantage in such fields because they are capable of global observation and repeatedly. Several satellite-based digital elevation datasets were provided to examine global terrains with medium resolutions e.g. the Shuttle Radar Topography Mission (SRTM), the global digital elevation model by the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER GDEM). A new global digital surface model (DSM) dataset using the archived data of the Panchromatic Remote-sensing Instrument for Stereo Mapping (PRISM) onboard the Advanced Land Observing Satellite (ALOS, nicknamed "Daichi") has been completed on March 2016 by Japan Aerospace Exploration Agency (JAXA) collaborating with NTT DATA Corp. and Remote Sensing Technology Center, Japan. This project is called "ALOS World 3D" (AW3D), and its dataset consists of the global DSM dataset with 0.15 arcsec. pixel spacing (approx. 5 m mesh) and ortho-rectified PRISM image with 2.5 m resolution. JAXA is also processing the global DSM with 1 arcsec. spacing (approx. 30 m mesh) based on the AW3D DSM dataset, and partially releasing it free of charge, which calls "ALOS World 3D 30 m mesh" (AW3D30). The global AW3D30 dataset will be released on May 2016. This paper describes the processing status, a preliminary validation result of the AW3D30 DSM dataset, and its public release status. As a summary of the preliminary validation of AW3D30 DSM, 4.40 m (RMSE) of the height accuracy of the dataset was confirmed using 5,121 independent check points distributed in the world.
A Comparison of the Forecast Skills among Three Numerical Models
NASA Astrophysics Data System (ADS)
Lu, D.; Reddy, S. R.; White, L. J.
2003-12-01
Three numerical weather forecast models, MM5, COAMPS and WRF, operating with a joint effort of NOAA HU-NCAS and Jackson State University (JSU) during summer 2003 have been chosen to study their forecast skills against observations. The models forecast over the same region with the same initialization, boundary condition, forecast length and spatial resolution. AVN global dataset have been ingested as initial conditions. Grib resolution of 27 km is chosen to represent the current mesoscale model. The forecasts with the length of 36h are performed to output the result with 12h interval. The key parameters used to evaluate the forecast skill include 12h accumulated precipitation, sea level pressure, wind, surface temperature and dew point. Precipitation is evaluated statistically using conventional skill scores, Threat Score (TS) and Bias Score (BS), for different threshold values based on 12h rainfall observations whereas other statistical methods such as Mean Error (ME), Mean Absolute Error(MAE) and Root Mean Square Error (RMSE) are applied to other forecast parameters.
Rainforest-initiated wet season onset over the southern Amazon.
Wright, Jonathon S; Fu, Rong; Worden, John R; Chakraborty, Sudip; Clinton, Nicholas E; Risi, Camille; Sun, Ying; Yin, Lei
2017-08-08
Although it is well established that transpiration contributes much of the water for rainfall over Amazonia, it remains unclear whether transpiration helps to drive or merely responds to the seasonal cycle of rainfall. Here, we use multiple independent satellite datasets to show that rainforest transpiration enables an increase of shallow convection that moistens and destabilizes the atmosphere during the initial stages of the dry-to-wet season transition. This shallow convection moisture pump (SCMP) preconditions the atmosphere at the regional scale for a rapid increase in rain-bearing deep convection, which in turn drives moisture convergence and wet season onset 2-3 mo before the arrival of the Intertropical Convergence Zone (ITCZ). Aerosols produced by late dry season biomass burning may alter the efficiency of the SCMP. Our results highlight the mechanisms by which interactions among land surface processes, atmospheric convection, and biomass burning may alter the timing of wet season onset and provide a mechanistic framework for understanding how deforestation extends the dry season and enhances regional vulnerability to drought.
Rainforest-initiated wet season onset over the southern Amazon
Wright, Jonathon S.; Fu, Rong; Worden, John R.; Chakraborty, Sudip; Clinton, Nicholas E.; Risi, Camille; Sun, Ying; Yin, Lei
2017-01-01
Although it is well established that transpiration contributes much of the water for rainfall over Amazonia, it remains unclear whether transpiration helps to drive or merely responds to the seasonal cycle of rainfall. Here, we use multiple independent satellite datasets to show that rainforest transpiration enables an increase of shallow convection that moistens and destabilizes the atmosphere during the initial stages of the dry-to-wet season transition. This shallow convection moisture pump (SCMP) preconditions the atmosphere at the regional scale for a rapid increase in rain-bearing deep convection, which in turn drives moisture convergence and wet season onset 2–3 mo before the arrival of the Intertropical Convergence Zone (ITCZ). Aerosols produced by late dry season biomass burning may alter the efficiency of the SCMP. Our results highlight the mechanisms by which interactions among land surface processes, atmospheric convection, and biomass burning may alter the timing of wet season onset and provide a mechanistic framework for understanding how deforestation extends the dry season and enhances regional vulnerability to drought. PMID:28729375
Quantifying Uncertainties in Land Surface Microwave Emissivity Retrievals
NASA Technical Reports Server (NTRS)
Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko
2012-01-01
Uncertainties in the retrievals of microwave land surface emissivities were quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including SSM/I, TMI and AMSR-E, were studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors in the retrievals. Generally these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 14% (312 K) over desert and 17% (320 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.52% (26 K). In particular, at 85.0/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are mostly likely caused by rain/cloud contamination, which can lead to random errors up to 1017 K under the most severe conditions.
NASA Astrophysics Data System (ADS)
Tuozzolo, S.; Frasson, R. P. M.; Durand, M. T.
2017-12-01
We analyze a multi-temporal dataset of in-situ and airborne water surface measurements from the March 2015 AirSWOT field campaign on the Willamette River in Western Oregon, which included six days of AirSWOT flights over a 75km stretch of the river. We examine systematic errors associated with dark water and layover effects in the AirSWOT dataset, and test the efficacies of different filtering and spatial averaging techniques at reconstructing the water surface profile. Finally, we generate a spatially-averaged time-series of water surface elevation and water surface slope. These AirSWOT-derived reach-averaged values are ingested in a prospective SWOT discharge algorithm to assess its performance on SWOT-like data collected from a borderline SWOT-measurable river (mean width = 90m).
In Situ Global Sea Surface Salinity and Variability from the NCEI Global Thermosalinograph Database
NASA Astrophysics Data System (ADS)
Wang, Z.; Boyer, T.; Zhang, H. M.
2017-12-01
Sea surface salinity (SSS) plays an important role in the global ocean circulations. The variations of sea surface salinity are key indicators of changes in air-sea water fluxes. Using nearly 30 years of in situ measurements of sea surface salinity from thermosalinographs, we will evaluate the variations of the sea surface salinity in the global ocean. The sea surface salinity data used are from our newly-developed NCEI Global Thermosalinograph Database - NCEI-TSG. This database provides a comprehensive set of quality-controlled in-situ sea-surface salinity and temperature measurements collected from over 340 vessels during the period 1989 to the present. The NCEI-TSG is the world's most complete TSG dataset, containing all data from the different TSG data assembly centers, e.g. COAPS (SAMOS), IODE (GOSUD) and AOML, with more historical data from NCEI's archive to be added. Using this unique dataset, we will investigate the spatial variations of the global SSS and its variability. Annual and interannual variability will also be studied at selected regions.
Metal surface corrosion grade estimation from single image
NASA Astrophysics Data System (ADS)
Chen, Yijun; Qi, Lin; Sun, Huyuan; Fan, Hao; Dong, Junyu
2018-04-01
Metal corrosion can cause many problems, how to quickly and effectively assess the grade of metal corrosion and timely remediation is a very important issue. Typically, this is done by trained surveyors at great cost. Assisting them in the inspection process by computer vision and artificial intelligence would decrease the inspection cost. In this paper, we propose a dataset of metal surface correction used for computer vision detection and present a comparison between standard computer vision techniques by using OpenCV and deep learning method for automatic metal surface corrosion grade estimation from single image on this dataset. The test has been performed by classifying images and calculating the accuracy for the two different approaches.
New DTM Extraction Approach from Airborne Images Derived Dsm
NASA Astrophysics Data System (ADS)
Mousa, Y. A.; Helmholz, P.; Belton, D.
2017-05-01
In this work, a new filtering approach is proposed for a fully automatic Digital Terrain Model (DTM) extraction from very high resolution airborne images derived Digital Surface Models (DSMs). Our approach represents an enhancement of the existing DTM extraction algorithm Multi-directional and Slope Dependent (MSD) by proposing parameters that are more reliable for the selection of ground pixels and the pixelwise classification. To achieve this, four main steps are implemented: Firstly, 8 well-distributed scanlines are used to search for minima as a ground point within a pre-defined filtering window size. These selected ground points are stored with their positions on a 2D surface to create a network of ground points. Then, an initial DTM is created using an interpolation method to fill the gaps in the 2D surface. Afterwards, a pixel to pixel comparison between the initial DTM and the original DSM is performed utilising pixelwise classification of ground and non-ground pixels by applying a vertical height threshold. Finally, the pixels classified as non-ground are removed and the remaining holes are filled. The approach is evaluated using the Vaihingen benchmark dataset provided by the ISPRS working group III/4. The evaluation includes the comparison of our approach, denoted as Network of Ground Points (NGPs) algorithm, with the DTM created based on MSD as well as a reference DTM generated from LiDAR data. The results show that our proposed approach over performs the MSD approach.
The international surface temperature initiative
NASA Astrophysics Data System (ADS)
Thorne, P. W.; Lawrimore, J. H.; Willett, K. M.; Allan, R.; Chandler, R. E.; Mhanda, A.; de Podesta, M.; Possolo, A.; Revadekar, J.; Rusticucci, M.; Stott, P. A.; Strouse, G. F.; Trewin, B.; Wang, X. L.; Yatagai, A.; Merchant, C.; Merlone, A.; Peterson, T. C.; Scott, E. M.
2013-09-01
The aim of International Surface Temperature Initiative is to create an end-to-end process for analysis of air temperature data taken over the land surface of the Earth. The foundation of any analysis is the source data. Land surface air temperature records have traditionally been stored in local, organizational, national and international holdings, some of which have been available digitally but many of which are available solely on paper or as imaged files. Further, economic and geopolitical realities have often precluded open sharing of these data. The necessary first step therefore is to collate readily available holdings and augment these over time either through gaining access to previously unavailable digital data or through data rescue and digitization activities. Next, it must be recognized that these historical measurements were made primarily in support of real-time weather applications where timeliness and coverage are key. At almost every long-term station it is virtually certain that changes in instrumentation, siting or observing practices have occurred. Because none of the historical measures were made in a metrologically traceable manner there is no unambiguous way to retrieve the true climate evolution from the heterogeneous raw data holdings. Therefore it is desirable for multiple independent groups to produce adjusted data sets (so-called homogenized data) to adequately understand the data characteristics and estimate uncertainties. Then it is necessary to benchmark the performance of the contributed algorithms (equivalent to metrological software validation) through development of realistic benchmark datasets. In support of this, a series of successive benchmarking and assessment cycles are envisaged, allowing continual improvement while avoiding over-tuning of algorithms. Finally, a portal is proposed giving access to related data-products, utilizing the assessment results to provide guidance to end-users on which product is the most suited to their needs. Recognizing that the expertise of the metrological community has been under-utilized historically in such climate data analysis problems, the governance of the Initiative includes significant representation from the metrological community. We actively welcome contributions from interested parties to any relevant aspects of the Initiative work.
Comparing MODIS C6 'Deep Blue' and 'Dark Target' Aerosol Data
NASA Technical Reports Server (NTRS)
Hsu, N. C.; Sayer, A. M.; Bettenhausen, C.; Lee, J.; Levy, R. C.; Mattoo, S.; Munchak, L. A.; Kleidman, R.
2014-01-01
The MODIS Collection 6 Atmospheres product suite includes refined versions of both 'Deep Blue' (DB) and 'Dark Target' (DT) aerosol algorithms, with the DB dataset now expanded to include coverage over vegetated land surfaces. This means that, over much of the global land surface, users will have both DB and DT data to choose from. A 'merged' dataset is also provided, primarily for visualization purposes, which takes retrievals from either or both algorithms based on regional and seasonal climatologies of normalized difference vegetation index (NDVI). This poster present some comparisons of these two C6 aerosol algorithms, focusing on AOD at 550 nm derived from MODIS Aqua measurements, with each other and with Aerosol Robotic Network (AERONET) data, with the intent to facilitate user decisions about the suitability of the two datasets for their desired applications.
NASA Astrophysics Data System (ADS)
Norton, P. A., II; Haj, A. E., Jr.
2014-12-01
The United States Geological Survey is currently developing a National Hydrologic Model (NHM) to support and facilitate coordinated and consistent hydrologic modeling efforts at the scale of the continental United States. As part of this effort, the Geospatial Fabric (GF) for the NHM was created. The GF is a database that contains parameters derived from datasets that characterize the physical features of watersheds. The GF was used to aggregate catchments and flowlines defined in the National Hydrography Dataset Plus dataset for more than 100,000 hydrologic response units (HRUs), and to establish initial parameter values for input to the Precipitation-Runoff Modeling System (PRMS). Many parameter values are adjusted in PRMS using an automated calibration process. Using these adjusted parameter values, the PRMS model estimated variables such as evapotranspiration (ET), potential evapotranspiration (PET), snow-covered area (SCA), and snow water equivalent (SWE). In order to evaluate the effectiveness of parameter calibration, and model performance in general, several satellite-based Moderate Resolution Imaging Spectroradiometer (MODIS) and Snow Data Assimilation System (SNODAS) gridded datasets including ET, PET, SCA, and SWE were compared to PRMS-simulated values. The MODIS and SNODAS data were spatially averaged for each HRU, and compared to PRMS-simulated ET, PET, SCA, and SWE values for each HRU in the Upper Missouri River watershed. Default initial GF parameter values and PRMS calibration ranges were evaluated. Evaluation results, and the use of MODIS and SNODAS datasets to update GF parameter values and PRMS calibration ranges, are presented and discussed.
This draft Geographic Information System (GIS) tool can be used to generate scenarios of housing-density changes and calculate impervious surface cover for the conterminous United States. A draft User’s Guide accompanies the tool. This product distributes the population project...
Exposing earth surface process model simulations to a large audience
NASA Astrophysics Data System (ADS)
Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.
2015-12-01
The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical modeling as a tool to create understanding of environmental processes.
Evaluation of Greenland near surface air temperature datasets
Reeves Eyre, J. E. Jack; Zeng, Xubin
2017-07-05
Near-surface air temperature (SAT) over Greenland has important effects on mass balance of the ice sheet, but it is unclear which SAT datasets are reliable in the region. Here extensive in situ SAT measurements ( ∼ 1400 station-years) are used to assess monthly mean SAT from seven global reanalysis datasets, five gridded SAT analyses, one satellite retrieval and three dynamically downscaled reanalyses. Strengths and weaknesses of these products are identified, and their biases are found to vary by season and glaciological regime. MERRA2 reanalysis overall performs best with mean absolute error less than 2 °C in all months. Ice sheet-average annual mean SAT frommore » different datasets are highly correlated in recent decades, but their 1901–2000 trends differ even in sign. Compared with the MERRA2 climatology combined with gridded SAT analysis anomalies, thirty-one earth system model historical runs from the CMIP5 archive reach ∼ 5 °C for the 1901–2000 average bias and have opposite trends for a number of sub-periods.« less
Evaluation of Greenland near surface air temperature datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reeves Eyre, J. E. Jack; Zeng, Xubin
Near-surface air temperature (SAT) over Greenland has important effects on mass balance of the ice sheet, but it is unclear which SAT datasets are reliable in the region. Here extensive in situ SAT measurements ( ∼ 1400 station-years) are used to assess monthly mean SAT from seven global reanalysis datasets, five gridded SAT analyses, one satellite retrieval and three dynamically downscaled reanalyses. Strengths and weaknesses of these products are identified, and their biases are found to vary by season and glaciological regime. MERRA2 reanalysis overall performs best with mean absolute error less than 2 °C in all months. Ice sheet-average annual mean SAT frommore » different datasets are highly correlated in recent decades, but their 1901–2000 trends differ even in sign. Compared with the MERRA2 climatology combined with gridded SAT analysis anomalies, thirty-one earth system model historical runs from the CMIP5 archive reach ∼ 5 °C for the 1901–2000 average bias and have opposite trends for a number of sub-periods.« less
Do pre-trained deep learning models improve computer-aided classification of digital mammograms?
NASA Astrophysics Data System (ADS)
Aboutalib, Sarah S.; Mohamed, Aly A.; Zuley, Margarita L.; Berg, Wendie A.; Luo, Yahong; Wu, Shandong
2018-02-01
Digital mammography screening is an important exam for the early detection of breast cancer and reduction in mortality. False positives leading to high recall rates, however, results in unnecessary negative consequences to patients and health care systems. In order to better aid radiologists, computer-aided tools can be utilized to improve distinction between image classifications and thus potentially reduce false recalls. The emergence of deep learning has shown promising results in the area of biomedical imaging data analysis. This study aimed to investigate deep learning and transfer learning methods that can improve digital mammography classification performance. In particular, we evaluated the effect of pre-training deep learning models with other imaging datasets in order to boost classification performance on a digital mammography dataset. Two types of datasets were used for pre-training: (1) a digitized film mammography dataset, and (2) a very large non-medical imaging dataset. By using either of these datasets to pre-train the network initially, and then fine-tuning with the digital mammography dataset, we found an increase in overall classification performance in comparison to a model without pre-training, with the very large non-medical dataset performing the best in improving the classification accuracy.
NASA Astrophysics Data System (ADS)
Hubbard, Susan S.; Ajo-Franklin, Jonathan B.; Dafflon, Baptiste; Dou, Shan; Kneafsey, Tim J.; Peterson, John E.; Tas, Neslihan; Torn, Margaret S.; Phuong Tran, Anh; Ulrich, Craig; Wainwright, Haruko; Wu, Yuxin; Wullschleger, Stan
2015-04-01
Although accurate prediction of ecosystem feedbacks to climate requires characterization of the properties that influence terrestrial carbon cycling, performing such characterization is challenging due to the disparity of scales involved. This is particularly true in vulnerable Arctic ecosystems, where microbial activities leading to the production of greenhouse gasses are a function of small-scale hydrological, geochemical, and thermal conditions influenced by geomorphology and seasonal dynamics. As part of the DOE Next-Generation Ecosystem Experiment (NGEE-Arctic), we are advancing two approaches to improve the characterization of complex Arctic ecosystems, with an initial application to an ice-wedge polygon dominated tundra site near Barrow, AK, USA. The first advance focuses on developing a new strategy to jointly monitor above- and below- ground properties critical for carbon cycling in the tundra. The strategy includes co-characterization of properties within the three critical ecosystem compartments: land surface (vegetation, water inundation, snow thickness, and geomorphology); active layer (peat thickness, soil moisture, soil texture, hydraulic conductivity, soil temperature, and geochemistry); and permafrost (mineral soil and ice content, nature, and distribution). Using a nested sampling strategy, a wide range of measurements have been collected at the study site over the past three years, including: above-ground imagery (LiDAR, visible, near infrared, NDVI) from various platforms, surface geophysical datasets (electrical, electromagnetic, ground penetrating radar, seismic), and point measurements (such as CO2 and methane fluxes, soil properties, microbial community composition). A subset of the coincident datasets is autonomously collected daily. Laboratory experiments and new inversion approaches are used to improve interpretation of the field geophysical datasets in terms of ecosystem properties. The new strategy has significantly advanced our ability to characterize and monitor ecosystem functioning - within and across permafrost, active layer and land-surface compartments and as a function of geomorphology and seasonal dynamics (thaw, growing season, freeze-up, and winter seasons). The second construct uses statistical approaches with the rich datasets to identify Arctic functional zones. Functional zones are regions in the landscape that have unique assemblages of above- and below-ground properties relevant to ecosystem functioning. Results demonstrate the strong co-variation of above and below ground properties in this Arctic ecosystem, particularly highlighting the critical influence of soil moisture on vegetation dynamics and redox-based active-layer biogeochemistry important for carbon cycling. The results also indicate that polygon types (low centered, high centered) have more power to explain the variations in properties than polygon features (trough, rim, center). This finding allows delineation of functional zones through grouping contiguous, similar types of polygons using remote sensing and surface geophysical datasets. Applied to the tundra NGEE study site, the functional zone approach permitted aggregation of critical properties associated with ~1350 polygons and their individual features, which vary over centimeter-to-meter length scales, into a few functional zones having suites of co-varying properties that were tractably defined over ~hundred meter length scales. The developed above-and-below ground monitoring strategy and functional zone approach are proving to be extremely valuable for gaining new insights about a complex Arctic ecosystem and for characterizing the system properties at high resolution and yet with spatial extents relevant for informing models focused on simulating ecosystem-climate feedbacks.
NASA Astrophysics Data System (ADS)
Wheatland, Jonathan; Bushby, Andy; Droppo, Ian; Carr, Simon; Spencer, Kate
2015-04-01
Suspended estuarine sediments form flocs that are compositionally complex, fragile and irregularly shaped. The fate and transport of suspended particulate matter (SPM) is determined by the size, shape, density, porosity and stability of these flocs and prediction of SPM transport requires accurate measurements of these three-dimensional (3D) physical properties. However, the multi-scaled nature of flocs in addition to their fragility makes their characterisation in 3D problematic. Correlative microscopy is a strategy involving the spatial registration of information collected at different scales using several imaging modalities. Previously, conventional optical microscopy (COM) and transmission electron microscopy (TEM) have enabled 2-dimensional (2D) floc characterisation at the gross (> 1 µm) and sub-micron scales respectively. Whilst this has proven insightful there remains a critical spatial and dimensional gap preventing the accurate measurement of geometric properties and an understanding of how structures at different scales are related. Within life sciences volumetric imaging techniques such as 3D micro-computed tomography (3D µCT) and focused ion beam scanning electron microscopy [FIB-SEM (or FIB-tomography)] have been combined to characterise materials at the centimetre to micron scale. Combining these techniques with TEM enables an advanced correlative study, allowing material properties across multiple spatial and dimensional scales to be visualised. The aims of this study are; 1) to formulate an advanced correlative imaging strategy combining 3D µCT, FIB-tomography and TEM; 2) to acquire 3D datasets; 3) to produce a model allowing their co-visualisation; 4) to interpret 3D floc structure. To reduce the chance of structural alterations during analysis samples were first 'fixed' in 2.5% glutaraldehyde/2% formaldehyde before being embedding in Durcupan resin. Intermediate steps were implemented to improve contrast and remove pore water, achieved by the addition of heavy metal stains and washing samples in a series of ethanol solutions and acetone. Gross-scale characterisation involved scanning samples using a Nikon Metrology HM X 225 µCT. For micro-scale analysis a working surface was revealed by microtoming the sample. Ultrathin sections were then collected and analysed using a JEOL 1200 Ex II TEM, and FIB-tomography datasets obtained using an FEI Quanta 3D FIB-SEM. Finally, to locate the surface and relate TEM and FIB-tomography datasets to the original floc, samples were rescanned using the µCT. Image processing was initially conducted in ImageJ. Following this datasets were imported into Amira 5.5 where pixel intensity thresholding allowed particle-matrix boundaries to be defined. Using 'landmarks' datasets were then registered to enable their co-visualisation in 3D models. Analysis of registered datasets reveals the complex non-fractal nature of flocs, whose properties span several of orders of magnitude. Primary particles are organised into discrete 'bundles', the arrangement of which directly influences their gross morphology. This strategy, which allows the co-visualisation of spatially registered multi-scale 3D datasets, provides unique insights into the true nature floc which would other have been impossible.
NASA Astrophysics Data System (ADS)
Abul Ehsan Bhuiyan, Md; Nikolopoulos, Efthymios I.; Anagnostou, Emmanouil N.; Quintana-Seguí, Pere; Barella-Ortiz, Anaïs
2018-02-01
This study investigates the use of a nonparametric, tree-based model, quantile regression forests (QRF), for combining multiple global precipitation datasets and characterizing the uncertainty of the combined product. We used the Iberian Peninsula as the study area, with a study period spanning 11 years (2000-2010). Inputs to the QRF model included three satellite precipitation products, CMORPH, PERSIANN, and 3B42 (V7); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset. We calibrated the QRF model for two seasons and two terrain elevation categories and used it to generate ensemble for these conditions. Evaluation of the combined product was based on a high-resolution, ground-reference precipitation dataset (SAFRAN) available at 5 km 1 h-1 resolution. Furthermore, to evaluate relative improvements and the overall impact of the combined product in hydrological response, we used the generated ensemble to force a distributed hydrological model (the SURFEX land surface model and the RAPID river routing scheme) and compared its streamflow simulation results with the corresponding simulations from the individual global precipitation and reference datasets. We concluded that the proposed technique could generate realizations that successfully encapsulate the reference precipitation and provide significant improvement in streamflow simulations, with reduction in systematic and random error on the order of 20-99 and 44-88 %, respectively, when considering the ensemble mean.
Antony, Bhavna; Abràmoff, Michael D.; Tang, Li; Ramdas, Wishal D.; Vingerling, Johannes R.; Jansonius, Nomdo M.; Lee, Kyungmoo; Kwon, Young H.; Sonka, Milan; Garvin, Mona K.
2011-01-01
The 3-D spectral-domain optical coherence tomography (SD-OCT) images of the retina often do not reflect the true shape of the retina and are distorted differently along the x and y axes. In this paper, we propose a novel technique that uses thin-plate splines in two stages to estimate and correct the distinct axial artifacts in SD-OCT images. The method was quantitatively validated using nine pairs of OCT scans obtained with orthogonal fast-scanning axes, where a segmented surface was compared after both datasets had been corrected. The mean unsigned difference computed between the locations of this artifact-corrected surface after the single-spline and dual-spline correction was 23.36 ± 4.04 μm and 5.94 ± 1.09 μm, respectively, and showed a significant difference (p < 0.001 from two-tailed paired t-test). The method was also validated using depth maps constructed from stereo fundus photographs of the optic nerve head, which were compared to the flattened top surface from the OCT datasets. Significant differences (p < 0.001) were noted between the artifact-corrected datasets and the original datasets, where the mean unsigned differences computed over 30 optic-nerve-head-centered scans (in normalized units) were 0.134 ± 0.035 and 0.302 ± 0.134, respectively. PMID:21833377
GLEAM v3: updated land evaporation and root-zone soil moisture datasets
NASA Astrophysics Data System (ADS)
Martens, Brecht; Miralles, Diego; Lievens, Hans; van der Schalie, Robin; de Jeu, Richard; Fernández-Prieto, Diego; Verhoest, Niko
2016-04-01
Evaporation determines the availability of surface water resources and the requirements for irrigation. In addition, through its impacts on the water, carbon and energy budgets, evaporation influences the occurrence of rainfall and the dynamics of air temperature. Therefore, reliable estimates of this flux at regional to global scales are of major importance for water management and meteorological forecasting of extreme events. However, the global-scale magnitude and variability of the flux, and the sensitivity of the underlying physical process to changes in environmental factors, are still poorly understood due to the limited global coverage of in situ measurements. Remote sensing techniques can help to overcome the lack of ground data. However, evaporation is not directly observable from satellite systems. As a result, recent efforts have focussed on combining the observable drivers of evaporation within process-based models. The Global Land Evaporation Amsterdam Model (GLEAM, www.gleam.eu) estimates terrestrial evaporation based on daily satellite observations of meteorological drivers of terrestrial evaporation, vegetation characteristics and soil moisture. Since the publication of the first version of the model in 2011, GLEAM has been widely applied for the study of trends in the water cycle, interactions between land and atmosphere and hydrometeorological extreme events. A third version of the GLEAM global datasets will be available from the beginning of 2016 and will be distributed using www.gleam.eu as gateway. The updated datasets include separate estimates for the different components of the evaporative flux (i.e. transpiration, bare-soil evaporation, interception loss, open-water evaporation and snow sublimation), as well as variables like the evaporative stress, potential evaporation, root-zone soil moisture and surface soil moisture. A new dataset using SMOS-based input data of surface soil moisture and vegetation optical depth will also be distributed. The most important updates in GLEAM include the revision of the soil moisture data assimilation system, the evaporative stress functions and the infiltration of rainfall. In this presentation, we will highlight the changes of the methodology and present the new datasets, their validation against in situ observations and the comparisons against alternative datasets of terrestrial evaporation, such as GLDAS-Noah, ERA-Interim and previous GLEAM datasets. Preliminary results indicate that the magnitude and the spatio-temporal variability of the evaporation estimates have been slightly improved upon previous versions of the datasets.
NASA Technical Reports Server (NTRS)
Andrews, A.
2002-01-01
A detailed mechanistic understanding of the sources and sinks of CO2 will be required to reliably predict future COS levels and climate. A commonly used technique for deriving information about CO2 exchange with surface reservoirs is to solve an "inverse problem," where CO2 observations are used with an atmospheric transport model to find the optimal distribution of sources and sinks. Synthesis inversion methods are powerful tools for addressing this question, but the results are disturbingly sensitive to the details of the calculation. Studies done using different atmospheric transport models and combinations of surface station data have produced substantially different distributions of surface fluxes. Adjoint methods are now being developed that will more effectively incorporate diverse datasets in estimates of surface fluxes of CO2. In an adjoint framework, it will be possible to combine CO2 concentration data from long-term surface monitoring stations with data from intensive field campaigns and with proposed future satellite observations. A major advantage of the adjoint approach is that meteorological and surface data, as well as data for other atmospheric constituents and pollutants can be efficiently included in addition to observations of CO2 mixing ratios. This presentation will provide an overview of potentially useful datasets for carbon cycle research in general with an emphasis on planning for the North American Carbon Project. Areas of overlap with ongoing and proposed work on air quality/air pollution issues will be highlighted.
NASA Astrophysics Data System (ADS)
Zhang, Xu; Li, Yun; Chen, Xiang; Li, Guanglin; Zev Rymer, William; Zhou, Ping
2013-08-01
Objective. This study investigates the effect of the involuntary motor activity of paretic-spastic muscles on the classification of surface electromyography (EMG) signals. Approach. Two data collection sessions were designed for 8 stroke subjects to voluntarily perform 11 functional movements using their affected forearm and hand at relatively slow and fast speeds. For each stroke subject, the degree of involuntary motor activity present in the voluntary surface EMG recordings was qualitatively described from such slow and fast experimental protocols. Myoelectric pattern recognition analysis was performed using different combinations of voluntary surface EMG data recorded from the slow and fast sessions. Main results. Across all tested stroke subjects, our results revealed that when involuntary surface EMG is absent or present in both the training and testing datasets, high accuracies (>96%, >98%, respectively, averaged over all the subjects) can be achieved in the classification of different movements using surface EMG signals from paretic muscles. When involuntary surface EMG was solely involved in either the training or testing datasets, the classification accuracies were dramatically reduced (<89%, <85%, respectively). However, if both the training and testing datasets contained EMG signals with the presence and absence of involuntary EMG interference, high accuracies were still achieved (>97%). Significance. The findings of this study can be used to guide the appropriate design and implementation of myoelectric pattern recognition based systems or devices toward promoting robot-aided therapy for stroke rehabilitation.
The evolution of rotating very massive stars with LMC composition
NASA Astrophysics Data System (ADS)
Köhler, K.; Langer, N.; de Koter, A.; de Mink, S. E.; Crowther, P. A.; Evans, C. J.; Gräfener, G.; Sana, H.; Sanyal, D.; Schneider, F. R. N.; Vink, J. S.
2015-01-01
Context. With growing evidence for the existence of very massive stars at subsolar metallicity, there is an increased need for corresponding stellar evolution models. Aims: We present a dense model grid with a tailored input chemical composition appropriate for the Large Magellanic Cloud (LMC). Methods: We use a one-dimensional hydrodynamic stellar evolution code, which accounts for rotation, transport of angular momentum by magnetic fields, and stellar wind mass loss to compute our detailed models. We calculate stellar evolution models with initial masses from 70 to 500 M⊙ and with initial surface rotational velocities from 0 to 550 km s-1, covering the core-hydrogen burning phase of evolution. Results: We find our rapid rotators to be strongly influenced by rotationally induced mixing of helium, with quasi-chemically homogeneous evolution occurring for the fastest rotating models. Above 160 M⊙, homogeneous evolution is also established through mass loss, producing pure helium stars at core hydrogen exhaustion independent of the initial rotation rate. Surface nitrogen enrichment is also found for slower rotators, even for stars that lose only a small fraction of their initial mass. For models above ~150 M⊙ at zero age, and for models in the whole considered mass range later on, we find a considerable envelope inflation due to the proximity of these models to their Eddington limit. This leads to a maximum ZAMS surface temperature of ~56 000 K, at ~180 M⊙, and to an evolution of stars in the mass range 50 M⊙...100 M⊙ to the regime of luminous blue variables in the Hertzsprung-Russell diagram with high internal Eddington factors. Inflation also leads to decreasing surface temperatures during the chemically homogeneous evolution of stars above ~180 M⊙. Conclusions: The cool surface temperatures due to the envelope inflation in our models lead to an enhanced mass loss, which prevents stars at LMC metallicity from evolving into pair-instability supernovae. The corresponding spin-down will also prevent very massive LMC stars to produce long-duration gamma-ray bursts, which might, however, originate from lower masses. The dataset of the presented stellar evolution models is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/573/A71Appendices are available in electronic form at http://www.aanda.org
NASA Technical Reports Server (NTRS)
Lang, Timothy; Mecikalski, John; Li, Xuanli; Chronis, Themis; Brewer, Alan; Churnside, James; Rutledge, Steve
2014-01-01
CYGNSS is a planned constellation consisting of multiple micro-satellites that leverage the Global Positioning System (GPS) to provide rapidly updated, high resolution (approx. 15-50 km, approx. 4 h) surface wind speeds (via bi-static scatterometry) over the tropical oceans in any weather condition, including heavy rainfall. The approach of the work to be presented at this conference is to utilize a limited-domain, cloud-system resolving model (Weather Research and Forecasting or WRF) and its attendant data assimilation scheme (Three-Dimensional Variational Assimilation or 3DVAR) to investigate the utility of the CYGNSS mission for helping characterize key convectiveto- mesoscale processes - such as surface evaporation, moisture advection and convergence, and upscale development of precipitation systems - that help drive the initiation and development of the Madden-Julian Oscillation (MJO) in the equatorial Indian Ocean. The proposed work will focus on three scientific objectives. Objective 1 is to produce a high-resolution surface wind dataset resolution (approx. 0.5 h, approx. 1-4 km) for multiple MJO onsets using WRF-assimilated winds and other data from the DYNAmics of the MJO (DYNAMO) field campaign, which took place during October 2011 - March 2012. Objective 2 is to study the variability of surface winds during MJO onsets at temporal and spatial scales of finer resolution than future CYGNSS data. The goal is to understand how sub-CYGNSS-resolution processes will shape the observations made by the satellite constellation. Objective 3 is to ingest simulated CYGNSS data into the WRF model in order to perform observing system simulation experiments (OSSEs). These will be used to test and quantify the potential beneficial effects provided by CYGNSS, particularly for characterizing the physical processes driving convective organization and upscale development during the initiation and development of the MJO. The proposed research is ideal for answering important questions about the CYGNSS mission, such as the representativeness of surface wind retrievals in the context of the complex airflow processes that occur during heavy precipitation, as well as the tradeoffs in retrieval accuracy that result from finer spatial resolution of the CYGNSS winds versus increased errors/noisiness in those data. Research plans and initial progress toward these objectives will be presented.
Implementing DOIs for Oceanographic Satellite Data at PO.DAAC
NASA Astrophysics Data System (ADS)
Hausman, J.; Tauer, E.; Chung, N.; Chen, C.; Moroni, D. F.
2013-12-01
The Physical Oceanographic Distributed Active Archive Center (PO.DAAC) is NASA's archive for physical oceanographic satellite data. It distributes over 500 datasets from gravity, ocean wind, sea surface topography, sea ice, ocean currents, salinity, and sea surface temperature satellite missions. A dataset is a collection of granules/files that share the same mission/project, versioning, processing level, spatial, and temporal characteristics. The large number of datasets is partially due to the number of satellite missions, but mostly because a single satellite mission typically has multiple versions or even temporal and spatial resolutions of data. As a result, a user might mistake one dataset for a different dataset from the same satellite mission. Due to the PO.DAAC'S vast variety and volume of data and growing requirements to report dataset usage, it has begun implementing DOIs for the datasets it archives and distributes. However, this was not as simple as registering a name for a DOI and providing a URL. Before implementing DOIs multiple questions needed to be answered. What are the sponsor and end-user expectations regarding DOIs? At what level does a DOI get assigned (dataset, file/granule)? Do all data get a DOI, or only selected data? How do we create a DOI? How do we create landing pages and manage them? What changes need to be made to the data archive, life cycle policy and web portal to accommodate DOIs? What if the data also exists at another archive and a DOI already exists? How is a DOI included if the data were obtained via a subsetting tool? How does a researcher or author provide a unique, definitive reference (standard citation) for a given dataset? This presentation will discuss how these questions were answered through changes in policy, process, and system design. Implementing DOIs is not a trivial undertaking, but as DOIs are rapidly becoming the de facto approach, it is worth the effort. Researchers have historically referenced the source satellite and data center (or archive), but scientific writings do not typically provide enough detail to point to a singular, uniquely identifiable dataset. DOIs provide the means to help researchers be precise in their data citations and provide needed clarity, standardization and permanence.
A polymer dataset for accelerated property prediction and design.
Huan, Tran Doan; Mannodi-Kanakkithodi, Arun; Kim, Chiho; Sharma, Vinit; Pilania, Ghanshyam; Ramprasad, Rampi
2016-03-01
Emerging computation- and data-driven approaches are particularly useful for rationally designing materials with targeted properties. Generally, these approaches rely on identifying structure-property relationships by learning from a dataset of sufficiently large number of relevant materials. The learned information can then be used to predict the properties of materials not already in the dataset, thus accelerating the materials design. Herein, we develop a dataset of 1,073 polymers and related materials and make it available at http://khazana.uconn.edu/. This dataset is uniformly prepared using first-principles calculations with structures obtained either from other sources or by using structure search methods. Because the immediate target of this work is to assist the design of high dielectric constant polymers, it is initially designed to include the optimized structures, atomization energies, band gaps, and dielectric constants. It will be progressively expanded by accumulating new materials and including additional properties calculated for the optimized structures provided.
Exploring Relationships in Big Data
NASA Astrophysics Data System (ADS)
Mahabal, A.; Djorgovski, S. G.; Crichton, D. J.; Cinquini, L.; Kelly, S.; Colbert, M. A.; Kincaid, H.
2015-12-01
Big Data are characterized by several different 'V's. Volume, Veracity, Volatility, Value and so on. For many datasets inflated Volumes through redundant features often make the data more noisy and difficult to extract Value out of. This is especially true if one is comparing/combining different datasets, and the metadata are diverse. We have been exploring ways to exploit such datasets through a variety of statistical machinery, and visualization. We show how we have applied it to time-series from large astronomical sky-surveys. This was done in the Virtual Observatory framework. More recently we have been doing similar work for a completely different domain viz. biology/cancer. The methodology reuse involves application to diverse datasets gathered through the various centers associated with the Early Detection Research Network (EDRN) for cancer, an initiative of the National Cancer Institute (NCI). Application to Geo datasets is a natural extension.
Hard exudates segmentation based on learned initial seeds and iterative graph cut.
Kusakunniran, Worapan; Wu, Qiang; Ritthipravat, Panrasee; Zhang, Jian
2018-05-01
(Background and Objective): The occurrence of hard exudates is one of the early signs of diabetic retinopathy which is one of the leading causes of the blindness. Many patients with diabetic retinopathy lose their vision because of the late detection of the disease. Thus, this paper is to propose a novel method of hard exudates segmentation in retinal images in an automatic way. (Methods): The existing methods are based on either supervised or unsupervised learning techniques. In addition, the learned segmentation models may often cause miss-detection and/or fault-detection of hard exudates, due to the lack of rich characteristics, the intra-variations, and the similarity with other components in the retinal image. Thus, in this paper, the supervised learning based on the multilayer perceptron (MLP) is only used to identify initial seeds with high confidences to be hard exudates. Then, the segmentation is finalized by unsupervised learning based on the iterative graph cut (GC) using clusters of initial seeds. Also, in order to reduce color intra-variations of hard exudates in different retinal images, the color transfer (CT) is applied to normalize their color information, in the pre-processing step. (Results): The experiments and comparisons with the other existing methods are based on the two well-known datasets, e_ophtha EX and DIARETDB1. It can be seen that the proposed method outperforms the other existing methods in the literature, with the sensitivity in the pixel-level of 0.891 for the DIARETDB1 dataset and 0.564 for the e_ophtha EX dataset. The cross datasets validation where the training process is performed on one dataset and the testing process is performed on another dataset is also evaluated in this paper, in order to illustrate the robustness of the proposed method. (Conclusions): This newly proposed method integrates the supervised learning and unsupervised learning based techniques. It achieves the improved performance, when compared with the existing methods in the literature. The robustness of the proposed method for the scenario of cross datasets could enhance its practical usage. That is, the trained model could be more practical for unseen data in the real-world situation, especially when the capturing environments of training and testing images are not the same. Copyright © 2018 Elsevier B.V. All rights reserved.
The ROSCOE Manual. Volume I-1. Program Description
1980-02-29
CA’ CACcc C-’ c-c--c ~cc CCC) - CD Z I n -. cc - cx ccan,- C.) CC a - ~’, Zn cc-a U - -. a cc Zc- *0 F-U- CD~ cx no Ch cc a a I CACAO U) c--C Ccx c...dataset is de - noted by E8. Every dataset is tied t6 the basic dataset along with some important lists such as the object list, the radar list, and the...used for the track initiation and track functions are shown next. Most of these parameters are well de - fined, with the exception of the range gate
Exploring Antarctic Land Surface Temperature Extremes Using Condensed Anomaly Databases
NASA Astrophysics Data System (ADS)
Grant, Glenn Edwin
Satellite observations have revolutionized the Earth Sciences and climate studies. However, data and imagery continue to accumulate at an accelerating rate, and efficient tools for data discovery, analysis, and quality checking lag behind. In particular, studies of long-term, continental-scale processes at high spatiotemporal resolutions are especially problematic. The traditional technique of downloading an entire dataset and using customized analysis code is often impractical or consumes too many resources. The Condensate Database Project was envisioned as an alternative method for data exploration and quality checking. The project's premise was that much of the data in any satellite dataset is unneeded and can be eliminated, compacting massive datasets into more manageable sizes. Dataset sizes are further reduced by retaining only anomalous data of high interest. Hosting the resulting "condensed" datasets in high-speed databases enables immediate availability for queries and exploration. Proof of the project's success relied on demonstrating that the anomaly database methods can enhance and accelerate scientific investigations. The hypothesis of this dissertation is that the condensed datasets are effective tools for exploring many scientific questions, spurring further investigations and revealing important information that might otherwise remain undetected. This dissertation uses condensed databases containing 17 years of Antarctic land surface temperature anomalies as its primary data. The study demonstrates the utility of the condensate database methods by discovering new information. In particular, the process revealed critical quality problems in the source satellite data. The results are used as the starting point for four case studies, investigating Antarctic temperature extremes, cloud detection errors, and the teleconnections between Antarctic temperature anomalies and climate indices. The results confirm the hypothesis that the condensate databases are a highly useful tool for Earth Science analyses. Moreover, the quality checking capabilities provide an important method for independent evaluation of dataset veracity.
Smith, Kathryn E. L.; Flocks, James G.; Steyer, Gregory D.; Piazza, Sarai C.
2015-01-01
Wetland sediment data were collected in 2009 and 2010 throughout the southwest Louisiana Chenier Plain as part of a pilot study to develop a diatom-based proxy for past wetland water chemistry and the identification of sediment deposits from tropical storms. The complete dataset includes forty-six surface sediment samples and nine sediment cores. The surface sediment samples were collected in fresh, intermediate, and brackish marsh and are located coincident with Coastwide Reference Monitoring System (CRMS) sites. The nine sediment cores were collected at the Rockefeller Wildlife Refuge (RWR) located in Grand Chenier, La.
Carbone, V; Fluit, R; Pellikaan, P; van der Krogt, M M; Janssen, D; Damsgaard, M; Vigneron, L; Feilkas, T; Koopman, H F J M; Verdonschot, N
2015-03-18
When analyzing complex biomechanical problems such as predicting the effects of orthopedic surgery, subject-specific musculoskeletal models are essential to achieve reliable predictions. The aim of this paper is to present the Twente Lower Extremity Model 2.0, a new comprehensive dataset of the musculoskeletal geometry of the lower extremity, which is based on medical imaging data and dissection performed on the right lower extremity of a fresh male cadaver. Bone, muscle and subcutaneous fat (including skin) volumes were segmented from computed tomography and magnetic resonance images scans. Inertial parameters were estimated from the image-based segmented volumes. A complete cadaver dissection was performed, in which bony landmarks, attachments sites and lines-of-action of 55 muscle actuators and 12 ligaments, bony wrapping surfaces, and joint geometry were measured. The obtained musculoskeletal geometry dataset was finally implemented in the AnyBody Modeling System (AnyBody Technology A/S, Aalborg, Denmark), resulting in a model consisting of 12 segments, 11 joints and 21 degrees of freedom, and including 166 muscle-tendon elements for each leg. The new TLEM 2.0 dataset was purposely built to be easily combined with novel image-based scaling techniques, such as bone surface morphing, muscle volume registration and muscle-tendon path identification, in order to obtain subject-specific musculoskeletal models in a quick and accurate way. The complete dataset, including CT and MRI scans and segmented volume and surfaces, is made available at http://www.utwente.nl/ctw/bw/research/projects/TLEMsafe for the biomechanical community, in order to accelerate the development and adoption of subject-specific models on large scale. TLEM 2.0 is freely shared for non-commercial use only, under acceptance of the TLEMsafe Research License Agreement. Copyright © 2014 Elsevier Ltd. All rights reserved.
Water Balance in the Amazon Basin from a Land Surface Model Ensemble
NASA Technical Reports Server (NTRS)
Getirana, Augusto C. V.; Dutra, Emanuel; Guimberteau, Matthieu; Kam, Jonghun; Li, Hong-Yi; Decharme, Bertrand; Zhang, Zhengqiu; Ducharne, Agnes; Boone, Aaron; Balsamo, Gianpaolo;
2014-01-01
Despite recent advances in land surfacemodeling and remote sensing, estimates of the global water budget are still fairly uncertain. This study aims to evaluate the water budget of the Amazon basin based on several state-ofthe- art land surface model (LSM) outputs. Water budget variables (terrestrial water storage TWS, evapotranspiration ET, surface runoff R, and base flow B) are evaluated at the basin scale using both remote sensing and in situ data. Meteorological forcings at a 3-hourly time step and 18 spatial resolution were used to run 14 LSMs. Precipitation datasets that have been rescaled to matchmonthly Global Precipitation Climatology Project (GPCP) andGlobal Precipitation Climatology Centre (GPCC) datasets and the daily Hydrologie du Bassin de l'Amazone (HYBAM) dataset were used to perform three experiments. The Hydrological Modeling and Analysis Platform (HyMAP) river routing scheme was forced with R and B and simulated discharges are compared against observations at 165 gauges. Simulated ET and TWS are compared against FLUXNET and MOD16A2 evapotranspiration datasets andGravity Recovery and ClimateExperiment (GRACE)TWSestimates in two subcatchments of main tributaries (Madeira and Negro Rivers).At the basin scale, simulated ET ranges from 2.39 to 3.26 mm day(exp -1) and a low spatial correlation between ET and precipitation indicates that evapotranspiration does not depend on water availability over most of the basin. Results also show that other simulated water budget components vary significantly as a function of both the LSM and precipitation dataset, but simulated TWS generally agrees with GRACE estimates at the basin scale. The best water budget simulations resulted from experiments using HYBAM, mostly explained by a denser rainfall gauge network and the rescaling at a finer temporal scale.
Spatial distribution of pingos in Northern Asia
Grosse, G.; Jones, Benjamin M.
2010-01-01
Pingos are prominent periglacial landforms in vast regions of the Arctic and Subarctic. They are indicators of modern and past conditions of permafrost, surface geology, hydrology and climate. A first version of a detailed spatial geodatabase of more than 6000 pingo locations in a 3.5 ?? 106 km2 region of Northern Asia was assembled from topographic maps. A first order analysis was carried out with respect to permafrost, landscape characteristics, surface geology, hydrology, climate, and elevation datasets using a Geographic Information System (GIS). Pingo heights in the dataset vary between 2 and 37 m, with a mean height of 4.8 m. About 64% of the pingos occur in continuous permafrost with high ice content and thick sediments; another 19% in continuous permafrost with moderate ice content and thick sediments. The majority of these pingos likely formed through closed system freezing, typical of those located in drained thermokarst lake basins of northern lowlands with continuous permafrost. About 82% of the pingos are located in the tundra bioclimatic zone. Most pingos in the dataset are located in regions with mean annual ground temperatures between -3 and -11 ??C and mean annual air temperatures between -7 and -18 ??C. The dataset confirms that surface geology and hydrology are key factors for pingo formation and occurrence. Based on model predictions for near-future permafrost distribution, hundreds of pingos along the southern margins of permafrost will be located in regions with thawing permafrost by 2100, which ultimately may lead to increased occurrence of pingo collapse. Based on our dataset and previously published estimates of pingo numbers from other regions, we conclude that there are more than 11 000 pingos on Earth. ?? 2010 Author(s).
Gross, Markus; Magar, Vanesa
2016-01-01
In previous work, the authors demonstrated how data from climate simulations can be utilized to estimate regional wind power densities. In particular, it was shown that the quality of wind power densities, estimated from the UPSCALE global dataset in offshore regions of Mexico, compared well with regional high resolution studies. Additionally, a link between surface temperature and moist air density in the estimates was presented. UPSCALE is an acronym for UK on PRACE (the Partnership for Advanced Computing in Europe)—weather-resolving Simulations of Climate for globAL Environmental risk. The UPSCALE experiment was performed in 2012 by NCAS (National Centre for Atmospheric Science)-Climate, at the University of Reading and the UK Met Office Hadley Centre. The study included a 25.6-year, five-member ensemble simulation of the HadGEM3 global atmosphere, at 25km resolution for present climate conditions. The initial conditions for the ensemble runs were taken from consecutive days of a test configuration. In the present paper, the emphasis is placed on the single climate run for a potential future climate scenario in the UPSCALE experiment dataset, using the Representation Concentrations Pathways (RCP) 8.5 climate change scenario. Firstly, some tests were performed to ensure that the results using only one instantiation of the current climate dataset are as robust as possible within the constraints of the available data. In order to achieve this, an artificial time series over a longer sampling period was created. Then, it was shown that these longer time series provided almost the same results than the short ones, thus leading to the argument that the short time series is sufficient to capture the climate. Finally, with the confidence that one instantiation is sufficient, the future climate dataset was analysed to provide, for the first time, a projection of future changes in wind power resources using the UPSCALE dataset. It is hoped that this, in turn, will provide some guidance for wind power developers and policy makers to prepare and adapt for climate change impacts on wind energy production. Although offshore locations around Mexico were used as a case study, the dataset is global and hence the methodology presented can be readily applied at any desired location. PMID:27788208
Gross, Markus; Magar, Vanesa
2016-01-01
In previous work, the authors demonstrated how data from climate simulations can be utilized to estimate regional wind power densities. In particular, it was shown that the quality of wind power densities, estimated from the UPSCALE global dataset in offshore regions of Mexico, compared well with regional high resolution studies. Additionally, a link between surface temperature and moist air density in the estimates was presented. UPSCALE is an acronym for UK on PRACE (the Partnership for Advanced Computing in Europe)-weather-resolving Simulations of Climate for globAL Environmental risk. The UPSCALE experiment was performed in 2012 by NCAS (National Centre for Atmospheric Science)-Climate, at the University of Reading and the UK Met Office Hadley Centre. The study included a 25.6-year, five-member ensemble simulation of the HadGEM3 global atmosphere, at 25km resolution for present climate conditions. The initial conditions for the ensemble runs were taken from consecutive days of a test configuration. In the present paper, the emphasis is placed on the single climate run for a potential future climate scenario in the UPSCALE experiment dataset, using the Representation Concentrations Pathways (RCP) 8.5 climate change scenario. Firstly, some tests were performed to ensure that the results using only one instantiation of the current climate dataset are as robust as possible within the constraints of the available data. In order to achieve this, an artificial time series over a longer sampling period was created. Then, it was shown that these longer time series provided almost the same results than the short ones, thus leading to the argument that the short time series is sufficient to capture the climate. Finally, with the confidence that one instantiation is sufficient, the future climate dataset was analysed to provide, for the first time, a projection of future changes in wind power resources using the UPSCALE dataset. It is hoped that this, in turn, will provide some guidance for wind power developers and policy makers to prepare and adapt for climate change impacts on wind energy production. Although offshore locations around Mexico were used as a case study, the dataset is global and hence the methodology presented can be readily applied at any desired location.
NASA Astrophysics Data System (ADS)
Langheinrich, M.; Fischer, P.; Probeck, M.; Ramminger, G.; Wagner, T.; Krauß, T.
2017-05-01
The growing number of available optical remote sensing data providing large spatial and temporal coverage enables the coherent and gapless observation of the earth's surface on the scale of whole countries or continents. To produce datasets of that size, individual satellite scenes have to be stitched together forming so-called mosaics. Here the problem arises that the different images feature varying radiometric properties depending on the momentary acquisition conditions. The interpretation of optical remote sensing data is to a great extent based on the analysis of the spectral composition of an observed surface reflection. Therefore the normalization of all images included in a large image mosaic is necessary to ensure consistent results concerning the application of procedures to the whole dataset. In this work an algorithm is described which enables the automated spectral harmonization of satellite images to a reference scene. As the stable and satisfying functionality of the proposed algorithm was already put to operational use to process a high number of SPOT-4/-5, IRS LISS-III and Landsat-5 scenes in the frame of the European Environment Agency's Copernicus/GMES Initial Operations (GIO) High-Resolution Layer (HRL) mapping of the HRL Forest for 20 Western, Central and (South)Eastern European countries, it is further evaluated on its reliability concerning the application to newer Sentinel-2 multispectral imaging products. The results show that the algorithm is comparably efficient for the processing of satellite image data from sources other than the sensor configurations it was originally designed for.
Predicting the Location of Human Perirhinal Cortex, Brodmann's area 35, from MRI
Augustinack, Jean C.; Huber, Kristen E.; Stevens, Allison A.; Roy, Michelle; Frosch, Matthew P.; van der Kouwe, André J.W.; Wald, Lawrence L.; Van Leemput, Koen; McKee, Ann; Fischl, Bruce
2012-01-01
The perirhinal cortex (Brodmann's area 35) is a multimodal area that is important for normal memory function. Specifically, perirhinal cortex is involved in detection of novel objects and manifests neurofibrillary tangles in Alzheimer's disease very early in disease progression. We scanned ex vivo brain hemispheres at standard resolution (1 mm × 1 mm × 1 mm) to construct pial/white matter surfaces in FreeSurfer and scanned again at high resolution (120 μm × 120 μm × 120 μm) to determine cortical architectural boundaries. After labeling perirhinal area 35 in the high resolution images, we mapped the high resolution labels to the surface models to localize area 35 in fourteen cases. We validated the area boundaries determined using histological Nissl staining. To test the accuracy of the probabilistic mapping, we measured the Hausdorff distance between the predicted and true labels and found that the median Hausdorff distance was 4.0 mm for left hemispheres (n = 7) and 3.2 mm for right hemispheres (n = 7) across subjects. To show the utility of perirhinal localization, we mapped our labels to a subset of the Alzheimer's Disease Neuroimaging Initiative dataset and found decreased cortical thickness measures in mild cognitive impairment and Alzheimer's disease compared to controls in the predicted perirhinal area 35. Our ex vivo probabilistic mapping of perirhinal cortex provides histologically validated, automated and accurate labeling of architectonic regions in the medial temporal lobe, and facilitates the analysis of atrophic changes in a large dataset for earlier detection and diagnosis. PMID:22960087
Optimising Habitat-Based Models for Wide-Ranging Marine Predators: Scale Matters
NASA Astrophysics Data System (ADS)
Scales, K. L.; Hazen, E. L.; Jacox, M.; Edwards, C. A.; Bograd, S. J.
2016-12-01
Predicting the responses of marine top predators to dynamic oceanographic conditions requires habitat-based models that sufficiently capture environmental preferences. Spatial resolution and temporal averaging of environmental data layers is a key aspect of model construction. The utility of surfaces contemporaneous to animal movement (e.g. daily, weekly), versus synoptic products (monthly, seasonal, climatological) is currently under debate, as is the optimal spatial resolution for predictive products. Using movement simulations with built-in environmental preferences (correlated random walks, multi-state hidden Markov-type models) together with modeled (Regional Oceanographic Modeling System, ROMS) and remotely-sensed (MODIS-Aqua) datasets, we explored the effects of degrading environmental surfaces (3km - 1 degree, daily - climatological) on model inference. We simulated the movements of a hypothetical wide-ranging marine predator through the California Current system over a three month period (May-June-July), based on metrics derived from previously published blue whale Balaenoptera musculus tracking studies. Results indicate that models using seasonal or climatological data fields can overfit true environmental preferences, in both presence-absence and behaviour-based model formulations. Moreover, the effects of a degradation in spatial resolution are more pronounced when using temporally averaged fields than when using daily, weekly or monthly datasets. In addition, we observed a notable divergence between the `best' models selected using common methods (e.g. AUC, AICc) and those that most accurately reproduced built-in environmental preferences. These findings have important implications for conservation and management of marine mammals, seabirds, sharks, sea turtles and large teleost fish, particularly in implementing dynamic ocean management initiatives and in forecasting responses to future climate-mediated ecosystem change.
Optimising Habitat-Based Models for Wide-Ranging Marine Predators: Scale Matters
NASA Astrophysics Data System (ADS)
Scales, K. L.; Hazen, E. L.; Jacox, M.; Edwards, C. A.; Bograd, S. J.
2016-02-01
Predicting the responses of marine top predators to dynamic oceanographic conditions requires habitat-based models that sufficiently capture environmental preferences. Spatial resolution and temporal averaging of environmental data layers is a key aspect of model construction. The utility of surfaces contemporaneous to animal movement (e.g. daily, weekly), versus synoptic products (monthly, seasonal, climatological) is currently under debate, as is the optimal spatial resolution for predictive products. Using movement simulations with built-in environmental preferences (correlated random walks, multi-state hidden Markov-type models) together with modeled (Regional Oceanographic Modeling System, ROMS) and remotely-sensed (MODIS-Aqua) datasets, we explored the effects of degrading environmental surfaces (3km - 1 degree, daily - climatological) on model inference. We simulated the movements of a hypothetical wide-ranging marine predator through the California Current system over a three month period (May-June-July), based on metrics derived from previously published blue whale Balaenoptera musculus tracking studies. Results indicate that models using seasonal or climatological data fields can overfit true environmental preferences, in both presence-absence and behaviour-based model formulations. Moreover, the effects of a degradation in spatial resolution are more pronounced when using temporally averaged fields than when using daily, weekly or monthly datasets. In addition, we observed a notable divergence between the `best' models selected using common methods (e.g. AUC, AICc) and those that most accurately reproduced built-in environmental preferences. These findings have important implications for conservation and management of marine mammals, seabirds, sharks, sea turtles and large teleost fish, particularly in implementing dynamic ocean management initiatives and in forecasting responses to future climate-mediated ecosystem change.
NASA Astrophysics Data System (ADS)
Awe, Thomas
2017-10-01
Implosions on the Z Facility assemble high-energy-density plasmas for radiation effects and ICF experiments, but achievable stagnation pressures and temperatures are degraded by the Magneto-Rayleigh-Taylor (MRT) instability. While the beryllium liners (tubes) used in Magnetized Liner Inertial Fusion (MagLIF) experiments are astonishingly smooth (10 to 50 nm RMS roughness), they also contain distributed micron-scale resistive inclusions, and large MRT amplitudes are observed. Early in the implosion, an electrothermal instability (ETI) may provide a perturbation which greatly exceeds the initial surface roughness of the liner. Resistive inhomogeneities drive nonuniform current density and Joule heating, resulting in locally higher temperature, and thus still higher resistivity. Such unstable temperature and pressure growth produce density perturbations which seed MRT. For MagLIF liners, ETI seeding of MRT has been inferred by evaluating late-time MRT, but a direct observation of ETI is not made. ETI is directly observed on the surface of 1.0-mm-diameter solid Al rods pulsed to 1 MA in 100 ns via high resolution gated optical imaging (2 ns temporal and 3 micron spatial resolution). Aluminum 6061 alloy rods, with micron-scale resistive inclusions, consistently first demonstrate overheating from distinct, 10-micron-scale, sub-eV spots, which 5-10 ns later merge into azimuthally stretched elliptical spots and discrete strata (40-100 microns wide by 10 microns tall). Axial plasma filaments form shortly thereafter. Surface plasma can be suppressed for rods coated with dielectric, enabling extended study of the evolution of stratified ETI structures, and experimental inference of ETI growth rates. This fundamentally new and highly 3-dimensional dataset informs ETI physics, including when the ETI seed of MRT may be initiated.
A Generalized Distributed Data Match-Up Service in Support of Oceanographic Application
NASA Astrophysics Data System (ADS)
Tsontos, V. M.; Huang, T.; Holt, B.; Smith, S. R.; Bourassa, M. A.; Worley, S. J.; Ji, Z.; Elya, J. L.; Stallard, A. P.
2016-02-01
Oceanographic applications increasingly rely on the integration and colocation of satellite and field observations providing complementary data coverage over a continuum of spatio-temporal scales. Here we report on a collaborative venture between NASA/JPL, NCAR and FSU/COAPS to develop a Distributed Oceanographic Match-up Service (DOMS). The DOMS project aims to implement a technical infrastructure providing a generalized, publicly accessible data collocation capability for satellite and in situ datasets utilizing remote data stores in support of satellite mission cal/val and a range of research and operational applications. The service will provide a mechanism for users to specify geospatial references and receive collocated satellite and field observations within the selected spatio-temporal domain and matchup window extent. DOMS will include several representative in situ and satellite datasets. Field data will focus on surface observations from NCAR's International Comprehensive Ocean-Atmosphere Data Set (ICOADS), the Shipboard Automated Meteorological and Oceanographic System Initiative (SAMOS) at FSU/COAPS, and the Salinity Processes in the Upper Ocean Regional Study (SPURS) data hosted at JPL/PO.DAAC. Satellite data will include JPL ASCAT L2 12.5 km winds, the Aquarius L2 orbital dataset, MODIS L2 swath data, and the high-resolution gridded L4 MUR-SST product. Importantly, while DOMS will be developed with these select datasets, it will be readily extendable for other in situ and satellite data collections and easily ported to other remote providers, thus potentially supporting additional science disciplines. Technical challenges to be addressed include: 1) ensuring accurate, efficient, and scalable match-up algorithm performance, 2) undertaking colocation using datasets that are distributed on the network, and 3) returning matched observations with sufficient metadata so that value differences can be properly interpreted. DOMS leverages existing technologies (EDGE, w10n, OPeNDAP, relational and graph/triple-store databases) and cloud computing. It will implement both a web portal interface for users to review and submit match-up requests interactively and underlying web service interface facilitating large-scale and automated machine-to-machine based queries.
Publishing high-quality climate data on the semantic web
NASA Astrophysics Data System (ADS)
Woolf, Andrew; Haller, Armin; Lefort, Laurent; Taylor, Kerry
2013-04-01
The effort over more than a decade to establish the semantic web [Berners-Lee et. al., 2001] has received a major boost in recent years through the Open Government movement. Governments around the world are seeking technical solutions to enable more open and transparent access to Public Sector Information (PSI) they hold. Existing technical protocols and data standards tend to be domain specific, and so limit the ability to publish and integrate data across domains (health, environment, statistics, education, etc.). The web provides a domain-neutral platform for information publishing, and has proven itself beyond expectations for publishing and linking human-readable electronic documents. Extending the web pattern to data (often called Web 3.0) offers enormous potential. The semantic web applies the basic web principles to data [Berners-Lee, 2006]: using URIs as identifiers (for data objects and real-world 'things', instead of documents) making the URIs actionable by providing useful information via HTTP using a common exchange standard (serialised RDF for data instead of HTML for documents) establishing typed links between information objects to enable linking and integration Leading examples of 'linked data' for publishing PSI may be found in both the UK (http://data.gov.uk/linked-data) and US (http://www.data.gov/page/semantic-web). The Bureau of Meteorology (BoM) is Australia's national meteorological agency, and has a new mandate to establish a national environmental information infrastructure (under the National Plan for Environmental Information, NPEI [BoM, 2012a]). While the initial approach is based on the existing best practice Spatial Data Infrastructure (SDI) architecture, linked-data is being explored as a technological alternative that shows great promise for the future. We report here the first trial of government linked-data in Australia under data.gov.au. In this initial pilot study, we have taken BoM's new high-quality reference surface temperature dataset, Australian Climate Observations Reference Network - Surface Air Temperature (ACORN-SAT) [BoM, 2012b]. This dataset contains daily homogenised surface temperature observations for 112 locations around Australia, dating back to 1910. An ontology for the dataset was developed [Lefort et. al., 2012], based on the existing Semantic Sensor Network ontology [Compton et. al., 2012] and the W3C RDF Data Cube vocabulary [W3C, 2012]. Additional vocabularies were developed, e.g. for BoM weather stations and rainfall districts. The dataset was converted to RDF and loaded into an RDF triplestore. The Linked-Data API (http://code.google.com/p/linked-data-api) was used to configure specific URI query patterns (e.g. for observation timeseries slices by station), and a SPARQL endpoint was provided for direct querying. In addition, some demonstration 'mash-ups' were developed, providing an interactive browser-based interface to the temperature timeseries. References [Berners-Lee et. al., 2001] Tim Berners-Lee, James Hendler and Ora Lassila (2001), "The Semantic Web", Scientific American, May 2001. [Berners-Lee, 2006] Tim Berners-Lee (2006), "Linked Data - Design Issues", W3C [http://www.w3.org/DesignIssues/LinkedData.html] [BoM, 2012a] Bureau of Meteorology (2012), "Environmental information" [http://www.bom.gov.au/environment/] [BoM, 2012b] Bureau of Meteorology (2012), "Australian Climate Observations Reference Network - Surface Air Temperature" [http://www.bom.gov.au/climate/change/acorn-sat/] [Compton et. al., 2012] Michael Compton, Payam Barnaghi, Luis Bermudez, Raul Garcia-Castro, Oscar Corcho, Simon Cox, John Graybeal, Manfred Hauswirth, Cory Henson, Arthur Herzog, Vincent Huang, Krzysztof Janowicz, W. David Kelsey, Danh Le Phuoc, Laurent Lefort, Myriam Leggieri, Holger Neuhaus, Andriy Nikolov, Kevin Page, Alexandre Passant, Amit Sheth, Kerry Taylor (2012), "The SSN Ontology of the W3C Semantic Sensor Network Incubator Group", J. Web Semantics, 17 (2012) [http://dx.doi.org/10.1016/j.websem.2012.05.003] [Lefort et. al., 2012] Laurent Lefort, Josh Bobruk, Armin Haller, Kerry Taylor and Andrew Woolf (2012), "A Linked Sensor Data Cube for a 100 Year Homogenised daily temperature dataset", Proc. Semantic Sensor Networks 2012 [http://ceur-ws.org/Vol-904/paper10.pdf] [W3C, 2012] W3C (2012), "The RDF Data Cube Vocabulary", [http://www.w3.org/TR/vocab-data-cube/
Quantifying Uncertainties in Land-Surface Microwave Emissivity Retrievals
NASA Technical Reports Server (NTRS)
Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko
2013-01-01
Uncertainties in the retrievals of microwaveland-surface emissivities are quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including the Special Sensor Microwave Imager, the Tropical Rainfall Measuring Mission Microwave Imager, and the Advanced Microwave Scanning Radiometer for Earth Observing System, are studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land-surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors inthe retrievals. Generally, these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 1%-4% (3-12 K) over desert and 1%-7% (3-20 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.5%-2% (2-6 K). In particular, at 85.5/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are most likely caused by rain/cloud contamination, which can lead to random errors up to 10-17 K under the most severe conditions.
NASA Astrophysics Data System (ADS)
Hopp, T.; Zapf, M.; Ruiter, N. V.
2014-03-01
An essential processing step for comparison of Ultrasound Computer Tomography images to other modalities, as well as for the use in further image processing, is to segment the breast from the background. In this work we present a (semi-) automated 3D segmentation method which is based on the detection of the breast boundary in coronal slice images and a subsequent surface fitting. The method was evaluated using a software phantom and in-vivo data. The fully automatically processed phantom results showed that a segmentation of approx. 10% of the slices of a dataset is sufficient to recover the overall breast shape. Application to 16 in-vivo datasets was performed successfully using semi-automated processing, i.e. using a graphical user interface for manual corrections of the automated breast boundary detection. The processing time for the segmentation of an in-vivo dataset could be significantly reduced by a factor of four compared to a fully manual segmentation. Comparison to manually segmented images identified a smoother surface for the semi-automated segmentation with an average of 11% of differing voxels and an average surface deviation of 2mm. Limitations of the edge detection may be overcome by future updates of the KIT USCT system, allowing a fully-automated usage of our segmentation approach.
NASA Technical Reports Server (NTRS)
Roberts, J. Brent; Robertson, Franklin R.; Clayson, Carol Anne
2012-01-01
Improved estimates of near-surface air temperature and air humidity are critical to the development of more accurate turbulent surface heat fluxes over the ocean. Recent progress in retrieving these parameters has been made through the application of artificial neural networks (ANN) and the use of multi-sensor passive microwave observations. Details are provided on the development of an improved retrieval algorithm that applies the nonlinear statistical ANN methodology to a set of observations from the Advanced Microwave Scanning Radiometer (AMSR-E) and the Advanced Microwave Sounding Unit (AMSU-A) that are currently available from the NASA AQUA satellite platform. Statistical inversion techniques require an adequate training dataset to properly capture embedded physical relationships. The development of multiple training datasets containing only in-situ observations, only synthetic observations produced using the Community Radiative Transfer Model (CRTM), or a mixture of each is discussed. An intercomparison of results using each training dataset is provided to highlight the relative advantages and disadvantages of each methodology. Particular emphasis will be placed on the development of retrievals in cloudy versus clear-sky conditions. Near-surface air temperature and humidity retrievals using the multi-sensor ANN algorithms are compared to previous linear and non-linear retrieval schemes.
Abandoned Uranium Mine (AUM) Surface Areas, Navajo Nation, 2016, US EPA Region 9
This GIS dataset contains polygon features that represent all Abandoned Uranium Mines (AUMs) on or within one mile of the Navajo Nation. Attributes include mine names, aliases, Potentially Responsible Parties, reclaimation status, EPA mine status, links to AUM reports, and the region in which an AUM is located. This dataset contains 608 features.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mardirossian, Narbe; Head-Gordon, Martin
Benchmark datasets of non-covalent interactions are essential for assessing the performance of density functionals and other quantum chemistry approaches. In a recent blind test, Taylor et al. benchmarked 14 methods on a new dataset consisting of 10 dimer potential energy curves calculated using coupled cluster with singles, doubles, and perturbative triples (CCSD(T)) at the complete basis set (CBS) limit (80 data points in total). Finally, the dataset is particularly interesting because compressed, near-equilibrium, and stretched regions of the potential energy surface are extensively sampled.
Mardirossian, Narbe; Head-Gordon, Martin
2016-11-09
Benchmark datasets of non-covalent interactions are essential for assessing the performance of density functionals and other quantum chemistry approaches. In a recent blind test, Taylor et al. benchmarked 14 methods on a new dataset consisting of 10 dimer potential energy curves calculated using coupled cluster with singles, doubles, and perturbative triples (CCSD(T)) at the complete basis set (CBS) limit (80 data points in total). Finally, the dataset is particularly interesting because compressed, near-equilibrium, and stretched regions of the potential energy surface are extensively sampled.
NASA Astrophysics Data System (ADS)
Yang, Junhua; Ji, Zhenming; Chen, Deliang; Kang, Shichang; Fu, Congshen; Duan, Keqin; Shen, Miaogen
2018-06-01
The application of satellite radiance assimilation can improve the simulation of precipitation by numerical weather prediction models. However, substantial quantities of satellite data, especially those derived from low-level (surface-sensitive) channels, are rejected for use because of the difficulty in realistically modeling land surface emissivity and energy budgets. Here, we used an improved land use and leaf area index (LAI) dataset in the WRF-3DVAR assimilation system to explore the benefit of using improved quality of land surface information to improve rainfall simulation for the Shule River Basin in the northeastern Tibetan Plateau as a case study. The results for July 2013 show that, for low-level channels (e.g., channel 3), the underestimation of brightness temperature in the original simulation was largely removed by more realistic land surface information. In addition, more satellite data could be utilized in the assimilation because the realistic land use and LAI data allowed more satellite radiance data to pass the deviation test and get used by the assimilation, which resulted in improved initial driving fields and better simulation in terms of temperature, relative humidity, vertical convection, and cumulative precipitation.
Kim, Won Hwa; Singh, Vikas; Chung, Moo K.; Hinrichs, Chris; Pachauri, Deepti; Okonkwo, Ozioma C.; Johnson, Sterling C.
2014-01-01
Statistical analysis on arbitrary surface meshes such as the cortical surface is an important approach to understanding brain diseases such as Alzheimer’s disease (AD). Surface analysis may be able to identify specific cortical patterns that relate to certain disease characteristics or exhibit differences between groups. Our goal in this paper is to make group analysis of signals on surfaces more sensitive. To do this, we derive multi-scale shape descriptors that characterize the signal around each mesh vertex, i.e., its local context, at varying levels of resolution. In order to define such a shape descriptor, we make use of recent results from harmonic analysis that extend traditional continuous wavelet theory from the Euclidean to a non-Euclidean setting (i.e., a graph, mesh or network). Using this descriptor, we conduct experiments on two different datasets, the Alzheimer’s Disease NeuroImaging Initiative (ADNI) data and images acquired at the Wisconsin Alzheimer’s Disease Research Center (W-ADRC), focusing on individuals labeled as having Alzheimer’s disease (AD), mild cognitive impairment (MCI) and healthy controls. In particular, we contrast traditional univariate methods with our multi-resolution approach which show increased sensitivity and improved statistical power to detect a group-level effects. We also provide an open source implementation. PMID:24614060
The center for expanded data annotation and retrieval
Bean, Carol A; Cheung, Kei-Hoi; Dumontier, Michel; Durante, Kim A; Gevaert, Olivier; Gonzalez-Beltran, Alejandra; Khatri, Purvesh; Kleinstein, Steven H; O’Connor, Martin J; Pouliot, Yannick; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Wiser, Jeffrey A
2015-01-01
The Center for Expanded Data Annotation and Retrieval is studying the creation of comprehensive and expressive metadata for biomedical datasets to facilitate data discovery, data interpretation, and data reuse. We take advantage of emerging community-based standard templates for describing different kinds of biomedical datasets, and we investigate the use of computational techniques to help investigators to assemble templates and to fill in their values. We are creating a repository of metadata from which we plan to identify metadata patterns that will drive predictive data entry when filling in metadata templates. The metadata repository not only will capture annotations specified when experimental datasets are initially created, but also will incorporate links to the published literature, including secondary analyses and possible refinements or retractions of experimental interpretations. By working initially with the Human Immunology Project Consortium and the developers of the ImmPort data repository, we are developing and evaluating an end-to-end solution to the problems of metadata authoring and management that will generalize to other data-management environments. PMID:26112029
Impact of automatization in temperature series in Spain and comparison with the POST-AWS dataset
NASA Astrophysics Data System (ADS)
Aguilar, Enric; López-Díaz, José Antonio; Prohom Duran, Marc; Gilabert, Alba; Luna Rico, Yolanda; Venema, Victor; Auchmann, Renate; Stepanek, Petr; Brandsma, Theo
2016-04-01
Climate data records are most of the times affected by inhomogeneities. Especially inhomogeneities introducing network-wide biases are sometimes related to changes happening almost simultaneously in an entire network. Relative homogenization is difficult in these cases, especially at the daily scale. A good example of this is the substitution of manual observations (MAN) by automatic weather stations (AWS). Parallel measurements (i.e. records taken at the same time with the old (MAN) and new (AWS) sensors can provide an idea of the bias introduced and help to evaluate the suitability of different correction approaches. We present here a quality controlled dataset compiled under the DAAMEC Project, comprising 46 stations across Spain and over 85,000 parallel measurements (AWS-MAN) of daily maximum and minimum temperature. We study the differences between both sensors and compare it with the available metadata to account for internal inhomogeneities. The differences between both systems vary much across stations, with patterns more related to their particular settings than to climatic/geographical reasons. The typical median biases (AWS-MAN) by station (comprised between the interquartile range) oscillate between -0.2°C and 0.4 in daily maximum temperature and between -0.4°C and 0.2°C in daily minimum temperature. These and other results are compared with a larger network, the Parallel Observations Scientific Team, a working group of the International Surface Temperatures Initiative (ISTI-POST) dataset, which comprises our stations, as well as others from different countries in America, Asia and Europe.
NASA Astrophysics Data System (ADS)
Sanchez-Lorenzo, A.
2010-09-01
One problem encountered when establishing the causes of global dimming and brightening is the limited number of long-term solar radiation series with accurate and calibrated measurements. For this reason, the analysis is often supported and extended with the use of other climatic variables such as sunshine duration and cloud cover. Specifically, sunshine duration is defined as the amount of time usually expressed in hours that direct solar radiation exceeds a certain threshold (usually taken at 120 W m-2). Consequently, this variable can be considered as an excellent proxy measure of solar radiation at interannual and decadal time scales, with the advantage that measurements of this variable were initiated in the late 19th century in different, worldwide, main meteorological stations. Nevertheless, detailed and up-to-date analysis of sunshine duration behavior on global or hemispheric scales are still missing. Thus, starting on September 2010 in the framework of different research projects, we will engage a worldwide compilation of the longest daily or monthly sunshine duration series from the late 19th century until present. Several quality control checks and homogenization methods will be applied to the generated sunshine dataset. The relationship between the more precise downward solar radiation series from the Global Energy Balance Archive (GEBA) and the homogenized sunshine series will be studied in order to reconstruct global and regional solar irradiance at the Earth's surface since the late 19th century. Since clouds are the main cause of interannual and decadal variability of radiation reaching the Earth's surface, as a complement to the long-term sunshine series we will also compile worldwide surface cloudiness observations. With this presentation we seek to encourage the climate community to contribute with their own local datasets to the SunCloud project. The SunCloud Team: M. Wild, Institute for Atmospheric and Climate Science, ETH Zurich, Switzerland (martin.wild@env.ethz.ch) E. Pallé, Institute of Astrophysics of the Canary Islands, Spain (epalle@iac.es) J. Calbó, Group of Environmental Physics, University of Girona, Spain (josep.calbo@udg.edu) M. Brunetti, Institute of Atmospheric Sciences and Climate, Italian National Research Council, Italy (m.brunetti@isac.cnr.it) G. Stanhill, Department of Environmental Physics and Irrigation, The Volcani Center, Israel (gerald@volcani.agri.gov.il) R. Brázdil, Institute of Geography, Masaryk University, Czech Republic (brazdil@sci.muni.cz) M. Barriendos, Department of Modern History, University of Barcelona, Spain (mbarriendos@ub.edu) C. Deser, National Center for Atmospheric Research, USA (cdeser@ucar.edu) P. Pereira, Department of Environmental Protection, Vilnius Gediminas Technical University, Lithuania (pereiraub@gmail.com) C. Azorin-Molina, The CEAM Foundation (Fundación Centro de Estudios Ambientales del Mediterráneo), Spain (cazorin@ceam.es) Q. You, Institute of Tibetan Plateau Research, Chinese Academy of Sciences, Beijing, China (yqingl@126.com)
Lefranc, Sandrine; Roca, Pauline; Perrot, Matthieu; Poupon, Cyril; Le Bihan, Denis; Mangin, Jean-François; Rivière, Denis
2016-05-01
Segregating the human cortex into distinct areas based on structural connectivity criteria is of widespread interest in neuroscience. This paper presents a groupwise connectivity-based parcellation framework for the whole cortical surface using a new high quality diffusion dataset of 79 healthy subjects. Our approach performs gyrus by gyrus to parcellate the whole human cortex. The main originality of the method is to compress for each gyrus the connectivity profiles used for the clustering without any anatomical prior information. This step takes into account the interindividual cortical and connectivity variability. To this end, we consider intersubject high density connectivity areas extracted using a surface-based watershed algorithm. A wide validation study has led to a fully automatic pipeline which is robust to variations in data preprocessing (tracking type, cortical mesh characteristics and boundaries of initial gyri), data characteristics (including number of subjects), and the main algorithmic parameters. A remarkable reproducibility is achieved in parcellation results for the whole cortex, leading to clear and stable cortical patterns. This reproducibility has been tested across non-overlapping subgroups and the validation is presented mainly on the pre- and postcentral gyri. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
The Nanomaterial Data Curation Initiative (NDCI) explores the critical aspect of data curation within the development of informatics approaches to understanding nanomaterial behavior. Data repositories and tools for integrating and interrogating complex nanomaterial datasets are...
Web-GIS visualisation of permafrost-related Remote Sensing products for ESA GlobPermafrost
NASA Astrophysics Data System (ADS)
Haas, A.; Heim, B.; Schaefer-Neth, C.; Laboor, S.; Nitze, I.; Grosse, G.; Bartsch, A.; Kaab, A.; Strozzi, T.; Wiesmann, A.; Seifert, F. M.
2016-12-01
The ESA GlobPermafrost (www.globpermafrost.info) provides a remote sensing service for permafrost research and applications. The service comprises of data product generation for various sites and regions as well as specific infrastructure allowing overview and access to datasets. Based on an online user survey conducted within the project, the user community extensively applies GIS software to handle remote sensing-derived datasets and requires preview functionalities before accessing them. In response, we develop the Permafrost Information System PerSys which is conceptualized as an open access geospatial data dissemination and visualization portal. PerSys will allow visualisation of GlobPermafrost raster and vector products such as land cover classifications, Landsat multispectral index trend datasets, lake and wetland extents, InSAR-based land surface deformation maps, rock glacier velocity fields, spatially distributed permafrost model outputs, and land surface temperature datasets. The datasets will be published as WebGIS services relying on OGC-standardized Web Mapping Service (WMS) and Web Feature Service (WFS) technologies for data display and visualization. The WebGIS environment will be hosted at the AWI computing centre where a geodata infrastructure has been implemented comprising of ArcGIS for Server 10.4, PostgreSQL 9.2 and a browser-driven data viewer based on Leaflet (http://leafletjs.com). Independently, we will provide an `Access - Restricted Data Dissemination Service', which will be available to registered users for testing frequently updated versions of project datasets. PerSys will become a core project of the Arctic Permafrost Geospatial Centre (APGC) within the ERC-funded PETA-CARB project (www.awi.de/petacarb). The APGC Data Catalogue will contain all final products of GlobPermafrost, allow in-depth dataset search via keywords, spatial and temporal coverage, data type, etc., and will provide DOI-based links to the datasets archived in the long-term, open access PANGAEA data repository.
Given the relatively high cost of mapping impervious surfaces at regional scales, substantial effort is being expended in the development of moderate-resolution, satellite-based methods for estimating impervious surface area (ISA). To rigorously assess the accuracy of these data ...
Capturing Data Connections within the Climate Data Initiative to Support Resiliency
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Bugbee, K.; Weigel, A. M.; Tilmes, C.
2015-12-01
The Climate Data Initiative (CDI) focuses on preparing the United States for the impacts of climate change by leveraging existing federal climate-relevant data to stimulate innovation and private-sector entrepreneurship supporting national climate-change preparedness. To achieve these goals, relevant data was curated around seven thematic areas relevant to climate change resiliency. Data for each theme was selected by subject matter experts from various Federal agencies and collected in Data.gov at http://climate.data.gov. While the curation effort for each theme has been immensely valuable on its own, in the end, the themes essentially become a long directory or a list. Establishing valuable connections between datasets and their intended use is lost. Therefore, the user understands that the datasets in the list have been approved by the CDI subject matter experts but has less certainty when making connections between the various datasets and their possible applications. Additionally, the intended use of the curated list is overwhelming and can be difficult to interpret. In order to better address the needs of the CDI data end users, the CDI team has been developing a new controlled vocabulary that will assist in capturing connections between datasets. This new vocabulary will be implemented in the Global Change Information System (GCIS), which has the capability to link individual items within the system. This presentation will highlight the methodology used to develop the controlled vocabulary that will aid end users in both understanding and locating relevant datasets for their intended use.
Jakobsson, Sofie; Ekman, Tor; Ahlberg, Karin
2015-06-01
To explore patients' experience of their illness when undergoing pelvic radiotherapy by describing the presence and severity of distressful symptoms and to explore initiated self-care activities in response to illness and symptoms. A mixed-method study was performed which included a core qualitative dataset and a supplementary quantitative dataset. Twenty-nine women undergoing five weeks of radiotherapy were prospectively interviewed during five weeks of treatment in order to capture experiences, distressful symptoms and quality of life during treatment. Grounded theory formed collection and analysis of the qualitative dataset and statistics were used to analyze the quantitative dataset. A maintained self-identity was concluded as being central during the trajectory of treatment. Initiated self-care activities served to alleviate physical, emotional, and social suffering; helping the respondents keep their integrity and sense of self. Previous life experiences influenced the process of being able to maintain self-identity. The gastrointestinal symptoms and pain caused most distress. In order to be able to maintain self-identity patients endure treatment by focusing on symptoms, on getting cured and on their self-image. Several distressful symptoms implied social limitations and a sense that the body would not take the strain. The result of this study can help health care professionals to gain a better understanding of the struggle to endure pelvic radiotherapy. Further, health care professionals should be more proactive in alleviating their patients' distressful symptoms. The results imply that previous life experiences should precede initiated interventions because these life experiences affect the patients' self-care activities. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Otterman, J.; Ardizzone, J.; Atlas, R.; Demaree, G.; Huth, R.; Jaagus, J.; Koslowsky, D.; Przybylak, R.; Wos, A.; Atlas, Robert (Technical Monitor)
1999-01-01
It is well recognized that advection from the North Atlantic has a profound effect on the climatic conditions in central Europe. A new dataset of the ocean-surface winds, derived from the Special Sensor Microwave Imager, SSM/1, is now available. This satellite instrument measures the wind speed, but not the direction. However, variational analysis developed at the Data Assimilation Office, NASA Goddard Space Flight Center, by combining the SSM/I measurements with wind vectors measured from ships, etc., produced global maps of the ocean surface winds suitable for climate analysis. From this SSM/I dataset, a specific index I(sub na) of the North Atlantic surface winds has been developed, which pertinently quantifies the low-level advection into central Europe. For a selected time-period, the index I(sub na) reports the average of the amplitude of the wind, averaging only the speed when the direction is from the southwest (when the wind is from another direction, the contribution counts to the average as zero speed). Strong correlations were found between February I(sub na) and the surface air temperatures in Europe 50-60 deg N. In the present study, we present the correlations between I(sub na) and temperature I(sub s), and also the sensitivity of T(sub s), to an increase in I(sub na), in various seasons and various regions. We specifically analyze the flow of maritime-air from the North Atlantic that produced two extraordinary warm periods: February 1990, and early-winter 2000/2001. The very cold December 2001 was clearly due to a northerly flow. Our conclusion is that the SSM/I dataset is very useful for providing insight to the forcing of climatic fluctuations in Europe.
The Status of the NASA MEaSUREs Combined ASTER and MODIS Emissivity Over Land (CAMEL) Products
NASA Astrophysics Data System (ADS)
Borbas, E. E.; Feltz, M.; Hulley, G. C.; Knuteson, R. O.; Hook, S. J.
2017-12-01
As part of a NASA MEaSUREs Land Surface Temperature and Emissivity project, the University of Wisconsin, Space Science and Engineering Center and the NASA's Jet Propulsion Laboratory have developed a global monthly mean emissivity Earth System Data Record (ESDR). The CAMEL ESDR was produced by merging two current state-of-the-art emissivity datasets: the UW-Madison MODIS Infrared emissivity dataset (UWIREMIS), and the JPL ASTER Global Emissivity Dataset v4 (GEDv4). The dataset includes monthly global data records of emissivity, uncertainty at 13 hinge points between 3.6-14.3 µm, and Principal Components Analysis (PCA) coefficients at 5 kilometer resolution for years 2003 to 2015. A high spectral resolution algorithm is also provided for HSR applications. The dataset is currently being tested in sounder retrieval algorithm (e.g. CrIS, IASI) and has already been implemented in RTTOV-12 for immediate use in numerical weather modeling and data assimilation. This poster will present the current status of the dataset.
CIFAR10-DVS: An Event-Stream Dataset for Object Classification
Li, Hongmin; Liu, Hanchao; Ji, Xiangyang; Li, Guoqi; Shi, Luping
2017-01-01
Neuromorphic vision research requires high-quality and appropriately challenging event-stream datasets to support continuous improvement of algorithms and methods. However, creating event-stream datasets is a time-consuming task, which needs to be recorded using the neuromorphic cameras. Currently, there are limited event-stream datasets available. In this work, by utilizing the popular computer vision dataset CIFAR-10, we converted 10,000 frame-based images into 10,000 event streams using a dynamic vision sensor (DVS), providing an event-stream dataset of intermediate difficulty in 10 different classes, named as “CIFAR10-DVS.” The conversion of event-stream dataset was implemented by a repeated closed-loop smooth (RCLS) movement of frame-based images. Unlike the conversion of frame-based images by moving the camera, the image movement is more realistic in respect of its practical applications. The repeated closed-loop image movement generates rich local intensity changes in continuous time which are quantized by each pixel of the DVS camera to generate events. Furthermore, a performance benchmark in event-driven object classification is provided based on state-of-the-art classification algorithms. This work provides a large event-stream dataset and an initial benchmark for comparison, which may boost algorithm developments in even-driven pattern recognition and object classification. PMID:28611582
CIFAR10-DVS: An Event-Stream Dataset for Object Classification.
Li, Hongmin; Liu, Hanchao; Ji, Xiangyang; Li, Guoqi; Shi, Luping
2017-01-01
Neuromorphic vision research requires high-quality and appropriately challenging event-stream datasets to support continuous improvement of algorithms and methods. However, creating event-stream datasets is a time-consuming task, which needs to be recorded using the neuromorphic cameras. Currently, there are limited event-stream datasets available. In this work, by utilizing the popular computer vision dataset CIFAR-10, we converted 10,000 frame-based images into 10,000 event streams using a dynamic vision sensor (DVS), providing an event-stream dataset of intermediate difficulty in 10 different classes, named as "CIFAR10-DVS." The conversion of event-stream dataset was implemented by a repeated closed-loop smooth (RCLS) movement of frame-based images. Unlike the conversion of frame-based images by moving the camera, the image movement is more realistic in respect of its practical applications. The repeated closed-loop image movement generates rich local intensity changes in continuous time which are quantized by each pixel of the DVS camera to generate events. Furthermore, a performance benchmark in event-driven object classification is provided based on state-of-the-art classification algorithms. This work provides a large event-stream dataset and an initial benchmark for comparison, which may boost algorithm developments in even-driven pattern recognition and object classification.
Land Capability Potential Index (LCPI) for the Lower Missouri River Valley
Jacobson, Robert B.; Chojnacki, Kimberly A.; Reuter, Joanna M.
2007-01-01
The Land Capability Potential Index (LCPI) was developed to serve as a relatively coarse-scale index to delineate broad land capability classes in the valley of the Lower Missouri River. The index integrates fundamental factors that determine suitability of land for various uses, and may provide a useful mechanism to guide land-management decisions. The LCPI was constructed from integration of hydrology, hydraulics, land-surface elevations, and soil permeability (or saturated hydraulic conductivity) datasets for an area of the Lower Missouri River, river miles 423–670. The LCPI estimates relative wetness based on intersecting water-surface elevations, interpolated from measurements or calculated from hydraulic models, with a high-resolution land-surface elevation dataset. The potential for wet areas to retain or drain water is assessed using soil-drainage classes that are estimated from saturated hydraulic conductivity of surface soils. Terrain mapping that delineates areas with convex, concave, and flat parts of the landscape provides another means to assess tendency of landscape patches to retain surface water.
NASA Astrophysics Data System (ADS)
Li, Hongsong; Lyu, Hang; Liao, Ningfang; Wu, Wenmin
2016-12-01
The bidirectional reflectance distribution function (BRDF) data in the ultraviolet (UV) band are valuable for many applications including cultural heritage, material analysis, surface characterization, and trace detection. We present a BRDF measurement instrument working in the near- and middle-UV spectral range. The instrument includes a collimated UV light source, a rotation stage, a UV imaging spectrometer, and a control computer. The data captured by the proposed instrument describe spatial, spectral, and angular variations of the light scattering from a sample surface. Such a multidimensional dataset of an example sample is captured by the proposed instrument and analyzed by a k-mean clustering algorithm to separate surface regions with same material but different surface roughnesses. The clustering results show that the angular dimension of the dataset can be exploited for surface roughness characterization. The two clustered BRDFs are fitted to a theoretical BRDF model. The fitting results show good agreement between the measurement data and the theoretical model.
Brokering technologies to realize the hydrology scenario in NSF BCube
NASA Astrophysics Data System (ADS)
Boldrini, Enrico; Easton, Zachary; Fuka, Daniel; Pearlman, Jay; Nativi, Stefano
2015-04-01
In the National Science Foundation (NSF) BCube project an international team composed of cyber infrastructure experts, geoscientists, social scientists and educators are working together to explore the use of brokering technologies, initially focusing on four domains: hydrology, oceans, polar, and weather. In the hydrology domain, environmental models are fundamental to understand the behaviour of hydrological systems. A specific model usually requires datasets coming from different disciplines for its initialization (e.g. elevation models from Earth observation, weather data from Atmospheric sciences, etc.). Scientific datasets are usually available on heterogeneous publishing services, such as inventory and access services (e.g. OGC Web Coverage Service, THREDDS Data Server, etc.). Indeed, datasets are published according to different protocols, moreover they usually come in different formats, resolutions, Coordinate Reference Systems (CRSs): in short different grid environments depending on the original data and the publishing service processing capabilities. Scientists can thus be impeded by the burden of discovery, access and normalize the desired datasets to the grid environment required by the model. These technological tasks of course divert scientists from their main, scientific goals. The use of GI-axe brokering framework has been experimented in a hydrology scenario where scientists needed to compare a particular hydrological model with two different input datasets (digital elevation models): - the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) dataset, v.2. - the Shuttle Radar Topography Mission (SRTM) dataset, v.3. These datasets were published by means of Hyrax Server technology, which can provide NetCDF files at their original resolution and CRS. Scientists had their model running on ArcGIS, so the main goal was to import the datasets using the available ArcPy library and have EPSG:4326 with the same resolution grid as the reference system, so that model outputs could be compared. ArcPy however is able to access only GeoTIff datasets that are published by a OGC Web Coverage Service (WCS). The GI-axe broker has then been deployed between the client application and the data providers. It has been configured to broker the two different Hyrax service endpoints and republish the data content through a WCS interface for the use of the ArcPy library. Finally, scientists were able to easily run the model, and to concentrate on the comparison of the different results obtained according to the selected input dataset. The use of a third party broker to perform such technological tasks has also shown to have the potential advantage of increasing the repeatability of a study among different researchers.
DOMstudio: an integrated workflow for Digital Outcrop Model reconstruction and interpretation
NASA Astrophysics Data System (ADS)
Bistacchi, Andrea
2015-04-01
Different Remote Sensing technologies, including photogrammetry and LIDAR, allow collecting 3D dataset that can be used to create 3D digital representations of outcrop surfaces, called Digital Outcrop Models (DOM), or sometimes Virtual Outcrop Models (VOM). Irrespective of the Remote Sensing technique used, DOMs can be represented either by photorealistic point clouds (PC-DOM) or textured surfaces (TS-DOM). The first are datasets composed of millions of points with XYZ coordinates and RGB colour, whilst the latter are triangulated surfaces onto which images of the outcrop have been mapped or "textured" (applying a tech-nology originally developed for movies and videogames). Here we present a workflow that allows exploiting in an integrated and efficient, yet flexible way, both kinds of dataset: PC-DOMs and TS-DOMs. The workflow is composed of three main steps: (1) data collection and processing, (2) interpretation, and (3) modelling. Data collection can be performed with photogrammetry, LIDAR, or other techniques. The quality of photogrammetric datasets obtained with Structure From Motion (SFM) techniques has shown a tremendous improvement over the past few years, and this is becoming the more effective way to collect DOM datasets. The main advantages of photogrammetry over LIDAR are represented by the very simple and lightweight field equipment (a digital camera), and by the arbitrary spatial resolution, that can be increased simply getting closer to the out-crop or by using a different lens. It must be noted that concerns about the precision of close-range photogrammetric surveys, that were justified in the past, are no more a problem if modern software and acquisition schemas are applied. In any case, LIDAR is a well-tested technology and it is still very common. Irrespective of the data collection technology, the output will be a photorealistic point cloud and a collection of oriented photos, plus additional imagery in special projects (e.g. infrared images). This dataset can be used as-is (PC-DOM), or a 3D triangulated surface can be interpolated from the point cloud, and images can be used to associate a texture to this surface (TS-DOM). In the DOMstudio workflow we use both PC-DOMs and TS-DOMs. Particularly, the latter are obtained projecting the original images onto the triangulated surface, without any downsampling, thus retaining the original resolution and quality of images collected in the field. In the DOMstudio interpretation step, PC-DOM is considered the best option for fracture analysis in outcrops where facets corresponding to fractures are present. This allows obtaining orientation statistics (e.g. stereoplots, Fisher statistics, etc.) directly from a point cloud where, for each point, the unit vector normal to the outcrop surface has been calculated. A recent development in this kind of processing is represented by the possibility to automatically select (segment) subset point clouds representing single fracture surfaces, which can be used for studies on fracture length, spacing, etc., allowing to obtain parameters like the frequency-length distribution, P21, etc. PC-DOM interpretation can be combined or complemented, depending on the outcrop morphology, with an interpretation carried out on a TS-DOM in terms of traces, which are the linear intersection of "geological" surfaces (fractures, faults, bedding, etc.) with the outcrop surface. This kind of interpretation is very well suited for outcrops with smooth surfaces, and can be performed either by manual picking, or by applying image analysis techniques on the images associated with the DOM. In this case, a huge mass of data, with very high resolution, can be collected very effectively. If we consider applications like lithological or mineral map-ping, TS-DOM datasets are the only suitable support. Finally, the DOMstudio workflow produces output in formats that are compatible with all common geomodelling packages (e.g. Gocad/Skua, Petrel, Move), allowing to directly use the quantitative data collected on DOMs to generate and calibrate geological, structural, or geostatistical models. I will present examples of applications including hydrocarbon reservoir analogue studies, studies on fault zone architecture, lithological mapping on sedimentary and metamorphic rocks, and studies on the surface of planets and small bodies in the Solar System.
NASA Astrophysics Data System (ADS)
Cao, Faxian; Yang, Zhijing; Ren, Jinchang; Ling, Wing-Kuen; Zhao, Huimin; Marshall, Stephen
2017-12-01
Although the sparse multinomial logistic regression (SMLR) has provided a useful tool for sparse classification, it suffers from inefficacy in dealing with high dimensional features and manually set initial regressor values. This has significantly constrained its applications for hyperspectral image (HSI) classification. In order to tackle these two drawbacks, an extreme sparse multinomial logistic regression (ESMLR) is proposed for effective classification of HSI. First, the HSI dataset is projected to a new feature space with randomly generated weight and bias. Second, an optimization model is established by the Lagrange multiplier method and the dual principle to automatically determine a good initial regressor for SMLR via minimizing the training error and the regressor value. Furthermore, the extended multi-attribute profiles (EMAPs) are utilized for extracting both the spectral and spatial features. A combinational linear multiple features learning (MFL) method is proposed to further enhance the features extracted by ESMLR and EMAPs. Finally, the logistic regression via the variable splitting and the augmented Lagrangian (LORSAL) is adopted in the proposed framework for reducing the computational time. Experiments are conducted on two well-known HSI datasets, namely the Indian Pines dataset and the Pavia University dataset, which have shown the fast and robust performance of the proposed ESMLR framework.
Wave equation datuming applied to marine OBS data and to land high resolution seismic profiling
NASA Astrophysics Data System (ADS)
Barison, Erika; Brancatelli, Giuseppe; Nicolich, Rinaldo; Accaino, Flavio; Giustiniani, Michela; Tinivella, Umberta
2011-03-01
One key step in seismic data processing flows is the computation of static corrections, which relocate shots and receivers at the same datum plane and remove near surface weathering effects. We applied a standard static correction and a wave equation datuming and compared the obtained results in two case studies: 1) a sparse ocean bottom seismometers dataset for deep crustal prospecting; 2) a high resolution land reflection dataset for hydrogeological investigation. In both cases, a detailed velocity field, obtained by tomographic inversion of the first breaks, was adopted to relocate shots and receivers to the datum plane. The results emphasize the importance of wave equation datuming to properly handle complex near surface conditions. In the first dataset, the deployed ocean bottom seismometers were relocated to the sea level (shot positions) and a standard processing sequence was subsequently applied to the output. In the second dataset, the application of wave equation datuming allowed us to remove the coherent noise, such as ground roll, and to improve the image quality with respect to the application of static correction. The comparison of the two approaches evidences that the main reflecting markers are better resolved when the wave equation datuming procedure is adopted.
Historical instrumental climate data for Australia - quality and utility for palaeoclimatic studies
NASA Astrophysics Data System (ADS)
Nicholls, Neville; Collins, Dean; Trewin, Blair; Hope, Pandora
2006-10-01
The quality and availability of climate data suitable for palaeoclimatic calibration and verification for the Australian region are discussed and documented. Details of the various datasets, including problems with the data, are presented. High-quality datasets, where such problems are reduced or even eliminated, are discussed. Many climate datasets are now analysed onto grids, facilitating the preparation of regional-average time series. Work is under way to produce such high-quality, gridded datasets for a variety of hitherto unavailable climate data, including surface humidity, pan evaporation, wind, and cloud. An experiment suggests that only a relatively small number of palaeoclimatic time series could provide a useful estimate of long-term changes in Australian annual average temperature. Copyright
Influence of spatial and temporal scales in identifying temperature extremes
NASA Astrophysics Data System (ADS)
van Eck, Christel M.; Friedlingstein, Pierre; Mulder, Vera L.; Regnier, Pierre A. G.
2016-04-01
Extreme heat events are becoming more frequent. Notable are severe heatwaves such as the European heatwave of 2003, the Russian heat wave of 2010 and the Australian heatwave of 2013. Surface temperature is attaining new maxima not only during the summer but also during the winter. The year of 2015 is reported to be a temperature record breaking year for both summer and winter. These extreme temperatures are taking their human and environmental toll, emphasizing the need for an accurate method to define a heat extreme in order to fully understand the spatial and temporal spread of an extreme and its impact. This research aims to explore how the use of different spatial and temporal scales influences the identification of a heat extreme. For this purpose, two near-surface temperature datasets of different temporal scale and spatial scale are being used. First, the daily ERA-Interim dataset of 0.25 degree and a time span of 32 years (1979-2010). Second, the daily Princeton Meteorological Forcing Dataset of 0.5 degree and a time span of 63 years (1948-2010). A temperature is considered extreme anomalous when it is surpassing the 90th, 95th, or the 99th percentile threshold based on the aforementioned pre-processed datasets. The analysis is conducted on a global scale, dividing the world in IPCC's so-called SREX regions developed for the analysis of extreme climate events. Pre-processing is done by detrending and/or subtracting the monthly climatology based on 32 years of data for both datasets and on 63 years of data for only the Princeton Meteorological Forcing Dataset. This results in 6 datasets of temperature anomalies from which the location in time and space of the anomalous warm days are identified. Comparison of the differences between these 6 datasets in terms of absolute threshold temperatures for extremes and the temporal and spatial spread of the extreme anomalous warm days show a dependence of the results on the datasets and methodology used. This stresses the need for a careful selection of data and methodology when identifying heat extremes.
Structure of the Cascadia Subduction Zone Imaged Using Surface Wave Tomography
NASA Astrophysics Data System (ADS)
Schaeffer, A. J.; Audet, P.
2017-12-01
Studies of the complete structure of the Cascadia subduction zone from the ridge to the arc have historically been limited by the lack of offshore ocean bottom seismograph (OBS) infrastructure. On land, numerous dense seismic deployments have illuminated detailed structures and dynamics associated with the interaction between the subducting oceanic plate and the overriding continental plate, including cycling of fluids, serpentinization of the overlying forearc mantle wedge, and the location of the upper surface of the Juan de Fuca plate as it subducts beneath the Pacific Northwest. In the last half-decade, the Cascadia Initiative (CI), along with Neptune (ONC) and several other OBS initiatives, have instrumented both the continental shelf and abyssal plains off shore of the Cascadia subduction zone, facilitating the construction of a complete picture of the subduction zone from ridge to trench and volcanic arc. In this study, we present a preliminary azimuthally anisotropic surface-wave phase-velocity based model of the complete system, capturing both the young, unaltered Juan de Fuca plate from the ridge, to its alteration as it enters the subduction zone, in addition to the overlying continent. This model is constructed from a combination of ambient noise cross-correlations and teleseismic two station interferometry, and combines together concurrently running offshore OBS and onshore stations. We furthermore perform a number of representative 1D depth inversions for shear velocity to categorize the pristine oceanic, subducted oceanic, and continental crust and lithospheric structure. In the future the dispersion dataset will be jointly inverted with receiver functions to constrain a 3D shear-velocity model of the complete region.
Konias, Sokratis; Chouvarda, Ioanna; Vlahavas, Ioannis; Maglaveras, Nicos
2005-09-01
Current approaches for mining association rules usually assume that the mining is performed in a static database, where the problem of missing attribute values does not practically exist. However, these assumptions are not preserved in some medical databases, like in a home care system. In this paper, a novel uncertainty rule algorithm is illustrated, namely URG-2 (Uncertainty Rule Generator), which addresses the problem of mining dynamic databases containing missing values. This algorithm requires only one pass from the initial dataset in order to generate the item set, while new metrics corresponding to the notion of Support and Confidence are used. URG-2 was evaluated over two medical databases, introducing randomly multiple missing values for each record's attribute (rate: 5-20% by 5% increments) in the initial dataset. Compared with the classical approach (records with missing values are ignored), the proposed algorithm was more robust in mining rules from datasets containing missing values. In all cases, the difference in preserving the initial rules ranged between 30% and 60% in favour of URG-2. Moreover, due to its incremental nature, URG-2 saved over 90% of the time required for thorough re-mining. Thus, the proposed algorithm can offer a preferable solution for mining in dynamic relational databases.
EnviroAtlas - Austin, TX - Impervious Proximity Gradient
In any given 1-square meter point in this EnviroAtlas dataset, the value shown gives the percentage of impervious surface within 1 square kilometer centered over the given point. Water is shown as '-99999' in this dataset to distinguish it from land areas with low impervious. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
An update on sORFs.org: a repository of small ORFs identified by ribosome profiling.
Olexiouk, Volodimir; Van Criekinge, Wim; Menschaert, Gerben
2018-01-04
sORFs.org (http://www.sorfs.org) is a public repository of small open reading frames (sORFs) identified by ribosome profiling (RIBO-seq). This update elaborates on the major improvements implemented since its initial release. sORFs.org now additionally supports three more species (zebrafish, rat and Caenorhabditis elegans) and currently includes 78 RIBO-seq datasets, a vast increase compared to the three that were processed in the initial release. Therefore, a novel pipeline was constructed that also enables sORF detection in RIBO-seq datasets comprising solely elongating RIBO-seq data while previously, matching initiating RIBO-seq data was necessary to delineate the sORFs. Furthermore, a novel noise filtering algorithm was designed, able to distinguish sORFs with true ribosomal activity from simulated noise, consequently reducing the false positive identification rate. The inclusion of other species also led to the development of an inner BLAST pipeline, assessing sequence similarity between sORFs in the repository. Building on the proof of concept model in the initial release of sORFs.org, a full PRIDE-ReSpin pipeline was now released, reprocessing publicly available MS-based proteomics PRIDE datasets, reporting on true translation events. Next to reporting those identified peptides, sORFs.org allows visual inspection of the annotated spectra within the Lorikeet MS/MS viewer, thus enabling detailed manual inspection and interpretation. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Sekiyama, Thomas; Kajino, Mizuo; Kunii, Masaru
2017-04-01
The authors investigated the impact of surface wind velocity data assimilation on the predictability of plume advection in the lower troposphere exploiting the radioactive cesium emitted by the Fukushima nuclear accident in March 2011 as an atmospheric tracer. It was because the radioactive cesium plume was dispersed from the sole point source exactly placed at the Fukushima Daiichi Nuclear Power Plant and its surface concentration was measured at many locations with a high frequency and high accuracy. We used a non-hydrostatic regional weather prediction model with a horizontal resolution of 3 km, which was coupled with an ensemble Kalman filter data assimilation system in this study, to simulate the wind velocity and plume advection. The main module of this weather prediction model has been developed and used operationally by the Japan Meteorological Agency (JMA) since before March 2011. The weather observation data assimilated into the model simulation were provided from two data resources; [#1] the JMA observation archives collected for numerical weather predictions (NWPs) and [#2] the land-surface wind velocity data archived by the JMA surface weather observation network. The former dataset [#1] does not contain land-surface wind velocity observations because their spatial representativeness is relatively small and therefore the land-surface wind velocity data assimilation normally deteriorates the more than one day NWP performance. The latter dataset [#2] is usually used for real-time weather monitoring and never used for the data assimilation of more than one day NWPs. We conducted two experiments (STD and TEST) to reproduce the radioactive cesium plume behavior for 48 hours from 12UTC 14 March to 12UTC 16 March 2011 over the land area of western Japan. The STD experiment was performed to replicate the operational NWP using only the #1 dataset, not assimilating land-surface wind observations. In contrast, the TEST experiment was performed assimilating both the #1 dataset and the #2 dataset including land-surface wind observations measured at more than 200 stations in the model domain. The meteorological boundary conditions for both the experiments were imported from the JMA operational global NWP model results. The modeled radioactive cesium concentrations were examined for plume arrival timing at each observatory comparing with the hourly-measured "suspended particulate matter" filter tape's cesium concentrations retrieved by Tsuruta et al. at more than 40 observatories. The averaged difference of the plume arrival times at 40 observatories between the observational reality and the STD experiment was 82.0 minutes; at this time, the forecast period was 13 hours on average. Meanwhile, The averaged difference of the TEST experiment was 72.8 minutes, which was smaller than that of the STD experiment with a statistical significance of 99.2 %. In summary, the land-surface wind velocity data assimilation improves the predictability of plume advection in the lower troposphere at least in the case of wintertime air pollution over complex terrain. We need more investigation into the data assimilation impact of land-surface weather observations on the predictability of pollutant dispersion especially in the planetary boundary layer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lydia Vaughn; Margaret Torn; Rachel Porras
Dataset includes Delta14C measurements made from CO2 that was collected and purified in 2012-2014 from surface soil chambers, soil pore space, and background atmosphere. In addition to 14CO2 data, dataset includes co-located measurements of CO2 and CH4 flux, soil and air temperature, and soil moisture. Measurements and field samples were taken from intensive study site 1 areas A, B, and C, and the site 0 and AB transects, from specified positions in high-centered, flat-centered, and low centered polygons.
Evaluating new SMAP soil moisture for drought monitoring in the rangelands of the US High Plains
Velpuri, Naga Manohar; Senay, Gabriel B.; Morisette, Jeffrey T.
2016-01-01
Level 3 soil moisture datasets from the recently launched Soil Moisture Active Passive (SMAP) satellite are evaluated for drought monitoring in rangelands.Validation of SMAP soil moisture (SSM) with in situ and modeled estimates showed high level of agreement.SSM showed the highest correlation with surface soil moisture (0-5 cm) and a strong correlation to depths up to 20 cm.SSM showed a reliable and expected response of capturing seasonal dynamics in relation to precipitation, land surface temperature, and evapotranspiration.Further evaluation using multi-year SMAP datasets is necessary to quantify the full benefits and limitations for drought monitoring in rangelands.
NASA Astrophysics Data System (ADS)
Zhou, J.; Ding, L.
2017-12-01
Land surface air temperature (SAT) is an important parameter in the modeling of radiation balance and energy budget of the earth surface. Generally, SAT is measured at ground meteorological stations; then SAT mapping is possible though a spatial interpolation process. The interpolated SAT map relies on the spatial distribution of ground stations, the terrain, and many other factors; thus, it has great uncertainties in regions with complicated terrain. Instead, SAT map can also be obtained through physical modeling of interactions between the land surface and the atmosphere. Such dataset generally has coarse spatial resolution (e.g. coarser than 0.1°) and cannot satisfy the applications at fine scales, e.g. 1 km. This presentation reports the reconstruction of a three hourly 1-km SAT dataset from 2001 to 2015 over the Qinghai-Tibet Plateau. The terrain in the Qinghai-Tibet Plateau, especially in the eastern part, is extremely complicated. Two SAT datasets with good qualities are used in this study. The first one is from the 3h China Meteorological Forcing Dataset with a 0.1° resolution released by the Institute of Tibetan Plateau Research, Chinese Academy of Sciences (Yang et al., 2010); the second one is from the ERA-Interim product with the same temporal resolution and a 0.125° resolution. A statistical approach is developed to downscale the spatial resolution of the derived SAT to 1-km. The elevation and the normalized difference vegetation index (NDVI) are selected as two scaling factors in the downscaling approach. Results demonstrate there is significantly negative correlation between the SAT and elevation in all seasons; there is also significantly negative correlation between the SAT and NDVI in the vegetation growth seasons, while the correlation decreases in the other seasons. Therefore, a temporally dynamic downscaling approach is feasible to enhance the spatial resolution of the SAT. Compared with the SAT at the 0.1° or 0.125°, the reconstructed 1-km SAT can provide much more spatial details in areas with complicated terrain. Additionally, the 1-km SAT agrees well with the ground measured air temperatures as well as the SAT before downscaling. The reconstructed SAT will be beneficial for the modeling of surface radiation balance and energy budget over the Qinghai-Tibet Plateau.
NASA Technical Reports Server (NTRS)
Johnson, Matthew Stephen
2017-01-01
A primary objective for TOLNet is the evaluation and validation of space-based tropospheric O3 retrievals from future systems such as the Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite. This study is designed to evaluate the tropopause-based O3 climatology (TB-Clim) dataset which will be used as the a priori profile information in TEMPO O3 retrievals. This study also evaluates model simulated O3 profiles, which could potentially serve as a priori O3 profile information in TEMPO retrievals, from near-real-time (NRT) data assimilation model products (NASA Global Modeling and Assimilation Office (GMAO) Goddard Earth Observing System (GEOS-5) Forward Processing (FP) and Modern-Era Retrospective analysis for Research and Applications version 2 (MERRA2)) and full chemical transport model (CTM), GEOS-Chem, simulations. The TB-Clim dataset and model products are evaluated with surface (0-2 km) and tropospheric (0-10 km) TOLNet observations to demonstrate the accuracy of the suggested a priori dataset and information which could potentially be used in TEMPO O3 algorithms. This study also presents the impact of individual a priori profile sources on the accuracy of theoretical TEMPO O3 retrievals in the troposphere and at the surface. Preliminary results indicate that while the TB-Clim climatological dataset can replicate seasonally-averaged tropospheric O3 profiles observed by TOLNet, model-simulated profiles from a full CTM (GEOS-Chem is used as a proxy for CTM O3 predictions) resulted in more accurate tropospheric and surface-level O3 retrievals from TEMPO when compared to hourly (diurnal cycle evaluation) and daily-averaged (daily variability evaluation) TOLNet observations. Furthermore, it was determined that when large daily-averaged surface O3 mixing ratios are observed (65 ppb), which are important for air quality purposes, TEMPO retrieval values at the surface display higher correlations and less bias when applying CTM a priori profile information compared to all other data products. The primary reason for this is that CTM predictions better capture the spatio-temporal variability of the vertical profiles of observed tropospheric O3 compared to the TB-Clim dataset and other NRT data assimilation models evaluated during this study.
Initial Development and Validation of the Global Citizenship Scale
ERIC Educational Resources Information Center
Morais, Duarte B.; Ogden, Anthony C.
2011-01-01
The purpose of this article is to report on the initial development of a theoretically grounded and empirically validated scale to measure global citizenship. The methodology employed is multi-faceted, including two expert face validity trials, extensive exploratory and confirmatory factor analyses with multiple datasets, and a series of three…
NASA Astrophysics Data System (ADS)
Dafflon, B.; Hubbard, S. S.; Ulrich, C.; Peterson, J. E.; Wu, Y.; Wainwright, H. M.; Gangodagamage, C.; Kholodov, A. L.; Kneafsey, T. J.
2013-12-01
Improvement in parameterizing Arctic process-rich terrestrial models to simulate feedbacks to a changing climate requires advances in estimating the spatiotemporal variations in active layer and permafrost properties - in sufficiently high resolution yet over modeling-relevant scales. As part of the DOE Next-Generation Ecosystem Experiments (NGEE-Arctic), we are developing advanced strategies for imaging the subsurface and for investigating land and subsurface co-variability and dynamics. Our studies include acquisition and integration of various measurements, including point-based, surface-based geophysical, and remote sensing datasets These data have been collected during a series of campaigns at the NGEE Barrow, AK site along transects that traverse a range of hydrological and geomorphological conditions, including low- to high- centered polygons and drained thaw lake basins. In this study, we describe the use of galvanic-coupled electrical resistance tomography (ERT), capacitively-coupled resistivity (CCR) , permafrost cores, above-ground orthophotography, and digital elevation model (DEM) to (1) explore complementary nature and trade-offs between characterization resolution, spatial extent and accuracy of different datasets; (2) develop inversion approaches to quantify permafrost characteristics (such as ice content, ice wedge frequency, and presence of unfrozen deep layer) and (3) identify correspondences between permafrost and land surface properties (such as water inundation, topography, and vegetation). In terms of methods, we developed a 1D-based direct search approach to estimate electrical conductivity distribution while allowing exploration of multiple solutions and prior information in a flexible way. Application of the method to the Barrow datasets reveals the relative information content of each dataset for characterizing permafrost properties, which shows features variability from below one meter length scales to large trends over more than a kilometer. Further, we used Pole- and Kite-based low-altitude aerial photography with inferred DEM, as well as DEM from LiDAR dataset, to quantify land-surface properties and their co-variability with the subsurface properties. Comparison of the above- and below-ground characterization information indicate that while some permafrost characteristics correspond with changes in hydrogeomorphological expressions, others features show more complex linkages with landscape properties. Overall, our results indicate that remote sensing data, point-scale measurements and surface geophysical measurements enable the identification of regional zones having similar relations between subsurface and land surface properties. Identification of such zonation and associated permafrost-land surface properties can be used to guide investigations of carbon cycling processes and for model parameterization.
XML-BSPM: an XML format for storing Body Surface Potential Map recordings.
Bond, Raymond R; Finlay, Dewar D; Nugent, Chris D; Moore, George
2010-05-14
The Body Surface Potential Map (BSPM) is an electrocardiographic method, for recording and displaying the electrical activity of the heart, from a spatial perspective. The BSPM has been deemed more accurate for assessing certain cardiac pathologies when compared to the 12-lead ECG. Nevertheless, the 12-lead ECG remains the most popular ECG acquisition method for non-invasively assessing the electrical activity of the heart. Although data from the 12-lead ECG can be stored and shared using open formats such as SCP-ECG, no open formats currently exist for storing and sharing the BSPM. As a result, an innovative format for storing BSPM datasets has been developed within this study. The XML vocabulary was chosen for implementation, as opposed to binary for the purpose of human readability. There are currently no standards to dictate the number of electrodes and electrode positions for recording a BSPM. In fact, there are at least 11 different BSPM electrode configurations in use today. Therefore, in order to support these BSPM variants, the XML-BSPM format was made versatile. Hence, the format supports the storage of custom torso diagrams using SVG graphics. This diagram can then be used in a 2D coordinate system for retaining electrode positions. This XML-BSPM format has been successfully used to store the Kornreich-117 BSPM dataset and the Lux-192 BSPM dataset. The resulting file sizes were in the region of 277 kilobytes for each BSPM recording and can be deemed suitable for example, for use with any telemonitoring application. Moreover, there is potential for file sizes to be further reduced using basic compression algorithms, i.e. the deflate algorithm. Finally, these BSPM files have been parsed and visualised within a convenient time period using a web based BSPM viewer. This format, if widely adopted could promote BSPM interoperability, knowledge sharing and data mining. This work could also be used to provide conceptual solutions and inspire existing formats such as DICOM, SCP-ECG and aECG to support the storage of BSPMs. In summary, this research provides initial ground work for creating a complete BSPM management system.
Franesqui, Miguel A; Yepes, Jorge; García-González, Cándida
2017-08-01
This article outlines the ultrasound data employed to calibrate in the laboratory an analytical model that permits the calculation of the depth of partial-depth surface-initiated cracks on bituminous pavements using this non-destructive technique. This initial calibration is required so that the model provides sufficient precision during practical application. The ultrasonic pulse transit times were measured on beam samples of different asphalt mixtures (semi-dense asphalt concrete AC-S; asphalt concrete for very thin layers BBTM; and porous asphalt PA). The cracks on the laboratory samples were simulated by means of notches of variable depths. With the data of ultrasound transmission time ratios, curve-fittings were carried out on the analytical model, thus determining the regression parameters and their statistical dispersion. The calibrated models obtained from laboratory datasets were subsequently applied to auscultate the evolution of the crack depth after microwaves exposure in the research article entitled "Top-down cracking self-healing of asphalt pavements with steel filler from industrial waste applying microwaves" (Franesqui et al., 2017) [1].
DOIs for Data: Progress in Data Citation and Publication in the Geosciences
NASA Astrophysics Data System (ADS)
Callaghan, S.; Murphy, F.; Tedds, J.; Allan, R.
2012-12-01
Identifiers for data are the bedrock on which data citation and publication rests. These, in their turn, are widely proposed as methods for encouraging researchers to share their datasets, and at the same time receive academic credit for their efforts in producing them. However, neither data citation nor publication can be properly achieved without a method of identifying clearly what is, and what isn't, part of the dataset. Once a dataset becomes part of the scientific record (either through formal data publication or through being cited) then issues such as dataset stability and permanence become vital to address. In the geosciences, several projects in the UK are concentrating on issues of dataset identification, citation and publication. The UK's Natural Environment Research Council's (NERC) Science Information Strategy data citation and publication project is addressing the issue of identifiers for data, stability, transparency, and credit for data producers through data citation. At a data publication level, 2012 has seen the launch of the new Wiley title Geoscience Data Journal and the PREPARDE (Peer Review for Publication & Accreditation of Research Data in the Earth sciences) project, both aiming to encourage data publication by addressing issues such as data paper submission workflows and the scientific peer-review of data. All of these initiatives work with a range of partners including academic institutions, learned societies, data centers and commercial publishers, both nationally and internationally, with a cross-project aim of developing the mechanisms so data can be identified, cited and published with confidence. This involves investigating barriers and drivers to data publishing and sharing, peer review, and re-use of geoscientific datasets, and specifically such topics as dataset requirements for citation, workflows for dataset ingestion into data centers and publishers, procedures and policies for editors, reviewers and authors of data publication, and assessing the trustworthiness of data archives. A key goal is to ensure that these projects reach out to, and are informed by, other related initiatives on a global basis, in particular anyone interested in developing long-term sustainable policies, processes, incentives and business models for managing and publishing research data. This presentation will give an overview of progress in the projects mentioned above, specifically focussing on the use of DOIs for datasets hosted in the NERC environmental data centers, and how DOIs are enabling formal data citation and publication in the geosciences.
Ensemble analyses improve signatures of tumour hypoxia and reveal inter-platform differences
2014-01-01
Background The reproducibility of transcriptomic biomarkers across datasets remains poor, limiting clinical application. We and others have suggested that this is in-part caused by differential error-structure between datasets, and their incomplete removal by pre-processing algorithms. Methods To test this hypothesis, we systematically assessed the effects of pre-processing on biomarker classification using 24 different pre-processing methods and 15 distinct signatures of tumour hypoxia in 10 datasets (2,143 patients). Results We confirm strong pre-processing effects for all datasets and signatures, and find that these differ between microarray versions. Importantly, exploiting different pre-processing techniques in an ensemble technique improved classification for a majority of signatures. Conclusions Assessing biomarkers using an ensemble of pre-processing techniques shows clear value across multiple diseases, datasets and biomarkers. Importantly, ensemble classification improves biomarkers with initially good results but does not result in spuriously improved performance for poor biomarkers. While further research is required, this approach has the potential to become a standard for transcriptomic biomarkers. PMID:24902696
A polymer dataset for accelerated property prediction and design
Huan, Tran Doan; Mannodi-Kanakkithodi, Arun; Kim, Chiho; ...
2016-03-01
Emerging computation- and data-driven approaches are particularly useful for rationally designing materials with targeted properties. Generally, these approaches rely on identifying structure-property relationships by learning from a dataset of sufficiently large number of relevant materials. The learned information can then be used to predict the properties of materials not already in the dataset, thus accelerating the materials design. Herein, we develop a dataset of 1,073 polymers and related materials and make it available at http://khazana.uconn.edu/. This dataset is uniformly prepared using first-principles calculations with structures obtained either from other sources or by using structure search methods. Because the immediate targetmore » of this work is to assist the design of high dielectric constant polymers, it is initially designed to include the optimized structures, atomization energies, band gaps, and dielectric constants. As a result, it will be progressively expanded by accumulating new materials and including additional properties calculated for the optimized structures provided.« less
ERIC Educational Resources Information Center
O'Brien, Mark
2011-01-01
The appropriateness of using statistical data to inform the design of any given service development or initiative often depends upon judgements regarding scale. Large-scale data sets, perhaps national in scope, whilst potentially important in informing the design, implementation and roll-out of experimental initiatives, will often remain unused…
Constraining the common properties of active region formation using the SDO/HEAR dataset
NASA Astrophysics Data System (ADS)
Schunker, H.; Braun, D. C.; Birch, A. C.
2016-10-01
Observations from the Solar Dynamics Observatory (SDO) have the potential for allowing the helioseismic study of the formation of hundreds of active regions, which enable us to perform statistical analyses. We collated a uniform data set of emerging active regions (EARs) observed by the SDO/HMI instrument suitable for helioseismic analysis, where each active region can be observed up to 7 days before emergence. We call this dataset the SDO Helioseismic Emerging Active Region (SDO/HEAR) survey. We have used this dataset to to understand the nature of active region emergence. The latitudinally averaged line-of-sight magnetic field of all the EARs shows that the leading (trailing) polarity moves in a prograde (retrograde) direction with a speed of 110 ± 15 m/s (-60 ± 10 m/s) relative to the Carrington rotation rate in the first day after emergence. However, relative to the differential rotation of the surface plasma the East-West velocity is symmetric, with a mean of 90 ± 10 m/s. We have also compared the surface flows associated with the EARs at the time of emergence with surface flows from numerical simulations of flux emergence with different rise speeds. We found that the surface flows in simulations of emerging flux with a low rise speed of 70 m/s best match the observations.
Transcriptional and Chromatin Dynamics of Muscle Regeneration After Severe Trauma
2016-10-12
performed pathway analysis of the time-clustered RNA- Seq data16 and showed an initial burst of pro-inflammatory and immune-response transcripts in the...143 showed dynamic behavior (See Methods) and analysis of the dynamic miRNAs reinforced many of the results observed from the RNA-Seq datasets...excellent agreement was viewed. Hierarchical clustering of the datasets through time revealed 5 clusters, and gene ontology (GO) analysis of the
NASA Astrophysics Data System (ADS)
Morton, E.; Bilek, S. L.; Rowe, C. A.
2017-12-01
Understanding the spatial extent and behavior of the interplate contact in the Cascadia Subduction Zone (CSZ) may prove pivotal to preparation for future great earthquakes, such as the M9 event of 1700. Current and historic seismic catalogs are limited in their integrity by their short duration, given the recurrence rate of great earthquakes, and by their rather high magnitude of completeness for the interplate seismic zone, due to its offshore distance from these land-based networks. This issue is addressed via the 2011-2015 Cascadia Initiative (CI) amphibious seismic array deployment, which combined coastal land seismometers with more than 60 ocean-bottom seismometers (OBS) situated directly above the presumed plate interface. We search the CI dataset for small, previously undetected interplate earthquakes to identify seismic patches on the megathrust. Using the automated subspace detection method, we search for previously undetected events. Our subspace comprises eigenvectors derived from CI OBS and on-land waveforms extracted for existing catalog events that appear to have occurred on the plate interface. Previous work focused on analysis of two repeating event clusters off the coast of Oregon spanning all 4 years of deployment. Here we expand earlier results to include detection and location analysis to the entire CSZ margin during the first year of CI deployment, with more than 200 new events detected for the central portion of the margin. Template events used for subspace scanning primarily occurred beneath the land surface along the coast, at the downdip edge of modeled high slip patches for the 1700 event, with most concentrated at the northwestern edge of the Olympic Peninsula.
NASA Astrophysics Data System (ADS)
Sathyanarayanan, S.; Moitra, A.; Sasikala, G.; Bhaduri, A. K.
2015-02-01
K IA is increasingly being regarded as a characteristic fracture toughness below which cleavage fracture does not occur. Its evaluation from small-sized Charpy specimens would be advantageous for applications in power plant industries. In this study, K IA has been evaluated for P91 steel in various cold worked and thermally aged conditions. Evaluation of K IA requires determination of crack arrest load( P arrest) and crack arrest length( a arrest). The main challenge is in the determination of a arrest due to the non-availability of standard methodologies and the absence of unequivocal microstructural signatures on the fracture surface in this steel to identify crack arrest. a arrest has been determined using the analytical Key- Curve methodology which has proven successful for this steel in unaged condition. The applicability of the Key- Curve method is validated by the good agreement of the determined final crack length with that measured optically on unbroken specimens of N&T and subsequently 15% cold-worked P91 steel which had been previously aged at 650 °C for 5000 h. Mean K IA varies from 47.46 MPa√m (NT steel aged at 600 °C for 5000 h) to 69.85 MPa√m(NT + 15% cw steel aged at 650 °C for 10000 h) for the various cold worked and aged datasets. K IA is found to be an average property unlike initiation toughness ( K Jd) which shows statistical scatter. Mean K IA is found to be in reasonable agreement with the lower bound values of cleavage initiation toughness ( K Jd) for the datasets in this study.
17 years of aerosol and clouds from the ATSR Series of Instruments
NASA Astrophysics Data System (ADS)
Poulsen, C. A.
2015-12-01
Aerosols play a significant role in Earth's climate by scattering and absorbing incoming sunlight and affecting the formation and radiative properties of clouds. The extent to which aerosols affect cloud remains one of the largest sources of uncertainty amongst all influences on climate change. Now, a new comprehensive datasets has been developed under the ESA Climate Change Initiative (CCI) programme to quantify how changes in aerosol levels affect these clouds. The unique dataset is constructed from the Optimal Retrieval of Aerosol and Cloud (ORAC) algorithm used in (A)ATSR (Along Track Scanning Radiometer) retrievals of aerosols generated in the Aerosol CCI and the CC4CL ( Community Code for CLimate) for cloud retrieval in the Cloud CCI. The ATSR instrument is a dual viewing instrument with on board visible and infra red calibration systems making it an ideal instrument to study trends of Aerosol and Clouds and their interactions. The data set begins in 1995 and ends in 2012. A new instrument in the series SLSTR(Sea and Land Surface Temperature Radiometer) will be launch in 2015. The Aerosol and Clouds are retreived using similar algorithms to maximise the consistency of the results These state-of-the-art retrievals have been merged together to quantify the susceptibility of cloud properties to changes in aerosol concentration. Aerosol-cloud susceptibilities are calculated from several thousand samples in each 1x1 degree globally gridded region. Two-D histograms of the aerosol and cloud properties are also included to facilitate seamless comparisons between other satellite and modelling data sets. The analysis of these two long term records will be discussed individually and the initial comparisons between these new joint products and models will be presented.
NASA Astrophysics Data System (ADS)
Xu, Wenhui; Li, Qingxiang; Jones, Phil; Wang, Xiaolan L.; Trewin, Blair; Yang, Su; Zhu, Chen; Zhai, Panmao; Wang, Jinfeng; Vincent, Lucie; Dai, Aiguo; Gao, Yun; Ding, Yihui
2018-04-01
A new dataset of integrated and homogenized monthly surface air temperature over global land for the period since 1900 [China Meteorological Administration global Land Surface Air Temperature (CMA-LSAT)] is developed. In total, 14 sources have been collected and integrated into the newly developed dataset, including three global (CRUTEM4, GHCN, and BEST), three regional and eight national sources. Duplicate stations are identified, and those with the higher priority are chosen or spliced. Then, a consistency test and a climate outlier test are conducted to ensure that each station series is quality controlled. Next, two steps are adopted to assure the homogeneity of the station series: (1) homogenized station series in existing national datasets (by National Meteorological Services) are directly integrated into the dataset without any changes (50% of all stations), and (2) the inhomogeneities are detected and adjusted for in the remaining data series using a penalized maximal t test (50% of all stations). Based on the dataset, we re-assess the temperature changes in global and regional areas compared with GHCN-V3 and CRUTEM4, as well as the temperature changes during the three periods of 1900-2014, 1979-2014 and 1998-2014. The best estimates of warming trends and there 95% confidence ranges for 1900-2014 are approximately 0.102 ± 0.006 °C/decade for the whole year, and 0.104 ± 0.009, 0.112 ± 0.007, 0.090 ± 0.006, and 0.092 ± 0.007 °C/decade for the DJF (December, January, February), MAM, JJA, and SON seasons, respectively. MAM saw the most significant warming trend in both 1900-2014 and 1979-2014. For an even shorter and more recent period (1998-2014), MAM, JJA and SON show similar warming trends, while DJF shows opposite trends. The results show that the ability of CMA-LAST for describing the global temperature changes is similar with other existing products, while there are some differences when describing regional temperature changes.
The Berkeley SuperNova Ia Program (BSNIP): Dataset and Initial Analysis
NASA Astrophysics Data System (ADS)
Silverman, Jeffrey; Ganeshalingam, M.; Kong, J.; Li, W.; Filippenko, A.
2012-01-01
I will present spectroscopic data from the Berkeley SuperNova Ia Program (BSNIP), their initial analysis, and the results of attempts to use spectral information to improve cosmological distance determinations to Type Ia supernova (SNe Ia). The dataset consists of 1298 low-redshift (z< 0.2) optical spectra of 582 SNe Ia observed from 1989 through the end of 2008. Many of the SNe have well-calibrated light curves with measured distance moduli as well as spectra that have been corrected for host-galaxy contamination. I will also describe the spectral classification scheme employed (using the SuperNova Identification code, SNID; Blondin & Tonry 2007) which utilizes a newly constructed set of SNID spectral templates. The sheer size of the BSNIP dataset and the consistency of the observation and reduction methods make this sample unique among all other published SN Ia datasets. I will also discuss measurements of the spectral features of about one-third of the spectra which were obtained within 20 days of maximum light. I will briefly describe the adopted method of automated, robust spectral-feature definition and measurement which expands upon similar previous studies. Comparisons of these measurements of SN Ia spectral features to photometric observables will be presented with an eye toward using spectral information to calculate more accurate cosmological distances. Finally, I will comment on related projects which also utilize the BSNIP dataset that are planned for the near future. This research was supported by NSF grant AST-0908886 and the TABASGO Foundation. I am grateful to Marc J. Staley for a Graduate Fellowship.
Weaver, J. Curtis
2006-01-01
A study of annual maximum precipitation frequency in Mecklenburg County, North Carolina, was conducted to characterize the frequency of precipitation at sites having at least 10 years of precipitation record. Precipitation-frequency studies provide information about the occurrence of precipitation amounts for given durations (for example, 1 hour or 24 hours) that can be expected to occur within a specified recurrence interval (expressed in years). In this study, annual maximum precipitation totals were determined for durations of 15 and 30 minutes; 1, 2, 3, 6, 12, and 24 hours; and for recurrence intervals of 2, 5, 10, 25, 50, 100, and 500 years. Precipitation data collected by the U.S. Geological Survey network of raingages in the city of Charlotte and Mecklenburg County were analyzed for this study. In September 2004, more than 70 precipitation sites were in operation; 27 of these sites had at least 10 years of record, which is the minimum record typically required in frequency studies. Missing record at one site, however, resulted in its removal from the dataset. Two datasets--the Charlotte Raingage Network (CRN) initial and CRN modified datasets--were developed from the U.S. Geological Survey data, which represented relatively short periods of record (10 and 11 years). The CRN initial dataset included very high precipitation totals from two storms that caused severe flooding in areas of the city and county in August 1995 and July 1997, which could significantly influence the statistical results. The CRN modified dataset excluded the highest precipitation totals from these two storms but included the second highest totals. More...
NASA Astrophysics Data System (ADS)
Martha, Tapas R.; Jain, Nirmala; Vamshi, Gasiganti T.; Vinod Kumar, K.
2017-11-01
This study shows results of morphological and spectroscopic analyses of Ius Chasma and its southern branched valleys using Orbiter datasets such as Mars Reconnaissance Orbiter (MRO)-Compact Reconnaissance Imaging Spectrometer for Mars (CRISM), High Resolution Imaging Science Experiment (MRO-HiRISE) and digital terrain model (HRSC-DTM). Result of the spectral analysis reveals presence of hydrated minerals such as opal, nontronite and vermiculite in the floor and wall rock areas Ius Chasma indicating alteration of parent rock in an water rich environment of early Mars. Topographic gradient and morphological evidences such as V-shaped valleys with theatre shaped stubby channels, dendritic drainage and river piracy indicate that these valleys were initially developed by surface runoff due to episodic floods and further expanded due to groundwater sapping controlled by faults and fractures. Minerals formed by aqueous alteration during valley formation and their intricate association with different morphological domains suggest that surface runoff played a key role in the development of branched valleys south of Ius Chasma on Mars.
FORGE Newberry 3D Gravity Density Model for Newberry Volcano
Alain Bonneville
2016-03-11
These data are Pacific Northwest National Lab inversions of an amalgamation of two surface gravity datasets: Davenport-Newberry gravity collected prior to 2012 stimulations and Zonge International gravity collected for the project "Novel use of 4D Monitoring Techniques to Improve Reservoir Longevity and Productivity in Enhanced Geothermal Systems" in 2012. Inversions of surface gravity recover a 3D distribution of density contrast from which intrusive igneous bodies are identified. The data indicate a body name, body type, point type, UTM X and Y coordinates, Z data is specified as meters below sea level (negative values then indicate elevations above sea level), thickness of the body in meters, suscept, density anomaly in g/cc, background density in g/cc, and density in g/cc. The model was created using a commercial gravity inversion software called ModelVision 12.0 (http://www.tensor-research.com.au/Geophysical-Products/ModelVision). The initial model is based on the seismic tomography interpretation (Beachly et al., 2012). All the gravity data used to constrain this model are on the GDR: https://gdr.openei.org/submissions/760.
Customizing WRF-Hydro for the Laurentian Great Lakes Basin
NASA Astrophysics Data System (ADS)
Gronewold, A.; Pei, L.; Gochis, D.; Mason, L.; Sampson, K. M.; Dugger, A. L.; Read, L.; McCreight, J. L.; Xiao, C.; Lofgren, B. M.; Anderson, E. J.; Chu, P. Y.
2017-12-01
To advance the state of the art in regional hydrological forecasting, and to align with operational deployment of the National Water Model, a team of scientists has been customizing WRF-Hydro (the Weather Research and Forecasting model - Hydrological modeling extension package) to the entirety (including binational land and lake surfaces) of the Laurentian Great Lakes basin. Objectives of this customization project include opererational simulation and forecasting of the Great Lakes water balance and, in the short-term, research-oriented insights into modeling one- and two-way coupled lake-atmosphere and near-shore processes. Initial steps in this project have focused on overcoming inconsistencies in land surface hydrographic datasets between the United States and Canada. Improvements in the model's current representation of lake physics and stream routing are also critical components of this effort. Here, we present an update on the status of this project, including a synthesis of offline tests with WRF-Hydro based on the newly developed Great Lakes hydrographic data, and an assessment of the model's ability to simulate seasonal and multi-decadal hydrological response across the Great Lakes.
Rupture history of 2008 May 12 Mw 8.0 Wen-Chuan earthquake: Evidence of slip interaction
NASA Astrophysics Data System (ADS)
Ji, C.; Shao, G.; Lu, Z.; Hudnut, K.; Jiu, J.; Hayes, G.; Zeng, Y.
2008-12-01
We will present the rupture process of the May 12, 2008 Mw 8.0 Wenchuan earthquake using all available data. The current model, using both teleseismic body and surface waves and interferometric LOS displacements, reveals an unprecedented complex rupture process which can not be resolved using either of the datasets individually. Rupture of this earthquake involved both the low angle Pengguan fault and the high angle Beichuan fault, which intersect each other at depth and are separated approximately 5-15 km at the surface. Rupture initiated on the Pengguan fault and triggered rupture on the Beichuan fault 10 sec later. The two faults dynamically interacted and unilaterally ruptured over 270 km with an average rupture velocity of 3.0 km/sec. The total seismic moment is 1.1x1021 Nm (Mw 8.0), roughly equally partitioned between the two faults. However, the spatiotemporal evaluations of the two faults are very different. This study will focus on the evidence for fault interactions and will analyze the corresponding uncertainties, in preparation for future dynamic studies of the same detailed nature.
Layers: A molecular surface peeling algorithm and its applications to analyze protein structures
Karampudi, Naga Bhushana Rao; Bahadur, Ranjit Prasad
2015-01-01
We present an algorithm ‘Layers’ to peel the atoms of proteins as layers. Using Layers we show an efficient way to transform protein structures into 2D pattern, named residue transition pattern (RTP), which is independent of molecular orientations. RTP explains the folding patterns of proteins and hence identification of similarity between proteins is simple and reliable using RTP than with the standard sequence or structure based methods. Moreover, Layers generates a fine-tunable coarse model for the molecular surface by using non-random sampling. The coarse model can be used for shape comparison, protein recognition and ligand design. Additionally, Layers can be used to develop biased initial configuration of molecules for protein folding simulations. We have developed a random forest classifier to predict the RTP of a given polypeptide sequence. Layers is a standalone application; however, it can be merged with other applications to reduce the computational load when working with large datasets of protein structures. Layers is available freely at http://www.csb.iitkgp.ernet.in/applications/mol_layers/main. PMID:26553411
NASA Astrophysics Data System (ADS)
Sun, Qingsong; Wang, Zhuosen; Li, Zhan; Erb, Angela; Schaaf, Crystal B.
2017-06-01
Land surface albedo is an essential variable for surface energy and climate modeling as it describes the proportion of incident solar radiant flux that is reflected from the Earth's surface. To capture the temporal variability and spatial heterogeneity of the land surface, satellite remote sensing must be used to monitor albedo accurately at a global scale. However, large data gaps caused by cloud or ephemeral snow have slowed the adoption of satellite albedo products by the climate modeling community. To address the needs of this community, we used a number of temporal and spatial gap-filling strategies to improve the spatial and temporal coverage of the global land surface MODIS BRDF, albedo and NBAR products. A rigorous evaluation of the gap-filled values shows good agreement with original high quality data (RMSE = 0.027 for the NIR band albedo, 0.020 for the red band albedo). This global snow-free and cloud-free MODIS BRDF and albedo dataset (established from 2001 to 2015) offers unique opportunities to monitor and assess the impact of the changes on the Earth's land surface.
A dataset mapping the potential biophysical effects of vegetation cover change
NASA Astrophysics Data System (ADS)
Duveiller, Gregory; Hooker, Josh; Cescatti, Alessandro
2018-02-01
Changing the vegetation cover of the Earth has impacts on the biophysical properties of the surface and ultimately on the local climate. Depending on the specific type of vegetation change and on the background climate, the resulting competing biophysical processes can have a net warming or cooling effect, which can further vary both spatially and seasonally. Due to uncertain climate impacts and the lack of robust observations, biophysical effects are not yet considered in land-based climate policies. Here we present a dataset based on satellite remote sensing observations that provides the potential changes i) of the full surface energy balance, ii) at global scale, and iii) for multiple vegetation transitions, as would now be required for the comprehensive evaluation of land based mitigation plans. We anticipate that this dataset will provide valuable information to benchmark Earth system models, to assess future scenarios of land cover change and to develop the monitoring, reporting and verification guidelines required for the implementation of mitigation plans that account for biophysical land processes.
A dataset mapping the potential biophysical effects of vegetation cover change
Duveiller, Gregory; Hooker, Josh; Cescatti, Alessandro
2018-01-01
Changing the vegetation cover of the Earth has impacts on the biophysical properties of the surface and ultimately on the local climate. Depending on the specific type of vegetation change and on the background climate, the resulting competing biophysical processes can have a net warming or cooling effect, which can further vary both spatially and seasonally. Due to uncertain climate impacts and the lack of robust observations, biophysical effects are not yet considered in land-based climate policies. Here we present a dataset based on satellite remote sensing observations that provides the potential changes i) of the full surface energy balance, ii) at global scale, and iii) for multiple vegetation transitions, as would now be required for the comprehensive evaluation of land based mitigation plans. We anticipate that this dataset will provide valuable information to benchmark Earth system models, to assess future scenarios of land cover change and to develop the monitoring, reporting and verification guidelines required for the implementation of mitigation plans that account for biophysical land processes. PMID:29461538
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xiaoma; Zhou, Yuyu; Asrar, Ghassem R.
High spatiotemporal resolution air temperature (Ta) datasets are increasingly needed for assessing the impact of temperature change on people, ecosystems, and energy system, especially in the urban domains. However, such datasets are not widely available because of the large spatiotemporal heterogeneity of Ta caused by complex biophysical and socioeconomic factors such as built infrastructure and human activities. In this study, we developed a 1-km gridded dataset of daily minimum Ta (Tmin) and maximum Ta (Tmax), and the associated uncertainties, in urban and surrounding areas in the conterminous U.S. for the 2003–2016 period. Daily geographically weighted regression (GWR) models were developedmore » and used to interpolate Ta using 1 km daily land surface temperature and elevation as explanatory variables. The leave-one-out cross-validation approach indicates that our method performs reasonably well, with root mean square errors of 2.1 °C and 1.9 °C, mean absolute errors of 1.5 °C and 1.3 °C, and R 2 of 0.95 and 0.97, for Tmin and Tmax, respectively. The resulting dataset captures reasonably the spatial heterogeneity of Ta in the urban areas, and also captures effectively the urban heat island (UHI) phenomenon that Ta rises with the increase of urban development (i.e., impervious surface area). The new dataset is valuable for studying environmental impacts of urbanization such as UHI and other related effects (e.g., on building energy consumption and human health). The proposed methodology also shows a potential to build a long-term record of Ta worldwide, to fill the data gap that currently exists for studies of urban systems.« less
NASA Technical Reports Server (NTRS)
Moody, Eric G.; King, Michael D.; Platnick, Steven; Schaaf, Crystal B.; Gao, Feng
2004-01-01
Land surface albedo is an important parameter in describing the radiative properties of the earth s surface as it represents the amount of incoming solar radiation that is reflected from the surface. The amount and type of vegetation of the surface dramatically alters the amount of radiation that is reflected; for example, croplands that contain leafy vegetation will reflect radiation very differently than blacktop associated with urban areas. In addition, since vegetation goes through a growth, or phenological, cycle, the amount of radiation that is reflected changes over the course of a year. As a result, albedo is both temporally and spatially dependant upon global location as there is a distribution of vegetated surface types and growing conditions. Land surface albedo is critical for a wide variety of earth system research projects including but not restricted to remote sensing of atmospheric aerosol and cloud properties from space, ground-based analysis of aerosol optical properties from surface-based sun/sky radiometers, biophysically-based land surface modeling of the exchange of energy, water, momentum, and carbon for various land use categories, and surface energy balance studies. These projects require proper representation of the surface albedo s spatial, spectral, and temporal variations, however, these representations are often lacking in datasets prior to the latest generation of land surface albedo products.
Exploration of scaling effects on coarse resolution land surface phenology
USDA-ARS?s Scientific Manuscript database
A great number of land surface phenoloy (LSP) data have been produced from various coarse resolution satellite datasets and detection algorithms across regional and global scales. Unlike field- measured phenological events which are quantitatively defined with clear biophysical meaning, current LSP ...
Moderate-Resolution Sea Surface Temperature Data for the Nearshore North Pacific
Coastal sea surface temperature (SST) is an important environmental characteristic defining habitat suitability for nearshore marine and estuarine organisms. The purpose of this publication is to provide access to an easy-to-use coastal SST dataset for ecologists, biogeographers...
NASA Astrophysics Data System (ADS)
Tamminen, J.; Sofieva, V.; Kyrölä, E.; Laine, M.; Degenstein, D. A.; Bourassa, A. E.; Roth, C.; Zawada, D.; Weber, M.; Rozanov, A.; Rahpoe, N.; Stiller, G. P.; Laeng, A.; von Clarmann, T.; Walker, K. A.; Sheese, P.; Hubert, D.; Van Roozendael, M.; Zehner, C.; Damadeo, R. P.; Zawodny, J. M.; Kramarova, N. A.; Bhartia, P. K.
2017-12-01
We present a merged dataset of ozone profiles from several satellite instruments: SAGE II on ERBS, GOMOS, SCIAMACHY and MIPAS on Envisat, OSIRIS on Odin, ACE-FTS on SCISAT, and OMPS on Suomi-NPP. The merged dataset is created in the framework of European Space Agency Climate Change Initiative (Ozone_cci) with the aim of analyzing stratospheric ozone trends. For the merged dataset, we used the latest versions of the original ozone datasets. The datasets from the individual instruments have been extensively validated and inter-compared; only those datasets, which are in good agreement and do not exhibit significant drifts with respect to collocated ground-based observations and with respect to each other, are used for merging. The long-term SAGE-CCI-OMPS dataset is created by computation and merging of deseasonalized anomalies from individual instruments. The merged SAGE-CCI-OMPS dataset consists of deseasonalized anomalies of ozone in 10° latitude bands from 90°S to 90°N and from 10 to 50 km in steps of 1 km covering the period from October 1984 to July 2016. This newly created dataset is used for evaluating ozone trends in the stratosphere through multiple linear regression. Negative ozone trends in the upper stratosphere are observed before 1997 and positive trends are found after 1997. The upper stratospheric trends are statistically significant at mid-latitudes in the upper stratosphere and indicate ozone recovery, as expected from the decrease of stratospheric halogens that started in the middle of the 1990s.
USDA-ARS?s Scientific Manuscript database
Surface albedo is widely used in climate and environment applications as an important parameter for controlling the surface energy budget. There is an increasing need for fine resolution (< 100 m) albedo data for use in small scale applications and for validating coarse-resolution datasets; however,...
SEEG initiative estimates of Brazilian greenhouse gas emissions from 1970 to 2015.
de Azevedo, Tasso Rezende; Costa Junior, Ciniro; Brandão Junior, Amintas; Cremer, Marcelo Dos Santos; Piatto, Marina; Tsai, David Shiling; Barreto, Paulo; Martins, Heron; Sales, Márcio; Galuchi, Tharic; Rodrigues, Alessandro; Morgado, Renato; Ferreira, André Luis; Barcellos E Silva, Felipe; Viscondi, Gabriel de Freitas; Dos Santos, Karoline Costal; Cunha, Kamyla Borges da; Manetti, Andrea; Coluna, Iris Moura Esteves; Albuquerque, Igor Reis de; Junior, Shigueo Watanabe; Leite, Clauber; Kishinami, Roberto
2018-05-29
This work presents the SEEG platform, a 46-year long dataset of greenhouse gas emissions (GHG) in Brazil (1970-2015) providing more than 2 million data records for the Agriculture, Energy, Industry, Waste and Land Use Change Sectors at national and subnational levels. The SEEG dataset was developed by the Climate Observatory, a Brazilian civil society initiative, based on the IPCC guidelines and Brazilian National Inventories embedded with country specific emission factors and processes, raw data from multiple official and non-official sources, and organized together with social and economic indicators. Once completed, the SEEG dataset was converted into a spreadsheet format and shared via web-platform that, by means of simple queries, allows users to search data by emission sources and country and state activities. Because of its effectiveness in producing and making available data on a consistent and accessible basis, SEEG may significantly increase the capacity of civil society, scientists and stakeholders to understand and anticipate trends related to GHG emissions as well as its implications to public policies in Brazil.
Harmonizing Access to Federal Data - Lessons Learned Through the Climate Data Initiative
NASA Astrophysics Data System (ADS)
Bugbee, K.; Pinheiro Privette, A. C.; Meyer, D. J.; Ramachandran, R.
2016-12-01
The Climate Data Initiative (CDI), launched by the Obama Administration in March of 2014, is an effort to leverage the extensive open Federal data to spur innovation and private-sector entrepreneurship in order to advance awareness of and preparedness for the impacts of climate change (see the White House fact sheet). The project includes an online catalog of climate-related datasets and data products in key areas of climate change risk and vulnerability from across the U.S. federal government through http://Climate.Data.gov. NASA was tasked with the implementation and management of the project and has been working closely with Subject Matter Experts (SMEs) and Data Curators (DCs) from across the Federal Government to identify and catalog federal datasets relevant for assessing climate risks and impacts. These datasets are organized around key themes and are framed by key climate questions. The current themes within CDI include: Arctic, Coastal Flooding, Ecosystem Vulnerability, Energy Infrastructure, Food Resilience, Human Health, Transportation, Tribal Nations and Water. This paper summarizes the main lessons learned from the last 2.5 years of CDI implementation.
SEEG initiative estimates of Brazilian greenhouse gas emissions from 1970 to 2015
de Azevedo, Tasso Rezende; Costa Junior, Ciniro; Brandão Junior, Amintas; Cremer, Marcelo dos Santos; Piatto, Marina; Tsai, David Shiling; Barreto, Paulo; Martins, Heron; Sales, Márcio; Galuchi, Tharic; Rodrigues, Alessandro; Morgado, Renato; Ferreira, André Luis; Barcellos e Silva, Felipe; Viscondi, Gabriel de Freitas; dos Santos, Karoline Costal; Cunha, Kamyla Borges da; Manetti, Andrea; Coluna, Iris Moura Esteves; Albuquerque, Igor Reis de; Junior, Shigueo Watanabe; Leite, Clauber; Kishinami, Roberto
2018-01-01
This work presents the SEEG platform, a 46-year long dataset of greenhouse gas emissions (GHG) in Brazil (1970–2015) providing more than 2 million data records for the Agriculture, Energy, Industry, Waste and Land Use Change Sectors at national and subnational levels. The SEEG dataset was developed by the Climate Observatory, a Brazilian civil society initiative, based on the IPCC guidelines and Brazilian National Inventories embedded with country specific emission factors and processes, raw data from multiple official and non-official sources, and organized together with social and economic indicators. Once completed, the SEEG dataset was converted into a spreadsheet format and shared via web-platform that, by means of simple queries, allows users to search data by emission sources and country and state activities. Because of its effectiveness in producing and making available data on a consistent and accessible basis, SEEG may significantly increase the capacity of civil society, scientists and stakeholders to understand and anticipate trends related to GHG emissions as well as its implications to public policies in Brazil. PMID:29809176
Gesch, D.; Williams, J.; Miller, W.
2001-01-01
Elevation models produced from Shuttle Radar Topography Mission (SRTM) data will be the most comprehensive, consistently processed, highest resolution topographic dataset ever produced for the Earth's land surface. Many applications that currently use elevation data will benefit from the increased availability of data with higher accuracy, quality, and resolution, especially in poorly mapped areas of the globe. SRTM data will be produced as seamless data, thereby avoiding many of the problems inherent in existing multi-source topographic databases. Serving as precursors to SRTM datasets, the U.S. Geological Survey (USGS) has produced and is distributing seamless elevation datasets that facilitate scientific use of elevation data over large areas. GTOPO30 is a global elevation model with a 30 arc-second resolution (approximately 1-kilometer). The National Elevation Dataset (NED) covers the United States at a resolution of 1 arc-second (approximately 30-meters). Due to their seamless format and broad area coverage, both GTOPO30 and NED represent an advance in the usability of elevation data, but each still includes artifacts from the highly variable source data used to produce them. The consistent source data and processing approach for SRTM data will result in elevation products that will be a significant addition to the current availability of seamless datasets, specifically for many areas outside the U.S. One application that demonstrates some advantages that may be realized with SRTM data is delineation of land surface drainage features (watersheds and stream channels). Seamless distribution of elevation data in which a user interactively specifies the area of interest and order parameters via a map server is already being successfully demonstrated with existing USGS datasets. Such an approach for distributing SRTM data is ideal for a dataset that undoubtedly will be of very high interest to the spatial data user community.
NASA Astrophysics Data System (ADS)
Khan, S.; Salas, F.; Sampson, K. M.; Read, L. K.; Cosgrove, B.; Li, Z.; Gochis, D. J.
2017-12-01
The representation of inland surface water bodies in distributed hydrologic models at the continental scale is a challenge. The National Water Model (NWM) utilizes the National Hydrography Dataset Plus Version 2 (NHDPlusV2) "waterbody" dataset to represent lakes and reservoirs. The "waterbody" layer is a comprehensive dataset that represents surface water bodies using common features like lakes, ponds, reservoirs, estuaries, playas and swamps/marshes. However, a major issue that remains unresolved even in the latest revision of NHDPlus Version 2 is the inconsistency in waterbody digitization and delineation errors. Manually correcting the water body polygons becomes tedious and quickly impossible for continental-scale hydrologic models such as the NWM. In this study, we improved spatial representation of 6,802 lakes and reservoirs by analyzing 379,110 waterbodies in the contiguous United States (excluding the Laurentian Great Lakes). We performed a step-by- step process that integrates a set of geospatial analyses to identify, track, and correct the extent of lakes and reservoirs features that are larger than 0.75 km2. The following assumptions were applied while developing the new dataset: a) lakes and reservoirs cannot directly feed into each other; b) each waterbody must have one outlet; and c) a single lake or reservoir feature cannot have multiple parts. The majority of the NHDplusV2 waterbody features in the original dataset are delineated correctly. However approximately 3 % of the lake and reservoir polygons were found to be incorrect with topological errors and were corrected accordingly. It is important to fix these digitizing errors because the waterbody features are closely linked to the river topology. This new waterbody dataset will ensure that model-simulated water is directed into and through the lakes and reservoirs in a manner that supports the NWM code base and assumptions. The improved dataset will facilitate more effective integration of lakes and reservoirs with correct spatial features into the updated NWM.
Coordinated Time Resolved Spectrophotometry of Asteroid 163249 (2002 GT)
NASA Astrophysics Data System (ADS)
Ryan, Erin L.; Woodward, C.; Gordon, M.; Wagner, M. R.; Chesley, S.; Hicks, M.; Pittichova, J.; Pravec, P.
2013-10-01
The near-Earth asteroid 163249 (2002 GT), classified as a potentially hazardous asteroid (PHA), has been identified a potential rendezvous target for the NASA Deep Impact spacecraft on 4 Jan 2020. As part of a coordinated international effort to study this asteroid during its 2013 apparition (J. Pittichová et al. DPS 2013), we obtained simultaneous Sloan r-band photometry at the Steward Observatory Bok 2.3-m telescope (+90Prime) and optical spectroscopic observations covering a wavelength interval from ~5400 to ~8500 Angstrom at the MMT 6.5-m (+RedChannel spectrograph) on 2013 June 16 and 17 UT near close Earth approach (heliocentric distance ~1.07 AU; geocentric distance ~0.13 AU) at 180 sec intervals over the ~3.76 hr rotational period. Our objective was to obtain a temporal sequence of spectra to assess surface mineralogy (seeking to potentially detect the 0.7 micron absorption bands attributed to phylosilicate materials) and to determine whether variations in the spectral slope and/or surface mineralogy are evident as a function of rotational period. Here we present initial analysis of these datasets, describing the light-curve and the reflectance spectra as a function of rotational phase. These datasets will be incorporated into a larger compendium describing the characteristics of asteroid 163249. Acknowledgement: This research supported in part by NASA 12-PAST-12-0010 grant NNX13AJ11G , and an appointment (E.L.R.) to the NASA Postdoctoral Program at the Goddard Space Flight Center, administered by Oak Ridge Associated Universities through a contract with NASA. Observations reported here were obtained at the MMT Observatory, a joint facility of the Smithsonian Institution and the University of Arizona. P.P. was supported by the Grant Agency of the Czech Republic, Grant P209/12/0229.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xia, Youlong; Ek, Michael; Sheffield, Justin
2013-02-25
Soil temperature can exhibit considerable memory from weather and climate signals and is among the most important initial conditions in numerical weather and climate models. Consequently, a more accurate long-term land surface soil temperature dataset is needed to improve weather and climate simulation and prediction, and is also important for the simulation of agricultural crop yield and ecological processes. The North-American Land Data Assimilation (NLDAS) Phase 2 (NLDAS-2) has generated 31-years (1979-2009) of simulated hourly soil temperature data with a spatial resolution of 1/8o. This dataset has not been comprehensively evaluated to date. Thus, the ultimate purpose of the presentmore » work is to assess Noah-simulated soil temperature for different soil depths and timescales. We used long-term (1979-2001) observed monthly mean soil temperatures from 137 cooperative stations over the United States to evaluate simulated soil temperature for three soil layers (0-10 cm, 10-40 cm, 40-100 cm) for annual and monthly timescales. We used short-term (1997-1999) observed soil temperature from 72 Oklahoma Mesonet stations to validate simulated soil temperatures for three soil layers and for daily and hourly timescales. The results showed that the Noah land surface model (Noah LSM) generally matches observed soil temperature well for different soil layers and timescales. At greater depths, the simulation skill (anomaly correlation) decreased for all time scales. The monthly mean diurnal cycle difference between simulated and observed soil temperature revealed large midnight biases in the cold season due to small downward longwave radiation and issues related to model parameters.« less
Noda, Masaru; Okayama, Hirokazu; Tachibana, Kazunoshin; Sakamoto, Wataru; Saito, Katsuharu; Thar Min, Aung Kyi; Ashizawa, Mai; Nakajima, Takahiro; Aoto, Keita; Momma, Tomoyuki; Katakura, Kyoko; Ohki, Shinji; Kono, Koji
2018-05-29
We aimed to discover glycosyltransferase gene (glycogene)-derived molecular subtypes of colorectal cancer (CRC) associated with patient outcomes. Transcriptomic and epigenomic datasets of non-tumor, pre-cancerous, cancerous tissues and cell lines with somatic mutations, mismatch repair status, clinicopathological and survival information, were assembled (n=4223) and glycogene profiles were analyzed. Immunohistochemistry for a glycogene, GALNT6, was conducted in adenoma and carcinoma specimens (n=403). The functional role and cell surface glycan profiles were further investigated by in vitro loss-of-function assays and lectin microarray analysis. We initially developed and validated a 15-glycogene signature that can identify a poor-prognostic subtype, which closely related to deficient mismatch repair (dMMR) and GALNT6 downregulation. The association of decreased GALNT6 with dMMR was confirmed in multiple datasets of tumors and cell lines, and was further recapitulated by immunohistochemistry, where approximately 15% tumors exhibited loss of GALNT6 protein. GALNT6 mRNA and protein was expressed in premalignant/preinvasive lesions but was subsequently downregulated in a subset of carcinomas, possibly through epigenetic silencing. Decreased GALNT6 was independently associated with poor prognosis in the immunohistochemistry cohort and an additional microarray meta-cohort, by multivariate analyses, and its discriminative power of survival was particularly remarkable in stage III patients. GALNT6 silencing in SW480 cells promoted invasion, migration, chemoresistance and increased cell surface expression of a cancer-associated truncated O-glycan, Tn-antigen. The 15-glycogene signature and the expression levels of GALNT6 mRNA and protein each serve as a novel prognostic biomarker, highlighting the role of dysregulated glycogenes in cancer-associated glycan synthesis and poor prognosis. Copyright ©2018, American Association for Cancer Research.
OpenSHS: Open Smart Home Simulator.
Alshammari, Nasser; Alshammari, Talal; Sedky, Mohamed; Champion, Justin; Bauer, Carolin
2017-05-02
This paper develops a new hybrid, open-source, cross-platform 3D smart home simulator, OpenSHS, for dataset generation. OpenSHS offers an opportunity for researchers in the field of the Internet of Things (IoT) and machine learning to test and evaluate their models. Following a hybrid approach, OpenSHS combines advantages from both interactive and model-based approaches. This approach reduces the time and efforts required to generate simulated smart home datasets. We have designed a replication algorithm for extending and expanding a dataset. A small sample dataset produced, by OpenSHS, can be extended without affecting the logical order of the events. The replication provides a solution for generating large representative smart home datasets. We have built an extensible library of smart devices that facilitates the simulation of current and future smart home environments. Our tool divides the dataset generation process into three distinct phases: first design: the researcher designs the initial virtual environment by building the home, importing smart devices and creating contexts; second, simulation: the participant simulates his/her context-specific events; and third, aggregation: the researcher applies the replication algorithm to generate the final dataset. We conducted a study to assess the ease of use of our tool on the System Usability Scale (SUS).
OpenSHS: Open Smart Home Simulator
Alshammari, Nasser; Alshammari, Talal; Sedky, Mohamed; Champion, Justin; Bauer, Carolin
2017-01-01
This paper develops a new hybrid, open-source, cross-platform 3D smart home simulator, OpenSHS, for dataset generation. OpenSHS offers an opportunity for researchers in the field of the Internet of Things (IoT) and machine learning to test and evaluate their models. Following a hybrid approach, OpenSHS combines advantages from both interactive and model-based approaches. This approach reduces the time and efforts required to generate simulated smart home datasets. We have designed a replication algorithm for extending and expanding a dataset. A small sample dataset produced, by OpenSHS, can be extended without affecting the logical order of the events. The replication provides a solution for generating large representative smart home datasets. We have built an extensible library of smart devices that facilitates the simulation of current and future smart home environments. Our tool divides the dataset generation process into three distinct phases: first design: the researcher designs the initial virtual environment by building the home, importing smart devices and creating contexts; second, simulation: the participant simulates his/her context-specific events; and third, aggregation: the researcher applies the replication algorithm to generate the final dataset. We conducted a study to assess the ease of use of our tool on the System Usability Scale (SUS). PMID:28468330
Evaluation of reanalysis near-surface winds over northern Africa in Boreal summer
NASA Astrophysics Data System (ADS)
Engelstaedter, Sebastian; Washington, Richard
2014-05-01
The emission of dust from desert surfaces depends on the combined effects of surface properties such as surface roughness, soil moisture, soil texture and particle size (erodibility) and wind speed (erosivity). In order for dust cycle models to realistically simulate dust emissions for the right reasons, it is essential that erosivity and erodibility controlling factors are represented correctly. There has been a focus on improving dust emission schemes or input fields of soil distribution and texture even though it has been shown that the use of wind fields from different reanalysis datasets to drive the same model can result in significant differences in the dust emissions. Here we evaluate the representation of near-surface wind speed from three different reanalysis datasets (ERA-Interim, CFSR and MERRA) over the North African domain. Reanalysis 10m wind speeds are compared with observations from SYNOP and METAR reports available from the UK Meteorological Office Integrated Data Archive System (MIDAS) Land and Marine Surface Stations Dataset. We compare 6-hourly observations of 10m wind speed between 1 January 1989 and 31 December 2009 from more the 500 surface stations with the corresponding reanalysis values. A station data based mean wind speed climatology for North Africa is presented. Overall, the representation of 10m winds is relatively poor in all three reanalysis datasets with stations in the northern parts of the Sahara still being better simulated (correlation coefficients ~ 0.5) than stations in the Sahel (correlation coefficients < 0.3) which points at the reanalyses not being able to realistically capture the Sahel dynamics systems. All three reanalyses have a systematic bias towards overestimating wind speed below 3-4 m/s and underestimating wind speed above 4 m/s. This bias becomes larger with increasing wind speed but is independent of the time of day. For instance, 14 m/s observed wind speeds are underestimated on average by 6 m/s in the ERA-Interim reanalysis. Given the cubic relationship between wind speed and dust emission this large underestimation is expected to significantly impact the simulation of dust emissions. A negative relationship between observed and ERA-Interim wind speed is found for winds above 14 m/s indicating that high wind speed generating processes are not well (if at all) represented in the model.
Evaluation and inter-comparison of modern day reanalysis datasets over Africa and the Middle East
NASA Astrophysics Data System (ADS)
Shukla, S.; Arsenault, K. R.; Hobbins, M.; Peters-Lidard, C. D.; Verdin, J. P.
2015-12-01
Reanalysis datasets are potentially very valuable for otherwise data-sparse regions such as Africa and the Middle East. They are potentially useful for long-term climate and hydrologic analyses and, given their availability in real-time, they are particularity attractive for real-time hydrologic monitoring purposes (e.g. to monitor flood and drought events). Generally in data-sparse regions, reanalysis variables such as precipitation, temperature, radiation and humidity are used in conjunction with in-situ and/or satellite-based datasets to generate long-term gridded atmospheric forcing datasets. These atmospheric forcing datasets are used to drive offline land surface models and simulate soil moisture and runoff, which are natural indicators of hydrologic conditions. Therefore, any uncertainty or bias in the reanalysis datasets contributes to uncertainties in hydrologic monitoring estimates. In this presentation, we report on a comprehensive analysis that evaluates several modern-day reanalysis products (such as NASA's MERRA-1 and -2, ECMWF's ERA-Interim and NCEP's CFS Reanalysis) over Africa and the Middle East region. We compare the precipitation and temperature from the reanalysis products with other independent gridded datasets such as GPCC, CRU, and USGS/UCSB's CHIRPS precipitation datasets, and CRU's temperature datasets. The evaluations are conducted at a monthly time scale, since some of these independent datasets are only available at this temporal resolution. The evaluations range from the comparison of the monthly mean climatology to inter-annual variability and long-term changes. Finally, we also present the results of inter-comparisons of radiation and humidity variables from the different reanalysis datasets.
Satellite Monitoring of Boston Harbor Water Quality: Initial Investigations
NASA Astrophysics Data System (ADS)
Sheldon, P.; Chen, R. F.; Schaaf, C.; Pahlevan, N.; Lee, Z.
2016-02-01
The transformation of Boston Harbor from the "dirtiest in America" to a National Park Area is one of the most remarkable estuarine recoveries in the world. A long-term water quality dataset from 1991 to present exists in Boston Harbor due to a $3. 8 billion lawsuit requiring the harbor clean-up. This project uses discrete water sampling and underway transects with a towed vehicle coordinated with Landsat 7 and Landsat 8 to create surface maps of chlorophyll a (Chl a), dissolved organic matter (CDOM and DOC), total suspended solids (TSS), diffuse attenuation coefficient (Kd_490), and photic depth in Boston Harbor. In addition, 3 buoys have been designed, constructed, and deployed in Boston Harbor that measure Chl a and CDOM fluorescence, optical backscatter, salinity, temperature, and meteorological parameters. We are initially using summer and fall of 2015 to develop atmospheric corrections for conditions in Boston Harbor and develop algorithms for Landsat 8 data to estimate in water photic depth, TSS, Chl a, Kd_490, and CDOM. We will report on initial buoy and cruise data and show 2015 Landsat-derived distributions of water quality parameters. It is our hope that once algorithms for present Landsat imagery can be developed, historical maps of water quality can be constructed using in water data back to 1991.
NASA Astrophysics Data System (ADS)
Bonfanti, C. E.; Stewart, J.; Lee, Y. J.; Govett, M.; Trailovic, L.; Etherton, B.
2017-12-01
One of the National Oceanic and Atmospheric Administration (NOAA) goals is to provide timely and reliable weather forecasts to support important decisions when and where people need it for safety, emergencies, planning for day-to-day activities. Satellite data is essential for areas lacking in-situ observations for use as initial conditions in Numerical Weather Prediction (NWP) Models, such as spans of the ocean or remote areas of land. Currently only about 7% of total received satellite data is selected for use and from that, an even smaller percentage ever are assimilated into NWP models. With machine learning, the computational and time costs needed for satellite data selection can be greatly reduced. We study various machine learning approaches to process orders of magnitude more satellite data in significantly less time allowing for a greater quantity and more intelligent selection of data to be used for assimilation purposes. Given the future launches of satellites in the upcoming years, machine learning is capable of being applied for better selection of Regions of Interest (ROI) in the magnitudes more of satellite data that will be received. This paper discusses the background of machine learning methods as applied to weather forecasting and the challenges of creating a "labeled dataset" for training and testing purposes. In the training stage of supervised machine learning, labeled data are important to identify a ROI as either true or false so that the model knows what signatures in satellite data to identify. Authors have selected cyclones, including tropical cyclones and mid-latitude lows, as ROI for their machine learning purposes and created a labeled dataset of true or false for ROI from Global Forecast System (GFS) reanalysis data. A dataset like this does not yet exist and given the need for a high quantity of samples, is was decided this was best done with automation. This process was done by developing a program similar to the National Center for Environmental Prediction (NCEP) tropical cyclone tracker by Marchok that was used to identify cyclones based off its physical characteristics. We will discuss the methods and challenges to creating this dataset and the dataset's use for our current supervised machine learning model as well as use for future work on events such as convection initiation.
Decadal prediction skill in the ocean with surface nudging in the IPSL-CM5A-LR climate model
NASA Astrophysics Data System (ADS)
Mignot, Juliette; García-Serrano, Javier; Swingedouw, Didier; Germe, Agathe; Nguyen, Sébastien; Ortega, Pablo; Guilyardi, Eric; Ray, Sulagna
2016-08-01
Two decadal prediction ensembles, based on the same climate model (IPSL-CM5A-LR) and the same surface nudging initialization strategy are analyzed and compared with a focus on upper-ocean variables in different regions of the globe. One ensemble consists of 3-member hindcasts launched every year since 1961 while the other ensemble benefits from 9 members but with start dates only every 5 years. Analysis includes anomaly correlation coefficients and root mean square errors computed against several reanalysis and gridded observational fields, as well as against the nudged simulation used to produce the hindcasts initial conditions. The last skill measure gives an upper limit of the predictability horizon one can expect in the forecast system, while the comparison with different datasets highlights uncertainty when assessing the actual skill. Results provide a potential prediction skill (verification against the nudged simulation) beyond the linear trend of the order of 10 years ahead at the global scale, but essentially associated with non-linear radiative forcings, in particular from volcanoes. At regional scale, we obtain 1 year in the tropical band, 10 years at midlatitudes in the North Atlantic and North Pacific, and 5 years at tropical latitudes in the North Atlantic, for both sea surface temperature (SST) and upper-ocean heat content. Actual prediction skill (verified against observational or reanalysis data) is overall more limited and less robust. Even so, large actual skill is found in the extratropical North Atlantic for SST and in the tropical to subtropical North Pacific for upper-ocean heat content. Results are analyzed with respect to the specific dynamics of the model and the way it is influenced by the nudging. The interplay between initialization and internal modes of variability is also analyzed for sea surface salinity. The study illustrates the importance of two key ingredients both necessary for the success of future coordinated decadal prediction exercises, a high frequency of start dates is needed to achieve robust statistical significance, and a large ensemble size is required to increase the signal to noise ratio.
The Alzheimer’s Disease Neuroimaging Initiative Informatics Core: A Decade in Review
Toga, Arthur W.; Crawford, Karen L.
2015-01-01
The Informatics Core of the Alzheimer’s Diseases Neuroimaging Initiative (ADNI) has coordinated data integration and dissemination for a continually growing and complex dataset in which both data contributors and recipients span institutions, scientific disciplines and geographic boundaries. This article provides an update on the accomplishments and future plans. PMID:26194316
Wieczorek, Michael; LaMotte, Andrew E.
2010-01-01
This tabular dataset represents the estimated area of artificial drainage for the year 1992 and irrigation types for the year 1997 compiled for every catchment of NHDPlus for the conterminous United States. The source datasets were derived from tabular National Resource Inventory (NRI) datasets created by the National Resources Conservation Service (NRCS, U.S. Department of Agriculture, 1995, 1997). Artificial drainage is defined as subsurface drains and ditches. Irrigation types are defined as gravity and pressure. Subsurface drains are described as conduits, such as corrugated plastic tubing, tile, or pipe, installed beneath the ground surface to collect and/or convey drainage. Surface drainage field ditches are described as graded ditches for collecting excess water. Gravity irrigation source is described as irrigation delivered to the farm and/or field by canals or pipelines open to the atmosphere; and water is distributed by the force of gravity down the field by: (1) A surface irrigation system (border, basin, furrow, corrugation, wild flooding, etc.) or (2) Sub-surface irrigation pipelines or ditches. Pressure irrigation source is described as irrigation delivered to the farm and/or field in pump or elevation-induced pressure pipelines, and water is distributed across the field by: (1) Sprinkle irrigation (center pivot, linear move, traveling gun, side roll, hand move, big gun, or fixed set sprinklers), or (2) Micro irrigation (drip emitters, continuous tube bubblers, micro spray or micro sprinklers). NRI data do not include Federal lands and are thus excluded from this dataset. The tabular data for drainage were spatially apportioned to the National Land Cover Dataset (NLCD, Kerie Hitt, written commun., 2005) and the tabular data for irrigation were spatially apportioned to an enhanced version of the National Land Cover Dataset (NLCDe, Nakagaki and others 2007) The NHDPlus Version 1.1 is an integrated suite of application-ready geospatial datasets that incorporates many of the best features of the National Hydrography Dataset (NHD) and the National Elevation Dataset (NED). The NHDPlus includes a stream network (based on the 1:100,00-scale NHD), improved networking, naming, and value-added attributes (VAAs). NHDPlus also includes elevation-derived catchments (drainage areas) produced using a drainage enforcement technique first widely used in New England, and thus referred to as "the New England Method." This technique involves "burning in" the 1:100,000-scale NHD and when available building "walls" using the National Watershed Boundary Dataset (WBD). The resulting modified digital elevation model (HydroDEM) is used to produce hydrologic derivatives that agree with the NHD and WBD. Over the past two years, an interdisciplinary team from the U.S. Geological Survey (USGS), and the U.S. Environmental Protection Agency (USEPA), and contractors, found that this method produces the best quality NHD catchments using an automated process (USEPA, 2007). The NHDPlus dataset is organized by 18 Production Units that cover the conterminous United States. The NHDPlus version 1.1 data are grouped by the U.S. Geological Survey's Major River Basins (MRBs, Crawford and others, 2006). MRB1, covering the New England and Mid-Atlantic River basins, contains NHDPlus Production Units 1 and 2. MRB2, covering the South Atlantic-Gulf and Tennessee River basins, contains NHDPlus Production Units 3 and 6. MRB3, covering the Great Lakes, Ohio, Upper Mississippi, and Souris-Red-Rainy River basins, contains NHDPlus Production Units 4, 5, 7 and 9. MRB4, covering the Missouri River basins, contains NHDPlus Production Units 10-lower and 10-upper. MRB5, covering the Lower Mississippi, Arkansas-White-Red, and Texas-Gulf River basins, contains NHDPlus Production Units 8, 11 and 12. MRB6, covering the Rio Grande, Colorado and Great Basin River basins, contains NHDPlus Production Units 13, 14, 15 and 16. MRB7, covering the Pacific Northwest River basins, contains NHDPlus Production Unit 17. MRB8, covering California River basins, contains NHDPlus Production Unit 18.
Ground robotic measurement of aeolian processes
NASA Astrophysics Data System (ADS)
Qian, Feifei; Jerolmack, Douglas; Lancaster, Nicholas; Nikolich, George; Reverdy, Paul; Roberts, Sonia; Shipley, Thomas; Van Pelt, R. Scott; Zobeck, Ted M.; Koditschek, Daniel E.
2017-08-01
Models of aeolian processes rely on accurate measurements of the rates of sediment transport by wind, and careful evaluation of the environmental controls of these processes. Existing field approaches typically require intensive, event-based experiments involving dense arrays of instruments. These devices are often cumbersome and logistically difficult to set up and maintain, especially near steep or vegetated dune surfaces. Significant advances in instrumentation are needed to provide the datasets that are required to validate and improve mechanistic models of aeolian sediment transport. Recent advances in robotics show great promise for assisting and amplifying scientists' efforts to increase the spatial and temporal resolution of many environmental measurements governing sediment transport. The emergence of cheap, agile, human-scale robotic platforms endowed with increasingly sophisticated sensor and motor suites opens up the prospect of deploying programmable, reactive sensor payloads across complex terrain in the service of aeolian science. This paper surveys the need and assesses the opportunities and challenges for amassing novel, highly resolved spatiotemporal datasets for aeolian research using partially-automated ground mobility. We review the limitations of existing measurement approaches for aeolian processes, and discuss how they may be transformed by ground-based robotic platforms, using examples from our initial field experiments. We then review how the need to traverse challenging aeolian terrains and simultaneously make high-resolution measurements of critical variables requires enhanced robotic capability. Finally, we conclude with a look to the future, in which robotic platforms may operate with increasing autonomy in harsh conditions. Besides expanding the completeness of terrestrial datasets, bringing ground-based robots to the aeolian research community may lead to unexpected discoveries that generate new hypotheses to expand the science itself.
Geobrowser Enhanced Access of Real-Time Antarctic Data
NASA Astrophysics Data System (ADS)
Breen, P.; Judge, D.; Cunningham, N.; Kirsch, P. J.
2007-12-01
A proof of principle project was initiated in the Fall of 2006 to develop a system enabling remote field station and ship borne data, collected in near real-time to be discovered, visualised and acquired through a web accessible framework. The two principal enabling drivers for this system were the recent improvements in communications with remote field stations and ships and the advent of low cost, easily accessible geobrowser technology providing the ability to visualise multiple, sometimes physically disparate datasets within a common interface. Strongly spatial in nature the oceanographic datasets suggested the incorporation of geobrowser (Google Earth) technology into this framework. A number of scientific benefits were identified by the project, these include the overall enhancing of the value of many of the datasets through their real-time contribution to forecasting models, satellite ground truthing and calibration of autonomous instrumentation. Improved efficacy of fieldwork led to rapid discovery of problems and the ability to deal with them promptly. The ability to correct or improve experiment parameters and increase capability of routine collection of high-quality data. In the past it may have been over a year before data arrived back at HQ potentially unusable, definitely unrepeatable and significantly reducing or delaying scientific output. The geobrowser interface provides the platform from which the spatial data is discovered, for example ship tracks and aspects of the physical oceanography such as sea surface temperature can be directly visualized. Importantly, ancillary and auxiliary information and metadata can be linked to the cruise data in a straightforward and accessible manner; scientists in Cambridge using a geobrowser were able to access and visualize cruise data from the Southern ocean 20 minutes after collection.
Das, Sayantan; Patel, Priyank Pravin; Sengupta, Somasis
2016-01-01
With myriad geospatial datasets now available for terrain information extraction and particularly streamline demarcation, there arises questions regarding the scale, accuracy and sensitivity of the initial dataset from which these aspects are derived, as they influence all other parameters computed subsequently. In this study, digital elevation models (DEM) derived from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER V2), Shuttle Radar Topography Mission (SRTM V4, C-Band, 3 arc-second), Cartosat -1 (CartoDEM 1.0) and topographical maps (R.F. 1:250,000 and 1:50,000), have been used to individually extract and analyze the relief, surface, size, shape and texture properties of a mountainous drainage basin. Nestled inside a mountainous setting, the basin is a semi-elongated one with high relief ratio (>90), steep slopes (25°-30°) and high drainage density (>3.5 km/sq km), as computed from the different DEMs. The basin terrain and stream network is extracted from each DEM, whose morphometric attributes are compared with the surveyed stream networks present in the topographical maps, with resampling of finer DEM datasets to coarser resolutions, to reduce scale-implications during the delineation process. Ground truth verifications for altitudinal accuracy have also been done by a GPS survey. DEMs derived from the 1:50,000 topographical map and ASTER GDEM V2 data are found to be more accurate and consistent in terms of absolute accuracy, than the other generated or available DEM data products, on basis of the morphometric parameters extracted from each. They also exhibit a certain degree of proximity to the surveyed topographical map.
Velocity Structure of the Iran Region Using Seismic and Gravity Observations
NASA Astrophysics Data System (ADS)
Syracuse, E. M.; Maceira, M.; Phillips, W. S.; Begnaud, M. L.; Nippress, S. E. J.; Bergman, E.; Zhang, H.
2015-12-01
We present a 3D Vp and Vs model of Iran generated using a joint inversion of body wave travel times, Rayleigh wave dispersion curves, and high-wavenumber filtered Bouguer gravity observations. Our work has two main goals: 1) To better understand the tectonics of a prominent example of continental collision, and 2) To assess the improvements in earthquake location possible as a result of joint inversion. The body wave dataset is mainly derived from previous work on location calibration and includes the first-arrival P and S phases of 2500 earthquakes whose initial locations qualify as GT25 or better. The surface wave dataset consists of Rayleigh wave group velocity measurements for regional earthquakes, which are inverted for a suite of period-dependent Rayleigh wave velocity maps prior to inclusion in the joint inversion for body wave velocities. We use gravity anomalies derived from the global gravity model EGM2008. To avoid mapping broad, possibly dynamic features in the gravity field intovariations in density and body wave velocity, we apply a high-pass wavenumber filter to the gravity measurements. We use a simple, approximate relationship between density and velocity so that the three datasets may be combined in a single inversion. The final optimized 3D Vp and Vs model allows us to explore how multi-parameter tomography addresses crustal heterogeneities in areas of limited coverage and improves travel time predictions. We compare earthquake locations from our models to independent locations obtained from InSAR analysis to assess the improvement in locations derived in a joint-inversion model in comparison to those derived in a more traditional body-wave-only velocity model.
The Large Scale Distribution of Water Ice in the Polar Regions of the Moon
NASA Astrophysics Data System (ADS)
Jordan, A.; Wilson, J. K.; Schwadron, N.; Spence, H. E.
2017-12-01
For in situ resource utilization, one must know where water ice is on the Moon. Many datasets have revealed both surface deposits of water ice and subsurface deposits of hydrogen near the lunar poles, but it has proved difficult to resolve the differences among the locations of these deposits. Despite these datasets disagreeing on how deposits are distributed on small scales, we show that most of these datasets do agree on the large scale distribution of water ice. We present data from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on the Lunar Reconnaissance Orbiter (LRO), LRO's Lunar Exploration Neutron Detector (LEND), the Neutron Spectrometer on Lunar Prospector (LPNS), LRO's Lyman Alpha Mapping Project (LAMP), LRO's Lunar Orbiter Laser Altimeter (LOLA), and Chandrayaan-1's Moon Mineralogy Mapper (M3). All, including those that show clear evidence for water ice, reveal surprisingly similar trends with latitude, suggesting that both surface and subsurface datasets are measuring ice. All show that water ice increases towards the poles, and most demonstrate that its signature appears at about ±70° latitude and increases poleward. This is consistent with simulations of how surface and subsurface cold traps are distributed with latitude. This large scale agreement constrains the origin of the ice, suggesting that an ancient cometary impact (or impacts) created a large scale deposit that has been rendered locally heterogeneous by subsequent impacts. Furthermore, it also shows that water ice may be available down to ±70°—latitudes that are more accessible than the poles for landing.
NASA Technical Reports Server (NTRS)
Gottschalck, Jon; Meng, Jesse; Rodel, Matt; Houser, paul
2005-01-01
Land surface models (LSMs) are computer programs, similar to weather and climate prediction models, which simulate the stocks and fluxes of water (including soil moisture, snow, evaporation, and runoff) and energy (including the temperature of and sensible heat released from the soil) after they arrive on the land surface as precipitation and sunlight. It is not currently possible to measure all of the variables of interest everywhere on Earth with sufficient accuracy and space-time resolution. Hence LSMs have been developed to integrate the available observations with our understanding of the physical processes involved, using powerful computers, in order to map these stocks and fluxes as they change in time. The maps are used to improve weather forecasts, support water resources and agricultural applications, and study the Earth's water cycle and climate variability. NASA's Global Land Data Assimilation System (GLDAS) project facilitates testing of several different LSMs with a variety of input datasets (e.g., precipitation, plant type). Precipitation is arguably the most important input to LSMs. Many precipitation datasets have been produced using satellite and rain gauge observations and weather forecast models. In this study, seven different global precipitation datasets were evaluated over the United States, where dense rain gauge networks contribute to reliable precipitation maps. We then used the seven datasets as inputs to GLDAS simulations, so that we could diagnose their impacts on output stocks and fluxes of water. In terms of totals, the Climate Prediction Center (CPC) Merged Analysis of Precipitation (CMAP) had the closest agreement with the US rain gauge dataset for all seasons except winter. The CMAP precipitation was also the most closely correlated in time with the rain gauge data during spring, fall, and winter, while the satellitebased estimates performed best in summer. The GLDAS simulations revealed that modeled soil moisture is highly sensitive to precipitation, with differences in spring and summer as large as 45% depending on the choice of precipitation input.
Lung mass density analysis using deep neural network and lung ultrasound surface wave elastography.
Zhou, Boran; Zhang, Xiaoming
2018-05-23
Lung mass density is directly associated with lung pathology. Computed Tomography (CT) evaluates lung pathology using the Hounsfield unit (HU) but not lung density directly. We have developed a lung ultrasound surface wave elastography (LUSWE) technique to measure the surface wave speed of superficial lung tissue. The objective of this study was to develop a method for analyzing lung mass density of superficial lung tissue using a deep neural network (DNN) and synthetic data of wave speed measurements with LUSWE. The synthetic training dataset of surface wave speed, excitation frequency, lung mass density, and viscoelasticity from LUSWE (788,000 in total) was used to train the DNN model. The DNN was composed of 3 hidden layers of 1024 neurons for each layer and trained for 10 epochs with a batch size of 4096 and a learning rate of 0.001 with three types of optimizers. The test dataset (4000) of wave speeds at three excitation frequencies (100, 150, and 200 Hz) and shear elasticity of superficial lung tissue was used to predict the lung density and evaluate its accuracy compared with predefined lung mass densities. This technique was then validated on a sponge phantom experiment. The obtained results showed that predictions matched well with test dataset (validation accuracy is 0.992) and experimental data in the sponge phantom experiment. This method may be useful to analyze lung mass density by using the DNN model together with the surface wave speed and lung stiffness measurements. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Barsai, Gabor
Creating accurate, current digital maps and 3-D scenes is a high priority in today's fast changing environment. The nation's maps are in a constant state of revision, with many alterations or new additions each day. Digital maps have become quite common. Google maps, Mapquest and others are examples. These also have 3-D viewing capability. Many details are now included, such as the height of low bridges, in the attribute data for the objects displayed on digital maps and scenes. To expedite the updating of these datasets, they should be created autonomously, without human intervention, from data streams. Though systems exist that attain fast, or even real-time performance mapping and reconstruction, they are typically restricted to creating sketches from the data stream, and not accurate maps or scenes. The ever increasing amount of image data available from private companies, governments and the internet, suggest the development of an automated system is of utmost importance. The proposed framework can create 3-D views autonomously; which extends the functionality of digital mapping. The first step to creating 3-D views is to reconstruct the scene of the area to be mapped. To reconstruct a scene from heterogeneous sources, the data has to be registered: either to each other or, preferably, to a general, absolute coordinate system. Registering an image is based on the reconstruction of the geometric relationship of the image to the coordinate system at the time of imaging. Registration is the process of determining the geometric transformation parameters of a dataset in one coordinate system, the source, with respect to the other coordinate system, the target. The advantages of fusing these datasets by registration manifests itself by the data contained in the complementary information that different modality datasets have. The complementary characteristics of these systems can be fully utilized only after successful registration of the photogrammetric and alternative data relative to a common reference frame. This research provides a novel approach to finding registration parameters, without the explicit use of conjugate points, but using conjugate features. These features are open or closed free-form linear features, there is no need for a parametric or any other type of representation of these features The proposed method will use different modality datasets of the same area: lidar data, image data and GIS data. There are two datasets: one from the Ohio State University and the other from San Bernardino, California. The reconstruction of scenes from imagery and range data, using laser and radar data, has been an active research area in the fields of photogrammetry and computer vision. Automatic, or just less human intervention, would have a great impact on alleviating the "bottle-neck" that describes the current state of creating knowledge from data. Pixels or laser points, the output of the sensor, represent a discretization of the real world. By themselves, these data points do not contain representative information. The values that are associated with them, intensity values and coordinates, do not define an object, and thus accurate maps are not possible just from data. Data is not an end product, nor does it directly provide answers to applications, although implicitly, the information about the object in question is contained in the data. In some form, the data from the initial data acquisition by the sensor has to be further processed to create useable information, and this information has to be combined with facts, procedures and heuristics that can be used to make inferences for reconstruction. To reconstruct a scene perfectly, whether it is an urban or rural scene, requires prior knowledge, heuristics. Buildings are, usually, smooth surfaces and many buildings are blocky with orthogonal, straight edges and sides; streets are smooth; vegetation is rough, with different shapes and sizes of trees, bushes. This research provides a path to fuse data from lidar, GIS and digital multispectral images and reconstructing the precise 3-D scene model, without human intervention, regardless of the type of data or features in the data. The data are initially registered to each other using GPS/INS initial positional values, then conjugate features are found in the datasets to refine the registration. The novelty of the research is that no conjugate points are necessary in the various datasets, and registration is performed without human intervention. The proposed system uses the original lidar and GIS data and finds edges of buildings with the help of the digital images, utilizing the exterior orientation parameters to project the lidar points onto the edge extracted image/map. These edge points are then utilized to orient and locate the datasets, in a correct position with respect to each other.
Quantifying the impact of human activity on temperatures in Germany
NASA Astrophysics Data System (ADS)
Benz, Susanne A.; Bayer, Peter; Blum, Philipp
2017-04-01
Human activity directly influences ambient air, surface and groundwater temperatures. Alterations of surface cover and land use influence the ambient thermal regime causing spatial temperature anomalies, most commonly heat islands. These local temperature anomalies are primarily described within the bounds of large and densely populated urban settlements, where they form so-called urban heat islands (UHI). This study explores the anthropogenic impact not only for selected cities, but for the thermal regime on a countrywide scale, by analyzing mean annual temperature datasets in Germany in three different compartments: measured surface air temperature (SAT), measured groundwater temperature (GWT), and satellite-derived land surface temperature (LST). As a universal parameter to quantify anthropogenic heat anomalies, the anthropogenic heat intensity (AHI) is introduced. It is closely related to the urban heat island intensity, but determined for each pixel (for satellite-derived LST) or measurement point (for SAT and GWT) of a large, even global, dataset individually, regardless of land use and location. Hence, it provides the unique opportunity to a) compare the anthropogenic impact on temperatures in air, surface and subsurface, b) to find main instances of anthropogenic temperature anomalies within the study area, in this case Germany, and c) to study the impact of smaller settlements or industrial sites on temperatures. For all three analyzed temperature datasets, anthropogenic heat intensity grows with increasing nighttime lights and declines with increasing vegetation, whereas population density has only minor effects. While surface anthropogenic heat intensity cannot be linked to specific land cover types in the studied resolution (1 km × 1 km) and classification system, both air and groundwater show increased heat intensities for artificial surfaces. Overall, groundwater temperature appears most vulnerable to human activity; unlike land surface temperature and surface air temperature, groundwater temperatures are elevated in cultivated areas as well. At the surface of Germany, the highest anthropogenic heat intensity with 4.5 K is found at an open-pit lignite mine near Jülich, followed by three large cities (Munich, Düsseldorf and Nuremberg) with annual mean anthropogenic heat intensities > 4 K. Overall, surface anthropogenic heat intensities > 0 K and therefore urban heat islands are observed in communities down to a population of 5,000.
Rheem, Sungsue; Rheem, Insoo; Oh, Sejong
2017-01-01
Response surface methodology (RSM) is a useful set of statistical techniques for modeling and optimizing responses in research studies of food science. In the analysis of response surface data, a second-order polynomial regression model is usually used. However, sometimes we encounter situations where the fit of the second-order model is poor. If the model fitted to the data has a poor fit including a lack of fit, the modeling and optimization results might not be accurate. In such a case, using a fullest balanced model, which has no lack of fit, can fix such problem, enhancing the accuracy of the response surface modeling and optimization. This article presents how to develop and use such a model for the better modeling and optimizing of the response through an illustrative re-analysis of a dataset in Park et al. (2014) published in the Korean Journal for Food Science of Animal Resources .
NASA Technical Reports Server (NTRS)
Fasnacht, Zachary; Qin, Wenhan; Haffner, David P.; Loyola, Diego; Joiner, Joanna; Krotkov, Nickolay; Vasilkov, Alexander; Spurr, Robert
2017-01-01
Surface Lambertian-equivalent reflectivity (LER) is important for trace gas retrievals in the direct calculation of cloud fractions and indirect calculation of the air mass factor. Current trace gas retrievals use climatological surface LER's. Surface properties that impact the bidirectional reflectance distribution function (BRDF) as well as varying satellite viewing geometry can be important for retrieval of trace gases. Geometry Dependent LER (GLER) captures these effects with its calculation of sun normalized radiances (I/F) and can be used in current LER algorithms (Vasilkov et al. 2016). Pixel by pixel radiative transfer calculations are computationally expensive for large datasets. Modern satellite missions such as the Tropospheric Monitoring Instrument (TROPOMI) produce very large datasets as they take measurements at much higher spatial and spectral resolutions. Look up table (LUT) interpolation improves the speed of radiative transfer calculations but complexity increases for non-linear functions. Neural networks perform fast calculations and can accurately predict both non-linear and linear functions with little effort.
Forecasting surface water flooding hazard and impact in real-time
NASA Astrophysics Data System (ADS)
Cole, Steven J.; Moore, Robert J.; Wells, Steven C.
2016-04-01
Across the world, there is increasing demand for more robust and timely forecast and alert information on Surface Water Flooding (SWF). Within a UK context, the government Pitt Review into the Summer 2007 floods provided recommendations and impetus to improve the understanding of SWF risk for both off-line design and real-time forecasting and warning. Ongoing development and trial of an end-to-end real-time SWF system is being progressed through the recently formed Natural Hazards Partnership (NHP) with delivery to the Flood Forecasting Centre (FFC) providing coverage over England & Wales. The NHP is a unique forum that aims to deliver coordinated assessments, research and advice on natural hazards for governments and resilience communities across the UK. Within the NHP, a real-time Hazard Impact Model (HIM) framework has been developed that includes SWF as one of three hazards chosen for initial trialling. The trial SWF HIM system uses dynamic gridded surface-runoff estimates from the Grid-to-Grid (G2G) hydrological model to estimate the SWF hazard. National datasets on population, infrastructure, property and transport are available to assess impact severity for a given rarity of SWF hazard. Whilst the SWF hazard footprint is calculated in real-time using 1, 3 and 6 hour accumulations of G2G surface runoff on a 1 km grid, it has been possible to associate these with the effective rainfall design profiles (at 250m resolution) used as input to a detailed flood inundation model (JFlow+) run offline to produce hazard information resolved to 2m resolution. This information is contained in the updated Flood Map for Surface Water (uFMfSW) held by the Environment Agency. The national impact datasets can then be used with the uFMfSW SWF hazard dataset to assess impacts at this scale and severity levels of potential impact assigned at 1km and for aggregated county areas in real-time. The impact component is being led by the Health and Safety Laboratory (HSL) within the NHP. Flood Guidance within the FFC employs the national Flood Risk Matrix, which categorises potential impacts into minimal, minor, significant and severe, and Likelihood, into very low, low, medium and high classes, and the matrix entries then define the Overall Flood Risk as very low, low, medium and high. Likelihood is quantified by running G2G with Met Office ensemble rainfall inputs that in turn allows a probability to be assigned to the SWF hazard and associated impact. This overall procedure is being trialled and refined off-line by CEH and HSL using case study data, and at the same time implemented as a pre-operational test system at the Met Office for evaluation by FFC (a joint Environment Agency and Met Office centre for flood forecasting) in 2016.
Skipping Strategy (SS) for Initial Population of Job-Shop Scheduling Problem
NASA Astrophysics Data System (ADS)
Abdolrazzagh-Nezhad, M.; Nababan, E. B.; Sarim, H. M.
2018-03-01
Initial population in job-shop scheduling problem (JSSP) is an essential step to obtain near optimal solution. Techniques used to solve JSSP are computationally demanding. Skipping strategy (SS) is employed to acquire initial population after sequence of job on machine and sequence of operations (expressed in Plates-jobs and mPlates-jobs) are determined. The proposed technique is applied to benchmark datasets and the results are compared to that of other initialization techniques. It is shown that the initial population obtained from the SS approach could generate optimal solution.
New NOAA-15 Advanced Microwave Sounding Unit (AMSU) Datasets for Stratospheric Research
NASA Technical Reports Server (NTRS)
Spencer, Roy W.; Braswell, William D.
1999-01-01
The NOAA-15 spacecraft launched in May 1998 carried the first Advanced Microwave Sounding Unit (AMSU). The AMSU has eleven oxygen absorption channels with weighting functions peaking from near the surface to 2 mb. Twice-daily, limb-corrected I degree gridded datasets of layer temperatures have been constructed since the AMSU went operational in early August 1998. Examples of AMSU imagery will be shown, as will preliminary analyses of daily fluctuations in tropical stratospheric temperatures and their relationship to daily variations in tropical-average rainfall measured by the Special Sensor Microwave Imager (SSM/I). The AMSU datasets are now available for other researchers to utilize.
An overview of results from the GEWEX radiation flux assessment
NASA Astrophysics Data System (ADS)
Raschke, E.; Stackhouse, P.; Kinne, S.; Contributors from Europe; the USA
2013-05-01
Multi-annual radiative flux averages of the International Cloud Climatology Project (ISCCP), of the GEWEX - Surface Radiation Budget Project (SRB) and of the Clouds and Earth Radiative Energy System (CERES) are compared and analyzed to characterize the Earth's radiative budget, assess differences and identify possible causes. These satellite based data-sets are also compared to results of a median model, which represents 20 climate models, that participated in the 4th IPCC assessment. Consistent distribution patterns and seasonal variations among the satellite data-sets demonstrate their scientific value, which would further increase if the datasets would be reanalyzed with more accurate and consistent ancillary data.
Check your biosignals here: a new dataset for off-the-person ECG biometrics.
da Silva, Hugo Plácido; Lourenço, André; Fred, Ana; Raposo, Nuno; Aires-de-Sousa, Marta
2014-02-01
The Check Your Biosignals Here initiative (CYBHi) was developed as a way of creating a dataset and consistently repeatable acquisition framework, to further extend research in electrocardiographic (ECG) biometrics. In particular, our work targets the novel trend towards off-the-person data acquisition, which opens a broad new set of challenges and opportunities both for research and industry. While datasets with ECG signals collected using medical grade equipment at the chest can be easily found, for off-the-person ECG data the solution is generally for each team to collect their own corpus at considerable expense of resources. In this paper we describe the context, experimental considerations, methods, and preliminary findings of two public datasets created by our team, one for short-term and another for long-term assessment, with ECG data collected at the hand palms and fingers. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Defining surfaces for skewed, highly variable data
Helsel, D.R.; Ryker, S.J.
2002-01-01
Skewness of environmental data is often caused by more than simply a handful of outliers in an otherwise normal distribution. Statistical procedures for such datasets must be sufficiently robust to deal with distributions that are strongly non-normal, containing both a large proportion of outliers and a skewed main body of data. In the field of water quality, skewness is commonly associated with large variation over short distances. Spatial analysis of such data generally requires either considerable effort at modeling or the use of robust procedures not strongly affected by skewness and local variability. Using a skewed dataset of 675 nitrate measurements in ground water, commonly used methods for defining a surface (least-squares regression and kriging) are compared to a more robust method (loess). Three choices are critical in defining a surface: (i) is the surface to be a central mean or median surface? (ii) is either a well-fitting transformation or a robust and scale-independent measure of center used? (iii) does local spatial autocorrelation assist in or detract from addressing objectives? Published in 2002 by John Wiley & Sons, Ltd.
Hou, Ying-Yu; He, Yan-Bo; Wang, Jian-Lin; Tian, Guo-Liang
2009-10-01
Based on the time series 10-day composite NOAA Pathfinder AVHRR Land (PAL) dataset (8 km x 8 km), and by using land surface energy balance equation and "VI-Ts" (vegetation index-land surface temperature) method, a new algorithm of land surface evapotranspiration (ET) was constructed. This new algorithm did not need the support from meteorological observation data, and all of its parameters and variables were directly inversed or derived from remote sensing data. A widely accepted ET model of remote sensing, i. e., SEBS model, was chosen to validate the new algorithm. The validation test showed that both the ET and its seasonal variation trend estimated by SEBS model and our new algorithm accorded well, suggesting that the ET estimated from the new algorithm was reliable, being able to reflect the actual land surface ET. The new ET algorithm of remote sensing was practical and operational, which offered a new approach to study the spatiotemporal variation of ET in continental scale and global scale based on the long-term time series satellite remote sensing images.
Influence of crisp values on the object-based data extraction procedure from LiDAR data
NASA Astrophysics Data System (ADS)
Tomljenovic, Ivan; Rousell, Adam
2014-05-01
Nowadays a plethora of approaches attempt to automate the process of object extraction from LiDAR data. However, the majority of these methods require the fusion of the LiDAR dataset with other information such as photogrammetric imagery. The approach that has been used as the basis for this paper is a novel method which makes use of human knowledge and the CNL modelling language to automatically extract buildings solely from LiDAR point cloud data in a transferable method. A number of rules are implemented to generate an artificial intelligence algorithm which is used for the object extraction. Although the single dataset method has been found to successfully extract building footprints from the point cloud dataset, at this initial stage it has one restriction that may limit its effectiveness - a number of the rules that are used are based on crisp boundary values. If, for example, the slope of the ground surface is used as a rule for determining objects then the slope value of a pixel would be assessed to determine if it is suitable for a building structure. This check would be performed by identifying whether the slope value is less than or greater than a threshold value. However, in reality such a crisp classification process is likely not to be a true reflection of real world scenarios. For example, using the crisp methods a difference of 1° in slope could result in one region in a dataset being deemed suitable and its neighboring region being seen as not suitable. It is likely however that there is in reality little difference in the actual suitability of these two neighboring regions. A more suitable classification process may be the use of fuzzy set theory whereby each region is seen as having degree of membership to a number of sets (or classifications). In the above example, the two regions would likely be seen as having very similar membership values to the different sets, although this is obviously dependent on factors such as the extent of each region. The purpose of this study is to identify to what extent the use of explicit boundary values has on the overall building footprint dataset extracted. By performing the analysis multiple times using differing threshold values for rules, it is possible to compare the resultant datasets and thus identify the impact of using such classification procedures. If a significant difference is found between the resultant datasets, this would highlight that the use of such crisp methods in the extraction processes may not be optimal and that a future enhancement to the method would be to consider the use of fuzzy classification methods.
NASA Astrophysics Data System (ADS)
Lovell, Arminel; Carr, Rachel; Stokes, Chris
2017-04-01
Himalayan glaciers have shrunk rapidly during the past twenty years. Understanding the factors controlling these losses is vital for forecasting changes in water resources, as the Himalaya houses the headwaters of major river systems, with densely populated catchments downstream. However, our knowledge of Himalayan glaciers is comparatively limited, due to their high-altitude, remote location. This is particularly the case in the Annapurna-Manaslu region, which has received relatively little scientific attention to date. Here, we present initial findings from remotely sensed data analysis, and our first field campaign in October 2016. Feature tracking of Band 8 Landsat imagery demonstrates that velocities in the region reach a maximum of 70-100 m a-1 , which is somewhat faster than those reported in the Khumbu region (e.g. Quincey et al 2009). A number of glaciers have substantial stagnant ice tongues, and most are flowing faster in the upper ablation zone than in the lower sections. The most rapidly flowing glaciers are located in the south-east of the Annapurna-Manaslu region and tend to also be the largest. Interestingly, initial observations suggest that the debris-covered ablation zones in the south-east are flowing more rapidly than the smaller, clean-ice glaciers in the north of the region. Comparison of velocities between 2000-2001 and 2014-2015 suggests deceleration on some glacier tongues. In October 2016, we conducted fieldwork on Annapurna South Glacier, located at the foot of Annapurna I. Here, we collected a number of datasets, with the aim of assessing the relationship between surface elevation change, ice velocities and debris cover. These included: i) installing ablation stakes in areas with varying debris cover; ii) quantifying debris characteristics, using Wolman counting and by measuring thickness; iii) surveying the glacier surface, using a differential GPS; iv) monitoring ice cliff melting, using Structure from Motion and; v) measuring surface and sub-surface temperatures, using i-buttons. Initial results demonstrate large spatial variability in debris characteristics and thickness, which in turn appears to substantially influence melt rates. The surface topography is highly uneven and a number of ice cliffs are present, where melt rates appear to be much higher than in surrounding areas. Interestingly, we observed very few surface melt ponds or surface melt water, which we suggest maybe be due to the basal topography and/or debris characteristics, and aim to further investigate this during our 2017 fieldwork.
A novel binary shape context for 3D local surface description
NASA Astrophysics Data System (ADS)
Dong, Zhen; Yang, Bisheng; Liu, Yuan; Liang, Fuxun; Li, Bijun; Zang, Yufu
2017-08-01
3D local surface description is now at the core of many computer vision technologies, such as 3D object recognition, intelligent driving, and 3D model reconstruction. However, most of the existing 3D feature descriptors still suffer from low descriptiveness, weak robustness, and inefficiency in both time and memory. To overcome these challenges, this paper presents a robust and descriptive 3D Binary Shape Context (BSC) descriptor with high efficiency in both time and memory. First, a novel BSC descriptor is generated for 3D local surface description, and the performance of the BSC descriptor under different settings of its parameters is analyzed. Next, the descriptiveness, robustness, and efficiency in both time and memory of the BSC descriptor are evaluated and compared to those of several state-of-the-art 3D feature descriptors. Finally, the performance of the BSC descriptor for 3D object recognition is also evaluated on a number of popular benchmark datasets, and an urban-scene dataset is collected by a terrestrial laser scanner system. Comprehensive experiments demonstrate that the proposed BSC descriptor obtained high descriptiveness, strong robustness, and high efficiency in both time and memory and achieved high recognition rates of 94.8%, 94.1% and 82.1% on the considered UWA, Queen, and WHU datasets, respectively.
Linking Satellite Derived Land Surface Temperature with Cholera: A Case Study for South Sudan
NASA Astrophysics Data System (ADS)
Aldaach, H. S. V.; Jutla, A.; Akanda, A. S.; Colwell, R. R.
2014-12-01
A sudden onset of cholera in South Sudan, in April 2014 in Northern Bari in Juba town resulted in more than 400 cholera cases after four weeks of initial outbreak with a case of fatality rate of CFR 5.4%. The total number of reported cholera cases for the period of April to July, 2014 were 5,141 including 114 deaths. With the limited efficacy of cholera vaccines, it is necessary to develop mechanisms to predict cholera occurrence and thereafter devise intervention strategies for mitigating impacts of the disease. Hydroclimatic processes, primarily precipitation and air temperature are related to epidemic and episodic outbreak of cholera. However, due to coarse resolution of both datasets, it is not possible to precisely locate the geographical location of disease. Here, using Land Surface Temperature (LST) from MODIS sensors, we have developed an algorithm to identify regions susceptible for cholera. Conditions for occurrence of cholera were detectable at least one month in advance in South Sudan and were statistically sensitive to hydroclimatic anomalies of land surface and air temperature, and precipitation. Our results indicate significant spatial and temporal averaging required to infer usable information from LST over South Sudan. Preliminary results that geographically location of cholera outbreak was identifiable within 1km resolution of the LST data.
Agriculture-driven deforestation in the tropics from 1990-2015: emissions, trends and uncertainties
NASA Astrophysics Data System (ADS)
Carter, Sarah; Herold, Martin; Avitabile, Valerio; de Bruin, Sytze; De Sy, Veronique; Kooistra, Lammert; Rufino, Mariana C.
2018-01-01
Limited data exists on emissions from agriculture-driven deforestation, and available data are typically uncertain. In this paper, we provide comparable estimates of emissions from both all deforestation and agriculture-driven deforestation, with uncertainties for 91 countries across the tropics between 1990 and 2015. Uncertainties associated with input datasets (activity data and emissions factors) were used to combine the datasets, where most certain datasets contribute the most. This method utilizes all the input data, while minimizing the uncertainty of the emissions estimate. The uncertainty of input datasets was influenced by the quality of the data, the sample size (for sample-based datasets), and the extent to which the timeframe of the data matches the period of interest. Area of deforestation, and the agriculture-driver factor (extent to which agriculture drives deforestation), were the most uncertain components of the emissions estimates, thus improvement in the uncertainties related to these estimates will provide the greatest reductions in uncertainties of emissions estimates. Over the period of the study, Latin America had the highest proportion of deforestation driven by agriculture (78%), and Africa had the lowest (62%). Latin America had the highest emissions from agriculture-driven deforestation, and these peaked at 974 ± 148 Mt CO2 yr-1 in 2000-2005. Africa saw a continuous increase in emissions between 1990 and 2015 (from 154 ± 21-412 ± 75 Mt CO2 yr-1), so mitigation initiatives could be prioritized there. Uncertainties for emissions from agriculture-driven deforestation are ± 62.4% (average over 1990-2015), and uncertainties were highest in Asia and lowest in Latin America. Uncertainty information is crucial for transparency when reporting, and gives credibility to related mitigation initiatives. We demonstrate that uncertainty data can also be useful when combining multiple open datasets, so we recommend new data providers to include this information.
USDA-ARS?s Scientific Manuscript database
Land surface albedo has been recognized by the Global Terrestrial Observing System (GTOS) as an essential climate variable crucial for accurate modeling and monitoring of the Earth’s radiative budget. While global climate studies can leverage albedo datasets from MODIS, VIIRS, and other coarse-reso...
Assessment of surface runoff depth changes in S\\varǎţel River basin, Romania using GIS techniques
NASA Astrophysics Data System (ADS)
Romulus, Costache; Iulia, Fontanine; Ema, Corodescu
2014-09-01
S\\varǎţel River basin, which is located in Curvature Subcarpahian area, has been facing an obvious increase in frequency of hydrological risk phenomena, associated with torrential events, during the last years. This trend is highly related to the increase in frequency of the extreme climatic phenomena and to the land use changes. The present study is aimed to highlight the spatial and quantitative changes occurred in surface runoff depth in S\\varǎţel catchment, between 1990-2006. This purpose was reached by estimating the surface runoff depth assignable to the average annual rainfall, by means of SCS-CN method, which was integrated into the GIS environment through the ArcCN-Runoff extension, for ArcGIS 10.1. In order to compute the surface runoff depth, by CN method, the land cover and the hydrological soil classes were introduced as vector (polygon data), while the curve number and the average annual rainfall were introduced as tables. After spatially modeling the surface runoff depth for the two years, the 1990 raster dataset was subtracted from the 2006 raster dataset, in order to highlight the changes in surface runoff depth.
NASA Astrophysics Data System (ADS)
Chou, H. K.; Ochoa-Tocachi, B. F.; Buytaert, W.
2017-12-01
Community land surface models such as JULES are increasingly used for hydrological assessment because of their state-of-the-art representation of land-surface processes. However, a major weakness of JULES and other land surface models is the limited number of land surface parameterizations that is available. Therefore, this study explores the use of data from a network of catchments under homogeneous land-use to generate parameter "libraries" to extent the land surface parameterizations of JULES. The network (called iMHEA) is part of a grassroots initiative to characterise the hydrological response of different Andean ecosystems, and collects data on streamflow, precipitation, and several weather variables at a high temporal resolution. The tropical Andes are a useful case study because of the complexity of meteorological and geographical conditions combined with extremely heterogeneous land-use that result in a wide range of hydrological responses. We then calibrated JULES for each land-use represented in the iMHEA dataset. For the individual land-use types, the results show improved simulations of streamflow when using the calibrated parameters with respect to default values. In particular, the partitioning between surface and subsurface flows can be improved. But also, on a regional scale, hydrological modelling was greatly benefitted from constraining parameters using such distributed citizen-science generated streamflow data. This study demonstrates the modelling and prediction on regional hydrology by integrating citizen science and land surface model. In the context of hydrological study, the limitation of data scarcity could be solved indeed by using this framework. Improved predictions of such impacts could be leveraged by catchment managers to guide watershed interventions, to evaluate their effectiveness, and to minimize risks.
MOLA-Based Landing Site Characterization
NASA Technical Reports Server (NTRS)
Duxbury, T. C.; Ivanov, A. B.
2001-01-01
The Mars Global Surveyor (MGS) Mars Orbiter Laser Altimeter (MOLA) data provide the basis for site characterization and selection never before possible. The basic MOLA information includes absolute radii, elevation and 1 micrometer albedo with derived datasets including digital image models (DIM's illuminated elevation data), slopes maps and slope statistics and small scale surface roughness maps and statistics. These quantities are useful in downsizing potential sites from descent engineering constraints and landing/roving hazard and mobility assessments. Slope baselines at the few hundred meter level and surface roughness at the 10 meter level are possible. Additionally, the MOLA-derived Mars surface offers the possibility to precisely register and map project other instrument datasets (images, ultraviolet, infrared, radar, etc.) taken at different resolution, viewing and lighting geometry, building multiple layers of an information cube for site characterization and selection. Examples of direct MOLA data, data derived from MOLA and other instruments data registered to MOLA arc given for the Hematite area.
Phobos spectral clustering: first results using the MRO-CRISM 0.4-2.5 micron dataset
NASA Astrophysics Data System (ADS)
Pajola, M.; Roush, T. L.; Marzo, G. A.; Simioni, E.
2016-12-01
Whether Phobos is a captured asteroid or it formed in situ around Mars, is still an outstanding question within the scientific community. The proposed Japanese Mars Moon eXploration (MMX) sample return mission has the chief scientific objective to solve this conundrum, reaching Phobos in early 2020s and returning Phobos samples to Earth few years later. Nonetheless, well before surface samples are returned to Earth, there are important spectral datasets that can be mined in order to constrain Phobos' surface properties and address implications regarding Phobos' origin. One of these is the MRO-CRISM multispectral observations of Phobos. The MRO-CRISM visible and infrared observations (0.4-2.5 micron) are here corrected for incidence and emission angles of the observation. Unlike previous studies of the MRO-CRISM data that selected specific regions for analyses, we apply a statistical technique that identifies different clusters based on a K-means partitioning algorithm. Selecting specific wavelength ranges of Phobos' reflectance spectra permits identification of possible mineralogical compounds and the spatial distribution of these on the surface of Phobos. This work paves the way to a deeper analysis of the available dataset regarding Phobos, potentially identifying regions of interest on the surface of Phobos that may warrant more detailed investigation by the MXX mission as potential sampling areas. Acknowledgments: M. Pajola was supported for this research by an appointment to the NASA Postdoctoral Program at the Ames Research Center administered by USRA.
Lu, Yongtao; Boudiffa, Maya; Dall'Ara, Enrico; Bellantuono, Ilaria; Viceconti, Marco
2015-11-01
In vivo micro-computed tomography (µCT) scanning is an important tool for longitudinal monitoring of the bone adaptation process in animal models. However, the errors associated with the usage of in vivo µCT measurements for the evaluation of bone adaptations remain unclear. The aim of this study was to evaluate the measurement errors using the bone surface distance approach. The right tibiae of eight 14-week-old C57BL/6 J female mice were consecutively scanned four times in an in vivo µCT scanner using a nominal isotropic image voxel size (10.4 µm) and the tibiae were repositioned between each scan. The repeated scan image datasets were aligned to the corresponding baseline (first) scan image dataset using rigid registration and a region of interest was selected in the proximal tibia metaphysis for analysis. The bone surface distances between the repeated and the baseline scan datasets were evaluated. It was found that the average (±standard deviation) median and 95th percentile bone surface distances were 3.10 ± 0.76 µm and 9.58 ± 1.70 µm, respectively. This study indicated that there were inevitable errors associated with the in vivo µCT measurements of bone microarchitecture and these errors should be taken into account for a better interpretation of bone adaptations measured with in vivo µCT. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
He, T.; Liang, S.; Zhang, Y.
2017-12-01
Massive melting events over Greenland have been observed over the past few decades. Accompanying the melting events are the surface albedo changes, which had temporal and spatial variations. Albedo changes over Greenland during the past few decades have been reported in previous studies with the help of satellite observations; however, magnitudes and timing in albedo trends differ greatly in those studies. This has limited our understanding of albedo change mechanisms over Greenland. In this study, we present an analysis of surface albedo change over Greenland since 1980s combining four satellite albedo datasets, namely MODIS, GLASS, CLARA, and Landsat. MODIS, GLASS, and CLARA albedo data are publicly available and Landsat albedos were derived in our earlier study trying to bridge the scale difference between coarse resolution data and ground measurements available from early 1980s. Inter-comparisons were made among the satellite albedos and against ground measurements. We have several new findings. First, trends in surface albedo change among the satellite albedo datasets generally agree with each other and with ground measurements. Second, all datasets showed negative albedo trends after 2000, but magnitudes differ greatly. Third, trends before 2000 from coarse resolution data are not significant but Landsat data observed positive albedo changes. Fourth, the turning point of albedo trend was found to be earlier than 2000. Those findings may bring new research topics on timing and magnitude, and an improved understanding mechanisms of the albedo changes over Greenland during the past few decades.
NASA Astrophysics Data System (ADS)
Yun, S.; Koketsu, K.; Aoki, Y.
2014-12-01
The September 4, 2010, Canterbury earthquake with a moment magnitude (Mw) of 7.1 is a crustal earthquake in the South Island, New Zealand. The February 22, 2011, Christchurch earthquake (Mw=6.3) is the biggest aftershock of the 2010 Canterbury earthquake that is located at about 50 km to the east of the mainshock. Both earthquakes occurred on previously unrecognized faults. Field observations indicate that the rupture of the 2010 Canterbury earthquake reached the surface; the surface rupture with a length of about 30 km is located about 4 km south of the epicenter. Also various data including the aftershock distribution and strong motion seismograms suggest a very complex rupture process. For these reasons it is useful to investigate the complex rupture process using multiple data with various sensitivities to the rupture process. While previously published source models are based on one or two datasets, here we infer the rupture process with three datasets, InSAR, strong-motion, and teleseismic data. We first performed point source inversions to derive the focal mechanism of the 2010 Canterbury earthquake. Based on the focal mechanism, the aftershock distribution, the surface fault traces and the SAR interferograms, we assigned several source faults. We then performed the joint inversion to determine the rupture process of the 2010 Canterbury earthquake most suitable for reproducing all the datasets. The obtained slip distribution is in good agreement with the surface fault traces. We also performed similar inversions to reveal the rupture process of the 2011 Christchurch earthquake. Our result indicates steep dip and large up-dip slip. This reveals the observed large vertical ground motion around the source region is due to the rupture process, rather than the local subsurface structure. To investigate the effects of the 3-D velocity structure on characteristic strong motion seismograms of the two earthquakes, we plan to perform the inversion taking 3-D velocity structure of this region into account.
Reconstruction of Arctic surface temperature in past 100 years using DINEOF
NASA Astrophysics Data System (ADS)
Zhang, Qiyi; Huang, Jianbin; Luo, Yong
2015-04-01
Global annual mean surface temperature has not risen apparently since 1998, which is described as global warming hiatus in recent years. However, measuring of temperature variability in Arctic is difficult because of large gaps in coverage of Arctic region in most observed gridded datasets. Since Arctic has experienced a rapid temperature change in recent years that called polar amplification, and temperature risen in Arctic is faster than global mean, the unobserved temperature in central Arctic will result in cold bias in both global and Arctic temperature measurement compared with model simulations and reanalysis datasets. Moreover, some datasets that have complete coverage in Arctic but short temporal scale cannot show Arctic temperature variability for long time. Data Interpolating Empirical Orthogonal Function (DINEOF) were applied to fill the coverage gap of NASA's Goddard Institute for Space Studies Surface Temperature Analysis (GISTEMP 250km smooth) product in Arctic with IABP dataset which covers entire Arctic region between 1979 and 1998, and to reconstruct Arctic temperature in 1900-2012. This method provided temperature reconstruction in central Arctic and precise estimation of both global and Arctic temperature variability with a long temporal scale. Results have been verified by extra independent station records in Arctic by statistical analysis, such as variance and standard deviation. The result of reconstruction shows significant warming trend in Arctic in recent 30 years, as the temperature trend in Arctic since 1997 is 0.76°C per decade, compared with 0.48°C and 0.67°C per decade from 250km smooth and 1200km smooth of GISTEMP. And global temperature trend is two times greater after using DINEOF. The discrepancies above stress the importance of fully consideration of temperature variance in Arctic because gaps of coverage in Arctic cause apparent cold bias in temperature estimation. The result of global surface temperature also proves that global warming in recent years is not as slow as thought.
ERIC Educational Resources Information Center
Jones, Francis
2018-01-01
We compared seven unrelated data-sets to evaluate a major education improvement initiative. Perceptions of students in 54 course sections were surveyed regarding the helpfulness of 39 specific teaching or learning strategies, and relative workloads and enthusiasm were compared to their other courses. Classes were observed using an established…
Dhyani, Manish; Grajo, Joseph R; Rodriguez, Dayron; Chen, Zhikui; Feldman, Adam; Tambouret, Rosemary; Gervais, Debra A; Arellano, Ronald S; Hahn, Peter F; Samir, Anthony E
2017-06-01
To evaluate whether the Aorta-Lesion-Attenuation-Difference on contrast-enhanced CT can aid in the differentiation of malignant and benign oncocytic renal neoplasms. Two independent cohorts-an initial (biopsy) dataset and a validation (surgical) dataset-with oncocytomas and chromophobe renal cell carcinomas (chRCC) were included in this IRB-approved retrospective study. A region of interest was placed on the renal mass and abdominal aorta on the same CT image slice to calculate an Aorta-Lesion-Attenuation-Difference (ALAD). ROC curves were plotted for different enhancement phases, and diagnostic performance of ALAD for differentiating chRCC from oncocytomas was calculated. Seventy-nine renal masses (56 oncocytomas, 23 chRCC) were analyzed in the initial (biopsy) dataset. Thirty-six renal masses (16 oncocytomas, 20 chRCC) were reviewed in the validation (surgical) cohort. ALAD showed a statistically significant difference between oncocytomas and chromophobes during the nephrographic phase (p < 0.001), early excretory phase (p < 0.001), and excretory phase (p = 0.029). The area under the ROC curve for the nephrographic phase was 1.00 (95% CI: 1.00-1.00) for the biopsy dataset and showed the narrowest confidence interval. At a threshold value of 25.5 HU, sensitivity was 100 (82.2%-100%) and specificity was 81.5 (61.9%-93.7%). When tested on the validation dataset on measurements made by an independent reader, the AUROC was 0.93 (95% CI: 0.84-1.00) with a sensitivity of 100 (80.0%-100%) and a specificity of 87.5 (60.4%-97.8%). Nephrographic phase ALAD has potential to differentiate benign and malignant oncocytic renal neoplasms on contrast-enhanced CT if histologic evaluation on biopsy is indeterminate.
MIDG-Emerging grid technologies for multi-site preclinical molecular imaging research communities.
Lee, Jasper; Documet, Jorge; Liu, Brent; Park, Ryan; Tank, Archana; Huang, H K
2011-03-01
Molecular imaging is the visualization and identification of specific molecules in anatomy for insight into metabolic pathways, tissue consistency, and tracing of solute transport mechanisms. This paper presents the Molecular Imaging Data Grid (MIDG) which utilizes emerging grid technologies in preclinical molecular imaging to facilitate data sharing and discovery between preclinical molecular imaging facilities and their collaborating investigator institutions to expedite translational sciences research. Grid-enabled archiving, management, and distribution of animal-model imaging datasets help preclinical investigators to monitor, access and share their imaging data remotely, and promote preclinical imaging facilities to share published imaging datasets as resources for new investigators. The system architecture of the Molecular Imaging Data Grid is described in a four layer diagram. A data model for preclinical molecular imaging datasets is also presented based on imaging modalities currently used in a molecular imaging center. The MIDG system components and connectivity are presented. And finally, the workflow steps for grid-based archiving, management, and retrieval of preclincial molecular imaging data are described. Initial performance tests of the Molecular Imaging Data Grid system have been conducted at the USC IPILab using dedicated VMware servers. System connectivity, evaluated datasets, and preliminary results are presented. The results show the system's feasibility, limitations, direction of future research. Translational and interdisciplinary research in medicine is increasingly interested in cellular and molecular biology activity at the preclinical levels, utilizing molecular imaging methods on animal models. The task of integrated archiving, management, and distribution of these preclinical molecular imaging datasets at preclinical molecular imaging facilities is challenging due to disparate imaging systems and multiple off-site investigators. A Molecular Imaging Data Grid design, implementation, and initial evaluation is presented to demonstrate the secure and novel data grid solution for sharing preclinical molecular imaging data across the wide-area-network (WAN).
Fourcade, Yoan; Engler, Jan O; Rödder, Dennis; Secondi, Jean
2014-01-01
MAXENT is now a common species distribution modeling (SDM) tool used by conservation practitioners for predicting the distribution of a species from a set of records and environmental predictors. However, datasets of species occurrence used to train the model are often biased in the geographical space because of unequal sampling effort across the study area. This bias may be a source of strong inaccuracy in the resulting model and could lead to incorrect predictions. Although a number of sampling bias correction methods have been proposed, there is no consensual guideline to account for it. We compared here the performance of five methods of bias correction on three datasets of species occurrence: one "virtual" derived from a land cover map, and two actual datasets for a turtle (Chrysemys picta) and a salamander (Plethodon cylindraceus). We subjected these datasets to four types of sampling biases corresponding to potential types of empirical biases. We applied five correction methods to the biased samples and compared the outputs of distribution models to unbiased datasets to assess the overall correction performance of each method. The results revealed that the ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species. However, the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases. The strong effect of initial conditions on correction performance highlights the need for further research to develop a step-by-step guideline to account for sampling bias. However, this method seems to be the most efficient in correcting sampling bias and should be advised in most cases.
Fourcade, Yoan; Engler, Jan O.; Rödder, Dennis; Secondi, Jean
2014-01-01
MAXENT is now a common species distribution modeling (SDM) tool used by conservation practitioners for predicting the distribution of a species from a set of records and environmental predictors. However, datasets of species occurrence used to train the model are often biased in the geographical space because of unequal sampling effort across the study area. This bias may be a source of strong inaccuracy in the resulting model and could lead to incorrect predictions. Although a number of sampling bias correction methods have been proposed, there is no consensual guideline to account for it. We compared here the performance of five methods of bias correction on three datasets of species occurrence: one “virtual” derived from a land cover map, and two actual datasets for a turtle (Chrysemys picta) and a salamander (Plethodon cylindraceus). We subjected these datasets to four types of sampling biases corresponding to potential types of empirical biases. We applied five correction methods to the biased samples and compared the outputs of distribution models to unbiased datasets to assess the overall correction performance of each method. The results revealed that the ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species. However, the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases. The strong effect of initial conditions on correction performance highlights the need for further research to develop a step-by-step guideline to account for sampling bias. However, this method seems to be the most efficient in correcting sampling bias and should be advised in most cases. PMID:24818607
Paxton, Alexandra; Griffiths, Thomas L
2017-10-01
Today, people generate and store more data than ever before as they interact with both real and virtual environments. These digital traces of behavior and cognition offer cognitive scientists and psychologists an unprecedented opportunity to test theories outside the laboratory. Despite general excitement about big data and naturally occurring datasets among researchers, three "gaps" stand in the way of their wider adoption in theory-driven research: the imagination gap, the skills gap, and the culture gap. We outline an approach to bridging these three gaps while respecting our responsibilities to the public as participants in and consumers of the resulting research. To that end, we introduce Data on the Mind ( http://www.dataonthemind.org ), a community-focused initiative aimed at meeting the unprecedented challenges and opportunities of theory-driven research with big data and naturally occurring datasets. We argue that big data and naturally occurring datasets are most powerfully used to supplement-not supplant-traditional experimental paradigms in order to understand human behavior and cognition, and we highlight emerging ethical issues related to the collection, sharing, and use of these powerful datasets.
Enhancing studies of the connectome in autism using the autism brain imaging data exchange II
Di Martino, Adriana; O’Connor, David; Chen, Bosi; Alaerts, Kaat; Anderson, Jeffrey S.; Assaf, Michal; Balsters, Joshua H.; Baxter, Leslie; Beggiato, Anita; Bernaerts, Sylvie; Blanken, Laura M. E.; Bookheimer, Susan Y.; Braden, B. Blair; Byrge, Lisa; Castellanos, F. Xavier; Dapretto, Mirella; Delorme, Richard; Fair, Damien A.; Fishman, Inna; Fitzgerald, Jacqueline; Gallagher, Louise; Keehn, R. Joanne Jao; Kennedy, Daniel P.; Lainhart, Janet E.; Luna, Beatriz; Mostofsky, Stewart H.; Müller, Ralph-Axel; Nebel, Mary Beth; Nigg, Joel T.; O’Hearn, Kirsten; Solomon, Marjorie; Toro, Roberto; Vaidya, Chandan J.; Wenderoth, Nicole; White, Tonya; Craddock, R. Cameron; Lord, Catherine; Leventhal, Bennett; Milham, Michael P.
2017-01-01
The second iteration of the Autism Brain Imaging Data Exchange (ABIDE II) aims to enhance the scope of brain connectomics research in Autism Spectrum Disorder (ASD). Consistent with the initial ABIDE effort (ABIDE I), that released 1112 datasets in 2012, this new multisite open-data resource is an aggregate of resting state functional magnetic resonance imaging (MRI) and corresponding structural MRI and phenotypic datasets. ABIDE II includes datasets from an additional 487 individuals with ASD and 557 controls previously collected across 16 international institutions. The combination of ABIDE I and ABIDE II provides investigators with 2156 unique cross-sectional datasets allowing selection of samples for discovery and/or replication. This sample size can also facilitate the identification of neurobiological subgroups, as well as preliminary examinations of sex differences in ASD. Additionally, ABIDE II includes a range of psychiatric variables to inform our understanding of the neural correlates of co-occurring psychopathology; 284 diffusion imaging datasets are also included. It is anticipated that these enhancements will contribute to unraveling key sources of ASD heterogeneity. PMID:28291247
DATS, the data tag suite to enable discoverability of datasets.
Sansone, Susanna-Assunta; Gonzalez-Beltran, Alejandra; Rocca-Serra, Philippe; Alter, George; Grethe, Jeffrey S; Xu, Hua; Fore, Ian M; Lyle, Jared; Gururaj, Anupama E; Chen, Xiaoling; Kim, Hyeon-Eui; Zong, Nansu; Li, Yueling; Liu, Ruiling; Ozyurt, I Burak; Ohno-Machado, Lucila
2017-06-06
Today's science increasingly requires effective ways to find and access existing datasets that are distributed across a range of repositories. For researchers in the life sciences, discoverability of datasets may soon become as essential as identifying the latest publications via PubMed. Through an international collaborative effort funded by the National Institutes of Health (NIH)'s Big Data to Knowledge (BD2K) initiative, we have designed and implemented the DAta Tag Suite (DATS) model to support the DataMed data discovery index. DataMed's goal is to be for data what PubMed has been for the scientific literature. Akin to the Journal Article Tag Suite (JATS) used in PubMed, the DATS model enables submission of metadata on datasets to DataMed. DATS has a core set of elements, which are generic and applicable to any type of dataset, and an extended set that can accommodate more specialized data types. DATS is a platform-independent model also available as an annotated serialization in schema.org, which in turn is widely used by major search engines like Google, Microsoft, Yahoo and Yandex.
NASA's Big Earth Data Initiative Accomplishments
NASA Technical Reports Server (NTRS)
Klene, Stephan A.; Pauli, Elisheva; Pressley, Natalie N.; Cechini, Matthew F.; McInerney, Mark
2017-01-01
The goal of NASA's effort for BEDI is to improve the usability, discoverability, and accessibility of Earth Observation data in support of societal benefit areas. Accomplishments: In support of BEDI goals, datasets have been entered into Common Metadata Repository(CMR), made available via the Open-source Project for a Network Data Access Protocol (OPeNDAP), have a Digital Object Identifier (DOI) registered for the dataset, and to support fast visualization many layers have been added in to the Global Imagery Browse Services (GIBS).
NASA's Big Earth Data Initiative Accomplishments
NASA Astrophysics Data System (ADS)
Klene, S. A.; Pauli, E.; Pressley, N. N.; Cechini, M. F.; McInerney, M.
2017-12-01
The goal of NASA's effort for BEDI is to improve the usability, discoverability, and accessibility of Earth Observation data in support of societal benefit areas. Accomplishments: In support of BEDI goals, datasets have been entered into Common Metadata Repository(CMR), made available via the Open-source Project for a Network Data Access Protocol (OPeNDAP), have a Digital Object Identifier (DOI) registered for the dataset, and to support fast visualization many layers have been added in to the Global Imagery Browse Service(GIBS)
NASA Technical Reports Server (NTRS)
Gatebe, C. K.; Dubovik, O.; King, M. D.; Sinyuk, A.
2010-01-01
This paper presents a new method for simultaneously retrieving aerosol and surface reflectance properties from combined airborne and ground-based direct and diffuse radiometric measurements. The method is based on the standard Aerosol Robotic Network (AERONET) method for retrieving aerosol size distribution, complex index of refraction, and single scattering albedo, but modified to retrieve aerosol properties in two layers, below and above the aircraft, and parameters on surface optical properties from combined datasets (Cloud Absorption Radiometer (CAR) and AERONET data). A key advantage of this method is the inversion of all available spectral and angular data at the same time, while accounting for the influence of noise in the inversion procedure using statistical optimization. The wide spectral (0.34-2.30 m) and angular range (180 ) of the CAR instrument, combined with observations from an AERONET sunphotometer, provide sufficient measurement constraints for characterizing aerosol and surface properties with minimal assumptions. The robustness of the method was tested on observations made during four different field campaigns: (a) the Southern African Regional Science Initiative 2000 over Mongu, Zambia, (b) the Intercontinental Transport Experiment-Phase B over Mexico City, Mexico (c) Cloud and Land Surface Interaction Campaign over the Atmospheric Radiation Measurement (ARM) Central Facility, Oklahoma, USA, and (d) the Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS) over Elson Lagoon in Barrow, Alaska, USA. The four areas are dominated by different surface characteristics and aerosol types, and therefore provide good test cases for the new inversion method.
Surface and Flow Field Measurements on the FAITH Hill Model
NASA Technical Reports Server (NTRS)
Bell, James H.; Heineck, James T.; Zilliac, Gregory; Mehta, Rabindra D.; Long, Kurtis R.
2012-01-01
A series of experimental tests, using both qualitative and quantitative techniques, were conducted to characterize both surface and off-surface flow characteristics of an axisymmetric, modified-cosine-shaped, wall-mounted hill named "FAITH" (Fundamental Aero Investigates The Hill). Two separate models were employed: a 6" high, 18" base diameter machined aluminum model that was used for wind tunnel tests and a smaller scale (2" high, 6" base diameter) sintered nylon version that was used in the water channel facility. Wind tunnel and water channel tests were conducted at mean test section speeds of 165 fps (Reynolds Number based on height = 500,000) and 0.1 fps (Reynolds Number of 1000), respectively. The ratio of model height to boundary later height was approximately 3 for both tests. Qualitative techniques that were employed to characterize the complex flow included surface oil flow visualization for the wind tunnel tests, and dye injection for the water channel tests. Quantitative techniques that were employed to characterize the flow included Cobra Probe to determine point-wise steady and unsteady 3D velocities, Particle Image Velocimetry (PIV) to determine 3D velocities and turbulence statistics along specified planes, Pressure Sensitive Paint (PSP) to determine mean surface pressures, and Fringe Imaging Skin Friction (FISF) to determine surface skin friction (magnitude and direction). This initial report summarizes the experimental set-up, techniques used, data acquired and describes some details of the dataset that is being constructed for use by other researchers, especially the CFD community. Subsequent reports will discuss the data and their interpretation in more detail
NASA Astrophysics Data System (ADS)
Ladd, Matthew; Viau, Andre
2013-04-01
Paleoclimate reconstructions rely on the accuracy of modern climate datasets for calibration of fossil records under the assumption of climate normality through time, which means that the modern climate operates in a similar manner as over the past 2,000 years. In this study, we show how using different modern climate datasets have an impact on a pollen-based reconstruction of mean temperature of the warmest month (MTWA) during the past 2,000 years for North America. The modern climate datasets used to explore this research question include the: Whitmore et al., (2005) modern climate dataset; North American Regional Reanalysis (NARR); National Center For Environmental Prediction (NCEP); European Center for Medium Range Weather Forecasting (ECMWF) ERA-40 reanalysis; WorldClim, Global Historical Climate Network (GHCN) and New et al., which is derived from the CRU dataset. Results show that some caution is advised in using the reanalysis data on large-scale reconstructions. Station data appears to dampen out the variability of the reconstruction produced using station based datasets. The reanalysis or model-based datasets are not recommended for paleoclimate large-scale North American reconstructions as they appear to lack some of the dynamics observed in station datasets (CRU) which resulted in warm-biased reconstructions as compared to the station-based reconstructions. The Whitmore et al. (2005) modern climate dataset appears to be a compromise between CRU-based datasets and model-based datasets except for the ERA-40. In addition, an ultra-high resolution gridded climate dataset such as WorldClim may only be useful if the pollen calibration sites in North America have at least the same spatial precision. We reconstruct the MTWA to within +/-0.01°C by using an average of all curves derived from the different modern climate datasets, demonstrating the robustness of the procedure used. It may be that the use of an average of different modern datasets may reduce the impact of uncertainty of paleoclimate reconstructions, however, this is yet to be determined with certainty. Future evaluation using for example the newly developed Berkeley earth surface temperature datasets should be tested against the paleoclimate record.
Aerosol climate time series from ESA Aerosol_cci (Invited)
NASA Astrophysics Data System (ADS)
Holzer-Popp, T.
2013-12-01
Within the ESA Climate Change Initiative (CCI) the Aerosol_cci project (mid 2010 - mid 2013, phase 2 proposed 2014-2016) has conducted intensive work to improve algorithms for the retrieval of aerosol information from European sensors AATSR (3 algorithms), PARASOL, MERIS (3 algorithms), synergetic AATSR/SCIAMACHY, OMI and GOMOS. Whereas OMI and GOMOS were used to derive absorbing aerosol index and stratospheric extinction profiles, respectively, Aerosol Optical Depth (AOD) and Angstrom coefficient were retrieved from the other sensors. Global datasets for 2008 were produced and validated versus independent ground-based data and other satellite data sets (MODIS, MISR). An additional 17-year dataset is currently generated using ATSR-2/AATSR data. During the three years of the project, intensive collaborative efforts were made to improve the retrieval algorithms focusing on the most critical modules. The team agreed on the use of a common definition for the aerosol optical properties. Cloud masking was evaluated, but a rigorous analysis with a pre-scribed cloud mask did not lead to improvement for all algorithms. Better results were obtained using a post-processing step in which sudden transitions, indicative of possible occurrence of cloud contamination, were removed. Surface parameterization, which is most critical for the nadir only algorithms (MERIS and synergetic AATSR / SCIAMACHY) was studied to a limited extent. The retrieval results for AOD, Ångström exponent (AE) and uncertainties were evaluated by comparison with data from AERONET (and a limited amount of MAN) sun photometer and with satellite data available from MODIS and MISR. Both level2 and level3 (gridded daily) datasets were validated. Several validation metrics were used (standard statistical quantities such as bias, rmse, Pearson correlation, linear regression, as well as scoring approaches to quantitatively evaluate the spatial and temporal correlations against AERONET), and in some cases developed further, to evaluate the datasets and their regional and seasonal merits. The validation showed that most datasets have improved significantly and in particular PARASOL (ocean only) provides excellent results. The metrics for AATSR (land and ocean) datasets are similar to those of MODIS and MISR, with AATSR better in some land regions and less good in some others (ocean). However, AATSR coverage is smaller than that of MODIS due to swath width. The MERIS dataset provides better coverage than AATSR but has lower quality (especially over land) than the other datasets. Also the synergetic AATSR/SCIAMACHY dataset has lower quality. The evaluation of the pixel uncertainties shows first good results but also reveals that more work needs to be done to provide comprehensive information for data assimilation. Users (MACC/ECMWF, AEROCOM) confirmed the relevance of this additional information and encouraged Aerosol_cci to release the current uncertainties. The paper will summarize and discuss the results of three year work in Aerosol_cci, extract the lessons learned and conclude with an outlook to the work proposed for the next three years. In this second phase a cyclic effort of algorithm evolution, dataset generation, validation and assessment will be applied to produce and further improve complete time series from all sensors under investigation, new sensors will be added (e.g. IASI), and preparation for the Sentinel missions will be made.
NASA Astrophysics Data System (ADS)
Laiti, Lavinia; Giovannini, Lorenzo; Zardi, Dino
2015-04-01
The accurate assessment of the solar radiation available at the Earth's surface is essential for a wide range of energy-related applications, such as the design of solar power plants, water heating systems and energy-efficient buildings, as well as in the fields of climatology, hydrology, ecology and agriculture. The characterization of solar radiation is particularly challenging in complex-orography areas, where topographic shadowing and altitude effects, together with local weather phenomena, greatly increase the spatial and temporal variability of such variable. At present, approaches ranging from surface measurements interpolation to orographic down-scaling of satellite data, to numerical model simulations are adopted for mapping solar radiation. In this contribution a high-resolution (200 m) solar atlas for the Trentino region (Italy) is presented, which was recently developed on the basis of hourly observations of global radiation collected from the local radiometric stations during the period 2004-2012. Monthly and annual climatological irradiation maps were obtained by the combined use of a GIS-based clear-sky model (r.sun module of GRASS GIS) and geostatistical interpolation techniques (kriging). Moreover, satellite radiation data derived by the MeteoSwiss HelioMont algorithm (2 km resolution) were used for missing-data reconstruction and for the final mapping, thus integrating ground-based and remote-sensing information. The results are compared with existing solar resource datasets, such as the PVGIS dataset, produced by the Joint Research Center Institute for Energy and Transport, and the HelioMont dataset, in order to evaluate the accuracy of the different datasets available for the region of interest.
NASA Astrophysics Data System (ADS)
Leydsman-McGinty, E. I.; Ramsey, R. D.; McGinty, C.
2013-12-01
The Remote Sensing/GIS Laboratory at Utah State University, in cooperation with the United States Environmental Protection Agency, is quantifying impervious surfaces for three watershed sub-basins in Utah. The primary objective of developing watershed-scale quantifications of impervious surfaces is to provide an indicator of potential impacts to wetlands that occur within the Wasatch Front and along the Great Salt Lake. A geospatial layer of impervious surfaces can assist state agencies involved with Utah's Wetlands Program Plan (WPP) in understanding the impacts of impervious surfaces on wetlands, as well as support them in carrying out goals and actions identified in the WPP. The three watershed sub-basins, Lower Bear-Malad, Lower Weber, and Jordan, span the highly urbanized Wasatch Front and are consistent with focal areas in need of wetland monitoring and assessment as identified in Utah's WPP. Geospatial layers of impervious surface currently exist in the form of national and regional land cover datasets; however, these datasets are too coarse to be utilized in fine-scale analyses. In addition, the pixel-based image processing techniques used to develop these coarse datasets have proven insufficient in smaller scale or detailed studies, particularly when applied to high-resolution satellite imagery or aerial photography. Therefore, object-based image analysis techniques are being implemented to develop the geospatial layer of impervious surfaces. Object-based image analysis techniques employ a combination of both geospatial and image processing methods to extract meaningful information from high-resolution imagery. Spectral, spatial, textural, and contextual information is used to group pixels into image objects and then subsequently used to develop rule sets for image classification. eCognition, an object-based image analysis software program, is being utilized in conjunction with one-meter resolution National Agriculture Imagery Program (NAIP) aerial photography from 2011.
Comparing Temperature Effects on E. Coli, Salmonella, and Enterococcus Survival in Surface Waters
The objective of this study was to compare dependency of survival rates on temperature for indicator organisms E. coli and Enterococcus and the pathogen Salmonella in surface waters. A database of 86 survival datasets from peer-reviewed papers on inactivation of E. coli, Salmonel...
Ecoregional analysis of nearshore sea-surface temperature in the North Pacific
Aim Sea surface temperature (SST) has been a parameter widely-identified to be useful to the investigation of marine species distribution, migration, and invasion, especially as SSTs are predicted to be affected by climate change. Here we use a remotely-sensed dataset to focus on...
Evaluation of the validated soil moisture product from the SMAP radiometer
USDA-ARS?s Scientific Manuscript database
In this study, we used a multilinear regression approach to retrieve surface soil moisture from NASA’s Soil Moisture Active Passive (SMAP) satellite data to create a global dataset of surface soil moisture which is consistent with ESA’s Soil Moisture and Ocean Salinity (SMOS) satellite retrieved sur...
DOT National Transportation Integrated Search
2014-07-01
The report summarizes a 2-day workshop held on November 6-7, 2013, to discuss data sources for surface transportation human factors research. The workshop was designed to assess the increasing number of different datasets and multiple ways of collect...
NASA Astrophysics Data System (ADS)
Karaoglanis, K.; Efthimiou, N.; Tsoumpas, C.
2015-09-01
Low count PET data is a challenge for medical image reconstruction. The statistics of a dataset is a key factor of the quality of the reconstructed images. Reconstruction algorithms which would be able to compensate for low count datasets could provide the means to reduce the patient injected doses and/or reduce the scan times. It has been shown that the use of priors improve the image quality in low count conditions. In this study we compared regularised versus post-filtered OSEM for their performance on challenging simulated low count datasets. Initial visual comparison demonstrated that both algorithms improve the image quality, although the use of regularization does not introduce the undesired blurring as post-filtering.
Empirical retrieval of sea spray aerosol production using satellite microwave radiometry
NASA Astrophysics Data System (ADS)
Savelyev, I. B.; Yelland, M. J.; Norris, S. J.; Salisbury, D.; Pascal, R. W.; Bettenhausen, M. H.; Prytherch, J.; Anguelova, M. D.; Brooks, I. M.
2017-12-01
This study presents a novel approach to obtaining global sea spray aerosol (SSA) production source term by relying on direct satellite observations of the ocean surface, instead of more traditional approaches driven by surface meteorology. The primary challenge in developing this empirical algorithm is to compile a calibrated, consistent dataset of SSA surface flux collected offshore over a variety of conditions (i.e., regions and seasons), thus representative of the global SSA production variability. Such dataset includes observations from SEASAW, HiWASE, and WAGES field campaigns, during which the SSA flux was measured from the bow of a research vessel using consistent and state-of-the-art eddy covariance methodology. These in situ data are matched to observations of the state of the ocean surface from Windsat polarimetric microwave satellite radiometer. Previous studies demonstrated the ability of WindSat to detect variations in surface waves slopes, roughness and foam, which led to the development of retrieval algorithms for surface wind vector and more recently whitecap fraction. Similarly, in this study, microwave emissions from the ocean surface are matched to and calibrated against in situ observations of the SSA production flux. The resulting calibrated empirical algorithm is applicable for retrieval of SSA source term throughout the duration of Windsat mission, from 2003 to present.
Open-source algorithm for detecting sea ice surface features in high-resolution optical imagery
NASA Astrophysics Data System (ADS)
Wright, Nicholas C.; Polashenski, Chris M.
2018-04-01
Snow, ice, and melt ponds cover the surface of the Arctic Ocean in fractions that change throughout the seasons. These surfaces control albedo and exert tremendous influence over the energy balance in the Arctic. Increasingly available meter- to decimeter-scale resolution optical imagery captures the evolution of the ice and ocean surface state visually, but methods for quantifying coverage of key surface types from raw imagery are not yet well established. Here we present an open-source system designed to provide a standardized, automated, and reproducible technique for processing optical imagery of sea ice. The method classifies surface coverage into three main categories: snow and bare ice, melt ponds and submerged ice, and open water. The method is demonstrated on imagery from four sensor platforms and on imagery spanning from spring thaw to fall freeze-up. Tests show the classification accuracy of this method typically exceeds 96 %. To facilitate scientific use, we evaluate the minimum observation area required for reporting a representative sample of surface coverage. We provide an open-source distribution of this algorithm and associated training datasets and suggest the community consider this a step towards standardizing optical sea ice imagery processing. We hope to encourage future collaborative efforts to improve the code base and to analyze large datasets of optical sea ice imagery.
NASA Astrophysics Data System (ADS)
Ham, S. H.; Loeb, N. G.; Kato, S.; Rose, F. G.; Bosilovich, M. G.; Rutan, D. A.; Huang, X.; Collow, A.
2017-12-01
Global Modeling Assimilation Office (GMAO) GEOS assimilated datasets are used to describe temperature and humidity profiles in the Clouds and the Earth's Radiant Energy System (CERES) data processing. Given that advance versions of the assimilated data sets known as of Forward Processing (FP), FP Parallel (FPP), and Modern-Era Retrospective Analysis for Research and Applications version 2 (MERRA-2) datasets are available, we examine clear-sky irradiance calculation to see if accuracy is improved with these newer versions of GMAO datasets when their temperature and humidity profiles are used in computing irradiances. Two older versions, GEOS-5.2.0 and GEOS-5.4.1 are used for producing, respectively, Ed3 and Ed4 CERES data products. For the evaluation, CERES-derived TOA irradiances and observed ground-based surface irradiances are compared with the computed irradiances for clear skies identified by Moderate Resolution Imaging Spectroradiometer (MODIS). Surface type dependent spectral emissivity is taken from an observationally-based monthly gridded emissivity dataset. TOA longwave (LW) irradiances computed with GOES-5.2.0 temperature and humidity profiles are biased low, up to -5 Wm-2, compared to CERES-derived TOA longwave irradiance over tropical oceans. In contrast, computed longwave irradiances agree well with CERES observations with the biases less than 2 W m-2 when GOES-5.4.1, FP v5.13, or MERRA-2 temperature and humidity are used. The negative biases of the TOA LW irradiance computed with GOES-5.2.0 appear to be related to a wet bias at 500-850 hPa layer. This indicates that if the input of CERES algorithm switches from GOES-5.2.0 to FP v5.13 or MERRA-2, the bias in clear-sky longwave TOA fluxes over tropical oceans is expected to be smaller. At surface, downward LW irradiances computed with FP v5.13 and MERRA-2 are biased low, up to -10 Wm-2, compared to ground observations over tropical oceans. The magnitude of the bias in the longwave surface irradiances cannot be explained by uncertainties related to aerosol, which is estimated to be less than 2.5 W m-2. Therefore, the negative biases are likely caused by cold or dry biases in FP v5.13 and MERRA-2 datasets. We plan to continue the investigation with more ground sites.
Moving towards Hyper-Resolution Hydrologic Modeling
NASA Astrophysics Data System (ADS)
Rouf, T.; Maggioni, V.; Houser, P.; Mei, Y.
2017-12-01
Developing a predictive capability for terrestrial hydrology across landscapes, with water, energy and nutrients as the drivers of these dynamic systems, faces the challenge of scaling meter-scale process understanding to practical modeling scales. Hyper-resolution land surface modeling can provide a framework for addressing science questions that we are not able to answer with coarse modeling scales. In this study, we develop a hyper-resolution forcing dataset from coarser resolution products using a physically based downscaling approach. These downscaling techniques rely on correlations with landscape variables, such as topography, roughness, and land cover. A proof-of-concept has been implemented over the Oklahoma domain, where high-resolution observations are available for validation purposes. Hourly NLDAS (North America Land Data Assimilation System) forcing data (i.e., near-surface air temperature, pressure, and humidity) have been downscaled to 500m resolution over the study area for 2015-present. Results show that correlation coefficients between the downscaled temperature dataset and ground observations are consistently higher than the ones between the NLDAS temperature data at their native resolution and ground observations. Not only correlation coefficients are higher, but also the deviation around the 1:1 line in the density scatterplots is smaller for the downscaled dataset than the original one with respect to the ground observations. Results are therefore encouraging as they demonstrate that the 500m temperature dataset has a good agreement with the ground information and can be adopted to force the land surface model for soil moisture estimation. The study has been expanded to wind speed and direction, incident longwave and shortwave radiation, pressure, and precipitation. Precipitation is well known to vary dramatically with elevation and orography. Therefore, we are pursuing a downscaling technique based on both topographical and vegetation characteristics.
NASA Astrophysics Data System (ADS)
Bhuiyan, M. A. E.; Nikolopoulos, E. I.; Anagnostou, E. N.
2017-12-01
Quantifying the uncertainty of global precipitation datasets is beneficial when using these precipitation products in hydrological applications, because precipitation uncertainty propagation through hydrologic modeling can significantly affect the accuracy of the simulated hydrologic variables. In this research the Iberian Peninsula has been used as the study area with a study period spanning eleven years (2000-2010). This study evaluates the performance of multiple hydrologic models forced with combined global rainfall estimates derived based on a Quantile Regression Forests (QRF) technique. In QRF technique three satellite precipitation products (CMORPH, PERSIANN, and 3B42 (V7)); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset are being utilized in this study. A high-resolution, ground-based observations driven precipitation dataset (named SAFRAN) available at 5 km/1 h resolution is used as reference. Through the QRF blending framework the stochastic error model produces error-adjusted ensemble precipitation realizations, which are used to force four global hydrological models (JULES (Joint UK Land Environment Simulator), WaterGAP3 (Water-Global Assessment and Prognosis), ORCHIDEE (Organizing Carbon and Hydrology in Dynamic Ecosystems) and SURFEX (Stands for Surface Externalisée) ) to simulate three hydrologic variables (surface runoff, subsurface runoff and evapotranspiration). The models are forced with the reference precipitation to generate reference-based hydrologic simulations. This study presents a comparative analysis of multiple hydrologic model simulations for different hydrologic variables and the impact of the blending algorithm on the simulated hydrologic variables. Results show how precipitation uncertainty propagates through the different hydrologic model structures to manifest in reduction of error in hydrologic variables.
Water masses in the Monterey Bay during the summer of 2000
NASA Astrophysics Data System (ADS)
Warn-Varnas, Alex; Gangopadhyay, Avijit; Hawkins, J. A.
2007-06-01
Water masses in Monterey Bay are determined from the CTD casts of the Monterey Ocean Observing System (MOOS) Upper-water-column Science Experiment (MUSE) August 2000 dataset. It is shown through cluster analysis that the MUSE 2000 CTD dataset contains 5 water masses. These five water masses are: bay surface water (BSW), bay warm water (BWW), bay intermediate water (BIW), sub arctic upper water (SUW) and North Pacific deep water (NPDW). The BWW is a new water mass that exists in one area and is attributed to the effects of solar heating. The volumes occupied by each of the water masses are obtained. The BIW water is the most dominant water mass and occupies 68.8% of the volume. The statistical means and standard deviations for each water parameter, including spiciness and oxygen concentration, are calculated during separate upwelling and relaxed periods. The water mass content and structure are analyzed and studied during upwelling and a relaxed period. During upwelling, along a CTD track off Pt. Ano Nuevo, the water mass T, S distribution tended to be organized along three branches. Off Pt. Ano Nuevo the innovative coastal observation network (ICON) model showed the formation of a cyclonic eddy during the analyzed upwelling period. In time the eddy moved southwest and became absorbed into the southerly flow during the initial phases of the following wind-relaxed period.
NASA Astrophysics Data System (ADS)
Harbeck, J.; Kurtz, N. T.; Studinger, M.; Onana, V.; Yi, D.
2015-12-01
The NASA Operation IceBridge Project Science Office has recently released an updated version of the sea ice freeboard, snow depth and thickness product (IDCSI4). This product is generated through the combination of multiple IceBridge instrument data, primarily the ATM laser altimeter, DMS georeferenced imagery and the CReSIS snow radar, and is available on a campaign-specific basis as all upstream data sets become available. Version 1 data (IDCSI2) was the initial data production; we have subsequently received community feedback that has now been incorporated, allowing us to provide an improved data product. All data now available to the public at the National Snow and Ice Data Center (NSIDC) have been homogeneously reprocessed using the new IDCSI4 algorithm. This algorithm contains significant upgrades that improve the quality and consistency of the dataset, including updated atmospheric and oceanic tidal models and replacement of the geoid with a more representative mean sea surface height product. Known errors with the IDCSI2 algorithm, identified by the Project Science Office as well as feedback from the scientific community, have been incorporated into the new algorithm as well. We will describe in detail the various steps of the IDCSI4 algorithm, show the improvements made over the IDCSI2 dataset and their beneficial impact and discuss future upgrades planned for the next version.
Datasets on hub-height wind speed comparisons for wind farms in California.
Wang, Meina; Ullrich, Paul; Millstein, Dev
2018-08-01
This article includes the description of data information related to the research article entitled "The future of wind energy in California: Future projections with the Variable-Resolution CESM"[1], with reference number RENE_RENE-D-17-03392. Datasets from the Variable-Resolution CESM, Det Norske Veritas Germanischer Lloyd Virtual Met, MERRA-2, CFSR, NARR, ISD surface observations, and upper air sounding observations were used for calculating and comparing hub-height wind speed at multiple major wind farms across California. Information on hub-height wind speed interpolation and power curves at each wind farm sites are also presented. All datasets, except Det Norske Veritas Germanischer Lloyd Virtual Met, are publicly available for future analysis.
Clear-Sky Longwave Irradiance at the Earth's Surface--Evaluation of Climate Models.
NASA Astrophysics Data System (ADS)
Garratt, J. R.
2001-04-01
An evaluation of the clear-sky longwave irradiance at the earth's surface (LI) simulated in climate models and in satellite-based global datasets is presented. Algorithm-based estimates of LI, derived from global observations of column water vapor and surface (or screen air) temperature, serve as proxy `observations.' All datasets capture the broad zonal variation and seasonal behavior in LI, mainly because the behavior in column water vapor and temperature is reproduced well. Over oceans, the dependence of annual and monthly mean irradiance upon sea surface temperature (SST) closely resembles the observed behavior of column water with SST. In particular, the observed hemispheric difference in the summer minus winter column water dependence on SST is found in all models, though with varying seasonal amplitudes. The analogous behavior in the summer minus winter LI is seen in all datasets. Over land, all models have a more highly scattered dependence of LI upon surface temperature compared with the situation over the oceans. This is related to a much weaker dependence of model column water on the screen-air temperature at both monthly and annual timescales, as observed. The ability of climate models to simulate realistic LI fields depends as much on the quality of model water vapor and temperature fields as on the quality of the longwave radiation codes. In a comparison of models with observations, root-mean-square gridpoint differences in mean monthly column water and temperature are 4-6 mm (5-8 mm) and 0.5-2 K (3-4 K), respectively, over large regions of ocean (land), consistent with the intermodel differences in LI of 5-13 W m2 (15-28 W m2).
NASA Astrophysics Data System (ADS)
Nowicki, S. A.; Skuse, R. J.
2012-12-01
High-resolution ecological and climate modeling requires quantification of surface characteristics such as rock abundance, soil induration and surface roughness at fine-scale, since these features can affect the micro and macro habitat of a given area and ultimately determine the assemblage of plant and animal species that may occur there. Our objective is to develop quantitative data layers of thermophysical properties of the entire Mojave Desert Ecoregion for applications to habitat modeling being conducted by the USGS Western Ecological Research Center. These research efforts are focused on developing habitat models and a better physical understanding of the Mojave Desert, which have implications the development of solar and wind energy resources, military installation expansion and residential development planned for the Mojave. Thus there is a need to improve our understanding of the mechanical composition and thermal characteristics of natural and modified surfaces in the southwestern US at as high-resolution as possible. Since the Mojave is a sparsely-vegetated, arid landscape with little precipitation, remote sensing-based thermophysical analyses using Advanced Spaceborne Thermal Emission and Reflectance Radiometer (ASTER) day and nighttime imagery are ideal for determining the physical properties of the surface. New mosaicking techniques for thermal imagery acquired at different dates, seasons and temperatures have allowed for the highest-resolution mosaics yet generated at 100m/pixel for thermal infrared wavelengths. Among our contributions is the development of seamless day and night ASTER mosaics of land surface temperatures that are calibrated to Moderate Resolution Imaging Spectroradiometer (MODIS) coincident observations to produce both a seamless mosaic and quantitative temperatures across the region that varies spectrally and thermophysically over a large number of orbit tracks. Products derived from this dataset include surface rock abundance, apparent thermal inertia, and diurnal/seasonal thermal regime. Additionally, the combination of moderate and high-resolution thermal observations are used to map the spatial and temporal variation of significant rain storms that intermittently increase the surface moisture. The resulting thermally-derived layers are in the process of being combined with composition, vegetation and surface reflectance datasets to map the Mojave at the highest VNIR resolution (20m/pixel) and compared to currently-available lower-resolution datasets.
NASA Astrophysics Data System (ADS)
Eberle, Jonas; Urban, Marcel; Hüttich, Christian; Schmullius, Christiane
2014-05-01
Numerous datasets providing temperature information from meteorological stations or remote sensing satellites are available. However, the challenging issue is to search in the archives and process the time series information for further analysis. These steps can be automated for each individual product, if the pre-conditions are complied, e.g. data access through web services (HTTP, FTP) or legal rights to redistribute the datasets. Therefore a python-based package was developed to provide data access and data processing tools for MODIS Land Surface Temperature (LST) data, which is provided by NASA Land Processed Distributed Active Archive Center (LPDAAC), as well as the Global Surface Summary of the Day (GSOD) and the Global Historical Climatology Network (GHCN) daily datasets provided by NOAA National Climatic Data Center (NCDC). The package to access and process the information is available as web services used by an interactive web portal for simple data access and analysis. Tools for time series analysis were linked to the system, e.g. time series plotting, decomposition, aggregation (monthly, seasonal, etc.), trend analyses, and breakpoint detection. Especially for temperature data a plot was integrated for the comparison of two temperature datasets based on the work by Urban et al. (2013). As a first result, a kernel density plot compares daily MODIS LST from satellites Aqua and Terra with daily means from GSOD and GHCN datasets. Without any data download and data processing, the users can analyze different time series datasets in an easy-to-use web portal. As a first use case, we built up this complimentary system with remotely sensed MODIS data and in situ measurements from meteorological stations for Siberia within the Siberian Earth System Science Cluster (www.sibessc.uni-jena.de). References: Urban, Marcel; Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane; Herold, Martin. 2013. "Comparison of Satellite-Derived Land Surface Temperature and Air Temperature from Meteorological Stations on the Pan-Arctic Scale." Remote Sens. 5, no. 5: 2348-2367. Further materials: Eberle, Jonas; Clausnitzer, Siegfried; Hüttich, Christian; Schmullius, Christiane. 2013. "Multi-Source Data Processing Middleware for Land Monitoring within a Web-Based Spatial Data Infrastructure for Siberia." ISPRS Int. J. Geo-Inf. 2, no. 3: 553-576.
Initial stages of organic film growth characterized by thermal desorption spectroscopy
Winkler, Adolf
2015-01-01
In the wake of the increasing importance of organic electronics, a more in-depth understanding of the early stages of organic film growth is indispensable. In this review a survey of several rod-like and plate-like organic molecules (p-quaterphenyl, p-sexiphenyl, hexaazatriphenylene-hexacarbonitrile (HATCN), rubicene, indigo) deposited on various application relevant substrates (gold, silver, mica, silicon dioxide) is given. The focus is particularly put on the application of thermal desorption spectroscopy to shed light on the kinetics and energetics of the molecule-substrate interaction. While each adsorption system reveals a manifold of features that are specific for the individual system, one can draw some general statements on the early stages of organic film formation from the available datasets. Among the important issues in this context is the formation of wetting layers and the dewetting as a function of the substrate surface conditions, organic film thickness and temperature. PMID:26778860
Coral mass spawning predicted by rapid seasonal rise in ocean temperature
Maynard, Jeffrey A.; Edwards, Alasdair J.; Guest, James R.; Rahbek, Carsten
2016-01-01
Coral spawning times have been linked to multiple environmental factors; however, to what extent these factors act as generalized cues across multiple species and large spatial scales is unknown. We used a unique dataset of coral spawning from 34 reefs in the Indian and Pacific Oceans to test if month of spawning and peak spawning month in assemblages of Acropora spp. can be predicted by sea surface temperature (SST), photosynthetically available radiation, wind speed, current speed, rainfall or sunset time. Contrary to the classic view that high mean SST initiates coral spawning, we found rapid increases in SST to be the best predictor in both cases (month of spawning: R2 = 0.73, peak: R2 = 0.62). Our findings suggest that a rapid increase in SST provides the dominant proximate cue for coral mass spawning over large geographical scales. We hypothesize that coral spawning is ultimately timed to ensure optimal fertilization success. PMID:27170709
Uncertainty Estimation in Tsunami Initial Condition From Rapid Bayesian Finite Fault Modeling
NASA Astrophysics Data System (ADS)
Benavente, R. F.; Dettmer, J.; Cummins, P. R.; Urrutia, A.; Cienfuegos, R.
2017-12-01
It is well known that kinematic rupture models for a given earthquake can present discrepancies even when similar datasets are employed in the inversion process. While quantifying this variability can be critical when making early estimates of the earthquake and triggered tsunami impact, "most likely models" are normally used for this purpose. In this work, we quantify the uncertainty of the tsunami initial condition for the great Illapel earthquake (Mw = 8.3, 2015, Chile). We focus on utilizing data and inversion methods that are suitable to rapid source characterization yet provide meaningful and robust results. Rupture models from teleseismic body and surface waves as well as W-phase are derived and accompanied by Bayesian uncertainty estimates from linearized inversion under positivity constraints. We show that robust and consistent features about the rupture kinematics appear when working within this probabilistic framework. Moreover, by using static dislocation theory, we translate the probabilistic slip distributions into seafloor deformation which we interpret as a tsunami initial condition. After considering uncertainty, our probabilistic seafloor deformation models obtained from different data types appear consistent with each other providing meaningful results. We also show that selecting just a single "representative" solution from the ensemble of initial conditions for tsunami propagation may lead to overestimating information content in the data. Our results suggest that rapid, probabilistic rupture models can play a significant role during emergency response by providing robust information about the extent of the disaster.
NASA Astrophysics Data System (ADS)
Nikolakopoulos, Konstantinos G.
2017-09-01
A global digital surface model dataset named ALOS Global Digital Surface Model (AW3D30) with a horizontal resolution of approx. 30-meter mesh (1 arcsec) has been released by the Japan Aerospace Exploration Agency (JAXA). The dataset has been compiled with images acquired by the Advanced Land Observing Satellite "DAICHI" (ALOS) and it is published based on the DSM dataset (5-meter mesh version) of the "World 3D Topographic Data", which is the most precise global-scale elevation data at this time, and its elevation precision is also at a world-leading level as a 30-meter mesh version. In this study the accuracy of ALOS AW3D30 was examined. For an area with complex geomorphologic characteristics DSM from ALOS stereo pairs were created with classical photogrammetric techniques. Those DSMs were compared with the ALOS AW3D30. Points of certified elevation collected with DGPS have been used to estimate the accuracy of the DSM. The elevation difference between the two DSMs was calculated. 2D RMSE, correlation and the percentile value were also computed and the results are presented.
NASA Astrophysics Data System (ADS)
Qingyuan, Wang; Yanan, Wang; Yiwei, Liu
2017-08-01
Two widely used sea surface temperature (SST) datasets are compared in this article. We examine characteristics in the climate variability of SST in the China Seas.Two series yielded almost the same warming trend for 1890-2013 (0.7-0.8°C/100 years). However, HadISST1 series shows much stronger warming trends during 1961-2013 and 1981-2013 than that of COBE SST2 series. The disagreement between data sets was marked after 1981. For the hiatus period 1998-2013, the cooling trends of HadISST1 series is much lower than that of COBE SST2. These differences between the two datasets are possibly caused by the different observations which are incorporated to fill with data-sparse regions since 1982. Those findings illustrate that there are some uncertainties in the estimate of SST warming patterns in certain regions. The results also indicate that the temporal and spatial deficiency of observed data is still the biggest handicap for analyzing multi-scale SST characteristics in regional area.
NASA Astrophysics Data System (ADS)
Beylich, A. A.; Lamoureux, S. F.; Decaulne, A.
2012-04-01
Projected climate change in cold regions is expected to alter melt season duration and intensity, along with the number of extreme rainfall events, total annual precipitation and the balance between snowfall and rainfall. Similarly, changes to the thermal balance are expected to reduce the extent of permafrost and seasonal ground frost and increase active layer depths. These effects will undoubtedly change surface environments in cold regions and alter the fluxes of sediments, nutrients and solutes, but the absence of quantitative data and coordinated process monitoring and analysis to understand the sensitivity of the Earth surface environment is acute in cold climate environments. The International Association of Geomorphologists (I.A.G./A.I.G.)SEDIBUD (Sediment Budgets in Cold Environments) Programme was formed in 2005 to address this existing key knowledge gap. SEDIBUD currently has about 400 members worldwide and the Steering Committee of this international programme is composed of ten scientists from eight different countries: Achim A. Beylich (Chair) (Norway), Armelle Decaulne (Secretary) (France), John C. Dixon (USA), Scott F. Lamoureux (Vice-Chair) (Canada), John F. Orwin (Canada), Jan-Christoph Otto (Austria), Irina Overeem (USA), Thorsteinn Saemundsson (Iceland), Jeff Warburton (UK), Zbigniew Zwolinski (Poland). The central research question of this global group of scientists is to: Assess and model the contemporary sedimentary fluxes in cold climates, with emphasis on both particulate and dissolved components. Initially formed as European Science Foundation (ESF) Network SEDIFLUX (2004-2006), SEDIBUD has further expanded to a global group of researchers with field research sites located in polar and alpine regions in the northern and southern hemisphere. Research carried out at each of the close to 50 defined SEDIBUD key test sites varies by programme, logistics and available resources, but typically represent interdisciplinary collaborations of geomorphologists, hydrologists, ecologists, permafrost scientists and glaciologists. SEDIBUD has developed manuals and protocols (SEDIFLUX Manual, available online, see below) with a key set of primary surface process monitoring and research data requirements to incorporate results from these diverse projects and allow coordinated quantitative analysis across the programme. Defined SEDIBUD key test sites provide data on annual climate conditions, total discharge and particulate and dissolved fluxes as well as information on other relevant surface processes. A number of selected key test sites is providing high-resolution data on climate conditions, runoff and sedimentary fluxes, which in addition to the annual data contribute to the SEDIBUD metadata database which is currently developed. Comparable datasets from different SEDIBUD key test sites are integrated and analysed to address key research questions as defined in the SEDIBUD Objective (available online, see below). Defined SEDIBUD key tasks for the coming years include (i) The continued generation and compilation of comparable longer-term datasets on contemporary sedimentary fluxes and sediment yields from SEDIBUD key test sites worldwide, (ii) The continued extension of the SEDIBUD metadata database with these datasets, (iii) The testing of defined SEDIBUD hypotheses (available online, see below) by using the datasets continuously compiled in the SEDIBUD metadata database. Detailed information on the I.A.G./A.I.G. SEDIBUD Programme, SEDIBUD meetings, SEDIBUD publications and SEDIBUD online documents and databases is available at the SEDIBUD website under http://www.geomorph.org/wg/wgsb.html.
NASA Astrophysics Data System (ADS)
Beylich, Achim A.; Lamoureux, Scott; Decaulne, Armelle
2013-04-01
Projected climate change in cold regions is expected to alter melt season duration and intensity, along with the number of extreme rainfall events, total annual precipitation and the balance between snowfall and rainfall. Similarly, changes to the thermal balance are expected to reduce the extent of permafrost and seasonal ground frost and increase active layer depths. These effects will undoubtedly change surface environments in cold regions and alter the fluxes of sediments, nutrients and solutes, but the absence of quantitative data and coordinated geomorphic process monitoring and analysis to understand the sensitivity of the Earth surface environment is acute in cold climate environments. The International Association of Geomorphologists (I.A.G. / A.I.G. ) SEDIBUD (Sediment Budgets in Cold Environments) Programme was formed in 2005 to address this existing key knowledge gap. SEDIBUD currently has about 400 members worldwide and the Steering Committee of this international programme is composed of ten scientists from eight different countries: Achim A. Beylich (Chair) (Norway), Armelle Decaulne (Secretary) (France), John C. Dixon (USA), Scott F. Lamoureux (Vice-Chair) (Canada), John F. Orwin (Canada), Jan-Christoph Otto (Austria), Irina Overeem (USA), Thorsteinn Sæmundsson (Iceland), Jeff Warburton (UK) and Zbigniew Zwolinski (Poland). The central research question of this global group of scientists is to: Assess and model the contemporary sedimentary fluxes in cold climates, with emphasis on both particulate and dissolved components. Initially formed as European Science Foundation (ESF) Network SEDIFLUX (Sedimentary Source-to-Sink Fluxes in Cold Environments) (2004 - ), SEDIBUD has further expanded to a global group of researchers with field research sites located in polar and alpine regions in the northern and southern hemisphere. Research carried out at each of the close to 50 defined SEDIBUD key test sites varies by programme, logistics and available resources, but typically represent interdisciplinary collaborations of geomorphologists, hydrologists, ecologists, permafrost scientists and glaciologists. SEDIBUD has developed manuals and protocols (SEDIFLUX Manual, available online, see below) with a key set of primary surface process monitoring and research data requirements to incorporate results from these diverse projects and allow coordinated quantitative analysis across the programme. Defined SEDIBUD key test sites provide data on annual climate conditions, total discharge and particulate and dissolved fluxes (yields) as well as information on other relevant surface processes. A number of selected key test sites is providing high-resolution data on climate conditions, runoff and sedimentary fluxes (yields), which in addition to the annual data contribute to the SEDIBUD metadata database. Comparable datasets from different SEDIBUD key test sites are integrated and analysed to address key research questions as defined in the SEDIBUD objective (available online, see below). Defined SEDIBUD key tasks for the coming years include (i) The continued generation and compilation of comparable longer-term datasets on contemporary sedimentary fluxes and sediment yields from SEDIBUD key test sites worldwide, (ii) The continued extension of the SEDIBUD metadata database with these datasets, (iii) The testing of defined SEDIBUD hypotheses (available online, see below) by using datasets continuously compiled in the SEDIBUD metadata database, (iv) The publication of a SEDIBUD book (synthesis book). Detailed information on the SEDIBUD Programme, SEDIBUD meetings, SEDIBUD publications and SEDIBUD online documents and databases is available at the SEDIBUD website under http://www.geomorph.org/wg/wgsb.html.
Shi, Jie; Thompson, Paul M.; Gutman, Boris; Wang, Yalin
2013-01-01
In this paper, we develop a new automated surface registration system based on surface conformal parameterization by holomorphic 1-forms, inverse consistentsurface fluid registration, and multivariate tensor-based morphometry (mTBM). First, we conformally map a surface onto a planar rectangle space with holomorphic 1-forms. Second, we compute surface conformal representation by combining its local conformal factor and mean curvature and linearly scale the dynamic range of the conformal representation to form the feature image of the surface. Third, we align the feature image with a chosen template image via the fluid image registration algorithm, which has been extended into the curvilinear coordinates to adjust for the distortion introduced by surface parameterization. The inverse consistent image registration algorithm is also incorporated in the system to jointly estimate the forward and inverse transformations between the study and template images. This alignment induces a corresponding deformation on the surface. We tested the system on Alzheimer's Disease Neuroimaging Initiative (ADNI) baseline dataset to study AD symptoms on hippocampus. In our system, by modeling a hippocampus as a 3D parametric surface, we nonlinearly registered each surface with a selected template surface. Then we used mTBM to analyze the morphometrydifference between diagnostic groups. Experimental results show that the new system has better performance than two publically available subcortical surface registration tools: FIRST and SPHARM. We also analyzed the genetic influence of the Apolipoprotein E ε4 allele (ApoE4),which is considered as the most prevalent risk factor for AD.Our work successfully detected statistically significant difference between ApoE4 carriers and non-carriers in both patients of mild cognitive impairment (MCI) and healthy control subjects. The results show evidence that the ApoE genotype may be associated with accelerated brain atrophy so that our workprovides a new MRI analysis tool that may help presymptomatic AD research. PMID:23587689
X-ray computed tomography library of shark anatomy and lower jaw surface models.
Kamminga, Pepijn; De Bruin, Paul W; Geleijns, Jacob; Brazeau, Martin D
2017-04-11
The cranial diversity of sharks reflects disparate biomechanical adaptations to feeding. In order to be able to investigate and better understand the ecomorphology of extant shark feeding systems, we created a x-ray computed tomography (CT) library of shark cranial anatomy with three-dimensional (3D) lower jaw reconstructions. This is used to examine and quantify lower jaw disparity in extant shark species in a separate study. The library is divided in a dataset comprised of medical CT scans of 122 sharks (Selachimorpha, Chondrichthyes) representing 73 extant species, including digitized morphology of entire shark specimens. This CT dataset and additional data provided by other researchers was used to reconstruct a second dataset containing 3D models of the left lower jaw for 153 individuals representing 94 extant shark species. These datasets form an extensive anatomical record of shark skeletal anatomy, necessary for comparative morphological, biomechanical, ecological and phylogenetic studies.
Berkeley SuperNova Ia Program (BSNIP): Initial Spectral Analysis
NASA Astrophysics Data System (ADS)
Silverman, Jeffrey; Kong, J.; Ganeshalingam, M.; Li, W.; Filippenko, A. V.
2011-01-01
The Berkeley SuperNova Ia Program (BSNIP) has been observing nearby (z < 0.1) Type Ia supernovae (SNe Ia) both photometrically and spectroscopically for over two decades. Using telescopes at both Lick and Keck Observatories, we have amassed an extensive collection of well-sampled optical light curves with complementary spectra covering, on average, 3400-10,000 Å. In total, we have obtained nearly 600 spectra of over 200 SNe Ia with densely sampled multi-color light curves. The initial analysis of this dataset consists of accurately and robustly measuring the strength and position of various spectral features near maximum brightness. We determine the endpoints, pseudo-continuum, expansion velocity, equivalent width, and depth of each major feature observed in our wavelength range. For objects with multiple spectra near maximum brightness we investigate how these values change with time. From these measurements we also calculate velocity gradients and various flux ratios within a given spectrum which will allow us to explore correlations between spectral and photometric observables. Some possible correlations have been studied previously, but our dataset is unique in how self-consistent the data reduction and spectral feature measurements have been, and it is a factor of a few larger than most earlier studies. We will briefly summarize the contents of the full dataset as an introduction to our initial analysis. Some of our measurements of SN Ia spectral features, along with a few initial results from those measurements, will be presented. Finally, we will comment on our current progress and planned future work. We gratefully acknowledge the financial support of NSF grant AST-0908886, the TABASGO Foundation, and the Marc J. Staley Graduate Fellowship in Astronomy.
NASA Astrophysics Data System (ADS)
Huang, Xiaomeng; Hu, Chenqi; Huang, Xing; Chu, Yang; Tseng, Yu-heng; Zhang, Guang Jun; Lin, Yanluan
2018-01-01
Mesoscale convective systems (MCSs) are important components of tropical weather systems and the climate system. Long-term data of MCS are of great significance in weather and climate research. Using long-term (1985-2008) global satellite infrared (IR) data, we developed a novel objective automatic tracking algorithm, which combines a Kalman filter (KF) with the conventional area-overlapping method, to generate a comprehensive MCS dataset. The new algorithm can effectively track small and fast-moving MCSs and thus obtain more realistic and complete tracking results than previous studies. A few examples are provided to illustrate the potential application of the dataset with a focus on the diurnal variations of MCSs over land and ocean regions. We find that the MCSs occurring over land tend to initiate in the afternoon with greater intensity, but the oceanic MCSs are more likely to initiate in the early morning with weaker intensity. A double peak in the maximum spatial coverage is noted over the western Pacific, especially over the southwestern Pacific during the austral summer. Oceanic MCSs also persist for approximately 1 h longer than their continental counterparts.
Wherry, Susan A.; Wood, Tamara M.; Anderson, Chauncey W.
2015-01-01
Using the extended 1991–2010 external phosphorus loading dataset, the lake TMDL model was recalibrated following the same procedures outlined in the Phase 1 review. The version of the model selected for further development incorporated an updated sediment initial condition, a numerical solution method for the chlorophyll a model, changes to light and phosphorus factors limiting algal growth, and a new pH-model regression, which removed Julian day dependence in order to avoid discontinuities in pH at year boundaries. This updated lake TMDL model was recalibrated using the extended dataset in order to compare calibration parameters to those obtained from a calibration with the original 7.5-year dataset. The resulting algal settling velocity calibrated from the extended dataset was more than twice the value calibrated with the original dataset, and, because the calibrated values of algal settling velocity and recycle rate are related (more rapid settling required more rapid recycling), the recycling rate also was larger than that determined with the original dataset. These changes in calibration parameters highlight the uncertainty in critical rates in the Upper Klamath Lake TMDL model and argue for their direct measurement in future data collection to increase confidence in the model predictions.
Wave-Breaking Turbulence in the Ocean Surface Layer
2016-06-01
bubbles may be important, both to the process of energy dissipation and to the quality of acoustic Doppler data, especially during rough conditions...energy beneath a breaking wave. For the roughest conditions in this dataset (20ms21 winds), bubbles and ‘‘spindrift’’ (spraying foam ) may become...to occur at the upper end of this dataset (U10 5 20ms 21). The pulse-coherent acoustic Doppler methods used on board the SWIFTs are not capable of
Wang, Kai; Mao, Jiafu; Dickinson, Robert; ...
2013-06-05
This paper examines a land surface solar radiation partitioning scheme, i.e., that of the Community Land Model version 4 (CLM4) with coupled carbon and nitrogen cycles. Taking advantage of a unique 30-year fraction of absorbed photosynthetically active radiation (FPAR) dataset derived from the Global Inventory Modeling and Mapping Studies (GIMMS) normalized difference vegetation index (NDVI) data set, multiple other remote sensing datasets, and site level observations, we evaluated the CLM4 FPAR ’s seasonal cycle, diurnal cycle, long-term trends and spatial patterns. These findings show that the model generally agrees with observations in the seasonal cycle, long-term trends, and spatial patterns,more » but does not reproduce the diurnal cycle. Discrepancies also exist in seasonality magnitudes, peak value months, and spatial heterogeneity. Here, we identify the discrepancy in the diurnal cycle as, due to, the absence of dependence on sun angle in the model. Implementation of sun angle dependence in a one-dimensional (1-D) model is proposed. The need for better relating of vegetation to climate in the model, indicated by long-term trends, is also noted. Evaluation of the CLM4 land surface solar radiation partitioning scheme using remote sensing and site level FPAR datasets provides targets for future development in its representation of this naturally complicated process.« less
NASA Technical Reports Server (NTRS)
Medlin, Jeffrey M.; Wood, Lance; Zavodsky, Brad; Case, Jon; Molthan, Andrew
2012-01-01
The initiation of deep convection during the warm season is a forecast challenge in the relative high instability and low wind shear environment of the U.S. Deep South. Despite improved knowledge of the character of well known mesoscale features such as local sea-, bay- and land-breezes, observations show the evolution of these features fall well short in fully describing the location of first initiates. A joint collaborative modeling effort among the NWS offices in Mobile, AL, and Houston, TX, and NASA s Short-term Prediction Research and Transition (SPoRT) Center was undertaken during the 2012 warm season to examine the impact of certain NASA produced products on the Weather Research and Forecasting Environmental Modeling System. The NASA products were: a 4-km Land Information System data, a 1-km sea surface temperature analysis, and a 4-km greenness vegetation fraction analysis. Similar domains were established over the southeast Texas and Alabama coastlines, each with a 9 km outer grid spacing and a 3 km inner nest spacing. The model was run at each NWS office once per day out to 24 hours from 0600 UTC, using the NCEP Global Forecast System for initial and boundary conditions. Control runs without the NASA products were made at the NASA SPoRT Center. The NCAR Model Evaluation Tools verification package was used to evaluate both the forecast timing and location of the first initiates, with a focus on the impacts of the NASA products on the model forecasts. Select case studies will be presented to highlight the influence of the products.
An ant colony based algorithm for overlapping community detection in complex networks
NASA Astrophysics Data System (ADS)
Zhou, Xu; Liu, Yanheng; Zhang, Jindong; Liu, Tuming; Zhang, Di
2015-06-01
Community detection is of great importance to understand the structures and functions of networks. Overlap is a significant feature of networks and overlapping community detection has attracted an increasing attention. Many algorithms have been presented to detect overlapping communities. In this paper, we present an ant colony based overlapping community detection algorithm which mainly includes ants' location initialization, ants' movement and post processing phases. An ants' location initialization strategy is designed to identify initial location of ants and initialize label list stored in each node. During the ants' movement phase, the entire ants move according to the transition probability matrix, and a new heuristic information computation approach is redefined to measure similarity between two nodes. Every node keeps a label list through the cooperation made by ants until a termination criterion is reached. A post processing phase is executed on the label list to get final overlapping community structure naturally. We illustrate the capability of our algorithm by making experiments on both synthetic networks and real world networks. The results demonstrate that our algorithm will have better performance in finding overlapping communities and overlapping nodes in synthetic datasets and real world datasets comparing with state-of-the-art algorithms.
Local and Cumulative Impervious Cover of Massachusetts Stream Basins
Brandt, Sara L.; Steeves, Peter A.
2009-01-01
Impervious surfaces such as paved roads, parking lots, and building roofs can affect the natural streamflow patterns and ecosystems of nearby streams. This dataset summarizes the percentage of impervious area for watersheds across Massachusetts by using a newly available statewide 1-m binary raster dataset of impervious surface for 2005. In order to accurately capture the wide spatial variability of impervious surface, it was necessary to delineate a new set of finely discretized basin boundaries for Massachusetts. This new set of basins was delineated at a scale finer than that of the existing 12-digit Hydrologic Unit Code basins (HUC-12s) of the national Watershed Boundary Dataset. The dataset consists of three GIS shapefiles. The Massachusetts nested subbasins and the hydrologic units data layers consist of topographically delineated boundaries and their associated percentage of impervious cover for all of Massachusetts except Cape Cod, the Islands, and the Plymouth-Carver region. The Massachusetts groundwater-contributing areas data layer consists of groundwater contributing-area boundaries for streams and coastal areas of Cape Cod and the Plymouth-Carver region. These boundaries were delineated by using groundwater-flow models previously published by the U.S. Geological Survey. Subbasin and hydrologic unit boundaries were delineated statewide with the exception of Cape Cod and the Plymouth-Carver Region. For the purpose of this study, a subbasin is defined as the entire drainage area upstream of an outlet point. Subbasins draining to multiple outlet points on the same stream are nested. That is, a large downstream subbasin polygon comprises all of the smaller upstream subbasin polygons. A hydrologic unit is the intervening drainage area between a given outlet point and the outlet point of the next upstream unit (Fig. 1). Hydrologic units divide subbasins into discrete, nonoverlapping areas. Each hydrologic unit corresponds to a subbasin delineated from the same outlet point; the hydrologic unit and the subbasin share the same unique identifier attribute. Because the same set of outlet points was used for the delineation of subbasins and hydrologic units, the linework for both data layers is identical; however, polygon attributes differ because for a given outlet point, the subbasin polygon area is the sum of all the upstream hydrologic units. Impervious surface summarized for a subbasin represents the percentage of impervious surface area of the entire upstream watershed, whereas the impervious surface for a hydrologic unit represents the percentage of impervious surface area for the intervening drainage area between two outlet points.
Nidheesh, N; Abdul Nazeer, K A; Ameer, P M
2017-12-01
Clustering algorithms with steps involving randomness usually give different results on different executions for the same dataset. This non-deterministic nature of algorithms such as the K-Means clustering algorithm limits their applicability in areas such as cancer subtype prediction using gene expression data. It is hard to sensibly compare the results of such algorithms with those of other algorithms. The non-deterministic nature of K-Means is due to its random selection of data points as initial centroids. We propose an improved, density based version of K-Means, which involves a novel and systematic method for selecting initial centroids. The key idea of the algorithm is to select data points which belong to dense regions and which are adequately separated in feature space as the initial centroids. We compared the proposed algorithm to a set of eleven widely used single clustering algorithms and a prominent ensemble clustering algorithm which is being used for cancer data classification, based on the performances on a set of datasets comprising ten cancer gene expression datasets. The proposed algorithm has shown better overall performance than the others. There is a pressing need in the Biomedical domain for simple, easy-to-use and more accurate Machine Learning tools for cancer subtype prediction. The proposed algorithm is simple, easy-to-use and gives stable results. Moreover, it provides comparatively better predictions of cancer subtypes from gene expression data. Copyright © 2017 Elsevier Ltd. All rights reserved.
AstroGrid: the UK's Virtual Observatory Initiative
NASA Astrophysics Data System (ADS)
Mann, Robert G.; Astrogrid Consortium; Lawrence, Andy; Davenhall, Clive; Mann, Bob; McMahon, Richard; Irwin, Mike; Walton, Nic; Rixon, Guy; Watson, Mike; Osborne, Julian; Page, Clive; Allan, Peter; Giaretta, David; Perry, Chris; Pike, Dave; Sherman, John; Murtagh, Fionn; Harra, Louise; Bentley, Bob; Mason, Keith; Garrington, Simon
AstroGrid is the UK's Virtual Observatory (VO) initiative. It brings together the principal astronomical data centres in the UK, and has been funded to the tune of ˜pounds 5M over the next three years, via PPARC, as part of the UK e--science programme. Its twin goals are the provision of the infrastructure and tools for the federation and exploitation of large astronomical (X-ray to radio), solar and space plasma physics datasets, and the delivery of federations of current datasets for its user communities to exploit using those tools. Whilst AstroGrid's work will be centred on existing and future (e.g. VISTA) UK datasets, it will seek solutions to generic VO problems and will contribute to the developing international virtual observatory framework: AstroGrid is a member of the EU-funded Astrophysical Virtual Observatory project, has close links to a second EU Grid initiative, the European Grid of Solar Observations (EGSO), and will seek an active role in the development of the common standards on which the international virtual observatory will rely. In this paper we shall primarily describe the concrete plans for AstroGrid's one-year Phase A study, which will centre on: (i) the definition of detailed science requirements through community consultation; (ii) the undertaking of a ``functionality market survey" to test the utility of existing technologies for the VO; and (iii) a pilot programme of database federations, each addressing different aspects of the general database federation problem. Further information on AstroGrid can be found at AstroGrid .
Carrea, Laura; Embury, Owen; Merchant, Christopher J
2015-11-01
Datasets containing information to locate and identify water bodies have been generated from data locating static-water-bodies with resolution of about 300 m (1/360 ∘ ) recently released by the Land Cover Climate Change Initiative (LC CCI) of the European Space Agency. The LC CCI water-bodies dataset has been obtained from multi-temporal metrics based on time series of the backscattered intensity recorded by ASAR on Envisat between 2005 and 2010. The new derived datasets provide coherently: distance to land, distance to water, water-body identifiers and lake-centre locations. The water-body identifier dataset locates the water bodies assigning the identifiers of the Global Lakes and Wetlands Database (GLWD), and lake centres are defined for in-land waters for which GLWD IDs were determined. The new datasets therefore link recent lake/reservoir/wetlands extent to the GLWD, together with a set of coordinates which locates unambiguously the water bodies in the database. Information on distance-to-land for each water cell and the distance-to-water for each land cell has many potential applications in remote sensing, where the applicability of geophysical retrieval algorithms may be affected by the presence of water or land within a satellite field of view (image pixel). During the generation and validation of the datasets some limitations of the GLWD database and of the LC CCI water-bodies mask have been found. Some examples of the inaccuracies/limitations are presented and discussed. Temporal change in water-body extent is common. Future versions of the LC CCI dataset are planned to represent temporal variation, and this will permit these derived datasets to be updated.
Observational Evidence for Desert Amplification Using Multiple Satellite Datasets.
Wei, Nan; Zhou, Liming; Dai, Yongjiu; Xia, Geng; Hua, Wenjian
2017-05-17
Desert amplification identified in recent studies has large uncertainties due to data paucity over remote deserts. Here we present observational evidence using multiple satellite-derived datasets that desert amplification is a real large-scale pattern of warming mode in near surface and low-tropospheric temperatures. Trend analyses of three long-term temperature products consistently confirm that near-surface warming is generally strongest over the driest climate regions and this spatial pattern of warming maximizes near the surface, gradually decays with height, and disappears in the upper troposphere. Short-term anomaly analyses show a strong spatial and temporal coupling of changes in temperatures, water vapor and downward longwave radiation (DLR), indicating that the large increase in DLR drives primarily near surface warming and is tightly associated with increasing water vapor over deserts. Atmospheric soundings of temperature and water vapor anomalies support the results of the long-term temperature trend analysis and suggest that desert amplification is due to comparable warming and moistening effects of the troposphere. Likely, desert amplification results from the strongest water vapor feedbacks near the surface over the driest deserts, where the air is very sensitive to changes in water vapor and thus efficient in enhancing the longwave greenhouse effect in a warming climate.
NASA Astrophysics Data System (ADS)
Lombardo, Kelly; Sinsky, Eric; Edson, James; Whitney, Michael M.; Jia, Yan
2018-03-01
A series of numerical sensitivity experiments is performed to quantify the impact of sea-surface temperature (SST) distribution on offshore surface fluxes and simulated sea-breeze dynamics. The SST simulations of two mid-latitude sea-breeze events over coastal New England are performed using a spatially-uniform SST, as well as spatially-varying SST datasets of 32- and 1-km horizontal resolutions. Offshore surface heat and buoyancy fluxes vary in response to the SST distribution. Local sea-breeze circulations are relatively insensitive, with minimal differences in vertical structure and propagation speed among the experiments. The largest thermal perturbations are confined to the lowest 10% of the sea-breeze column due to the relatively high stability of the mid-Atlantic marine atmospheric boundary layer (ABL) suppressing vertical mixing, resulting in the depth of the marine layer remaining unchanged. Minimal impacts on the column-averaged virtual potential temperature and sea-breeze depth translates to small changes in sea-breeze propagation speed. This indicates that the use of datasets with a fine-scale SST may not produce more accurate sea-breeze simulations in highly stable marine ABL regimes, though may prove more beneficial in less stable sub-tropical environments.
Ian Warren
2010-09-15
P and S-wave datasets and associated report studying the ability to use three-component long offset surface seismic surveys to find large aperture fractures in geothermal resources at the San Emidio geothermal resource area in Washoe County, Nevada.
USDA-ARS?s Scientific Manuscript database
The objective of this study was to compare the dependencies of survival rates on temperature for indicator organisms E. coli and Enterococcus and the pathogen Salmonella in surface waters. A database consisting of 86 survival datasets from peer-reviewed papers on inactivation of E. coli, Salmonella...
Lee, Won-Joon; Wilkinson, Caroline M; Hwang, Hyeon-Shik; Lee, Sang-Mi
2015-05-01
Accuracy is the most important factor supporting the reliability of forensic facial reconstruction (FFR) comparing to the corresponding actual face. A number of methods have been employed to evaluate objective accuracy of FFR. Recently, it has been attempted that the degree of resemblance between computer-generated FFR and actual face is measured by geometric surface comparison method. In this study, three FFRs were produced employing live adult Korean subjects and three-dimensional computerized modeling software. The deviations of the facial surfaces between the FFR and the head scan CT of the corresponding subject were analyzed in reverse modeling software. The results were compared with those from a previous study which applied the same methodology as this study except average facial soft tissue depth dataset. Three FFRs of this study that applied updated dataset demonstrated lesser deviation errors between the facial surfaces of the FFR and corresponding subject than those from the previous study. The results proposed that appropriate average tissue depth data are important to increase quantitative accuracy of FFR. © 2015 American Academy of Forensic Sciences.
On the relationship between land surface infrared emissivity and soil moisture
NASA Astrophysics Data System (ADS)
Zhou, Daniel K.; Larar, Allen M.; Liu, Xu
2018-01-01
The relationship between surface infrared (IR) emissivity and soil moisture content has been investigated based on satellite measurements. Surface soil moisture content can be estimated by IR remote sensing, namely using the surface parameters of IR emissivity, temperature, vegetation coverage, and soil texture. It is possible to separate IR emissivity from other parameters affecting surface soil moisture estimation. The main objective of this paper is to examine the correlation between land surface IR emissivity and soil moisture. To this end, we have developed a simple yet effective scheme to estimate volumetric soil moisture (VSM) using IR land surface emissivity retrieved from satellite IR spectral radiance measurements, assuming those other parameters impacting the radiative transfer (e.g., temperature, vegetation coverage, and surface roughness) are known for an acceptable time and space reference location. This scheme is applied to a decade of global IR emissivity data retrieved from MetOp-A infrared atmospheric sounding interferometer measurements. The VSM estimated from these IR emissivity data (denoted as IR-VSM) is used to demonstrate its measurement-to-measurement variations. Representative 0.25-deg spatially-gridded monthly-mean IR-VSM global datasets are then assembled to compare with those routinely provided from satellite microwave (MW) multisensor measurements (denoted as MW-VSM), demonstrating VSM spatial variations as well as seasonal-cycles and interannual variability. Initial positive agreement is shown to exist between IR- and MW-VSM (i.e., R2 = 0.85). IR land surface emissivity contains surface water content information. So, when IR measurements are used to estimate soil moisture, this correlation produces results that correspond with those customarily achievable from MW measurements. A decade-long monthly-gridded emissivity atlas is used to estimate IR-VSM, to demonstrate its seasonal-cycle and interannual variation, which is spatially coherent and consistent with that from MW measurements, and, moreover, to achieve our objective of investigating the relationship between land surface IR emissivity and soil moisture.
Global retrieval of soil moisture and vegetation properties using data-driven methods
NASA Astrophysics Data System (ADS)
Rodriguez-Fernandez, Nemesio; Richaume, Philippe; Kerr, Yann
2017-04-01
Data-driven methods such as neural networks (NNs) are a powerful tool to retrieve soil moisture from multi-wavelength remote sensing observations at global scale. In this presentation we will review a number of recent results regarding the retrieval of soil moisture with the Soil Moisture and Ocean Salinity (SMOS) satellite, either using SMOS brightness temperatures as input data for the retrieval or using SMOS soil moisture retrievals as reference dataset for the training. The presentation will discuss several possibilities for both the input datasets and the datasets to be used as reference for the supervised learning phase. Regarding the input datasets, it will be shown that NNs take advantage of the synergy of SMOS data and data from other sensors such as the Advanced Scatterometer (ASCAT, active microwaves) and MODIS (visible and infra red). NNs have also been successfully used to construct long time series of soil moisture from the Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E) and SMOS. A NN with input data from ASMR-E observations and SMOS soil moisture as reference for the training was used to construct a dataset sharing a similar climatology and without a significant bias with respect to SMOS soil moisture. Regarding the reference data to train the data-driven retrievals, we will show different possibilities depending on the application. Using actual in situ measurements is challenging at global scale due to the scarce distribution of sensors. In contrast, in situ measurements have been successfully used to retrieve SM at continental scale in North America, where the density of in situ measurement stations is high. Using global land surface models to train the NN constitute an interesting alternative to implement new remote sensing surface datasets. In addition, these datasets can be used to perform data assimilation into the model used as reference for the training. This approach has recently been tested at the European Centre for Medium-Range Weather Forecasts (ECMWF). Finally, retrievals using radiative transfer models can also be used as a reference SM dataset for the training phase. This approach was used to retrieve soil moisture from ASMR-E, as mentioned above, and also to implement the official European Space Agency (ESA) SMOS soil moisture product in Near-Real-Time. We will finish with a discussion of the retrieval of vegetation parameters from SMOS observations using data-driven methods.
Evolution of anthropogenic emissions at the global and regional scale during the past three decades
NASA Astrophysics Data System (ADS)
Granier, C.; Bessagnet, B. B.; Bond, T. C.; D'Angiola, A.; Denier van der Gon, H.; Frost, G. J.; Heil, A.; Kaiser, J.; Kinne, S. A.; Klimont, Z.; Kloster, S.; Lamarque, J.; Liousse, C.; Masui, T.; Meleux, F.; Mieville, A.; Ohara, T.; Raut, J.; Riahi, K.; Schultz, M. G.; Smith, S.; Thomson, A. M.; van Aardenne, J.; van der Werf, G.; van Vuuren, D.
2010-12-01
The knowledge of the distributions of surface emissions of gases and aerosols is essential for an accurate modeling and analysis of the distribution and evolution of the concentration of gaseous and particulate chemical species. The quantification of surface fluxes by source of origin is furthermore central to the assessment of effects and the development of control measures. Over the past few years, different ranges of emission fluxes have been proposed by several studies, which have provided emissions at different spatial and temporal scales. We have compared the emissions of several chemical compounds, i.e. carbon monoxide, nitrogen oxides, sulfur dioxide and black carbon, as provided by global and regional emissions inventories in different regions of the world for the past thirty years. The presentation will focus on the United States, Europe and China. Significant differences between the datasets providing emissions in these regions have been identified, reaching for example 60% and 35% for anthropogenic emissions of carbon monoxide and nitrogen oxides in both regions, respectively. We will assess the current uncertainties on surface emissions and their recent trends. This analysis is often hindered because of differences in base years and in species considered in the different datasets. Current work aiming at compiling comparable metrics for such species for the analysis of regional and global emission datasets will be discussed.
NASA Technical Reports Server (NTRS)
McCaul, Eugene W., Jr.; Case, Jonathan L.; Zavodsky, Bradley; Srikishen, Jayanthi; Medlin, Jeffrey; Wood, Lance
2014-01-01
Convection-allowing numerical weather simula- tions have often been shown to produce convective storms that have significant sensitivity to choices of model physical parameterizations. Among the most important of these sensitivities are those related to cloud microphysics, but planetary boundary layer parameterizations also have a significant impact on the evolution of the convection. Aspects of the simulated convection that display sensitivity to these physics schemes include updraft size and intensity, simulated radar reflectivity, timing and placement of storm initi- ation and decay, total storm rainfall, and other storm features derived from storm structure and hydrometeor fields, such as predicted lightning flash rates. In addition to the basic parameters listed above, the simulated storms may also exhibit sensitivity to im- posed initial conditions, such as the fields of soil temper- ature and moisture, vegetation cover and health, and sea and lake water surface temperatures. Some of these sensitivities may rival those of the basic physics sensi- tivities mentioned earlier. These sensitivities have the potential to disrupt the accuracy of short-term forecast simulations of convective storms, and thereby pose sig- nificant difficulties for weather forecasters. To make a systematic study of the quantitative impacts of each of these sensitivities, a matrix of simulations has been performed using all combinations of eight separate microphysics schemes, three boundary layer schemes, and two sets of initial conditions. The first version of initial conditions consists of the default data from large-scale operational model fields, while the second features specialized higher- resolution soil conditions, vegetation conditions and water surface temperatures derived from datasets created at NASA's Short-term Prediction and Operational Research Tran- sition (SPoRT) Center at the National Space Science and Technology Center (NSSTC) in Huntsville, AL. Simulations as outlined above, each 48 in number, were conducted for five midsummer weakly sheared coastal convective events each at two sites, Mobile, AL (MOB) and Houston, TX (HGX). Of special interest to operational forecasters at MOB and HGX were accuracy of timing and placement of convective storm initiation, reflectivity magnitudes and coverage, rainfall and inferred lightning threat.
Page, William R.; Berry, Margaret E.; VanSistine, D. Paco; Snyders, Scott R.
2009-01-01
The purpose of this map is to provide an integrated, bi-national geologic map dataset for display and analyses on an Arc Internet Map Service (IMS) dedicated to environmental health studies in the United States-Mexico border region. The IMS web site was designed by the US-Mexico Border Environmental Health Initiative project and collaborators, and the IMS and project web site address is http://borderhealth.cr.usgs.gov/. The objective of the project is to acquire, evaluate, analyze, and provide earth, biologic, and human health resources data within a GIS framework (IMS) to further our understanding of possible linkages between the physical environment and public health issues. The geologic map dataset is just one of many datasets included in the web site; other datasets include biologic, hydrologic, geographic, and human health themes.
A conceptual prototype for the next-generation national elevation dataset
Stoker, Jason M.; Heidemann, Hans Karl; Evans, Gayla A.; Greenlee, Susan K.
2013-01-01
In 2012 the U.S. Geological Survey's (USGS) National Geospatial Program (NGP) funded a study to develop a conceptual prototype for a new National Elevation Dataset (NED) design with expanded capabilities to generate and deliver a suite of bare earth and above ground feature information over the United States. This report details the research on identifying operational requirements based on prior research, evaluation of what is needed for the USGS to meet these requirements, and development of a possible conceptual framework that could potentially deliver the kinds of information that are needed to support NGP's partners and constituents. This report provides an initial proof-of-concept demonstration using an existing dataset, and recommendations for the future, to inform NGP's ongoing and future elevation program planning and management decisions. The demonstration shows that this type of functional process can robustly create derivatives from lidar point cloud data; however, more research needs to be done to see how well it extends to multiple datasets.
NASA Astrophysics Data System (ADS)
Deem, Eric; Cattafesta, Louis; Zhang, Hao; Rowley, Clancy
2016-11-01
Closed-loop control of flow separation requires the spatio-temporal states of the flow to be fed back through the controller in real time. Previously, static and dynamic estimation methods have been employed that provide reduced-order model estimates of the POD-coefficients of the flow velocity using surface pressure measurements. However, this requires a "learning" dataset a priori. This approach is effective as long as the dynamics during control do not stray from the learning dataset. Since only a few dynamical features are required for feedback control of flow separation, many of the details provided by full-field snapshots are superfluous. This motivates a state-observation technique that extracts key dynamical features directly from surface pressure, without requiring PIV snapshots. The results of identifying DMD modes of separated flow through an array of surface pressure sensors in real-time are presented. This is accomplished by employing streaming DMD "on the fly" to surface pressure snapshots. These modal characteristics exhibit striking similarities to those extracted from PIV data and the pressure field obtained via solving Poisson's equation. Progress towards closed-loop separation control based on the dynamic modes of surface pressure will be discussed. Supported by AFOSR Grant FA9550-14-1-0289.
NASA Astrophysics Data System (ADS)
Banerjee, Arindam; Kolekar, Nitin
2015-11-01
The current experimental investigation aims at understanding the effect of free surface proximity and associated blockage on near-wake flow-field and performance of a three bladed horizontal axis marine hydrokinetic turbine. Experiments were conducted on a 0.14m radius, three bladed constant chord turbine in a 0.61m ×0.61m test section water channel. The turbine was subjected to various rotational speeds, flow speeds and depths of immersion. Experimental data was acquired through a submerged in-line thrust-torque sensor that was corrected to an unblocked dataset with a blockage correction using measured thrust data. A detailed comparison is presented between blocked and unblocked datasets to identify influence of Reynolds number and free surface proximity on blockage effects. The percent change in Cp was found to be dependent on flow velocity, rotational speed and free surface to blade tip clearance. Further, flow visualization using a stereoscopic particle image velocimetry was carried out in the near-wake region of turbine to understand the mechanism responsible for variation of Cp with rotational speed and free surface proximity. Results revealed presence of slower wake at higher rotational velocities and increased asymmetry in the wake at high free surface proximity.
NASA Astrophysics Data System (ADS)
Prat, O. P.; Nelson, B. R.
2013-12-01
We use a suite of quantitative precipitation estimates (QPEs) derived from satellite, radar, surface observations, and models to derive precipitation characteristics over CONUS for the period 2002-2012. This comparison effort includes satellite multi-sensor datasets of TMPA 3B42, CMORPH, and PERSIANN. The satellite based QPEs are compared over the concurrent period with the NCEP Stage IV product, which is a near real time product providing precipitation data at the hourly temporal scale gridded at a nominal 4-km spatial resolution. In addition, remotely sensed precipitation datasets are compared with surface observations from the Global Historical Climatology Network (GHCN-Daily) and from the PRISM (Parameter-elevation Regressions on Independent Slopes Model), which provides gridded precipitation estimates that are used as a baseline for multi-sensor QPE products comparison. The comparisons are performed at the annual, seasonal, monthly, and daily scales with focus on selected river basins (Southeastern US, Pacific Northwest, Great Plains). While, unconditional annual rain rates present a satisfying agreement between all products, results suggest that satellite QPE datasets exhibit important biases in particular at higher rain rates (≥4 mm/day). Conversely, on seasonal scales differences between remotely sensed data and ground surface observations can be greater than 50% and up to 90% for low daily accumulation (≤1 mm/day) such as in the Western US (summer) and Central US (winter). The conditional analysis performed using different daily rainfall accumulation thresholds (from low rainfall intensity to intense precipitation) shows that while intense events measured at the ground are infrequent (around 2% for daily accumulation above 2 inches/day), remotely sensed products displayed differences from 20-50% and up to 90-100%. A discussion on the impact of differing spatial and temporal resolutions with respect to the datasets ability to capture extreme precipitation events is also provided. Furthermore, this work is part of a broader effort to evaluate long-term multi-sensor QPEs in the perspective of developing Climate Data Records (CDRs) for precipitation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xiaoma; Zhou, Yuyu; Asrar, Ghassem R.
High spatiotemporal land surface temperature (LST) datasets are increasingly needed in a variety of fields such as ecology, hydrology, meteorology, epidemiology, and energy systems. Moderate Resolution Imaging Spectroradiometer (MODIS) LST is one of such high spatiotemporal datasets that are widely used. But, it has large amount of missing values primarily because of clouds. Gapfilling the missing values is an important approach to create high spatiotemporal LST datasets. However current gapfilling methods have limitations in terms of accuracy and time required to assemble the data over large areas (e.g., national and continental levels). In this study, we developed a 3-step hybridmore » method by integrating a combination of daily merging, spatiotemporal gapfilling, and temporal interpolation methods, to create a high spatiotemporal LST dataset using the four daily LST observations from the two MODIS instruments on Terra and Aqua satellites. We applied this method in urban and surrounding areas for the conterminous U.S. in 2010. The evaluation of the gapfilled LST product indicates that its root mean squared error (RMSE) to be 3.3K for mid-daytime (1:30 pm) and 2.7K for mid-13 nighttime (1:30 am) observations. The method can be easily extended to other years and regions and is also applicable to other satellite products. This seamless daily (mid-daytime and mid-nighttime) LST product with 1 km spatial resolution is of great value for studying effects of urbanization (e.g., urban heat island) and the related impacts on people, ecosystems, energy systems and other infrastructure for cities.« less
Integration of coastal inundation modeling from storm tides to individual waves
NASA Astrophysics Data System (ADS)
Li, Ning; Roeber, Volker; Yamazaki, Yoshiki; Heitmann, Troy W.; Bai, Yefei; Cheung, Kwok Fai
2014-11-01
Modeling of storm-induced coastal inundation has primarily focused on the surge generated by atmospheric pressure and surface winds with phase-averaged effects of the waves as setup. Through an interoperable model package, we investigate the role of phase-resolving wave processes in simulation of coastal flood hazards. A spectral ocean wave model describes generation and propagation of storm waves from deep to intermediate water, while a non-hydrostatic storm-tide model has the option to couple with a spectral coastal wave model for computation of phase-averaged processes in a near-shore region. The ocean wave and storm-tide models can alternatively provide the wave spectrum and the surface elevation as the boundary and initial conditions for a nested Boussinesq model. Additional surface-gradient terms in the Boussinesq equations maintain the quasi-steady, non-uniform storm tide for modeling of phase-resolving surf and swash-zone processes as well as combined tide, surge, and wave inundation. The two nesting schemes are demonstrated through a case study of Hurricane Iniki, which made landfall on the Hawaiian Island of Kauai in 1992. With input from a parametric hurricane model and global reanalysis and tidal datasets, the two approaches produce comparable significant wave heights and phase-averaged surface elevations in the surf zone. The nesting of the Boussinesq model provides a seamless approach to augment the inundation due to the individual waves in matching the recorded debris line along the coast.
This funding opportunity announcement (FOA) encourages applications that propose to conduct secondary data analysis and integration of existing datasets and database resources, with the ultimate aim to elucidate the genetic architecture of cancer risk and related outcomes. The goal of this initiative is to address key scientific questions relevant to cancer epidemiology by supporting the analysis of existing genetic or genomic datasets, possibly in combination with environmental, outcomes, behavioral, lifestyle, and molecular profiles data.
This funding opportunity announcement (FOA) encourages applications that propose to conduct secondary data analysis and integration of existing datasets and database resources, with the ultimate aim to elucidate the genetic architecture of cancer risk and related outcomes. The goal of this initiative is to address key scientific questions relevant to cancer epidemiology by supporting the analysis of existing genetic or genomic datasets, possibly in combination with environmental, outcomes, behavioral, lifestyle, and molecular profiles data.
NASA Technical Reports Server (NTRS)
Vander Kaaden, K. E.; Harrington, A. D.; McCubbin, F. M.
2017-01-01
With the resurgence of human curiosity to explore planetary bodies beyond our own, comes the possibility of health risks associated with the materials covering the surface of these planetary bodies. In order to mitigate these health risks and prepare ourselves for the eventuality of sending humans to other planetary bodies, toxicological evaluations of extraterrestrial materials is imperative (Harrington et al. 2017). Given our close proximity, as well as our increased datasets from various missions (e.g., Apollo, Mars Exploration Rovers, Dawn, etc…), the three most likely candidates for initial human surface exploration are the Moon, Mars, and asteroid 4Vesta. Seven samples, including lunar mare basalt NWA 4734, lunar regolith breccia NWA 7611, martian basalt Tissint, martian regolith breccia NWA 7034, a vestian basalt Berthoud, a vestian regolith breccia NWA 2060, and a terrestrial mid-ocean ridge basalt, were examined for bulk chemistry, mineralogy, geochemical reactivity, and inflammatory potential. In this study, we have taken alliquots from these samples, both the fresh samples and those that underwent iron leaching (Tissint, NWA 7034, NWA 4734, MORB), and performed low pressure, high temperature melting experiments to determine the bulk composition of the materials that were previously examined.
Reddy, James E.; Kappel, William M.
2010-01-01
Existing hydrogeologic and geospatial data useful for the assessment of focused recharge to the carbonate-rock aquifer in the central part of Genesee County, NY, were compiled from numerous local, State, and Federal agency sources. Data sources utilized in this pilot study include available geospatial datasets from Federal and State agencies, interviews with local highway departments and the Genesee County Soil and Water Conservation District, and an initial assessment of karst features through the analysis of ortho-photographs, with minimal field verification. The compiled information is presented in a series of county-wide and quadrangle maps. The county-wide maps present generalized hydrogeologic conditions including distribution of geologic units, major faults, and karst features, and bedrock-surface and water-table configurations. Ten sets of quadrangle maps of the area that overlies the carbonate-rock aquifer present more detailed and additional information including distribution of bedrock outcrops, thin and (or) permeable soils, and karst features such as sinkholes and swallets. Water-resource managers can utilize the information summarized in this report as a guide to their assessment of focused recharge to, and the potential for surface contaminants to reach the carbonate-rock aquifer.
Higher homocysteine associated with thinner cortical gray matter in 803 ADNI subjects
Madsen, Sarah K.; Rajagopalan, Priya; Joshi, Shantanu H.; Toga, Arthur W.; Thompson, Paul M.
2014-01-01
A significant portion of our risk for dementia in old age is associated with lifestyle factors (diet, exercise, and cardiovascular health) that are modifiable, at least in principle. One such risk factor – high homocysteine levels in the blood – is known to increase risk for Alzheimer’s disease and vascular disorders. Here we set out to understand how homocysteine levels relate to 3D surface-based maps of cortical gray matter distribution (thickness, volume, surface area) computed from brain MRI in 803 elderly subjects from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) dataset. Individuals with higher plasma levels of homocysteine had lower gray matter thickness in bilateral frontal, parietal, occipital and right temporal regions; and lower gray matter volumes in left frontal, parietal, temporal, and occipital regions, after controlling for diagnosis, age, and sex, and after correcting for multiple comparisons. No significant within-group associations were found in cognitively healthy people, mild cognitive impairment, or Alzheimer’s disease. These regional differences in gray matter structure may be useful biomarkers to assess the effectiveness of interventions, such as vitamin B supplements, that aim to prevent homocysteine-related brain atrophy by normalizing homocysteine levels. PMID:25444607
Crew Earth Observations: Twelve Years of Documenting Earth from the International Space Station
NASA Technical Reports Server (NTRS)
Evans, Cynthia A.; Stefanov, William L.; Willis, Kimberley; Runco, Susan; Wilkinson, M. Justin; Dawson, Melissa; Trenchard, Michael
2012-01-01
The Crew Earth Observations (CEO) payload was one of the initial experiments aboard the International Space Station, and has been continuously collecting data about the Earth since Expedition 1. The design of the experiment is simple: using state-of-the-art camera equipment, astronauts collect imagery of the Earth's surface over defined regions of scientific interest and also document dynamic events such as storms systems, floods, wild fires and volcanic eruptions. To date, CEO has provided roughly 600,000 images of Earth, capturing views of features and processes on land, the oceans, and the atmosphere. CEO data are less rigorously constrained than other remote sensing data, but the volume of data, and the unique attributes of the imagery provide a rich and understandable view of the Earth that is difficult to achieve from the classic remote sensing platforms. In addition, the length-of-record of the imagery dataset, especially when combined with astronaut photography from other NASA and Russian missions starting in the early 1960s, provides a valuable record of changes on the surface of the Earth over 50 years. This time period coincides with the rapid growth of human settlements and human infrastructure.
Ocean Surface Vector Wind: Research Challenges and Operational Opportunities
NASA Technical Reports Server (NTRS)
Halpern, David
2012-01-01
The atmosphere and ocean are joined together over seventy percent of Earth, with ocean surface vector wind (OSVW) stress one of the linkages. Satellite OSVW measurements provide estimates of wind divergence at the bottom of the atmosphere and wind stress curl at the top of the ocean; both variables are critical for weather and climate applications. As is common with satellite measurements, a multitude of OSVW data products exist for each currently operating satellite instrument. In 2012 the Joint Technical Commission on Oceanography and Marine Meteorology (JCOMM) launched an initiative to coordinate production of OSVW data products to maximize the impact and benefit of existing and future OSVW measurements in atmospheric and oceanic applications. This paper describes meteorological and oceanographic requirements for OSVW data products; provides an inventory of unique data products to illustrate that the challenge is not the production of individual data products, but the generation of harmonized datasets for analysis and synthesis of the ensemble of data products; and outlines a vision for JCOMM, in partnership with other international groups, to assemble an international network to share ideas, data, tools, strategies, and deliverables to improve utilization of satellite OSVW data products for research and operational applications.
Development of a global historic monthly mean precipitation dataset
NASA Astrophysics Data System (ADS)
Yang, Su; Xu, Wenhui; Xu, Yan; Li, Qingxiang
2016-04-01
Global historic precipitation dataset is the base for climate and water cycle research. There have been several global historic land surface precipitation datasets developed by international data centers such as the US National Climatic Data Center (NCDC), European Climate Assessment & Dataset project team, Met Office, etc., but so far there are no such datasets developed by any research institute in China. In addition, each dataset has its own focus of study region, and the existing global precipitation datasets only contain sparse observational stations over China, which may result in uncertainties in East Asian precipitation studies. In order to take into account comprehensive historic information, users might need to employ two or more datasets. However, the non-uniform data formats, data units, station IDs, and so on add extra difficulties for users to exploit these datasets. For this reason, a complete historic precipitation dataset that takes advantages of various datasets has been developed and produced in the National Meteorological Information Center of China. Precipitation observations from 12 sources are aggregated, and the data formats, data units, and station IDs are unified. Duplicated stations with the same ID are identified, with duplicated observations removed. Consistency test, correlation coefficient test, significance t-test at the 95% confidence level, and significance F-test at the 95% confidence level are conducted first to ensure the data reliability. Only those datasets that satisfy all the above four criteria are integrated to produce the China Meteorological Administration global precipitation (CGP) historic precipitation dataset version 1.0. It contains observations at 31 thousand stations with 1.87 × 107 data records, among which 4152 time series of precipitation are longer than 100 yr. This dataset plays a critical role in climate research due to its advantages in large data volume and high density of station network, compared to other datasets. Using the Penalized Maximal t-test method, significant inhomogeneity has been detected in historic precipitation datasets at 340 stations. The ratio method is then employed to effectively remove these remarkable change points. Global precipitation analysis based on CGP v1.0 shows that rainfall has been increasing during 1901-2013 with an increasing rate of 3.52 ± 0.5 mm (10 yr)-1, slightly higher than that in the NCDC data. Analysis also reveals distinguished long-term changing trends at different latitude zones.
Effect of Diffuse Backscatter in Cassini Datasets on the Inferred Properties of Titan's surface
NASA Astrophysics Data System (ADS)
Sultan-Salem, A. K.; Tyler, G. L.
2006-12-01
Microwave (2.18 cm-λ) backscatter data for the surface of Titan obtained with the Cassini Radar instrument exhibit a significant diffuse scattering component. An empirical scattering law of the form Acos^{n}θ, with free parameters A and n, is often employed to model diffuse scattering, which may involve one or more unidentified mechanisms and processes, such as volume scattering and scattering from surface structure that is much smaller than the electromagnetic wavelength used to probe the surface. The cosine law in general is not explicit in its dependence on either the surface structure or electromagnetic parameters. Further, the cosine law often is only a poor representation of the observed diffuse scattering, as can be inferred from computation of standard goodness-of-fit measures such as the statistical significance. We fit four Cassini datasets (TA Inbound and Outbound, T3 Outbound, and T8 Inbound) with a linear combination of a cosine law and a generalized fractal-based quasi-specular scattering law (A. K. Sultan- Salem and G. L. Tyler, J. Geophys. Res., 111, E06S08, doi:10.1029/2005JE002540, 2006), in order to demonstrate how the presence of diffuse scattering increases considerably the uncertainty in surface parameters inferred from the quasi-specular component, typically the dielectric constant of the surface material and the surface root-mean-square slope. This uncertainty impacts inferences concerning the physical properties of the surfaces that display mixed scattering properties.
Deformable Image Registration based on Similarity-Steered CNN Regression.
Cao, Xiaohuan; Yang, Jianhua; Zhang, Jun; Nie, Dong; Kim, Min-Jeong; Wang, Qian; Shen, Dinggang
2017-09-01
Existing deformable registration methods require exhaustively iterative optimization, along with careful parameter tuning, to estimate the deformation field between images. Although some learning-based methods have been proposed for initiating deformation estimation, they are often template-specific and not flexible in practical use. In this paper, we propose a convolutional neural network (CNN) based regression model to directly learn the complex mapping from the input image pair (i.e., a pair of template and subject) to their corresponding deformation field. Specifically, our CNN architecture is designed in a patch-based manner to learn the complex mapping from the input patch pairs to their respective deformation field. First, the equalized active-points guided sampling strategy is introduced to facilitate accurate CNN model learning upon a limited image dataset. Then, the similarity-steered CNN architecture is designed, where we propose to add the auxiliary contextual cue, i.e., the similarity between input patches, to more directly guide the learning process. Experiments on different brain image datasets demonstrate promising registration performance based on our CNN model. Furthermore, it is found that the trained CNN model from one dataset can be successfully transferred to another dataset, although brain appearances across datasets are quite variable.
Space-based surface wind vectors to aid understanding of air-sea interactions
NASA Technical Reports Server (NTRS)
Atlas, R.; Bloom, S. C.; Hoffman, R. N.; Ardizzone, J. V.; Brin, G.
1991-01-01
A novel and unique ocean-surface wind data-set has been derived by combining the Defense Meteorological Satellite Program Special Sensor Microwave Imager data with additional conventional data. The variational analysis used generates a gridded surface wind analysis that minimizes an objective function measuring the misfit of the analysis to the background, the data, and certain a priori constraints. In the present case, the European Center for Medium-Range Weather Forecasts surface-wind analysis is used as the background.
1996-12-01
ranging from academic to industrial demonstrated the utility of the developed procedure for ab initio surface meshing from discrete data, such as...academic to industrial demonstrate the utility of the pro- hypersonic reentry problems, where ray-tracing based on posed procedure for ab initio surface...data input within industrial simulations. The origi- nal CAD dataset had over 500 surface patches, many All of the surface grids shown were obtained
Dataset on predictive compressive strength model for self-compacting concrete.
Ofuyatan, O M; Edeki, S O
2018-04-01
The determination of compressive strength is affected by many variables such as the water cement (WC) ratio, the superplasticizer (SP), the aggregate combination, and the binder combination. In this dataset article, 7, 28, and 90-day compressive strength models are derived using statistical analysis. The response surface methodology is used toinvestigate the effect of the parameters: Varying percentages of ash, cement, WC, and SP on hardened properties-compressive strengthat 7,28 and 90 days. Thelevels of independent parameters are determinedbased on preliminary experiments. The experimental values for compressive strengthat 7, 28 and 90 days and modulus of elasticity underdifferent treatment conditions are also discussed and presented.These dataset can effectively be used for modelling and prediction in concrete production settings.
Map the Permafrost and its Affected Soils and Vegetation on the Tibetan Plateau
NASA Astrophysics Data System (ADS)
Zhao, L.; Sheng, Y.; Pang, Q.; Zou, D.; Wang, Z.; Li, W.; Wu, X.; Yue, G.; Fang, H.; Zhao, Y.
2015-12-01
Great amount of literatures had been published to deal with the actual distribution and changes of permafrost on the Tibetan Plateau (TP) on the basis of observed ground temperature dataset along Qinghai-Xizang Highway and/or Railway (QXH/R) during the last several decades. But there is very limited data available in the eastern part of the QXH/R and almost no observation in the western part of QXH/R not only for the observed permafrost data, but also for the dataset on ground surface conditions, such as soil and vegetation, which are used as model parameters, initial variables, or benchmark data sets for calibration, validation, and comparison in various Earth System Models (ESMs). To evaluate the status of permafrost and its environmental conditions, such as the distribution and thermal state of permafrost, soil and vegetation on the TP, detailed investigation on permafrost were conducted in 5 regions with different climatic and geologic conditions over the whole plateau from 2009 to 2013, and more than 100 ground temperatures (GTs) monitoring boreholes were drilled and equipped with thermistors, of which 10 sites were equipped with automatic meteorological stations. Geophysical prospecting methods, such as ground penetrating radar (GPR) and electromagnetic prospecting, were used in the same time to detect the permafrost distribution and thicknesses. The monitoring data revealed that the thermal state of permafrost was well correlated with elevation, and regulated by annual precipitation, local geological, geomorphological and hydrological conditions through heat exchanges between ground and atmosphere. Different models, including GTs statistical model, Common Land Surface Model (CoLM), Noah land surface model and TTOP models, were used to map the permafrost in 5 selected regions and the whole TP, while the investigated and monitored data were used as calibration and validation for all models. Finally, we compiled the permafrost map of the TP, soil and vegetation map within the permafrost regions on the TP. We also compiled the soil organic carbon density map of permafrost affected soils on the TP. An overview on permafrost thickness, GTs, ice content was statistically summarized based on investigation data.
NASA Astrophysics Data System (ADS)
Smith, Shawn; Bourassa, Mark
2014-05-01
The development of a new surface flux dataset based on underway meteorological observations from research vessels will be presented. The research vessel data center at the Florida State University routinely acquires, quality controls, and distributes underway surface meteorological and oceanographic observations from over 30 oceanographic vessels. These activities are coordinated by the Shipboard Automated Meteorological and Oceanographic System (SAMOS) initiative in partnership with the Rolling Deck to Repository (R2R) project. Recently, the SAMOS data center has used these underway observations to produce bulk flux estimates for each vessel along individual cruise tracks. A description of this new flux product, along with the underlying data quality control procedures applied to SAMOS observations, will be provided. Research vessels provide underway observations at high-temporal frequency (1 min. sampling interval) that include navigational (position, course, heading, and speed), meteorological (air temperature, humidity, wind, surface pressure, radiation, rainfall), and oceanographic (surface sea temperature and salinity) samples. Vessels recruited to the SAMOS initiative collect a high concentration of data within the U.S. continental shelf and also frequently operate well outside routine shipping lanes, capturing observations in extreme ocean environments (Southern, Arctic, South Atlantic, and South Pacific oceans). These observations are atypical for their spatial and temporal sampling, making them very useful for many applications including validation of numerical models and satellite retrievals, as well as local assessments of natural variability. Individual SAMOS observations undergo routine automated quality control and select vessels receive detailed visual data quality inspection. The result is a quality-flagged data set that is ideal for calculating turbulent flux estimates. We will describe the bulk flux algorithms that have been applied to the observations and the choices of constants that are used. Analysis of the preliminary SAMOS flux products will be presented, including spatial and temporal coverage for each derived parameter. The unique quality and sampling locations of research vessel observations and their independence from many models and products makes them ideal for validation studies. The strengths and limitations of research observations for flux validation studies will be discussed. The authors welcome a discussion with the flux community regarding expansion of the SAMOS program to include additional international vessels, thus facilitating and expansion of this research vessel-based flux product.
NASA's Earth Science Use of Commercially Availiable Remote Sensing Datasets: Cover Image
NASA Technical Reports Server (NTRS)
Underwood, Lauren W.; Goward, Samuel N.; Fearon, Matthew G.; Fletcher, Rose; Garvin, Jim; Hurtt, George
2008-01-01
The cover image incorporates high resolution stereo pairs acquired from the DigitalGlobe(R) QuickBird sensor. It shows a digital elevation model of Meteor Crater, Arizona at approximately 1.3 meter point-spacing. Image analysts used the Leica Photogrammetry Suite to produce the DEM. The outside portion was computed from two QuickBird panchromatic scenes acquired October 2006, while an Optech laser scan dataset was used for the crater s interior elevations. The crater s terrain model and image drape were created in a NASA Constellation Program project focused on simulating lunar surface environments for prototyping and testing lunar surface mission analysis and planning tools. This work exemplifies NASA s Scientific Data Purchase legacy and commercial high resolution imagery applications, as scientists use commercial high resolution data to examine lunar analog Earth landscapes for advanced planning and trade studies for future lunar surface activities. Other applications include landscape dynamics related to volcanism, hydrologic events, climate change, and ice movement.
Sea surface temperature and salinity from French research vessels, 2001–2013
Gaillard, Fabienne; Diverres, Denis; Jacquin, Stéphane; Gouriou, Yves; Grelet, Jacques; Le Menn, Marc; Tassel, Joelle; Reverdin, Gilles
2015-01-01
French Research vessels have been collecting thermo-salinometer (TSG) data since 1999 to contribute to the Global Ocean Surface Underway Data (GOSUD) programme. The instruments are regularly calibrated and continuously monitored. Water samples are taken on a daily basis by the crew and later analysed in the laboratory. We present here the delayed mode processing of the 2001–2013 dataset and an overview of the resulting quality. Salinity measurement error was a few hundredths of a unit or less on the practical salinity scale (PSS), due to careful calibration and instrument maintenance, complemented with a rigorous adjustment on water samples. In a global comparison, these data show excellent agreement with an ARGO-based salinity gridded product. The Sea Surface Salinity and Temperature from French REsearch SHips (SSST-FRESH) dataset is very valuable for the ‘calibration and validation’ of the new satellite observations delivered by the Soil Moisture and Ocean Salinity (SMOS) and Aquarius missions. PMID:26504523
NASA Astrophysics Data System (ADS)
Machío, Francisco; Rodríguez-Cielos, Ricardo; Navarro, Francisco; Lapazaran, Javier; Otero, Jaime
2017-10-01
We present a 14-year record of in situ glacier surface velocities determined by repeated global navigation satellite system (GNSS) measurements in a dense network of 52 stakes distributed across two glaciers, Johnsons (tidewater) and Hurd (land-terminating), located on Livingston Island, South Shetland Islands, Antarctica. The measurements cover the time period 2000-2013 and were collected at the beginning and end of each austral summer season. A second-degree polynomial approximation is fitted to each stake position, which allows estimating the approximate positions and associated velocities at intermediate times. This dataset is useful as input data for numerical models of glacier dynamics or for the calibration and validation of remotely sensed velocities for a region where very scarce in situ glacier surface velocity measurements have been available so far. The link to the data repository is as follows: http://doi.pangaea.de/10.1594/PANGAEA.846791.
Holmes, Avram J.; Hollinshead, Marisa O.; O’Keefe, Timothy M.; Petrov, Victor I.; Fariello, Gabriele R.; Wald, Lawrence L.; Fischl, Bruce; Rosen, Bruce R.; Mair, Ross W.; Roffman, Joshua L.; Smoller, Jordan W.; Buckner, Randy L.
2015-01-01
The goal of the Brain Genomics Superstruct Project (GSP) is to enable large-scale exploration of the links between brain function, behavior, and ultimately genetic variation. To provide the broader scientific community data to probe these associations, a repository of structural and functional magnetic resonance imaging (MRI) scans linked to genetic information was constructed from a sample of healthy individuals. The initial release, detailed in the present manuscript, encompasses quality screened cross-sectional data from 1,570 participants ages 18 to 35 years who were scanned with MRI and completed demographic and health questionnaires. Personality and cognitive measures were obtained on a subset of participants. Each dataset contains a T1-weighted structural MRI scan and either one (n=1,570) or two (n=1,139) resting state functional MRI scans. Test-retest reliability datasets are included from 69 participants scanned within six months of their initial visit. For the majority of participants self-report behavioral and cognitive measures are included (n=926 and n=892 respectively). Analyses of data quality, structure, function, personality, and cognition are presented to demonstrate the dataset’s utility. PMID:26175908
Edge-oriented dual-dictionary guided enrichment (EDGE) for MRI-CT image reconstruction.
Li, Liang; Wang, Bigong; Wang, Ge
2016-01-01
In this paper, we formulate the joint/simultaneous X-ray CT and MRI image reconstruction. In particular, a novel algorithm is proposed for MRI image reconstruction from highly under-sampled MRI data and CT images. It consists of two steps. First, a training dataset is generated from a series of well-registered MRI and CT images on the same patients. Then, an initial MRI image of a patient can be reconstructed via edge-oriented dual-dictionary guided enrichment (EDGE) based on the training dataset and a CT image of the patient. Second, an MRI image is reconstructed using the dictionary learning (DL) algorithm from highly under-sampled k-space data and the initial MRI image. Our algorithm can establish a one-to-one correspondence between the two imaging modalities, and obtain a good initial MRI estimation. Both noise-free and noisy simulation studies were performed to evaluate and validate the proposed algorithm. The results with different under-sampling factors show that the proposed algorithm performed significantly better than those reconstructed using the DL algorithm from MRI data alone.
NASA Astrophysics Data System (ADS)
Boisserie, M.; Cocke, S.; O'Brien, J. J.
2009-12-01
Although the amount of water contained in the soil seems insignificant when compared to the total amount of water on a global-scale, soil moisture is widely recognized as a crucial variable for climate studies. It plays a key role in regulating the interaction between the atmosphere and the land-surface by controlling the repartition between the surface latent and sensible heat fluxes. In addition, the persistence of soil moisture anomalies provides one of the most important components of memory for the climate system. Several studies have shown that, during the boreal summer in mid-latitudes, the soil moisture role in controlling the continental precipitation variability may be more important than that of the sea surface temperature (Koster et al. 2000, Hong and Kalnay 2000, Koster et al. 2000, Kumar and Hoerling 1995, Trenberth et al. 1998, Shukla 1998). Although all of the above studies have demonstrated the strong sensitivity of seasonal forecasts to the soil moisture initial conditions, they relied on extreme or idealized soil moisture levels. The question of whether realistic soil moisture initial conditions lead to improved seasonal predictions has not been adequately addressed. Progress in addressing this question has been hampered by the lack of long-term reliable observation-based global soil moisture data sets. Since precipitation strongly affects the soil moisture characteristics at the surface and in depth, an alternative to this issue is to assimilate precipitation. Because precipitation is a diagnostic variable, most of the current reanalyses do not directly assimilate it into their models (M. Bosilovitch, 2008). In this study, an effective technique that directly assimilates the precipitation is used. We examine two experiments. In the first experiment, the model is initialized by directly assimilating a global, 3-hourly, 1.0° precipitation dataset, provided by Sheffield et al. (2006), in a continuous assimilation period of a couple of months. For this, we use a technique named the Precipitation Assimilation Reanalysis (PAR) described in Nunes and Cocke (2004). This technique consists of modifying the vertical profile of humidity as a function of the observed and predicted model rain rates. In the second experiment, the model is initialized without precipitation assimilation. For each experiment, ten sets of seasonal forecasts of the coupled land-atmosphere Florida State University/Center for Ocean and Atmosphere Predictions Studies (FSU/COAPS) model were generated, starting from the boreal summer of each year between 1986 and 1995. For each forecast, ten ensembles are produced by starting the forecast from the 1st and the 15th of each month from April to August. The results of these experiments show, first, that the PAR technique greatly improves the temporal and spatial variability of out model soil moisture estimate. Second, using these realistic soil moisture initial conditions, we found a significant increase in the air temperature seasonal forecasting skills. However, not significant increase has been found in the precipitation seasonal forecasting skills. The results of this study are involved in the GLACE-2 international multi-model experiment.
Atmospheric Science Data Center
2017-10-11
... new inland water class for RCCM calculation and changed threshold and surface classification datasets accordingly. Modified land second ... 06/21/2000 First version of RCCM. Pre-launch threshold values are used. New ancillary files: ...
NASA Astrophysics Data System (ADS)
Calla, O. P. N.; Mathur, Shubhra; Gadri, Kishan Lal; Jangid, Monika
2016-12-01
In the present paper, permittivity maps of equatorial lunar surface are generated using brightness temperature (TB) data obtained from Microwave Radiometer (MRM) of Chang'e-1 and physical temperature (TP) data obtained from Diviner of Lunar Reconnaissance Orbiter (LRO). Here, permittivity mapping is not carried out above 60° latitudes towards the lunar poles due to large anomaly in the physical temperature obtained from the Diviner. Microwave frequencies, which are used to generate these maps are 3 GHz, 7.8 GHz, 19.35 GHz and 37 GHz. Permittivity values are simulated using TB values at these four frequencies. Here, weighted average of physical temperature obtained from Diviner are used to compute permittivity at each microwave frequencies. Longer wavelengths of microwave signals give information of more deeper layers of the lunar surface as compared to smaller wavelength. Initially, microwave emissivity is estimated using TB values from MRM and physical temperature (TP) from Diviner. From estimated emissivity the real part of permittivity (ε), is calculated using Fresnel equations. The permittivity maps of equatorial lunar surface is generated. The simulated permittivity values are normalized with respect to density for easy comparison of simulated permittivity values with the permittivity values of Apollo samples as well as with the permittivity values of Terrestrial Analogue of Lunar Soil (TALS) JSC-1A. Lower value of dielectric constant (ε‧) indicates that the corresponding lunar surface is smooth and doesn't have rough rocky terrain. Thus a future lunar astronaut can use these data to decide proper landing site for future lunar missions. The results of this paper will serve as input to future exploration of lunar surface.
NASA Astrophysics Data System (ADS)
Wright, L.; Coddington, O.; Pilewskie, P.
2017-12-01
Hyperspectral instruments are a growing class of Earth observing sensors designed to improve remote sensing capabilities beyond discrete multi-band sensors by providing tens to hundreds of continuous spectral channels. Improved spectral resolution, range and radiometric accuracy allow the collection of large amounts of spectral data, facilitating thorough characterization of both atmospheric and surface properties. We describe the development of an Informed Non-Negative Matrix Factorization (INMF) spectral unmixing method to exploit this spectral information and separate atmospheric and surface signals based on their physical sources. INMF offers marked benefits over other commonly employed techniques including non-negativity, which avoids physically impossible results; and adaptability, which tailors the method to hyperspectral source separation. The INMF algorithm is adapted to separate contributions from physically distinct sources using constraints on spectral and spatial variability, and library spectra to improve the initial guess. Using this INMF algorithm we decompose hyperspectral imagery from the NASA Hyperspectral Imager for the Coastal Ocean (HICO), with a focus on separating surface and atmospheric signal contributions. HICO's coastal ocean focus provides a dataset with a wide range of atmospheric and surface conditions. These include atmospheres with varying aerosol optical thicknesses and cloud cover. HICO images also provide a range of surface conditions including deep ocean regions, with only minor contributions from the ocean surfaces; and more complex shallow coastal regions with contributions from the seafloor or suspended sediments. We provide extensive comparison of INMF decomposition results against independent measurements of physical properties. These include comparison against traditional model-based retrievals of water-leaving, aerosol, and molecular scattering radiances and other satellite products, such as aerosol optical thickness from the Moderate Resolution Imaging Spectroradiometer (MODIS).
Zhang, Xiangmin; Williams, Rachel; Wu, Xiaodong; Anderson, Donald D.; Sonka, Milan
2011-01-01
A novel method for simultaneous segmentation of multiple interacting surfaces belonging to multiple interacting objects, called LOGISMOS (layered optimal graph image segmentation of multiple objects and surfaces), is reported. The approach is based on the algorithmic incorporation of multiple spatial inter-relationships in a single n-dimensional graph, followed by graph optimization that yields a globally optimal solution. The LOGISMOS method’s utility and performance are demonstrated on a bone and cartilage segmentation task in the human knee joint. Although trained on only a relatively small number of nine example images, this system achieved good performance. Judged by dice similarity coefficients (DSC) using a leave-one-out test, DSC values of 0.84 ± 0.04, 0.80 ± 0.04 and 0.80 ± 0.04 were obtained for the femoral, tibial, and patellar cartilage regions, respectively. These are excellent DSC values, considering the narrow-sheet character of the cartilage regions. Similarly, low signed mean cartilage thickness errors were obtained when compared to a manually-traced independent standard in 60 randomly selected 3-D MR image datasets from the Osteoarthritis Initiative database—0.11 ± 0.24, 0.05 ± 0.23, and 0.03 ± 0.17 mm for the femoral, tibial, and patellar cartilage thickness, respectively. The average signed surface positioning errors for the six detected surfaces ranged from 0.04 ± 0.12 mm to 0.16 ± 0.22 mm. The reported LOGISMOS framework provides robust and accurate segmentation of the knee joint bone and cartilage surfaces of the femur, tibia, and patella. As a general segmentation tool, the developed framework can be applied to a broad range of multiobject multisurface segmentation problems. PMID:20643602
How to Make Data a Blessing to Parametric Uncertainty Quantification and Reduction?
NASA Astrophysics Data System (ADS)
Ye, M.; Shi, X.; Curtis, G. P.; Kohler, M.; Wu, J.
2013-12-01
In a Bayesian point of view, probability of model parameters and predictions are conditioned on data used for parameter inference and prediction analysis. It is critical to use appropriate data for quantifying parametric uncertainty and its propagation to model predictions. However, data are always limited and imperfect. When a dataset cannot properly constrain model parameters, it may lead to inaccurate uncertainty quantification. While in this case data appears to be a curse to uncertainty quantification, a comprehensive modeling analysis may help understand the cause and characteristics of parametric uncertainty and thus turns data into a blessing. In this study, we illustrate impacts of data on uncertainty quantification and reduction using an example of surface complexation model (SCM) developed to simulate uranyl (U(VI)) adsorption. The model includes two adsorption sites, referred to as strong and weak sites. The amount of uranium adsorption on these sites determines both the mean arrival time and the long tail of the breakthrough curves. There is one reaction on the weak site but two reactions on the strong site. The unknown parameters include fractions of the total surface site density of the two sites and surface complex formation constants of the three reactions. A total of seven experiments were conducted with different geochemical conditions to estimate these parameters. The experiments with low initial concentration of U(VI) result in a large amount of parametric uncertainty. A modeling analysis shows that it is because the experiments cannot distinguish the relative adsorption affinity of the strong and weak sites on uranium adsorption. Therefore, the experiments with high initial concentration of U(VI) are needed, because in the experiments the strong site is nearly saturated and the weak site can be determined. The experiments with high initial concentration of U(VI) are a blessing to uncertainty quantification, and the experiments with low initial concentration help modelers turn a curse into a blessing. The data impacts on uncertainty quantification and reduction are quantified using probability density functions of model parameters obtained from Markov Chain Monte Carlo simulation using the DREAM algorithm. This study provides insights to model calibration, uncertainty quantification, experiment design, and data collection in groundwater reactive transport modeling and other environmental modeling.
Building a better search engine for earth science data
NASA Astrophysics Data System (ADS)
Armstrong, E. M.; Yang, C. P.; Moroni, D. F.; McGibbney, L. J.; Jiang, Y.; Huang, T.; Greguska, F. R., III; Li, Y.; Finch, C. J.
2017-12-01
Free text data searching of earth science datasets has been implemented with varying degrees of success and completeness across the spectrum of the 12 NASA earth sciences data centers. At the JPL Physical Oceanography Distributed Active Archive Center (PO.DAAC) the search engine has been developed around the Solr/Lucene platform. Others have chosen other popular enterprise search platforms like Elasticsearch. Regardless, the default implementations of these search engines leveraging factors such as dataset popularity, term frequency and inverse document term frequency do not fully meet the needs of precise relevancy and ranking of earth science search results. For the PO.DAAC, this shortcoming has been identified for several years by its external User Working Group that has assigned several recommendations to improve the relevancy and discoverability of datasets related to remotely sensed sea surface temperature, ocean wind, waves, salinity, height and gravity that comprise a total count of over 500 public availability datasets. Recently, the PO.DAAC has teamed with an effort led by George Mason University to improve the improve the search and relevancy ranking of oceanographic data via a simple search interface and powerful backend services called MUDROD (Mining and Utilizing Dataset Relevancy from Oceanographic Datasets to Improve Data Discovery) funded by the NASA AIST program. MUDROD has mined and utilized the combination of PO.DAAC earth science dataset metadata, usage metrics, and user feedback and search history to objectively extract relevance for improved data discovery and access. In addition to improved dataset relevance and ranking, the MUDROD search engine also returns recommendations to related datasets and related user queries. This presentation will report on use cases that drove the architecture and development, and the success metrics and improvements on search precision and recall that MUDROD has demonstrated over the existing PO.DAAC search interfaces.
USDA-ARS?s Scientific Manuscript database
Using multiple historical satellite surface soil moisture products, the Kalman Filtering-based Soil Moisture Analysis Rainfall Tool (SMART) is applied to improve the accuracy of a multi-decadal global daily rainfall product that has been bias-corrected to match the monthly totals of available rain g...
San Pedro River Basin Data Browser Report
Acquisition of primary spatial data and database development are initial features of any type of landscape assessment project. They provide contemporary land cover and the ancillary datasets necessary to establish reference condition and develop alternative future scenarios that ...
NASA Astrophysics Data System (ADS)
Snidero, M.; Amilibia, A.; Gratacos, O.; Muñoz, J. A.
2009-04-01
This work presents a methodological workflow for the 3D reconstruction of geological surfaces at regional scale, based on remote sensing data and geological maps. This workflow has been tested on the reconstruction of the Anaran anticline, located in the Zagros Fold and Thrust belt mountain front. The used remote sensing data-set is a combination of Aster and Spot images as well as a high resolution digital elevation model. A consistent spatial positioning of the complete data-set in a 3D environment is necessary to obtain satisfactory results during the reconstruction. The Aster images have been processed by the Optimum Index Factor (OIF) technique, in order to facilitate the geological mapping. By pansharpening of the resulting Aster image with the SPOT panchromatic one we obtain the final high-resolution image used during the 3D mapping. Structural data (dip data) has been acquired through the analysis of the 3D mapped geological traces. Structural analysis of the resulting data-set allows us to divide the structure in different cylindrical domains. Related plunge lines orientation has been used to project data along the structure, covering areas with little or no information. Once a satisfactory dataset has been acquired, we reconstruct a selected horizon following the dip-domain concept. By manual editing, the obtained surfaces have been adjusted to the mapped geological limits as well as to the modeled faults. With the implementation of the Discrete Smooth Interpolation (DSI) algorithm, the final surfaces have been reconstructed along the anticline. Up to date the results demonstrate that the proposed methodology is a powerful tool for 3D reconstruction of geological surfaces when working with remote sensing data, in very inaccessible areas (eg. Iran, China, Africa). It is especially useful in semiarid regions where the structure strongly controls the topography. The reconstructed surfaces clearly show the geometry in the different sectors of the structure: presence of a back thrust affecting the back limb in the southern part of the anticline, the geometry of the grabens located along the anticline crest, the crosscutting relationship in the north-south faulted zone with the main thrust, the northern dome periclinal closure.
Acceptance of Domestic Cat Mitochondrial DNA in a Criminal Proceeding
Lyons, Leslie A.; Grahn, Robert A.; Kun, Teri J.; Netzel, Linda R.; Wictum, Elizabeth E.; Halverson, Joy L.
2014-01-01
Shed hair from domestic animals readily adheres to clothing and other contact items, providing a source of transfer evidence for criminal investigations. Mitochondrial DNA is often the only option for DNA analysis of shed hair. Human mitochondrial DNA analysis has been accepted in the US court system since 1996. The murder trial of the State of Missouri versus Henry L. Polk, Jr. represents the first legal proceeding where cat mitochondrial DNA analysis was introduced into evidence. The mitochondrial DNA evidence was initially considered inadmissible due to concerns about the cat dataset and the scientific acceptance of the marker. Those concerns were subsequently addressed, and the evidence was deemed admissible. This report reviews the case in regards to the cat biological evidence and its ultimate admission as generally accepted and reliable. Expansion and saturation analysis of the cat mitochondrial DNA control region dataset supported the initial interpretation of the evidence. PMID:25086413
Moonesinghe, S Ramani; Grocott, Michael P W; Bennett-Guerrero, Elliott; Bergamaschi, Roberto; Gottumukkala, Vijaya; Hopkins, Thomas J; McCluskey, Stuart; Gan, Tong J; Mythen, Michael Monty G; Shaw, Andrew D; Miller, Timothy E
2017-01-01
This article sets out a framework for measurement of quality of care relevant to enhanced recovery pathways (ERPs) in elective colorectal surgery. The proposed framework is based on established measurement systems and/or theories, and provides an overview of the different approaches for improving clinical monitoring, and enhancing quality improvement or research in varied settings with different levels of available resources. Using a structure-process-outcome framework, we make recommendations for three hierarchical tiers of data collection. Core, Quality Improvement, and Best Practice datasets are proposed. The suggested datasets incorporate patient data to describe case-mix, process measures to describe delivery of enhanced recovery and clinical outcomes. The fundamental importance of routine collection of data for the initiation, maintenance, and enhancement of enhanced recovery pathways is emphasized.
Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data
NASA Technical Reports Server (NTRS)
Baxes, Gregory; Mixon, Brian; Linger, TIm
2013-01-01
Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.
Rane, Swati; Plassard, Andrew; Landman, Bennett A.; Claassen, Daniel O.; Donahue, Manus J.
2017-01-01
This work explores the feasibility of combining anatomical MRI data across two public repositories namely, the Alzheimer’s Disease Neuroimaging Initiative (ADNI) and the Progressive Parkinson’s Markers Initiative (PPMI). We compared cortical thickness and subcortical volumes in cognitively normal older adults between datasets with distinct imaging parameters to assess if they would provide equivalent information. Three distinct datasets were identified. Major differences in data were scanner manufacturer and the use of magnetization inversion to enhance tissue contrast. Equivalent datasets, i.e., those providing similar volumetric measurements in cognitively normal controls, were identified in ADNI and PPMI. These were datasets obtained on the Siemens scanner with TI = 900 ms. Our secondary goal was to assess the agreement between subcortical volumes that are obtained with different software packages. Three subcortical measurement applications (FSL, FreeSurfer, and a recent multi-atlas approach) were compared. Our results show significant agreement in the measurements of caudate, putamen, pallidum, and hippocampus across the packages and poor agreement between measurements of accumbens and amygdala. This is likely due to their smaller size and lack of gray matter-white matter tissue contrast for accurate segmentation. This work provides a segue to combine imaging data from ADNI and PPMI to increase statistical power as well as to interrogate common mechanisms in disparate pathologies such as Alzheimer’s and Parkinson’s diseases. It lays the foundation for comparison of anatomical data acquired with disparate imaging parameters and analyzed with disparate software tools. Furthermore, our work partly explains the variability in the results of studies using different software packages. PMID:29756095
Rane, Swati; Plassard, Andrew; Landman, Bennett A; Claassen, Daniel O; Donahue, Manus J
2017-01-01
This work explores the feasibility of combining anatomical MRI data across two public repositories namely, the Alzheimer's Disease Neuroimaging Initiative (ADNI) and the Progressive Parkinson's Markers Initiative (PPMI). We compared cortical thickness and subcortical volumes in cognitively normal older adults between datasets with distinct imaging parameters to assess if they would provide equivalent information. Three distinct datasets were identified. Major differences in data were scanner manufacturer and the use of magnetization inversion to enhance tissue contrast. Equivalent datasets, i.e., those providing similar volumetric measurements in cognitively normal controls, were identified in ADNI and PPMI. These were datasets obtained on the Siemens scanner with TI = 900 ms. Our secondary goal was to assess the agreement between subcortical volumes that are obtained with different software packages. Three subcortical measurement applications (FSL, FreeSurfer, and a recent multi-atlas approach) were compared. Our results show significant agreement in the measurements of caudate, putamen, pallidum, and hippocampus across the packages and poor agreement between measurements of accumbens and amygdala. This is likely due to their smaller size and lack of gray matter-white matter tissue contrast for accurate segmentation. This work provides a segue to combine imaging data from ADNI and PPMI to increase statistical power as well as to interrogate common mechanisms in disparate pathologies such as Alzheimer's and Parkinson's diseases. It lays the foundation for comparison of anatomical data acquired with disparate imaging parameters and analyzed with disparate software tools. Furthermore, our work partly explains the variability in the results of studies using different software packages.
NASA Astrophysics Data System (ADS)
Koch, J.; Jensen, K. H.; Stisen, S.
2017-12-01
Hydrological models that integrate numerical process descriptions across compartments of the water cycle are typically required to undergo thorough model calibration in order to estimate suitable effective model parameters. In this study, we apply a spatially distributed hydrological model code which couples the saturated zone with the unsaturated zone and the energy portioning at the land surface. We conduct a comprehensive multi-constraint model calibration against nine independent observational datasets which reflect both the temporal and the spatial behavior of hydrological response of a 1000km2 large catchment in Denmark. The datasets are obtained from satellite remote sensing and in-situ measurements and cover five keystone hydrological variables: discharge, evapotranspiration, groundwater head, soil moisture and land surface temperature. Results indicate that a balanced optimization can be achieved where errors on objective functions for all nine observational datasets can be reduced simultaneously. The applied calibration framework was tailored with focus on improving the spatial pattern performance; however results suggest that the optimization is still more prone to improve the temporal dimension of model performance. This study features a post-calibration linear uncertainty analysis. This allows quantifying parameter identifiability which is the worth of a specific observational dataset to infer values to model parameters through calibration. Furthermore the ability of an observation to reduce predictive uncertainty is assessed as well. Such findings determine concrete implications on the design of model calibration frameworks and, in more general terms, the acquisition of data in hydrological observatories.
ARM Research in the Equatorial Western Pacific: A Decade and Counting
NASA Technical Reports Server (NTRS)
Long, C. N.; McFarlane, S. A.; DelGenio, A.; Minnis, P.; Ackerman, T. S.; Mather, J.; Comstock, J.; Mace, G. G.; Jensen, M.; Jakob, C.
2013-01-01
The tropical western Pacific (TWP) is an important climatic region. Strong solar heating, warm sea surface temperatures, and the annual progression of the intertropical convergence zone (ITCZ) across this region generate abundant convective systems, which through their effects on the heat and water budgets have a profound impact on global climate and precipitation. In order to accurately evaluate tropical cloud systems in models, measurements of tropical clouds, the environment in which they reside, and their impact on the radiation and water budgets are needed. Because of the remote location, ground-based datasets of cloud, atmosphere, and radiation properties from the TWP region have come primarily from short-term field experiments. While providing extremely useful information on physical processes, these short-term datasets are limited in statistical and climatological information. To provide longterm measurements of the surface radiation budget in the tropics and the atmospheric properties that affect it, the Atmospheric Radiation Measurement program established a measurement site on Manus Island, Papua New Guinea, in 1996 and on the island republic of Nauru in late 1998. These sites provide unique datasets now available for more than 10 years on Manus and Nauru. This article presents examples of the scientific use of these datasets including characterization of cloud properties, analysis of cloud radiative forcing, model studies of tropical clouds and processes, and validation of satellite algorithms. New instrumentation recently installed at the Manus site will provide expanded opportunities for tropical atmospheric science.
Carmen Legaz-García, María Del; Miñarro-Giménez, José Antonio; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2016-06-03
Biomedical research usually requires combining large volumes of data from multiple heterogeneous sources, which makes difficult the integrated exploitation of such data. The Semantic Web paradigm offers a natural technological space for data integration and exploitation by generating content readable by machines. Linked Open Data is a Semantic Web initiative that promotes the publication and sharing of data in machine readable semantic formats. We present an approach for the transformation and integration of heterogeneous biomedical data with the objective of generating open biomedical datasets in Semantic Web formats. The transformation of the data is based on the mappings between the entities of the data schema and the ontological infrastructure that provides the meaning to the content. Our approach permits different types of mappings and includes the possibility of defining complex transformation patterns. Once the mappings are defined, they can be automatically applied to datasets to generate logically consistent content and the mappings can be reused in further transformation processes. The results of our research are (1) a common transformation and integration process for heterogeneous biomedical data; (2) the application of Linked Open Data principles to generate interoperable, open, biomedical datasets; (3) a software tool, called SWIT, that implements the approach. In this paper we also describe how we have applied SWIT in different biomedical scenarios and some lessons learned. We have presented an approach that is able to generate open biomedical repositories in Semantic Web formats. SWIT is able to apply the Linked Open Data principles in the generation of the datasets, so allowing for linking their content to external repositories and creating linked open datasets. SWIT datasets may contain data from multiple sources and schemas, thus becoming integrated datasets.
NASA Astrophysics Data System (ADS)
Ruan, Jinshuai; Wen, Xiaohang; Fan, Guangzhou; Li, Deqin; Hua, Wei; Wang, Bingyun; Zhang, Yi; Zhang, Mingjun; Wang, Chao; Wang, Lei
2017-11-01
To study the land surface and atmospheric meteorological characteristics of non-uniform underlying surfaces in the semi-arid area of Northeast China, we use a "High-Resolution Assimilation Dataset of the water-energy cycle in China (HRADC)". The grid points of three different underlying surfaces were selected, and their meteorological elements were averaged for each type (i.e., mixed forest, grassland, and cropland). For 2009, we compared and analyzed the different components of leaf area index (LAI), soil temperature and moisture, surface albedo, precipitation, and surface energy for various underlying surfaces in Northeast China. The results indicated that the LAI of mixed forest and cropland during the summer is greater than 5 m2 m-2 and below 2.5 m2 m-2 for grassland; in the winter and spring seasons, the Green Vegetation Fraction (GVF) is below 30%. The soil temperature and moisture both vary greatly. Throughout the year, the mixed forest is dominated by latent heat evaporation; in grasslands and croplands, the sensible heat flux and the latent heat flux are approximately equal, and the GVF contributed more to latent heat flux than sensible heat flux in the summer. This study compares meteorological characteristics between three different underlying surfaces of the semi-arid area of Northeast China and makes up for the insufficiency of purely using observations for the study. This research is important for understanding the water-energy cycle and transport in the semi-arid area.
Poppenga, Sandra K.; Worstell, Bruce B.; Stoker, Jason M.; Greenlee, Susan K.
2009-01-01
The U.S. Geological Survey (USGS) has taken the lead in the creation of a valuable remote sensing product by incorporating digital elevation models (DEMs) derived from Light Detection and Ranging (lidar) into the National Elevation Dataset (NED), the elevation layer of 'The National Map'. High-resolution lidar-derived DEMs provide the accuracy needed to systematically quantify and fully integrate surface flow including flow direction, flow accumulation, sinks, slope, and a dense drainage network. In 2008, 1-meter resolution lidar data were acquired in Minnehaha County, South Dakota. The acquisition was a collaborative effort between Minnehaha County, the city of Sioux Falls, and the USGS Earth Resources Observation and Science (EROS) Center. With the newly acquired lidar data, USGS scientists generated high-resolution DEMs and surface flow features. This report compares lidar-derived surface flow features in Minnehaha County to 30- and 10-meter elevation data previously incorporated in the NED and ancillary hydrography datasets. Surface flow features generated from lidar-derived DEMs are consistently integrated with elevation and are important in understanding surface-water movement to better detect surface-water runoff, flood inundation, and erosion. Many topographic and hydrologic applications will benefit from the increased availability of accurate, high-quality, and high-resolution surface-water data. The remotely sensed data provide topographic information and data integration capabilities needed for meeting current and future human and environmental needs.
Multibeam 3D Underwater SLAM with Probabilistic Registration.
Palomer, Albert; Ridao, Pere; Ribas, David
2016-04-20
This paper describes a pose-based underwater 3D Simultaneous Localization and Mapping (SLAM) using a multibeam echosounder to produce high consistency underwater maps. The proposed algorithm compounds swath profiles of the seafloor with dead reckoning localization to build surface patches (i.e., point clouds). An Iterative Closest Point (ICP) with a probabilistic implementation is then used to register the point clouds, taking into account their uncertainties. The registration process is divided in two steps: (1) point-to-point association for coarse registration and (2) point-to-plane association for fine registration. The point clouds of the surfaces to be registered are sub-sampled in order to decrease both the computation time and also the potential of falling into local minima during the registration. In addition, a heuristic is used to decrease the complexity of the association step of the ICP from O(n2) to O(n) . The performance of the SLAM framework is tested using two real world datasets: First, a 2.5D bathymetric dataset obtained with the usual down-looking multibeam sonar configuration, and second, a full 3D underwater dataset acquired with a multibeam sonar mounted on a pan and tilt unit.
NASA Technical Reports Server (NTRS)
Poulos, Gregory S.; Stamus, Peter A.; Snook, John S.
2005-01-01
The Cold Land Processes Experiment (CLPX) experiment emphasized the development of a strong synergism between process-oriented understanding, land surface models and microwave remote sensing. Our work sought to investigate which topographically- generated atmospheric phenomena are most relevant to the CLPX MSA's for the purpose of evaluating their climatic importance to net local moisture fluxes and snow transport through the use of high-resolution data assimilation/atmospheric numerical modeling techniques. Our task was to create three long-term, scientific quality atmospheric datasets for quantitative analysis (for all CLPX researchers) and provide a summary of the meteorologically-relevant phenomena of the three MSAs (see Figure) over northern Colorado. Our efforts required the ingest of a variety of CLPX datasets and the execution an atmospheric and land surface data assimilation system based on the Navier-Stokes equations (the Local Analysis and Prediction System, LAPS, and an atmospheric numerical weather prediction model, as required) at topographically- relevant grid spacing (approx. 500 m). The resulting dataset will be analyzed by the CLPX community as a part of their larger research goals to determine the relative influence of various atmospheric phenomena on processes relevant to CLPX scientific goals.
Global lake response to the recent warming hiatus
NASA Astrophysics Data System (ADS)
Winslow, Luke A.; Leach, Taylor H.; Rose, Kevin C.
2018-05-01
Understanding temporal variability in lake warming rates over decadal scales is important for understanding observed change in aquatic systems. We analyzed a global dataset of lake surface water temperature observations (1985‑2009) to examine how lake temperatures responded to a recent global air temperature warming hiatus (1998‑2012). Prior to the hiatus (1985‑1998), surface water temperatures significantly increased at an average rate of 0.532 °C decade‑1 (±0.214). In contrast, water temperatures did not change significantly during the hiatus (average rate ‑0.087 °C decade‑1 ±0.223). Overall, 83% of lakes in our dataset (129 of 155) had faster warming rates during the pre-hiatus period than during the hiatus period. These results demonstrate that lakes have exhibited decadal-scale variability in warming rates coherent with global air temperatures and represent an independent line of evidence for the recent warming hiatus. Our analyses provide evidence that lakes are sentinels of broader climatological processes and indicate that warming rates based on datasets where a large proportion of observations were collected during the hiatus period may underestimate longer-term trends.
NASA Astrophysics Data System (ADS)
Geminale, A.; Grassi, D.; Altieri, F.; Serventi, G.; Carli, C.; Carrozzo, F. G.; Sgavetti, M.; Orosei, R.; D'Aversa, E.; Bellucci, G.; Frigeri, A.
2015-06-01
The aim of this work is to extract the surface contribution in the martian visible/near-infrared spectra removing the atmospheric components by means of Principal Component Analysis (PCA) and target transformation (TT). The developed technique is suitable for separating spectral components in a data set large enough to enable an effective usage of statistical methods, in support to the more common approaches to remove the gaseous component. In this context, a key role is played by the estimation, from the spectral population, of the covariance matrix that describes the statistical correlation of the signal among different points in the spectrum. As a general rule, the covariance matrix becomes more and more meaningful increasing the size of initial population, justifying therefore the importance of sizable datasets. Data collected by imaging spectrometers, such as the OMEGA (Observatoire pour la Minéralogie, l'Eau, les Glaces et l'Activité) instrument on board the ESA mission Mars Express (MEx), are particularly suitable for this purpose since it includes in the same session of observation a large number of spectra with different content of aerosols, gases and mineralogy. The methodology presented in this work has been first validated using a simulated dataset of spectra to evaluate its accuracy. Then, it has been applied to the analysis of OMEGA sessions over Nili Fossae and Mawrth Vallis regions, which have been already widely studied because of the presence of hydrated minerals. These minerals are key components of the surface to investigate the presence of liquid water flowing on the martian surface in the Noachian period. Moreover, since a correction for the atmospheric aerosols (dust) component is also applied to these observations, the present work is able to completely remove the atmospheric contribution from the analysed spectra. Once the surface reflectance, free from atmospheric contributions, has been obtained, the Modified Gaussian Model (MGM) has been applied to spectra showing the hydrated phase. Silicates and iron-bearing hydrated minerals have been identified by means of the electronic transitions of Fe2+ between 0.8 and 1.2 μm, while at longer wavelengths the hydrated mineralogy is identified by overtones of the OH group. Surface reflectance spectra, as derived through the method discussed in this paper, clearly show a lower level of the atmospheric residuals in the 1.9 hydration band, thus resulting in a better match with the MGM deconvolution parameters found for the laboratory spectra of martian hydrated mineral analogues and allowing a deeper investigation of this spectral range.
The Reach Address Database (RAD) stores reach address information for each Water Program feature that has been linked to the underlying surface water features (streams, lakes, etc) in the National Hydrology Database (NHD) Plus dataset.
Creating a Linked Data Hub in the Geosciences
NASA Astrophysics Data System (ADS)
Narock, T. W.; Rozell, E. A.; Robinson, E. M.
2012-12-01
Linked data is a paradigm for publishing data on the Web by using, among other things, non-proprietary data formats and resolvable identifiers for things in your dataset. One linked data initiative, DBPedia, is widely used as a "crystallization point" for linked data on the Web. It serves as a hub for links from external datasets covering a broad variety of domains. Within the Earth Science Information Partnership (ESIP) efforts have begun to create a similar crystallization point for linked data in the geosciences. The initial project was created by converting more than 100,000 abstracts from the American Geophysical Union (AGU) into linked data using the Resource Description Framework. Like the Wikipedia data DBPedia is derived from, AGU publications have extremely broad coverage of topics in the geosciences. To better characterize the network, we have linked this AGU data to ESIP meeting and membership data, as well as to National Science Foundation-funded research projects. In doing so, we can visualize connections between different collaborative clusters like the ESIP Community or NSF grantees within the broader Geosciences communities that attend AGU conferences. Efforts to extend this project include - the ability to annotate abstracts, provide links to referenced tools or datasets, and the enabling of a crowd-sourcing approach to co-reference resolution.
Damage and protection cost curves for coastal floods within the 600 largest European cities
NASA Astrophysics Data System (ADS)
Prahl, Boris F.; Boettle, Markus; Costa, Luís; Kropp, Jürgen P.; Rybski, Diego
2018-03-01
The economic assessment of the impacts of storm surges and sea-level rise in coastal cities requires high-level information on the damage and protection costs associated with varying flood heights. We provide a systematically and consistently calculated dataset of macroscale damage and protection cost curves for the 600 largest European coastal cities opening the perspective for a wide range of applications. Offering the first comprehensive dataset to include the costs of dike protection, we provide the underpinning information to run comparative assessments of costs and benefits of coastal adaptation. Aggregate cost curves for coastal flooding at the city-level are commonly regarded as by-products of impact assessments and are generally not published as a standalone dataset. Hence, our work also aims at initiating a more critical discussion on the availability and derivation of cost curves.